The Challenge of Large-Scale Literacy Improvement
ERIC Educational Resources Information Center
Levin, Ben
2010-01-01
This paper discusses the challenge of making large-scale improvements in literacy in schools across an entire education system. Despite growing interest and rhetoric, there are very few examples of sustained, large-scale change efforts around school-age literacy. The paper reviews 2 instances of such efforts, in England and Ontario. After…
Intra-reach headwater fish assemblage structure
McKenna, James E.
2017-01-01
Large-scale conservation efforts can take advantage of modern large databases and regional modeling and assessment methods. However, these broad-scale efforts often assume uniform average habitat conditions and/or species assemblages within stream reaches.
MAINTAINING DATA QUALITY IN THE PERFORMANCE OF A LARGE SCALE INTEGRATED MONITORING EFFORT
Macauley, John M. and Linda C. Harwell. In press. Maintaining Data Quality in the Performance of a Large Scale Integrated Monitoring Effort (Abstract). To be presented at EMAP Symposium 2004: Integrated Monitoring and Assessment for Effective Water Quality Management, 3-7 May 200...
The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research
ERIC Educational Resources Information Center
Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy
2016-01-01
This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…
Grossman, Murray; Powers, John; Ash, Sherry; McMillan, Corey; Burkholder, Lisa; Irwin, David; Trojanowski, John Q.
2012-01-01
Non-fluent/agrammatic primary progressive aphasia (naPPA) is a progressive neurodegenerative condition most prominently associated with slowed, effortful speech. A clinical imaging marker of naPPA is disease centered in the left inferior frontal lobe. We used multimodal imaging to assess large-scale neural networks underlying effortful expression in 15 patients with sporadic naPPA due to frontotemporal lobar degeneration (FTLD) spectrum pathology. Effortful speech in these patients is related in part to impaired grammatical processing, and to phonologic speech errors. Gray matter (GM) imaging shows frontal and anterior-superior temporal atrophy, most prominently in the left hemisphere. Diffusion tensor imaging reveals reduced fractional anisotropy in several white matter (WM) tracts mediating projections between left frontal and other GM regions. Regression analyses suggest disruption of three large-scale GM-WM neural networks in naPPA that support fluent, grammatical expression. These findings emphasize the role of large-scale neural networks in language, and demonstrate associated language deficits in naPPA. PMID:23218686
Critical Issues in Large-Scale Assessment: A Resource Guide.
ERIC Educational Resources Information Center
Redfield, Doris
The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…
ESRI applications of GIS technology: Mineral resource development
NASA Technical Reports Server (NTRS)
Derrenbacher, W.
1981-01-01
The application of geographic information systems technology to large scale regional assessment related to mineral resource development, identifying candidate sites for related industry, and evaluating sites for waste disposal is discussed. Efforts to develop data bases were conducted at scales ranging from 1:3,000,000 to 1:25,000. In several instances, broad screening was conducted for large areas at a very general scale with more detailed studies subsequently undertaken in promising areas windowed out of the generalized data base. Increasingly, the systems which are developed are structured as the spatial framework for the long-term collection, storage, referencing, and retrieval of vast amounts of data about large regions. Typically, the reconnaissance data base for a large region is structured at 1:250,000 scale, data bases for smaller areas being structured at 1:25,000, 1:50,000 or 1:63,360. An integrated data base for the coterminous US was implemented at a scale of 1:3,000,000 for two separate efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, Alan J.
2016-04-29
While the stated reason for asking this question is “to understand better our ability to warn policy makers in the unlikely event of an unanticipated SRM geoengineering deployment or large-scale field experiment”, my colleagues and I felt that motives would be important context because the scale of any meaningful SRM deployment would be so large that covert deployment seems impossible. However, several motives emerged that suggest a less-than-global effort might be important.
Large Composite Structures Processing Technologies for Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Clinton, R. G., Jr.; Vickers, J. H.; McMahon, W. M.; Hulcher, A. B.; Johnston, N. J.; Cano, R. J.; Belvin, H. L.; McIver, K.; Franklin, W.; Sidwell, D.
2001-01-01
Significant efforts have been devoted to establishing the technology foundation to enable the progression to large scale composite structures fabrication. We are not capable today of fabricating many of the composite structures envisioned for the second generation reusable launch vehicle (RLV). Conventional 'aerospace' manufacturing and processing methodologies (fiber placement, autoclave, tooling) will require substantial investment and lead time to scale-up. Out-of-autoclave process techniques will require aggressive efforts to mature the selected technologies and to scale up. Focused composite processing technology development and demonstration programs utilizing the building block approach are required to enable envisioned second generation RLV large composite structures applications. Government/industry partnerships have demonstrated success in this area and represent best combination of skills and capabilities to achieve this goal.
Gray, B.R.; Shi, W.; Houser, J.N.; Rogala, J.T.; Guan, Z.; Cochran-Biederman, J. L.
2011-01-01
Ecological restoration efforts in large rivers generally aim to ameliorate ecological effects associated with large-scale modification of those rivers. This study examined whether the effects of restoration efforts-specifically those of island construction-within a largely open water restoration area of the Upper Mississippi River (UMR) might be seen at the spatial scale of that 3476ha area. The cumulative effects of island construction, when observed over multiple years, were postulated to have made the restoration area increasingly similar to a positive reference area (a proximate area comprising contiguous backwater areas) and increasingly different from two negative reference areas. The negative reference areas represented the Mississippi River main channel in an area proximate to the restoration area and an open water area in a related Mississippi River reach that has seen relatively little restoration effort. Inferences on the effects of restoration were made by comparing constrained and unconstrained models of summer chlorophyll a (CHL), summer inorganic suspended solids (ISS) and counts of benthic mayfly larvae. Constrained models forced trends in means or in both means and sampling variances to become, over time, increasingly similar to those in the positive reference area and increasingly dissimilar to those in the negative reference areas. Trends were estimated over 12- (mayflies) or 14-year sampling periods, and were evaluated using model information criteria. Based on these methods, restoration effects were observed for CHL and mayflies while evidence in favour of restoration effects on ISS was equivocal. These findings suggest that the cumulative effects of island building at relatively large spatial scales within large rivers may be estimated using data from large-scale surveillance monitoring programs. Published in 2010 by John Wiley & Sons, Ltd.
Schmoldt, D.L.; Peterson, D.L.; Keane, R.E.; Lenihan, J.M.; McKenzie, D.; Weise, D.R.; Sandberg, D.V.
1999-01-01
A team of fire scientists and resource managers convened 17-19 April 1996 in Seattle, Washington, to assess the effects of fire disturbance on ecosystems. Objectives of this workshop were to develop scientific recommendations for future fire research and management activities. These recommendations included a series of numerically ranked scientific and managerial questions and responses focusing on (1) links among fire effects, fuels, and climate; (2) fire as a large-scale disturbance; (3) fire-effects modeling structures; and (4) managerial concerns, applications, and decision support. At the present time, understanding of fire effects and the ability to extrapolate fire-effects knowledge to large spatial scales are limited, because most data have been collected at small spatial scales for specific applications. Although we clearly need more large-scale fire-effects data, it will be more expedient to concentrate efforts on improving and linking existing models that simulate fire effects in a georeferenced format while integrating empirical data as they become available. A significant component of this effort should be improved communication between modelers and managers to develop modeling tools to use in a planning context. Another component of this modeling effort should improve our ability to predict the interactions of fire and potential climatic change at very large spatial scales. The priority issues and approaches described here provide a template for fire science and fire management programs in the next decade and beyond.
Michael K. Young; Kevin S. McKelvey; Kristine L. Pilgrim; Michael K. Schwartz
2013-01-01
There is growing interest in broad-scale biodiversity assessments that can serve as benchmarks for identifying ecological change. Genetic tools have been used for such assessments for decades, but spatial sampling considerations have largely been ignored. Here, we demonstrate how intensive sampling efforts across a large geographical scale can influence identification...
ERIC Educational Resources Information Center
Glazer, Joshua L.; Peurach, Donald J.
2013-01-01
The development and scale-up of school improvement networks is among the most important educational innovations of the last decade, and current federal, state, and district efforts attempt to use school improvement networks as a mechanism for supporting large-scale change. The potential of improvement networks, however, rests on the extent to…
Large-Scale 3D Printing: The Way Forward
NASA Astrophysics Data System (ADS)
Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid
2018-03-01
Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.
Scaling NASA Applications to 1024 CPUs on Origin 3K
NASA Technical Reports Server (NTRS)
Taft, Jim
2002-01-01
The long and highly successful joint SGI-NASA research effort in ever larger SSI systems was to a large degree the result of the successful development of the MLP scalable parallel programming paradigm developed at ARC: 1) MLP scaling in real production codes justified ever larger systems at NAS; 2) MLP scaling on 256p Origin 2000 gave SGl impetus to productize 256p; 3) MLP scaling on 512 gave SGI courage to build 1024p O3K; and 4) History of MLP success resulted in IBM Star Cluster based MLP effort.
Evaluating stream trout habitat on large-scale aerial color photographs
Wallace J. Greentree; Robert C. Aldrich
1976-01-01
Large-scale aerial color photographs were used to evaluate trout habitat by studying stream and streambank conditions. Ninety-two percent of these conditions could be identified correctly on the color photographs. Color photographs taken 1 year apart showed that rehabilitation efforts resulted in stream vegetation changes. Water depth was correlated with film density:...
Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, Wes
2016-07-24
The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaoliang; Stauffer, Philip H.
This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.
Taking Teacher Learning to Scale: Sharing Knowledge and Spreading Ideas across Geographies
ERIC Educational Resources Information Center
Klein, Emily J.; Jaffe-Walter, Reva; Riordan, Megan
2016-01-01
This research reports data from case studies of three intermediary organizations facing the challenge of scaling up teacher learning. The turn of the century launched scaling-up efforts of all three intermediaries, growing from intimate groups, where founding teachers and staff were key supports for teacher learning, to large multistate…
Multisite Studies and Scaling up in Educational Research
ERIC Educational Resources Information Center
Harwell, Michael
2012-01-01
A scale-up study in education typically expands the sample of students, schools, districts, and/or practices or materials used in smaller studies in ways that build in heterogeneity. Yet surprisingly little is known about the factors that promote successful scaling up efforts in education, in large part due to the absence of empirically supported…
Shake Test Results and Dynamic Calibration Efforts for the Large Rotor Test Apparatus
NASA Technical Reports Server (NTRS)
Russell, Carl R.
2014-01-01
A shake test of the Large Rotor Test Apparatus (LRTA) was performed in an effort to enhance NASAscapability to measure dynamic hub loads for full-scale rotor tests. This paper documents the results of theshake test as well as efforts to calibrate the LRTA balance system to measure dynamic loads.Dynamic rotor loads are the primary source of vibration in helicopters and other rotorcraft, leading topassenger discomfort and damage due to fatigue of aircraft components. There are novel methods beingdeveloped to reduce rotor vibrations, but measuring the actual vibration reductions on full-scale rotorsremains a challenge. In order to measure rotor forces on the LRTA, a balance system in the non-rotatingframe is used. The forces at the balance can then be translated to the hub reference frame to measure therotor loads. Because the LRTA has its own dynamic response, the balance system must be calibrated toinclude the natural frequencies of the test rig.
Large Scale Structure Studies: Final Results from a Rich Cluster Redshift Survey
NASA Astrophysics Data System (ADS)
Slinglend, K.; Batuski, D.; Haase, S.; Hill, J.
1995-12-01
The results from the COBE satellite show the existence of structure on scales on the order of 10% or more of the horizon scale of the universe. Rich clusters of galaxies from the Abell-ACO catalogs show evidence of structure on scales of 100 Mpc and hold the promise of confirming structure on the scale of the COBE result. Unfortunately, until now, redshift information has been unavailable for a large percentage of these clusters, so present knowledge of their three dimensional distribution has quite large uncertainties. Our approach in this effort has been to use the MX multifiber spectrometer on the Steward 2.3m to measure redshifts of at least ten galaxies in each of 88 Abell cluster fields with richness class R>= 1 and mag10 <= 16.8 (estimated z<= 0.12) and zero or one measured redshifts. This work has resulted in a deeper, 95% complete and more reliable sample of 3-D positions of rich clusters. The primary intent of this survey has been to constrain theoretical models for the formation of the structure we see in the universe today through 2-pt. spatial correlation function and other analyses of the large scale structures traced by these clusters. In addition, we have obtained enough redshifts per cluster to greatly improve the quality and size of the sample of reliable cluster velocity dispersions available for use in other studies of cluster properties. This new data has also allowed the construction of an updated and more reliable supercluster candidate catalog. Our efforts have resulted in effectively doubling the volume traced by these clusters. Presented here is the resulting 2-pt. spatial correlation function, as well as density plots and several other figures quantifying the large scale structure from this much deeper and complete sample. Also, with 10 or more redshifts in most of our cluster fields, we have investigated the extent of projection effects within the Abell catalog in an effort to quantify and understand how this may effect the Abell sample.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Kai; Kim, Donghoe; Whitaker, James B
Rapid development of perovskite solar cells (PSCs) during the past several years has made this photovoltaic (PV) technology a serious contender for potential large-scale deployment on the terawatt scale in the PV market. To successfully transition PSC technology from the laboratory to industry scale, substantial efforts need to focus on scalable fabrication of high-performance perovskite modules with minimum negative environmental impact. Here, we provide an overview of the current research and our perspective regarding PSC technology toward future large-scale manufacturing and deployment. Several key challenges discussed are (1) a scalable process for large-area perovskite module fabrication; (2) less hazardous chemicalmore » routes for PSC fabrication; and (3) suitable perovskite module designs for different applications.« less
Enhancing ecosystem restoration efficiency through spatial and temporal coordination.
Neeson, Thomas M; Ferris, Michael C; Diebel, Matthew W; Doran, Patrick J; O'Hanley, Jesse R; McIntyre, Peter B
2015-05-12
In many large ecosystems, conservation projects are selected by a diverse set of actors operating independently at spatial scales ranging from local to international. Although small-scale decision making can leverage local expert knowledge, it also may be an inefficient means of achieving large-scale objectives if piecemeal efforts are poorly coordinated. Here, we assess the value of coordinating efforts in both space and time to maximize the restoration of aquatic ecosystem connectivity. Habitat fragmentation is a leading driver of declining biodiversity and ecosystem services in rivers worldwide, and we simultaneously evaluate optimal barrier removal strategies for 661 tributary rivers of the Laurentian Great Lakes, which are fragmented by at least 6,692 dams and 232,068 road crossings. We find that coordinating barrier removals across the entire basin is nine times more efficient at reconnecting fish to headwater breeding grounds than optimizing independently for each watershed. Similarly, a one-time pulse of restoration investment is up to 10 times more efficient than annual allocations totaling the same amount. Despite widespread emphasis on dams as key barriers in river networks, improving road culvert passability is also essential for efficiently restoring connectivity to the Great Lakes. Our results highlight the dramatic economic and ecological advantages of coordinating efforts in both space and time during restoration of large ecosystems.
Enhancing ecosystem restoration efficiency through spatial and temporal coordination
Neeson, Thomas M.; Ferris, Michael C.; Diebel, Matthew W.; Doran, Patrick J.; O’Hanley, Jesse R.; McIntyre, Peter B.
2015-01-01
In many large ecosystems, conservation projects are selected by a diverse set of actors operating independently at spatial scales ranging from local to international. Although small-scale decision making can leverage local expert knowledge, it also may be an inefficient means of achieving large-scale objectives if piecemeal efforts are poorly coordinated. Here, we assess the value of coordinating efforts in both space and time to maximize the restoration of aquatic ecosystem connectivity. Habitat fragmentation is a leading driver of declining biodiversity and ecosystem services in rivers worldwide, and we simultaneously evaluate optimal barrier removal strategies for 661 tributary rivers of the Laurentian Great Lakes, which are fragmented by at least 6,692 dams and 232,068 road crossings. We find that coordinating barrier removals across the entire basin is nine times more efficient at reconnecting fish to headwater breeding grounds than optimizing independently for each watershed. Similarly, a one-time pulse of restoration investment is up to 10 times more efficient than annual allocations totaling the same amount. Despite widespread emphasis on dams as key barriers in river networks, improving road culvert passability is also essential for efficiently restoring connectivity to the Great Lakes. Our results highlight the dramatic economic and ecological advantages of coordinating efforts in both space and time during restoration of large ecosystems. PMID:25918378
Webinar July 28: H2@Scale - A Potential Opportunity | News | NREL
role of hydrogen at the grid scale and the efforts of a large, national lab team assembled to evaluate the potential of hydrogen to play a critical role in our energy future. Presenters will share facts
Book Review: Large-Scale Ecosystem Restoration: Five Case Studies from the United States
Broad-scale ecosystem restoration efforts involve a very complex set of ecological and societal components, and the success of any ecosystem restoration project rests on an integrated approach to implementation. Editors Mary Doyle and Cynthia Drew have successfully synthesized ma...
NASA Astrophysics Data System (ADS)
Fischer, P. D.; Brown, M. E.; Trumbo, S. K.; Hand, K. P.
2017-01-01
We present spatially resolved spectroscopic observations of Europa’s surface at 3-4 μm obtained with the near-infrared spectrograph and adaptive optics system on the Keck II telescope. These are the highest quality spatially resolved reflectance spectra of Europa’s surface at 3-4 μm. The observations spatially resolve Europa’s large-scale compositional units at a resolution of several hundred kilometers. The spectra show distinct features and geographic variations associated with known compositional units; in particular, large-scale leading hemisphere chaos shows a characteristic longward shift in peak reflectance near 3.7 μm compared to icy regions. These observations complement previous spectra of large-scale chaos, and can aid efforts to identify the endogenous non-ice species.
A leap forward in geographic scale for forest ectomycorrhizal fungi
Filipa Cox; Nadia Barsoum; Martin I. Bidartondo; Isabella Børja; Erik Lilleskov; Lars O. Nilsson; Pasi Rautio; Kath Tubby; Lars Vesterdal
2010-01-01
In this letter we propose a first large-scale assessment of mycorrhizas with a European-wide network of intensively monitored forest plots as a research platform. This effort would create a qualitative and quantitative shift in mycorrhizal research by delivering the first continental-scale map of mycorrhizal fungi. Readersmay note that several excellent detailed...
An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.
2012-11-01
Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.
Understanding middle managers' influence in implementing patient safety culture.
Gutberg, Jennifer; Berta, Whitney
2017-08-22
The past fifteen years have been marked by large-scale change efforts undertaken by healthcare organizations to improve patient safety and patient-centered care. Despite substantial investment of effort and resources, many of these large-scale or "radical change" initiatives, like those in other industries, have enjoyed limited success - with practice and behavioural changes neither fully adopted nor ultimately sustained - which has in large part been ascribed to inadequate implementation efforts. Culture change to "patient safety culture" (PSC) is among these radical change initiatives, where results to date have been mixed at best. This paper responds to calls for research that focus on explicating factors that affect efforts to implement radical change in healthcare contexts, and focuses on PSC as the radical change implementation. Specifically, this paper offers a novel conceptual model based on Organizational Learning Theory to explain the ability of middle managers in healthcare organizations to influence patient safety culture change. We propose that middle managers can capitalize on their unique position between upper and lower levels in the organization and engage in 'ambidextrous' learning that is critical to implementing and sustaining radical change. This organizational learning perspective offers an innovative way of framing the mid-level managers' role, through both explorative and exploitative activities, which further considers the necessary organizational context in which they operate.
Swept-Wing Ice Accretion Characterization and Aerodynamics
NASA Technical Reports Server (NTRS)
Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.
2013-01-01
NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65% scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20%, 64% and 83% semispan stations of the baseline-reference wing. Three-dimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date. 1
Swept-Wing Ice Accretion Characterization and Aerodynamics
NASA Technical Reports Server (NTRS)
Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.
2013-01-01
NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65 percent scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20, 64 and 83 percent semispan stations of the baseline-reference wing. Threedimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less
Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications
NASA Technical Reports Server (NTRS)
Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.
2017-01-01
Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.
Fabrication of the HIAD Large-Scale Demonstration Assembly
NASA Technical Reports Server (NTRS)
Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.
2017-01-01
Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.
Lessons from SMD experience with approaches to the evaluation of fare changes
DOT National Transportation Integrated Search
1980-01-01
Over the past several years UMTA's Service and Methods Demonstration Program (SMD) has undertaken a large number of studies of the effects of fare changes, both increases and decreases. Some of these studies have been large scale efforts directed at ...
Incentives and Test-Based Accountability in Education
ERIC Educational Resources Information Center
Hout, Michael, Ed.; Elliott, Stuart W., Ed.
2011-01-01
In recent years there have been increasing efforts to use accountability systems based on large-scale tests of students as a mechanism for improving student achievement. The federal No Child Left Behind Act (NCLB) is a prominent example of such an effort, but it is only the continuation of a steady trend toward greater test-based accountability in…
An Overview of NASA Efforts on Zero Boiloff Storage of Cryogenic Propellants
NASA Technical Reports Server (NTRS)
Hastings, Leon J.; Plachta, D. W.; Salerno, L.; Kittel, P.; Haynes, Davy (Technical Monitor)
2001-01-01
Future mission planning within NASA has increasingly motivated consideration of cryogenic propellant storage durations on the order of years as opposed to a few weeks or months. Furthermore, the advancement of cryocooler and passive insulation technologies in recent years has substantially improved the prospects for zero boiloff storage of cryogenics. Accordingly, a cooperative effort by NASA's Ames Research Center (ARC), Glenn Research Center (GRC), and Marshall Space Flight Center (MSFC) has been implemented to develop and demonstrate "zero boiloff" concepts for in-space storage of cryogenic propellants, particularly liquid hydrogen and oxygen. ARC is leading the development of flight-type cryocoolers, GRC the subsystem development and small scale testing, and MSFC the large scale and integrated system level testing. Thermal and fluid modeling involves a combined effort by the three Centers. Recent accomplishments include: 1) development of "zero boiloff" analytical modeling techniques for sizing the storage tankage, passive insulation, cryocooler, power source mass, and radiators; 2) an early subscale demonstration with liquid hydrogen 3) procurement of a flight-type 10 watt, 95 K pulse tube cryocooler for liquid oxygen storage and 4) assembly of a large-scale test article for an early demonstration of the integrated operation of passive insulation, destratification/pressure control, and cryocooler (commercial unit) subsystems to achieve zero boiloff storage of liquid hydrogen. Near term plans include the large-scale integrated system demonstration testing this summer, subsystem testing of the flight-type pulse-tube cryocooler with liquid nitrogen (oxygen simulant), and continued development of a flight-type liquid hydrogen pulse tube cryocooler.
Computerization of Library and Information Services in Mainland China.
ERIC Educational Resources Information Center
Lin, Sharon Chien
1994-01-01
Describes two phases of the automation of library and information services in mainland China. From 1974-86, much effort was concentrated on developing computer systems, databases, online retrieval, and networking. From 1986 to the present, practical progress became possible largely because of CD-ROM technology; and large scale networking for…
Satellite-based peatland mapping: potential of the MODIS sensor.
D. Pflugmacher; O.N. Krankina; W.B. Cohen
2006-01-01
Peatlands play a major role in the global carbon cycle but are largely overlooked in current large-scale vegetation mapping efforts. In this study, we investigated the potential of the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor to capture extent and distribution of peatlands in the St. Petersburg region of Russia.
2015-02-11
A similar risk-based approach may be appropriate for deploying military personnel. e) If DoD were to consider implementing a large- scale pre...quality of existing spirometry programs prior to considering a larger scale pre-deployment effort. Identifying an accelerated decrease in spirometry...baseline spirometry on a wider scale . e) Conduct pre-deployment baseline spirometry if there is a significant risk of exposure to a pulmonary hazard based
Validity of Scores for a Developmental Writing Scale Based on Automated Scoring
ERIC Educational Resources Information Center
Attali, Yigal; Powers, Donald
2009-01-01
A developmental writing scale for timed essay-writing performance was created on the basis of automatically computed indicators of writing fluency, word choice, and conventions of standard written English. In a large-scale data collection effort that involved a national sample of more than 12,000 students from 4th, 6th, 8th, 10th, and 12th grade,…
Harrington, Brian A.; Brown, S.; Corven, James; Bart, Jonathan
2002-01-01
Shorebirds are among the most highly migratory creatures on earth. Both the study of their ecology and ongoing efforts to conserve their populations must reflect this central aspect of their biology. Many species of shorebirds use migration and staging sites scattered throughout the hemisphere to complete their annual migrations between breeding areas and nonbreeding habitats (Morrison 1984). The vast distances between habitats they use pose significant challenges for studying their migration ecology. At the same time, the large number of political boundaries shorebirds cross during their epic migrations create parallel challenges for organizations working on their management and conservation.Nebel et al. (2002) represent a collaborative effort to understand the conservation implications of Western Sandpiper (Calidris mauri) migration ecology on a scale worthy of this highly migratory species. The data sets involved in the analysis come from four U.S. states, two Canadian provinces, and a total of five nations. Only by collaborating on this historic scale were the authors able to assemble the information necessary to understand important aspects of the migration ecology of this species, and the implications for conservation of the patterns they discovered.Collaborative approaches to shorebird migration ecology developed slowly over several decades. The same period also saw the creation of large-scale efforts to monitor and conserve shorebirds. This overview first traces the history of the study of migration ecology of shorebirds during that fertile period, and then describes the monitoring and protection efforts that have been developed in an attempt to address the enormous issues of scale posed by shorebird migration ecology and conservation.
In 1990, EMAP's Coastal Monitoring Program conducted its first regional sampling program in the Virginian Province. This first effort focused only at large spatial scales (regional) with some stratification to examine estuarine types. In the ensuing decade, EMAP-Coastal has condu...
The Role of Teacher Leaders in Scaling Up Standards-Based Reform.
ERIC Educational Resources Information Center
Swanson, Judy; Snell, Jean; Koency, Gina; Berns, Barbara
This study examined 10 urban middle school teacher leaders who played significant roles in their districts' and states' large-scale standards reform efforts. Interviews, observations, and shadowing were conducted during the first year to examine the teachers' scope of work. Observations focused on teachers working with a range of students and with…
Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.
Hardt, Marah J; Flett, Keith; Howell, Colleen J
2017-08-01
Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.
Fisher research in the US Rocky Mountains: A critical overview
Michael Schwartz; J. Sauder
2013-01-01
In this talk we review the recent fisher research and monitoring efforts that have occurred throughout Idaho and Montana in past 2 decades. We begin this talk with a summary of the habitat relationship work that has examined fisher habitat use at multiple scales. These have largely been conducted using radio and satellite telemetry, although a new, joint effort to use...
ERIC Educational Resources Information Center
Kaskie, Brian; Walker, Mark; Andersson, Matthew
2017-01-01
The aging of the academic workforce is becoming more relevant to policy discussions in higher education. Yet there has been no formal, large-scale analysis of institutional efforts to develop policies and programs for aging employees. We fielded a representative survey of human resource specialists at 187 colleges and universities across the…
NASA Astrophysics Data System (ADS)
Sobel, A. H.; Wang, S.; Bellon, G.; Sessions, S. L.; Woolnough, S.
2013-12-01
Parameterizations of large-scale dynamics have been developed in the past decade for studying the interaction between tropical convection and large-scale dynamics, based on our physical understanding of the tropical atmosphere. A principal advantage of these methods is that they offer a pathway to attack the key question of what controls large-scale variations of tropical deep convection. These methods have been used with both single column models (SCMs) and cloud-resolving models (CRMs) to study the interaction of deep convection with several kinds of environmental forcings. While much has been learned from these efforts, different groups' efforts are somewhat hard to compare. Different models, different versions of the large-scale parameterization methods, and experimental designs that differ in other ways are used. It is not obvious which choices are consequential to the scientific conclusions drawn and which are not. The methods have matured to the point that there is value in an intercomparison project. In this context, the Global Atmospheric Systems Study - Weak Temperature Gradient (GASS-WTG) project was proposed at the Pan-GASS meeting in September 2012. The weak temperature gradient approximation is one method to parameterize large-scale dynamics, and is used in the project name for historical reasons and simplicity, but another method, the damped gravity wave (DGW) method, will also be used in the project. The goal of the GASS-WTG project is to develop community understanding of the parameterization methods currently in use. Their strengths, weaknesses, and functionality in models with different physics and numerics will be explored in detail, and their utility to improve our understanding of tropical weather and climate phenomena will be further evaluated. This presentation will introduce the intercomparison project, including background, goals, and overview of the proposed experimental design. Interested groups will be invited to join (it will not be too late), and preliminary results will be presented.
Kennedy, Jacob J.; Abbatiello, Susan E.; Kim, Kyunggon; Yan, Ping; Whiteaker, Jeffrey R.; Lin, Chenwei; Kim, Jun Seok; Zhang, Yuzheng; Wang, Xianlong; Ivey, Richard G.; Zhao, Lei; Min, Hophil; Lee, Youngju; Yu, Myeong-Hee; Yang, Eun Gyeong; Lee, Cheolju; Wang, Pei; Rodriguez, Henry; Kim, Youngsoo; Carr, Steven A.; Paulovich, Amanda G.
2014-01-01
The successful application of MRM in biological specimens raises the exciting possibility that assays can be configured to measure all human proteins, resulting in an assay resource that would promote advances in biomedical research. We report the results of a pilot study designed to test the feasibility of a large-scale, international effort in MRM assay generation. We have configured, validated across three laboratories, and made publicly available as a resource to the community 645 novel MRM assays representing 319 proteins expressed in human breast cancer. Assays were multiplexed in groups of >150 peptides and deployed to quantify endogenous analyte in a panel of breast cancer-related cell lines. Median assay precision was 5.4%, with high inter-laboratory correlation (R2 >0.96). Peptide measurements in breast cancer cell lines were able to discriminate amongst molecular subtypes and identify genome-driven changes in the cancer proteome. These results establish the feasibility of a scaled, international effort. PMID:24317253
Find out about The Cancer Genome Atlas (TCGA) is a comprehensive and coordinated effort to accelerate our understanding of the molecular basis of cancer through the application of genome analysis technologies, including large-scale genome sequencing.
Role of substrate quality on IC performance and yields
NASA Technical Reports Server (NTRS)
Thomas, R. N.
1981-01-01
The development of silicon and gallium arsenide crystal growth for the production of large diameter substrates are discussed. Large area substrates of significantly improved compositional purity, dopant distribution and structural perfection on a microscopic as well as macroscopic scale are important requirements. The exploratory use of magnetic fields to suppress convection effects in Czochralski crystal growth is addressed. The growth of large crystals in space appears impractical at present however the efforts to improve substrate quality could benefit from the experiences gained in smaller scale growth experiments conducted in the zero gravity environment of space.
NASA Technical Reports Server (NTRS)
Fujiwara, Gustavo; Bragg, Mike; Triphahn, Chris; Wiberg, Brock; Woodard, Brian; Loth, Eric; Malone, Adam; Paul, Bernard; Pitera, David; Wilcox, Pete;
2017-01-01
This report presents the key results from the first two years of a program to develop experimental icing simulation capabilities for full-scale swept wings. This investigation was undertaken as a part of a larger collaborative research effort on ice accretion and aerodynamics for large-scale swept wings. Ice accretion and the resulting aerodynamic effect on large-scale swept wings presents a significant airplane design and certification challenge to air frame manufacturers, certification authorities, and research organizations alike. While the effect of ice accretion on straight wings has been studied in detail for many years, the available data on swept-wing icing are much more limited, especially for larger scales.
Caldwell, Robert R
2011-12-28
The challenge to understand the physical origin of the cosmic acceleration is framed as a problem of gravitation. Specifically, does the relationship between stress-energy and space-time curvature differ on large scales from the predictions of general relativity. In this article, we describe efforts to model and test a generalized relationship between the matter and the metric using cosmological observations. Late-time tracers of large-scale structure, including the cosmic microwave background, weak gravitational lensing, and clustering are shown to provide good tests of the proposed solution. Current data are very close to proving a critical test, leaving only a small window in parameter space in the case that the generalized relationship is scale free above galactic scales.
Scaling-up NLP Pipelines to Process Large Corpora of Clinical Notes.
Divita, G; Carter, M; Redd, A; Zeng, Q; Gupta, K; Trautner, B; Samore, M; Gundlapalli, A
2015-01-01
This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". This paper describes the scale-up efforts at the VA Salt Lake City Health Care System to address processing large corpora of clinical notes through a natural language processing (NLP) pipeline. The use case described is a current project focused on detecting the presence of an indwelling urinary catheter in hospitalized patients and subsequent catheter-associated urinary tract infections. An NLP algorithm using v3NLP was developed to detect the presence of an indwelling urinary catheter in hospitalized patients. The algorithm was tested on a small corpus of notes on patients for whom the presence or absence of a catheter was already known (reference standard). In planning for a scale-up, we estimated that the original algorithm would have taken 2.4 days to run on a larger corpus of notes for this project (550,000 notes), and 27 days for a corpus of 6 million records representative of a national sample of notes. We approached scaling-up NLP pipelines through three techniques: pipeline replication via multi-threading, intra-annotator threading for tasks that can be further decomposed, and remote annotator services which enable annotator scale-out. The scale-up resulted in reducing the average time to process a record from 206 milliseconds to 17 milliseconds or a 12- fold increase in performance when applied to a corpus of 550,000 notes. Purposely simplistic in nature, these scale-up efforts are the straight forward evolution from small scale NLP processing to larger scale extraction without incurring associated complexities that are inherited by the use of the underlying UIMA framework. These efforts represent generalizable and widely applicable techniques that will aid other computationally complex NLP pipelines that are of need to be scaled out for processing and analyzing big data.
Information about the San Francisco Bay Water Quality Project (SFBWQP) Urban Greening Bay Area, a large-scale effort to re-envision urban landscapes to include green infrastructure (GI) making communities more livable and reducing stormwater runoff.
Home - The Cancer Genome Atlas - Cancer Genome - TCGA
The Cancer Genome Atlas (TCGA) is a comprehensive and coordinated effort to accelerate our understanding of the molecular basis of cancer through the application of genome analysis technologies, including large-scale genome sequencing.
Recent developments in microfluidic large scale integration.
Araci, Ismail Emre; Brisk, Philip
2014-02-01
In 2002, Thorsen et al. integrated thousands of micromechanical valves on a single microfluidic chip and demonstrated that the control of the fluidic networks can be simplified through multiplexors [1]. This enabled realization of highly parallel and automated fluidic processes with substantial sample economy advantage. Moreover, the fabrication of these devices by multilayer soft lithography was easy and reliable hence contributed to the power of the technology; microfluidic large scale integration (mLSI). Since then, mLSI has found use in wide variety of applications in biology and chemistry. In the meantime, efforts to improve the technology have been ongoing. These efforts mostly focus on; novel materials, components, micromechanical valve actuation methods, and chip architectures for mLSI. In this review, these technological advances are discussed and, recent examples of the mLSI applications are summarized. Copyright © 2013 Elsevier Ltd. All rights reserved.
Seitsinger, Anne M; Felner, Robert D; Brand, Stephen; Burns, Amy
2008-08-01
As schools move forward with comprehensive school reform, parents' roles have shifted and been redefined. Parent-teacher communication is critical to student success, yet how schools and teachers contact parents is the subject of few studies. Evaluations of school-change efforts require reliable and useful measures of teachers' practices in communicating with parents. The structure of teacher-parent-contact practices was examined using data from multiple, longitudinal cohorts of schools and teachers from a large-scale project and found to be a reliable and stable measure of parent contact across building levels and localities. Teacher/school practices in contacting parents were found to be significantly related to parent reports of school contact performance and student academic adjustment and achievement. Implications for school improvement efforts are discussed.
Integrating market processes into utility resource planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, E.P.
1992-11-01
Integrated resource planning has resulted in an abundance of alternatives for meeting existing and new demand for electricity services: (1) utility demand-side management (DSM) programs, (2) DSM bidding, (3) competitive bidding for private power supplies, (4) utility re-powering, and (5) new utility construction. Each alternative relies on a different degree of planning for implementation and, therefore, each alternative relies on markets to a greater or lesser degree. This paper shows how the interaction of planning processes and market forces results in resource allocations among the alternatives. The discussion focuses on three phenomena that are driving forces behind the unanticipated consequences'more » of contemporary integrated resource planning efforts. These forces are: (1) large-scale DSM efforts, (2) customer bypass, and (3) large-scale independent power projects. 22 refs., 3 figs., 2 tabs.« less
Evaluation Findings from High School Reform Efforts in Baltimore
ERIC Educational Resources Information Center
Smerdon, Becky; Cohen, Jennifer
2009-01-01
The Baltimore City Public School System (BCPSS) is one of the first urban districts in the country to undertake large-scale high school reform, phasing in small learning communities by opening new high schools and transforming large, comprehensive high schools into small high schools. With support from the Bill & Melinda Gates Foundation, a…
A Navy Shore Activity Manpower Planning System for Civilians. Technical Report No. 24.
ERIC Educational Resources Information Center
Niehaus, R. J.; Sholtz, D.
This report describes the U.S. Navy Shore Activity Manpower Planning System (SAMPS) advanced development research project. This effort is aimed at large-scale feasibility tests of manpower models for large Naval installations. These local planning systems are integrated with Navy-wide information systems on a data-communications network accessible…
Brief Self-Report Scales Assessing Life History Dimensions of Mating and Parenting Effort.
Kruger, Daniel J
2017-01-01
Life history theory (LHT) is a powerful evolutionary framework for understanding physiological, psychological, and behavioral variation both between and within species. Researchers and theorists are increasingly integrating LHT into evolutionary psychology, as it provides a strong foundation for research across many topical areas. Human life history variation has been represented in psychological and behavioral research in several ways, including indicators of conditions in the developmental environment, indicators of conditions in the current environment, and indicators of maturation and life milestones (e.g., menarche, initial sexual activity, first pregnancy), and in self-report survey scale measures. Survey scale measures have included constructs such as time perspective and future discounting, although the most widely used index is a constellation of indicators assessing the K-factor, thought to index general life history speed (from fast to slow). The current project examined the utility of two brief self-report survey measures assessing the life history dimensions of mating effort and parenting effort with a large undergraduate sample in the United States. Consistent with the theory, items reflected two inversely related dimensions. In regressions including the K-factor, the Mating Effort Scale proved to be a powerful predictor of other constructs and indicators related to life history variation. The Parenting Effort Scale had less predictive power overall, although it explained unique variance across several constructs and was the only unique predictor of the number of long-term (serious and committed) relationships. These scales may be valuable additions to self-report survey research projects examining life history variation.
Kotamäki, Niina; Thessler, Sirpa; Koskiaho, Jari; Hannukkala, Asko O.; Huitu, Hanna; Huttula, Timo; Havento, Jukka; Järvenpää, Markku
2009-01-01
Sensor networks are increasingly being implemented for environmental monitoring and agriculture to provide spatially accurate and continuous environmental information and (near) real-time applications. These networks provide a large amount of data which poses challenges for ensuring data quality and extracting relevant information. In the present paper we describe a river basin scale wireless sensor network for agriculture and water monitoring. The network, called SoilWeather, is unique and the first of this type in Finland. The performance of the network is assessed from the user and maintainer perspectives, concentrating on data quality, network maintenance and applications. The results showed that the SoilWeather network has been functioning in a relatively reliable way, but also that the maintenance and data quality assurance by automatic algorithms and calibration samples requires a lot of effort, especially in continuous water monitoring over large areas. We see great benefits on sensor networks enabling continuous, real-time monitoring, while data quality control and maintenance efforts highlight the need for tight collaboration between sensor and sensor network owners to decrease costs and increase the quality of the sensor data in large scale applications. PMID:22574050
RAID-2: Design and implementation of a large scale disk array controller
NASA Technical Reports Server (NTRS)
Katz, R. H.; Chen, P. M.; Drapeau, A. L.; Lee, E. K.; Lutz, K.; Miller, E. L.; Seshan, S.; Patterson, D. A.
1992-01-01
We describe the implementation of a large scale disk array controller and subsystem incorporating over 100 high performance 3.5 inch disk drives. It is designed to provide 40 MB/s sustained performance and 40 GB capacity in three 19 inch racks. The array controller forms an integral part of a file server that attaches to a Gb/s local area network. The controller implements a high bandwidth interconnect between an interleaved memory, an XOR calculation engine, the network interface (HIPPI), and the disk interfaces (SCSI). The system is now functionally operational, and we are tuning its performance. We review the design decisions, history, and lessons learned from this three year university implementation effort to construct a truly large scale system assembly.
Evaluation of Hydrogel Technologies for the Decontamination ...
Report This current research effort was developed to evaluate intermediate level (between bench-scale and large-scale or wide-area implementation) decontamination procedures, materials, technologies, and techniques used to remove radioactive material from different surfaces. In the event of such an incident, application of this technology would primarily be intended for decontamination of high-value buildings, important infrastructure, and landmarks.
Native fish conservation areas: a vision for large-scale conservation of native fish communities
Jack E. Williams; Richard N. Williams; Russell F. Thurow; Leah Elwell; David P. Philipp; Fred A. Harris; Jeffrey L. Kershner; Patrick J. Martinez; Dirk Miller; Gordon H. Reeves; Christopher A. Frissell; James R. Sedell
2011-01-01
The status of freshwater fishes continues to decline despite substantial conservation efforts to reverse this trend and recover threatened and endangered aquatic species. Lack of success is partially due to working at smaller spatial scales and focusing on habitats and species that are already degraded. Protecting entire watersheds and aquatic communities, which we...
NASA Astrophysics Data System (ADS)
Arrigo, J. S.; Famiglietti, J. S.; Murdoch, L. C.; Lakshmi, V.; Hooper, R. P.
2012-12-01
The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) continues a major effort towards supporting Community Hydrologic Modeling. From 2009 - 2011, the Community Hydrologic Modeling Platform (CHyMP) initiative held three workshops, the ultimate goal of which was to produce recommendations and an implementation plan to establish a community modeling program that enables comprehensive simulation of water anywhere on the North American continent. Such an effort would include connections to and advances in global climate models, biogeochemistry, and efforts of other disciplines that require an understanding of water patterns and processes in the environment. To achieve such a vision will require substantial investment in human and cyber-infrastructure and significant advances in the science of hydrologic modeling and spatial scaling. CHyMP concluded with a final workshop, held March 2011, and produced several recommendations. CUAHSI and the university community continue to advance community modeling and implement these recommendations through several related and follow on efforts. Key results from the final 2011 workshop included agreement among participants that the community is ready to move forward with implementation. It is recognized that initial implementation of this larger effort can begin with simulation capabilities that currently exist, or that can be easily developed. CHyMP identified four key activities in support of community modeling: benchmarking, dataset evaluation and development, platform evaluation, and developing a national water model framework. Key findings included: 1) The community supported the idea of a National Water Model framework; a community effort is needed to explore what the ultimate implementation of a National Water Model is. A true community modeling effort would support the modeling of "water anywhere" and would include all relevant scales and processes. 2) Implementation of a community modeling program could initially focus on continental scale modeling of water quantity (rather than quality). The goal of this initial model is the comprehensive description of water stores and fluxes in such a way to permit linkage to GCM's, biogeochemical, ecological, and geomorphic models. This continental scale focus allows systematic evaluation of our current state of knowledge and data, leverages existing efforts done by large scale modelers, contributes to scientific discovery that informs globally and societal relevant questions, and provides an initial framework to evaluate hydrologic information relevant to other disciplines and a structure into which to incorporate other classes of hydrologic models. 3) Dataset development will be a key aspect of any successful national water model implementation. Our current knowledge of the subsurface is limiting our ability to truly integrate soil and groundwater into large scale models, and to answering critical science questions with societal relevance (i.e. groundwater's influence on climate). 4) The CHyMP workshops and efforts to date have achieved collaboration between university scientists, government agencies and the private sector that must be maintained. Follow on efforts in community modeling should aim at leveraging and maintaining this collaboration for maximum scientific and societal benefit.
Using Sunlight and Cell Networks to Bring Fleeting Tracking to Small Scale Fisheries
NASA Astrophysics Data System (ADS)
Garren, M.; Selbie, H.; Suchomel, D.; McDonald, W.; Solomon, D.
2016-12-01
Traditionally, the efforts of small scale fisheries have not been easily incorporated into the global picture of fishing effort and activity. That means that the activities of the vast majority ( 90%) of fishing vessels in the world have remained unquantified and largely opaque. With newly developed technology that harnesses solar power and cost-effective cellular networks to transmit data, it is becoming possible to provide vessel tracking systems on a large scale for vessels of all sizes. Furthermore, capitalizing on the relatively inexpensive cellular networks to transfer the data enables data of much higher granularity to be captured. By recording a vessel's position every few seconds, instead of minutes to hours as is typical of most satellite-based systems, we are able to resolve a diverse array of behaviors happening at sea including when and where fishing occurred and what type of fishing gear was used. This high granularity data is both incredibly useful and also a challenge to manage and mine. New approaches for handling and processing this continuous data stream of vessel positions are being developed to extract the most informative and actionable pieces of information for a variety of audiences including governing agencies, industry supply chains seeking transparency, non-profit organizations supporting conservation efforts, academic researchers and the fishers and boat owners.
Validation of the RAGE Hydrocode for Impacts into Volatile-Rich Targets
NASA Astrophysics Data System (ADS)
Plesko, C. S.; Asphaug, E.; Coker, R. F.; Wohletz, K. H.; Korycansky, D. G.; Gisler, G. R.
2007-12-01
In preparation for a detailed study of large-scale impacts into the Martian surface, we have validated the RAGE hydrocode (Gittings et al., in press, CSD) against a suite of experiments and statistical models. We present comparisons of hydrocode models to centimeter-scale gas gun impacts (Nakazawa et al. 2002), an underground nuclear test (Perret, 1971), and crater scaling laws (Holsapple 1993, O'Keefe and Ahrens 1993). We have also conducted model convergence and uncertainty analyses which will be presented. Results to date are encouraging for our current model goals, and indicate areas where the hydrocode may be extended in the future. This validation work is focused on questions related to the specific problem of large impacts into volatile-rich targets. The overall goal of this effort is to be able to realistically model large-scale Noachian, and possibly post- Noachian, impacts on Mars not so much to model the crater morphology as to understand the evolution of target volatiles in the post-impact regime, to explore how large craters might set the stage for post-impact hydro- geologic evolution both locally (in the crater subsurface) and globally, due to the redistribution of volatiles from the surface and subsurface into the atmosphere. This work is performed under the auspices of IGPP and the DOE at LANL under contracts W-7405-ENG-36 and DE-AC52-06NA25396. Effort by DK and EA is sponsored by NASA's Mars Fundamental Research Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, P. D.; Brown, M. E.; Trumbo, S. K.
2017-01-01
We present spatially resolved spectroscopic observations of Europa’s surface at 3–4 μ m obtained with the near-infrared spectrograph and adaptive optics system on the Keck II telescope. These are the highest quality spatially resolved reflectance spectra of Europa’s surface at 3–4 μ m. The observations spatially resolve Europa’s large-scale compositional units at a resolution of several hundred kilometers. The spectra show distinct features and geographic variations associated with known compositional units; in particular, large-scale leading hemisphere chaos shows a characteristic longward shift in peak reflectance near 3.7 μ m compared to icy regions. These observations complement previous spectra of large-scalemore » chaos, and can aid efforts to identify the endogenous non-ice species.« less
Systems Biology-Based Investigation of Host-Plasmodium Interactions.
Smith, Maren L; Styczynski, Mark P
2018-05-18
Malaria is a serious, complex disease caused by parasites of the genus Plasmodium. Plasmodium parasites affect multiple tissues as they evade immune responses, replicate, sexually reproduce, and transmit between vertebrate and invertebrate hosts. The explosion of omics technologies has enabled large-scale collection of Plasmodium infection data, revealing systems-scale patterns, mechanisms of pathogenesis, and the ways that host and pathogen affect each other. Here, we provide an overview of recent efforts using systems biology approaches to study host-Plasmodium interactions and the biological themes that have emerged from these efforts. We discuss some of the challenges in using systems biology for this goal, key research efforts needed to address those issues, and promising future malaria applications of systems biology. Copyright © 2018 Elsevier Ltd. All rights reserved.
Shandas, Vivek; Voelkel, Jackson; Rao, Meenakshi; George, Linda
2016-01-01
Reducing exposure to degraded air quality is essential for building healthy cities. Although air quality and population vary at fine spatial scales, current regulatory and public health frameworks assess human exposures using county- or city-scales. We build on a spatial analysis technique, dasymetric mapping, for allocating urban populations that, together with emerging fine-scale measurements of air pollution, addresses three objectives: (1) evaluate the role of spatial scale in estimating exposure; (2) identify urban communities that are disproportionately burdened by poor air quality; and (3) estimate reduction in mobile sources of pollutants due to local tree-planting efforts using nitrogen dioxide. Our results show a maximum value of 197% difference between cadastrally-informed dasymetric system (CIDS) and standard estimations of population exposure to degraded air quality for small spatial extent analyses, and a lack of substantial difference for large spatial extent analyses. These results provide the foundation for improving policies for managing air quality, and targeting mitigation efforts to address challenges of environmental justice. PMID:27527205
Automated Scheduling of Science Activities for Titan Encounters by Cassini
NASA Technical Reports Server (NTRS)
Ray, Trina L.; Knight, Russel L.; Mohr, Dave
2014-01-01
In an effort to demonstrate the efficacy of automated planning and scheduling techniques for large missions, we have adapted ASPEN (Activity Scheduling and Planning Environment) [1] and CLASP (Compressed Large-scale Activity Scheduling and Planning) [2] to the domain of scheduling high-level science goals into conflict-free operations plans for Titan encounters by the Cassini spacecraft.
ERIC Educational Resources Information Center
Margolis, Peter A.; DeWalt, Darren A.; Simon, Janet E.; Horowitz, Sheldon; Scoville, Richard; Kahn, Norman; Perelman, Robert; Bagley, Bruce; Miles, Paul
2010-01-01
Improving Performance in Practice (IPIP) is a large system intervention designed to align efforts and motivate the creation of a tiered system of improvement at the national, state, practice, and patient levels, assisting primary-care physicians and their practice teams to assess and measurably improve the quality of care for chronic illness and…
Matthew B. Russell; Anthony W. D' Amato; Bethany K. Schulz; Christopher W. Woodall; Grant M. Domke; John B. Bradford
2014-01-01
The contribution of understorey vegetation (UVEG) to forest ecosystem biomass and carbon (C) across diverse forest types has, to date, eluded quantification at regional and national scales. Efforts to quantify UVEG C have been limited to field-intensive studies or broad-scalemodelling approaches lacking fieldmeasurements. Although large-scale inventories of UVEG C are...
Progress and limitations on quantifying nutrient and carbon loading to coastal waters
NASA Astrophysics Data System (ADS)
Stets, E.; Oelsner, G. P.; Stackpoole, S. M.
2017-12-01
Riverine export of nutrients and carbon to estuarine and coastal waters are important determinants of coastal ecosystem health and provide necessary insight into global biogeochemical cycles. Quantification of coastal solute loads typically relies upon modeling based on observations of concentration and discharge from selected rivers draining to the coast. Most large-scale river export models require unidirectional flow and thus are referenced to monitoring locations at the head of tide, which can be located far inland. As a result, the contributions of the coastal plain, tidal wetlands, and concentrated coastal development are often poorly represented in regional and continental-scale estimates of solute delivery to coastal waters. However, site-specific studies have found that these areas are disproportionately active in terms of nutrient and carbon export. Modeling efforts to upscale fluxes from these areas, while not common, also suggest an outsized importance to coastal flux estimates. This presentation will focus on illustrating how the problem of under-representation of near-shore environments impacts large-scale coastal flux estimates in the context of recent regional and continental-scale assessments. Alternate approaches to capturing the influence of the near-coastal terrestrial inputs including recent data aggregation efforts and modeling approaches will be discussed.
CPTAC | Office of Cancer Clinical Proteomics Research
The National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) is a national effort to accelerate the understanding of the molecular basis of cancer through the application of large-scale proteome and genome analysis, or proteogenomics.
Leveraging Resources to Address Transportation Needs: Transportation Pooled Fund Program
DOT National Transportation Integrated Search
2004-05-28
This brochure describes the Transportation Pooled Fund (TPF) Program. The objectives of the TPF Program are to leverage resources, avoid duplication of effort, undertake large-scale projects, obtain greater input on project definition, achieve broade...
Enabling large-scale viscoelastic calculations via neural network acceleration
NASA Astrophysics Data System (ADS)
Robinson DeVries, P.; Thompson, T. B.; Meade, B. J.
2017-12-01
One of the most significant challenges involved in efforts to understand the effects of repeated earthquake cycle activity are the computational costs of large-scale viscoelastic earthquake cycle models. Deep artificial neural networks (ANNs) can be used to discover new, compact, and accurate computational representations of viscoelastic physics. Once found, these efficient ANN representations may replace computationally intensive viscoelastic codes and accelerate large-scale viscoelastic calculations by more than 50,000%. This magnitude of acceleration enables the modeling of geometrically complex faults over thousands of earthquake cycles across wider ranges of model parameters and at larger spatial and temporal scales than have been previously possible. Perhaps most interestingly from a scientific perspective, ANN representations of viscoelastic physics may lead to basic advances in the understanding of the underlying model phenomenology. We demonstrate the potential of artificial neural networks to illuminate fundamental physical insights with specific examples.
Toward exascale production of recombinant adeno-associated virus for gene transfer applications.
Cecchini, S; Negrete, A; Kotin, R M
2008-06-01
To gain acceptance as a medical treatment, adeno-associated virus (AAV) vectors require a scalable and economical production method. Recent developments indicate that recombinant AAV (rAAV) production in insect cells is compatible with current good manufacturing practice production on an industrial scale. This platform can fully support development of rAAV therapeutics from tissue culture to small animal models, to large animal models, to toxicology studies, to Phase I clinical trials and beyond. Efforts to characterize, optimize and develop insect cell-based rAAV production have culminated in successful bioreactor-scale production of rAAV, with total yields potentially capable of approaching the exa-(10(18)) scale. These advances in large-scale AAV production will allow us to address specific catastrophic, intractable human diseases such as Duchenne muscular dystrophy, for which large amounts of recombinant vector are essential for successful outcome.
Remote Imaging Applied to Schistosomiasis Control: The Anning River Project
NASA Technical Reports Server (NTRS)
Seto, Edmund Y. W.; Maszle, Don R.; Spear, Robert C.; Gong, Peng
1997-01-01
The use of satellite imaging to remotely detect areas of high risk for transmission of infectious disease is an appealing prospect for large-scale monitoring of these diseases. The detection of large-scale environmental determinants of disease risk, often called landscape epidemiology, has been motivated by several authors (Pavlovsky 1966; Meade et al. 1988). The basic notion is that large-scale factors such as population density, air temperature, hydrological conditions, soil type, and vegetation can determine in a coarse fashion the local conditions contributing to disease vector abundance and human contact with disease agents. These large-scale factors can often be remotely detected by sensors or cameras mounted on satellite or aircraft platforms and can thus be used in a predictive model to mark high risk areas of transmission and to target control or monitoring efforts. A review of satellite technologies for this purpose was recently presented by Washino and Wood (1994) and Hay (1997) and Hay et al. (1997).
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.
2012-01-01
The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.
An Overview of the Launch Vehicle Blast Environments Development Efforts
NASA Technical Reports Server (NTRS)
Richardson, Erin; Bangham, Mike; Blackwood, James; Skinner, Troy; Hays, Michael; Jackson, Austin; Richman, Ben
2014-01-01
NASA has been funding an ongoing development program to characterize the explosive environments produced during a catastrophic launch vehicle accident. These studies and small-scale tests are focused on the near field environments that threaten the crew. The results indicate that these environments are unlikely to result in immediate destruction of the crew modules. The effort began as an independent assessment by NASA safety organizations, followed by the Ares program and NASA Engineering and Safety Center and now as a Space Launch Systems (SLS) focused effort. The development effort is using the test and accident data available from public or NASA sources as well as focused scaled tests that are examining the fundamental aspects of uncontained explosions of Hydrogen and air and Hydrogen and Oxygen. The primary risk to the crew appears to be the high-energy fragments and these are being characterized for the SLS. The development efforts will characterize the thermal environment of the explosions as well to ensure that the risk is well understood and to document the overall energy balance of an explosion. The effort is multi-path in that analytical, computational and focused testing is being used to develop the knowledge to understand potential SLS explosions. This is an ongoing program with plans that expand the development from fundamental testing at small-scale levels to large-scale tests that can be used to validate models for commercial programs. The ultimate goal is to develop a knowledge base that can be used by vehicle designers to maximize crew survival in an explosion.
MONITORING DECLINING METAPOPULATIONS: INSIGHTS FROM A MODEL SIMULATION
Pond-breeding amphibians, host-specialist butterflies, and a variety of other organisms have been shown to exhibit population structures and dynamics consistent with metapopulation theory. In recent years large-scale biodiversity monitoring efforts have been initiated in many reg...
Lessons Learned from the Everglades Collaborative Adaptive Management Program
Recent technical papers explore whether adaptive management (AM) is useful for environmental management and restoration efforts and discuss the many challenges to overcome for successful implementation, especially for large-scale restoration programs (McLain and Lee 1996; Levine ...
Large Scale Evaluation fo Nickel Aluminide Rolls
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2005-09-01
This completed project was a joint effort between Oak Ridge National Laboratory and Bethlehem Steel (now Mittal Steel) to demonstrate the effectiveness of using nickel aluminide intermetallic alloy rolls as part of an updated, energy-efficient, commercial annealing furnace system.
ATLAS and LHC computing on CRAY
NASA Astrophysics Data System (ADS)
Sciacca, F. G.; Haug, S.; ATLAS Collaboration
2017-10-01
Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one important measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort of moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb, from a dedicated cluster to the large Cray systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.
Vulnerability of China's nearshore ecosystems under intensive mariculture development.
Liu, Hui; Su, Jilan
2017-04-01
Rapid economic development and increasing population in China have exerted tremendous pressures on the coastal ecosystems. In addition to land-based pollutants and reclamation, fast expansion of large-scale intensive mariculture activities has also brought about additional effects. So far, the ecological impact of rapid mariculture development and its large-scale operations has not drawn enough attention. In this paper, the rapid development of mariculture in China is reviewed, China's effort in the application of ecological mariculture is examined, and the vulnerability of marine ecosystem to mariculture impact is evaluated through a number of examples. Removal or reduced large and forage fish, due to both habitat loss to reclamation/mariculture and overfishing for food or fishmeal, may have far-reaching effects on the coastal and shelf ecosystems in the long run. Large-scale intensive mariculture operations carry with them undesirable biological and biochemical characteristics, which may have consequences on natural ecosystems beyond normally perceived spatial and temporal boundaries. As our understanding of possible impacts of large-scale intensive mariculture is lagging far behind its development, much research is urgently needed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haid, D.A.; Fietz, W.A.
1969-06-01
The effort to scale-up the plasma-arc process to produce large solenoids and saddle coils is described. Large coils (up to 16-/sup 3///sub 4/ in. and 41-in. length) of three different configurations, helical, ''pancake'' and ''saddle,'' were fabricated using the plasma arc process.
Lester O. Dillard; Kevin R. Russell; W. Mark Ford
2008-01-01
The federally threatened Cheat Mountain salamander (Plethodon nettingi; hereafter CMS) is known to occur in approximately 70 small, scattered populations in the Allegheny Mountains of eastern West Virginia, USA. Current conservation and management efforts on federal, state, and private lands involving CMS largely rely on small scale, largely...
NASA Astrophysics Data System (ADS)
Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick
2017-04-01
Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model, SUPERFLEX is capable of predicting runoff, soil moisture, and SMOS-like brightness temperature time series. Such a model is traditionally calibrated using only discharge measurements. In this study we designed a multi-objective calibration procedure based on both discharge measurements and SMOS-derived brightness temperature observations in order to evaluate the added value of remotely sensed soil moisture data in the calibration process. As a test case we set up the SUPERFLEX model for the large scale Murray-Darling catchment in Australia ( 1 Million km2). When compared to in situ soil moisture time series, model predictions show good agreement resulting in correlation coefficients exceeding 70 % and Root Mean Squared Errors below 1 %. When benchmarked with the physically based land surface model CLM, SUPERFLEX exhibits similar performance levels. By adapting the runoff routing function within the SUPERFLEX model, the predicted discharge results in a Nash Sutcliff Efficiency exceeding 0.7 over both the calibration and the validation periods.
An Analysis of Rich Cluster Redshift Survey Data for Large Scale Structure Studies
NASA Astrophysics Data System (ADS)
Slinglend, K.; Batuski, D.; Haase, S.; Hill, J.
1994-12-01
The results from the COBE satellite show the existence of structure on scales on the order of 10% or more of the horizon scale of the universe. Rich clusters of galaxies from Abell's catalog show evidence of structure on scales of 100 Mpc and may hold the promise of confirming structure on the scale of the COBE result. However, many Abell clusters have zero or only one measured redshift, so present knowledge of their three dimensional distribution has quite large uncertainties. The shortage of measured redshifts for these clusters may also mask a problem of projection effects corrupting the membership counts for the clusters. Our approach in this effort has been to use the MX multifiber spectrometer on the Steward 2.3m to measure redshifts of at least ten galaxies in each of 80 Abell cluster fields with richness class R>= 1 and mag10 <= 16.8 (estimated z<= 0.12) and zero or one measured redshifts. This work will result in a deeper, more complete (and reliable) sample of positions of rich clusters. Our primary intent for the sample is for two-point correlation and other studies of the large scale structure traced by these clusters in an effort to constrain theoretical models for structure formation. We are also obtaining enough redshifts per cluster so that a much better sample of reliable cluster velocity dispersions will be available for other studies of cluster properties. To date, we have collected such data for 64 clusters, and for most of them, we have seven or more cluster members with redshifts, allowing for reliable velocity dispersion calculations. Velocity histograms and stripe density plots for several interesting cluster fields are presented, along with summary tables of cluster redshift results. Also, with 10 or more redshifts in most of our cluster fields (30({') } square, just about an `Abell diameter' at z ~ 0.1) we have investigated the extent of projection effects within the Abell catalog in an effort to quantify and understand how this may effect the Abell sample.
Heavy hydrocarbon main injector technology
NASA Technical Reports Server (NTRS)
Fisher, S. C.; Arbit, H. A.
1988-01-01
One of the key components of the Advanced Launch System (ALS) is a large liquid rocket, booster engine. To keep the overall vehicle size and cost down, this engine will probably use liquid oxygen (LOX) and a heavy hydrocarbon, such as RP-1, as propellants and operate at relatively high chamber pressures to increase overall performance. A technology program (Heavy Hydrocarbon Main Injector Technology) is being studied. The main objective of this effort is to develop a logic plan and supporting experimental data base to reduce the risk of developing a large scale (approximately 750,000 lb thrust), high performance main injector system. The overall approach and program plan, from initial analyses to large scale, two dimensional combustor design and test, and the current status of the program are discussed. Progress includes performance and stability analyses, cold flow tests of injector model, design and fabrication of subscale injectors and calorimeter combustors for performance, heat transfer, and dynamic stability tests, and preparation of hot fire test plans. Related, current, high pressure, LOX/RP-1 injector technology efforts are also briefly discussed.
Shear-driven dynamo waves at high magnetic Reynolds number.
Tobias, S M; Cattaneo, F
2013-05-23
Astrophysical magnetic fields often display remarkable organization, despite being generated by dynamo action driven by turbulent flows at high conductivity. An example is the eleven-year solar cycle, which shows spatial coherence over the entire solar surface. The difficulty in understanding the emergence of this large-scale organization is that whereas at low conductivity (measured by the magnetic Reynolds number, Rm) dynamo fields are well organized, at high Rm their structure is dominated by rapidly varying small-scale fluctuations. This arises because the smallest scales have the highest rate of strain, and can amplify magnetic field most efficiently. Therefore most of the effort to find flows whose large-scale dynamo properties persist at high Rm has been frustrated. Here we report high-resolution simulations of a dynamo that can generate organized fields at high Rm; indeed, the generation mechanism, which involves the interaction between helical flows and shear, only becomes effective at large Rm. The shear does not enhance generation at large scales, as is commonly thought; instead it reduces generation at small scales. The solution consists of propagating dynamo waves, whose existence was postulated more than 60 years ago and which have since been used to model the solar cycle.
MODFLOW-LGR: Practical application to a large regional dataset
NASA Astrophysics Data System (ADS)
Barnes, D.; Coulibaly, K. M.
2011-12-01
In many areas of the US, including southwest Florida, large regional-scale groundwater models have been developed to aid in decision making and water resources management. These models are subsequently used as a basis for site-specific investigations. Because the large scale of these regional models is not appropriate for local application, refinement is necessary to analyze the local effects of pumping wells and groundwater related projects at specific sites. The most commonly used approach to date is Telescopic Mesh Refinement or TMR. It allows the extraction of a subset of the large regional model with boundary conditions derived from the regional model results. The extracted model is then updated and refined for local use using a variable sized grid focused on the area of interest. MODFLOW-LGR, local grid refinement, is an alternative approach which allows model discretization at a finer resolution in areas of interest and provides coupling between the larger "parent" model and the locally refined "child." In the present work, these two approaches are tested on a mining impact assessment case in southwest Florida using a large regional dataset (The Lower West Coast Surficial Aquifer System Model). Various metrics for performance are considered. They include: computation time, water balance (as compared to the variable sized grid), calibration, implementation effort, and application advantages and limitations. The results indicate that MODFLOW-LGR is a useful tool to improve local resolution of regional scale models. While performance metrics, such as computation time, are case-dependent (model size, refinement level, stresses involved), implementation effort, particularly when regional models of suitable scale are available, can be minimized. The creation of multiple child models within a larger scale parent model makes it possible to reuse the same calibrated regional dataset with minimal modification. In cases similar to the Lower West Coast model, where a model is larger than optimal for direct application as a parent grid, a combination of TMR and LGR approaches should be used to develop a suitable parent grid.
Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldenson, N.; Mauger, G.; Leung, L. R.
Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less
Data integration in the era of omics: current and future challenges
2014-01-01
To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990
Applications of large-scale density functional theory in biology
NASA Astrophysics Data System (ADS)
Cole, Daniel J.; Hine, Nicholas D. M.
2016-10-01
Density functional theory (DFT) has become a routine tool for the computation of electronic structure in the physics, materials and chemistry fields. Yet the application of traditional DFT to problems in the biological sciences is hindered, to a large extent, by the unfavourable scaling of the computational effort with system size. Here, we review some of the major software and functionality advances that enable insightful electronic structure calculations to be performed on systems comprising many thousands of atoms. We describe some of the early applications of large-scale DFT to the computation of the electronic properties and structure of biomolecules, as well as to paradigmatic problems in enzymology, metalloproteins, photosynthesis and computer-aided drug design. With this review, we hope to demonstrate that first principles modelling of biological structure-function relationships are approaching a reality.
Jones, Alvin; Ingram, M Victoria
2011-10-01
Using a relatively new statistical paradigm, Optimal Data Analysis (ODA; Yarnold & Soltysik, 2005), this research demonstrated that newly developed scales for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and MMPI-2 Restructured Form (MMPI-2-RF) specifically designed to assess over-reporting of cognitive and/or somatic symptoms were more effective than the MMPI-2 F-family of scales in predicting effort status on tests of cognitive functioning in a sample of 288 military members. ODA demonstrated that when all scales were performing at their theoretical maximum possible level of classification accuracy, the Henry Heilbronner Index (HHI), Response Bias Scale (RBS), Fake Bad Scale (FBS), and the Symptom Validity Scale (FBS-r) outperformed the F-family of scales on a variety of ODA indexes of classification accuracy, including an omnibus measure (effect strength total, EST) of the descriptive and prognostic utility of ODA models developed for each scale. Based on the guidelines suggested by Yarnold and Soltysik for evaluating effect strengths for ODA models, the newly developed scales had effects sizes that were moderate in size (37.66 to 45.68), whereas the F-family scales had effects strengths that ranged from weak to moderate (15.42 to 32.80). In addition, traditional analysis demonstrated that HHI, RBS, FBS, and FBS-R had large effect sizes (0.98 to 1.16) based on Cohen's (1988) suggested categorization of effect size when comparing mean scores for adequate versus inadequate effort groups, whereas F-family of scales had small to medium effect sizes (0.25 to 0.76). The MMPI-2-RF Infrequent Somatic Responses Scale (F(S)) tended to perform in a fashion similar to F, the best performing F-family scale.
Cost-Driven Design of a Large Scale X-Plane
NASA Technical Reports Server (NTRS)
Welstead, Jason R.; Frederic, Peter C.; Frederick, Michael A.; Jacobson, Steven R.; Berton, Jeffrey J.
2017-01-01
A conceptual design process focused on the development of a low-cost, large scale X-plane was developed as part of an internal research and development effort. One of the concepts considered for this process was the double-bubble configuration recently developed as an advanced single-aisle class commercial transport similar in size to a Boeing 737-800 or Airbus A320. The study objective was to reduce the contractor cost from contract award to first test flight to less than $100 million, and having the first flight within three years of contract award. Methods and strategies for reduced cost are discussed.
Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.
2010-01-01
Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.
Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter
2013-01-01
Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032
Hieu, Nguyen Trong; Brochier, Timothée; Tri, Nguyen-Huu; Auger, Pierre; Brehmer, Patrice
2014-09-01
We consider a fishery model with two sites: (1) a marine protected area (MPA) where fishing is prohibited and (2) an area where the fish population is harvested. We assume that fish can migrate from MPA to fishing area at a very fast time scale and fish spatial organisation can change from small to large clusters of school at a fast time scale. The growth of the fish population and the catch are assumed to occur at a slow time scale. The complete model is a system of five ordinary differential equations with three time scales. We take advantage of the time scales using aggregation of variables methods to derive a reduced model governing the total fish density and fishing effort at the slow time scale. We analyze this aggregated model and show that under some conditions, there exists an equilibrium corresponding to a sustainable fishery. Our results suggest that in small pelagic fisheries the yield is maximum for a fish population distributed among both small and large clusters of school.
Large-scale particle acceleration by magnetic reconnection during solar flares
NASA Astrophysics Data System (ADS)
Li, X.; Guo, F.; Li, H.; Li, G.; Li, S.
2017-12-01
Magnetic reconnection that triggers explosive magnetic energy release has been widely invoked to explain the large-scale particle acceleration during solar flares. While great efforts have been spent in studying the acceleration mechanism in small-scale kinetic simulations, there have been rare studies that make predictions to acceleration in the large scale comparable to the flare reconnection region. Here we present a new arrangement to study this problem. We solve the large-scale energetic-particle transport equation in the fluid velocity and magnetic fields from high-Lundquist-number MHD simulations of reconnection layers. This approach is based on examining the dominant acceleration mechanism and pitch-angle scattering in kinetic simulations. Due to the fluid compression in reconnection outflows and merging magnetic islands, particles are accelerated to high energies and develop power-law energy distributions. We find that the acceleration efficiency and power-law index depend critically on upstream plasma beta and the magnitude of guide field (the magnetic field component perpendicular to the reconnecting component) as they influence the compressibility of the reconnection layer. We also find that the accelerated high-energy particles are mostly concentrated in large magnetic islands, making the islands a source of energetic particles and high-energy emissions. These findings may provide explanations for acceleration process in large-scale magnetic reconnection during solar flares and the temporal and spatial emission properties observed in different flare events.
Law Enforcement Efforts to Control Domestically Grown Marijuana.
1984-05-25
mari- juana grown indoors , the involvement of large criminal organizations, and the patterns of domestic marijuana distribution. In response to a GAO...information is particularly important if the amount of marijuana grown indoors and the number of large-scale cultiva- tion and distribution organizations... marijuana indoors is becoming increasingly popular. A 1982 narcotics assessment by the Western States Information Network (WSIN)2 of marijuana
Michael G. Harrington; Erin Noonan-Wright; Mitchell Doherty
2007-01-01
Much of the coniferous zones in the Western United States where fires were historically frequent have seen large increases in stand densities and associated forest fuels due to 20th century anthropogenic influences. This condition is partially responsible for contemporary large, uncharacteristically severe wildfires. Therefore, considerable effort is under way to...
SAR STUDY OF NASAL TOXICITY: LESSONS FOR MODELING SMALL TOXICITY DATASETS
Most toxicity data, particularly from whole animal bioassays, are generated without the needs or capabilities of structure-activity relationship (SAR) modeling in mind. Some toxicity endpoints have been of sufficient regulatory concern to warrant large scale testing efforts (e.g....
TREATMENT OF MUNICIPAL WASTEWATERS BY THE FLUIDIZED BED BIOREACTOR PROCESS
A 2-year, large-scale pilot investigation was conducted at the City of Newburgh Water Pollution Control Plant, Newburgh, NY, to demonstrate the application of the fluidized bed bioreactor process to the treatment of municipal wastewaters. The experimental effort investigated the ...
A MANAGEMENT SUPPORT SYSTEM FOR GREAT LAKES COASTAL WETLANDS
The Great Lakes National Program Office in conjunction with the Great Lakes Commission and other researchers is leading a large scale collaborative effort that will yield, in unprecedented detail, a management support system for Great Lakes coastal wetlands. This entails the dev...
Evaluating Green/Gray Infrastructure for CSO/Stormwater Control
The NRMRL is conducting this project to evaluate the water quality and quantity benefits of a large-scale application of green infrastructure (low-impact development/best management practices) retrofits in an entire subcatchment. It will document ORD's effort to demonstrate the e...
Nakamura, Brad J; Mueller, Charles W; Higa-McMillan, Charmaine; Okamura, Kelsie H; Chang, Jaime P; Slavin, Lesley; Shimabukuro, Scott
2014-01-01
Hawaii's Child and Adolescent Mental Health Division provides a unique illustration of a youth public mental health system with a long and successful history of large-scale quality improvement initiatives. Many advances are linked to flexibly organizing and applying knowledge gained from the scientific literature and move beyond installing a limited number of brand-named treatment approaches that might be directly relevant only to a small handful of system youth. This article takes a knowledge-to-action perspective and outlines five knowledge management strategies currently under way in Hawaii. Each strategy represents one component of a larger coordinated effort at engineering a service system focused on delivering both brand-named treatment approaches and complimentary strategies informed by the evidence base. The five knowledge management examples are (a) a set of modular-based professional training activities for currently practicing therapists, (b) an outreach initiative for supporting youth evidence-based practices training at Hawaii's mental health-related professional programs, (c) an effort to increase consumer knowledge of and demand for youth evidence-based practices, (d) a practice and progress agency performance feedback system, and (e) a sampling of system-level research studies focused on understanding treatment as usual. We end by outlining a small set of lessons learned and a longer term vision for embedding these efforts into the system's infrastructure.
Forcey, Greg M.; Thogmartin, Wayne E.; Linz, George M.; McKann, Patrick C.
2014-01-01
Bird populations are influenced by many environmental factors at both large and small scales. Our study evaluated the influences of regional climate and land-use variables on the Northern Harrier (Circus cyaneus), Black Tern (Childonias niger), and Marsh Wren (Cistothorus palustris) in the prairie potholes of the upper Midwest of the United States. These species were chosen because their diverse habitat preference represent the spectrum of habitat conditions present in the Prairie Potholes, ranging from open prairies to dense cattail marshes. We evaluated land-use covariates at three logarithmic spatial scales (1,000 ha, 10,000 ha, and 100,000 ha) and constructed models a priori using information from published habitat associations and climatic influences. The strongest influences on the abundance of each of the three species were the percentage of wetland area across all three spatial scales and precipitation in the year preceding that when bird surveys were conducted. Even among scales ranging over three orders of magnitude the influence of spatial scale was small, as models with the same variables expressed at different scales were often in the best model subset. Examination of the effects of large-scale environmental variables on wetland birds elucidated relationships overlooked in many smaller-scale studies, such as the influences of climate and habitat variables at landscape scales. Given the spatial variation in the abundance of our focal species within the prairie potholes, our model predictions are especially useful for targeting locations, such as northeastern South Dakota and central North Dakota, where management and conservation efforts would be optimally beneficial. This modeling approach can also be applied to other species and geographic areas to focus landscape conservation efforts and subsequent small-scale studies, especially in constrained economic climates.
Value-focused framework for defining landscape-scale conservation targets
Romañach, Stephanie; Benscoter, Allison M.; Brandt, Laura A.
2016-01-01
Conservation of natural resources can be challenging in a rapidly changing world and require collaborative efforts for success. Conservation planning is the process of deciding how to protect, conserve, and enhance or minimize loss of natural and cultural resources. Establishing conservation targets (also called indicators or endpoints), the measurable expressions of desired resource conditions, can help with site-specific up to landscape-scale conservation planning. Using conservation targets and tracking them through time can deliver benefits such as insight into ecosystem health and providing early warnings about undesirable trends. We describe an approach using value-focused thinking to develop statewide conservation targets for Florida. Using such an approach allowed us to first identify stakeholder objectives and then define conservation targets to meet those objectives. Stakeholders were able to see how their shared efforts fit into the broader conservation context, and also anticipate the benefits of multi-agency and -organization collaboration. We developed an iterative process for large-scale conservation planning that included defining a shared framework for the process, defining the conservation targets themselves, as well as developing management and monitoring strategies for evaluation of their effectiveness. The process we describe is applicable to other geographies where multiple parties are seeking to implement collaborative, large-scale biological planning.
De La Vega, Francisco M; Dailey, David; Ziegle, Janet; Williams, Julie; Madden, Dawn; Gilbert, Dennis A
2002-06-01
Since public and private efforts announced the first draft of the human genome last year, researchers have reported great numbers of single nucleotide polymorphisms (SNPs). We believe that the availability of well-mapped, quality SNP markers constitutes the gateway to a revolution in genetics and personalized medicine that will lead to better diagnosis and treatment of common complex disorders. A new generation of tools and public SNP resources for pharmacogenomic and genetic studies--specifically for candidate-gene, candidate-region, and whole-genome association studies--will form part of the new scientific landscape. This will only be possible through the greater accessibility of SNP resources and superior high-throughput instrumentation-assay systems that enable affordable, highly productive large-scale genetic studies. We are contributing to this effort by developing a high-quality linkage disequilibrium SNP marker map and an accompanying set of ready-to-use, validated SNP assays across every gene in the human genome. This effort incorporates both the public sequence and SNP data sources, and Celera Genomics' human genome assembly and enormous resource ofphysically mapped SNPs (approximately 4,000,000 unique records). This article discusses our approach and methodology for designing the map, choosing quality SNPs, designing and validating these assays, and obtaining population frequency ofthe polymorphisms. We also discuss an advanced, high-performance SNP assay chemisty--a new generation of the TaqMan probe-based, 5' nuclease assay-and high-throughput instrumentation-software system for large-scale genotyping. We provide the new SNP map and validation information, validated SNP assays and reagents, and instrumentation systems as a novel resource for genetic discoveries.
Harvey, Léa; Fortin, Daniel
2013-01-01
Spatial heterogeneity in the strength of trophic interactions is a fundamental property of food web spatial dynamics. The feeding effort of herbivores should reflect adaptive decisions that only become rewarding when foraging gains exceed 1) the metabolic costs, 2) the missed opportunity costs of not foraging elsewhere, and 3) the foraging costs of anti-predator behaviour. Two aspects of these costs remain largely unexplored: the link between the strength of plant-herbivore interactions and the spatial scale of food-quality assessment, and the predator-prey spatial game. We modeled the foraging effort of free-ranging plains bison (Bison bison bison) in winter, within a mosaic of discrete meadows. Spatial patterns of bison herbivory were largely driven by a search for high net energy gains and, to a lesser degree, by the spatial game with grey wolves (Canis lupus). Bison decreased local feeding effort with increasing metabolic and missed opportunity costs. Bison herbivory was most consistent with a broad-scale assessment of food patch quality, i.e., bison grazed more intensively in patches with a low missed opportunity cost relative to other patches available in the landscape. Bison and wolves had a higher probability of using the same meadows than expected randomly. This co-occurrence indicates wolves are ahead in the spatial game they play with bison. Wolves influenced bison foraging at fine scale, as bison tended to consume less biomass at each feeding station when in meadows where the risk of a wolf's arrival was relatively high. Also, bison left more high-quality vegetation in large than small meadows. This behavior does not maximize their energy intake rate, but is consistent with bison playing a shell game with wolves. Our assessment of bison foraging in a natural setting clarifies the complex nature of plant-herbivore interactions under predation risk, and reveals how spatial patterns in herbivory emerge from multi-scale landscape heterogeneity. PMID:24039909
Portable parallel stochastic optimization for the design of aeropropulsion components
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Rhodes, G. S.
1994-01-01
This report presents the results of Phase 1 research to develop a methodology for performing large-scale Multi-disciplinary Stochastic Optimization (MSO) for the design of aerospace systems ranging from aeropropulsion components to complete aircraft configurations. The current research recognizes that such design optimization problems are computationally expensive, and require the use of either massively parallel or multiple-processor computers. The methodology also recognizes that many operational and performance parameters are uncertain, and that uncertainty must be considered explicitly to achieve optimum performance and cost. The objective of this Phase 1 research was to initialize the development of an MSO methodology that is portable to a wide variety of hardware platforms, while achieving efficient, large-scale parallelism when multiple processors are available. The first effort in the project was a literature review of available computer hardware, as well as review of portable, parallel programming environments. The first effort was to implement the MSO methodology for a problem using the portable parallel programming language, Parallel Virtual Machine (PVM). The third and final effort was to demonstrate the example on a variety of computers, including a distributed-memory multiprocessor, a distributed-memory network of workstations, and a single-processor workstation. Results indicate the MSO methodology can be well-applied towards large-scale aerospace design problems. Nearly perfect linear speedup was demonstrated for computation of optimization sensitivity coefficients on both a 128-node distributed-memory multiprocessor (the Intel iPSC/860) and a network of workstations (speedups of almost 19 times achieved for 20 workstations). Very high parallel efficiencies (75 percent for 31 processors and 60 percent for 50 processors) were also achieved for computation of aerodynamic influence coefficients on the Intel. Finally, the multi-level parallelization strategy that will be needed for large-scale MSO problems was demonstrated to be highly efficient. The same parallel code instructions were used on both platforms, demonstrating portability. There are many applications for which MSO can be applied, including NASA's High-Speed-Civil Transport, and advanced propulsion systems. The use of MSO will reduce design and development time and testing costs dramatically.
Large scale in vivo recordings to study neuronal biophysics.
Giocomo, Lisa M
2015-06-01
Over the last several years, technological advances have enabled researchers to more readily observe single-cell membrane biophysics in awake, behaving animals. Studies utilizing these technologies have provided important insights into the mechanisms generating functional neural codes in both sensory and non-sensory cortical circuits. Crucial for a deeper understanding of how membrane biophysics control circuit dynamics however, is a continued effort to move toward large scale studies of membrane biophysics, in terms of the numbers of neurons and ion channels examined. Future work faces a number of theoretical and technical challenges on this front but recent technological developments hold great promise for a larger scale understanding of how membrane biophysics contribute to circuit coding and computation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Heating and Large Scale Dynamics of the Solar Corona
NASA Technical Reports Server (NTRS)
Schnack, Dalton D.
2000-01-01
The effort was concentrated in the areas: coronal heating mechanism, unstructured adaptive grid algorithms, numerical modeling of magnetic reconnection in the MRX experiment: effect of toroidal magnetic field and finite pressure, effect of OHMIC heating and vertical magnetic field, effect of dynamic MESH adaption.
Transforming Power Systems; 21st Century Power Partnership
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-05-20
The 21st Century Power Partnership - a multilateral effort of the Clean Energy Ministerial - serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with deep energy ef?ciency and smart grid solutions.
ERIC Educational Resources Information Center
Coffey, Dave
2006-01-01
The scale of the mechanical and plumbing systems required to support a large, multi-building academic health sciences/research center entails a lot of ductwork. Getting mechanical systems installed and running while carrying out activities from other building disciplines requires a great deal of coordinated effort. A university and its…
TOWARD ERROR ANALYSIS OF LARGE-SCALE FOREST CARBON BUDGETS
Quantification of forest carbon sources and sinks is an important part of national inventories of net greenhouse gas emissions. Several such forest carbon budgets have been constructed, but little effort has been made to analyse the sources of error and how these errors propagate...
Estimating carbon fluxes on small rotationally grazed pastures
USDA-ARS?s Scientific Manuscript database
Satellite-based Normalized Difference Vegetation Index (NDVI) data have been extensively used for estimating gross primary productivity (GPP) and yield of grazing lands throughout the world. Large-scale estimates of GPP are a necessary component of efforts to monitor the soil carbon balance of grazi...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grossman, Max; Pritchard Jr., Howard Porter; Budimlic, Zoran
2016-12-22
Graph500 [14] is an effort to offer a standardized benchmark across large-scale distributed platforms which captures the behavior of common communicationbound graph algorithms. Graph500 differs from other large-scale benchmarking efforts (such as HPL [6] or HPGMG [7]) primarily in the irregularity of its computation and data access patterns. The core computational kernel of Graph500 is a breadth-first search (BFS) implemented on an undirected graph. The output of Graph500 is a spanning tree of the input graph, usually represented by a predecessor mapping for every node in the graph. The Graph500 benchmark defines several pre-defined input sizes for implementers to testmore » against. This report summarizes investigation into implementing the Graph500 benchmark on OpenSHMEM, and focuses on first building a strong and practical understanding of the strengths and limitations of past work before proposing and developing novel extensions.« less
NASA Astrophysics Data System (ADS)
Yue, Y.; Tong, X.; Wang, K.; Fensholt, R.; Brandt, M.
2017-12-01
With the aim to combat desertification and improve the ecological environment, mega-engineering afforestation projects have been launched in the karst regions of southwest China around the turn of the new millennium. A positive impact of these projects on vegetation cover has been shown, however, it remains unclear if conservation efforts have been able to effectively restore ecosystem properties and reduce the sensitivity of the karst ecosystem to climate variations at large scales. Here we use passive microwave and optical satellite time series data combined with the ecosystem model LPJ-GUESS and show widespread increase in vegetation cover with a clear demarcation at the Chinese national border contrasting the conditions of neighboring countries. We apply a breakpoint detection to identify permanent changes in vegetation time series and assess the vegetation's sensitivity against climate before and after the breakpoints. A majority (74%) of the breakpoints were detected between 2001 and 2004 and are remarkably in line with the implementation and spatial extent of the Grain to Green project. We stratify the counties of the study area into four groups according to the extent of Grain to Green conservation areas and find distinct differences between the groups. Vegetation trends are similar prior to afforestation activities (1982-2000), but clearly diverge at a later stage, following the spatial extent of conservation areas. Moreover, vegetation cover dynamics were increasingly decoupled from climatic influence in areas of high conservation efforts. Whereas both vegetation resilience and resistance were considerably improved in areas with large conservation efforts thereby showing an increase in ecosystem stability, ongoing degradation and an amplified sensitivity to climate variability was found in areas with limited project implementation. Our study concludes that large scale conservation projects can regionally contribute to a greening Earth and are able to mitigate desertification by increasing the vegetation cover and reducing the ecosystem sensitivity to climate change, however, degradation remains a serious issue in the karst ecosystem of southwest China.
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.
1975-01-01
The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.
Culture and cognition in health systems change.
Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan
2015-01-01
Large-scale change involves modifying not only the structures and functions of multiple organizations, but also the mindsets and behaviours of diverse stakeholders. This paper focuses on the latter: the informal, less visible, and often neglected psychological and social factors implicated in change efforts. The purpose of this paper is to differentiate between the concepts of organizational culture and mental models, to argue for the value of applying a shared mental models (SMM) framework to large-scale change, and to suggest directions for future research. The authors provide an overview of SMM theory and use it to explore the dynamic relationship between culture and cognition. The contributions and limitations of the theory to change efforts are also discussed. Culture and cognition are complementary perspectives, providing insight into two different levels of the change process. SMM theory draws attention to important questions that add value to existing perspectives on large-scale change. The authors outline these questions for future research and argue that research and practice in this domain may be best served by focusing less on the potentially narrow goal of "achieving consensus" and more on identifying, understanding, and managing cognitive convergences and divergences as part of broader research and change management programmes. Drawing from both cultural and cognitive paradigms can provide researchers with a more complete picture of the processes by which coordinated action are achieved in complex change initiatives in the healthcare domain.
2011-12-01
road oil, aviation gasoline, kerosene, lubricants, naphtha-type jet fuel, pentanes plus, petrochemical feedstocks, special naphthas, still gas... refinery gas), waxes, miscellaneous products, and crude oil burned as fuel. Figure 2. Uses of Oil (EIA, 2010a, p. 148) There is no significant body of...1. Large-Scale Efforts in the 1990s There have been efforts in the past to bring about the adoption of EVs or other zero- emissions vehicles. There
2008-11-01
In 2004, senior military commanders called for a “ Manhattan Project -like” effort against IEDs, and the Department of Defense (DOD) later...reference to the Manhattan Project by U.S. Central Command leaders was meant to convey the need for a large-scale, focused effort, combining the nation’s...of a highway in southern Iraq. USA Photo/Master Sergeant Lek Mateo. 15 JIEDDO TODAY We’ve got to have something like the Manhattan Project . General
Barbera, J; Macintyre, A; Gostin, L; Inglesby, T; O'Toole, T; DeAtley, C; Tonat, K; Layton, M
2001-12-05
Concern for potential bioterrorist attacks causing mass casualties has increased recently. Particular attention has been paid to scenarios in which a biological agent capable of person-to-person transmission, such as smallpox, is intentionally released among civilians. Multiple public health interventions are possible to effect disease containment in this context. One disease control measure that has been regularly proposed in various settings is the imposition of large-scale or geographic quarantine on the potentially exposed population. Although large-scale quarantine has not been implemented in recent US history, it has been used on a small scale in biological hoaxes, and it has been invoked in federally sponsored bioterrorism exercises. This article reviews the scientific principles that are relevant to the likely effectiveness of quarantine, the logistic barriers to its implementation, legal issues that a large-scale quarantine raises, and possible adverse consequences that might result from quarantine action. Imposition of large-scale quarantine-compulsory sequestration of groups of possibly exposed persons or human confinement within certain geographic areas to prevent spread of contagious disease-should not be considered a primary public health strategy in most imaginable circumstances. In the majority of contexts, other less extreme public health actions are likely to be more effective and create fewer unintended adverse consequences than quarantine. Actions and areas for future research, policy development, and response planning efforts are provided.
Kongelf, Anine; Bandewar, Sunita V S; Bharat, Shalini; Collumbien, Martine
2015-01-01
In the last decade, community mobilisation (CM) interventions targeting female sex workers (FSWs) have been scaled-up in India's national response to the HIV epidemic. This included the Bill and Melinda Gates Foundation's Avahan programme which adopted a business approach to plan and manage implementation at scale. With the focus of evaluation efforts on measuring effectiveness and health impacts there has been little analysis thus far of the interaction of the CM interventions with the sex work industry in complex urban environments. Between March and July 2012 semi-structured, in-depth interviews and focus group discussions were conducted with 63 HIV intervention implementers, to explore challenges of HIV prevention among FSWs in Mumbai. A thematic analysis identified contextual factors that impact CM implementation. Large-scale interventions are not only impacted by, but were shown to shape the dynamic social context. Registration practices and programme monitoring were experienced as stigmatising, reflected in shifting client preferences towards women not disclosing as 'sex workers'. This combined with urban redevelopment and gentrification of traditional red light areas, forcing dispersal and more 'hidden' ways of solicitation, further challenging outreach and collectivisation. Participants reported that brothel owners and 'pimps' continued to restrict access to sex workers and the heterogeneous 'community' of FSWs remains fragmented with high levels of mobility. Stakeholder engagement was poor and mobilising around HIV prevention not compelling. Interventions largely failed to respond to community needs as strong target-orientation skewed activities towards those most easily measured and reported. Large-scale interventions have been impacted by and contributed to an increasingly complex sex work environment in Mumbai, challenging outreach and mobilisation efforts. Sex workers remain a vulnerable and disempowered group needing continued support and more comprehensive services.
A state-based national network for effective wildlife conservation
Meretsky, Vicky J.; Maguire, Lynn A.; Davis, Frank W.; Stoms, David M.; Scott, J. Michael; Figg, Dennis; Goble, Dale D.; Griffith, Brad; Henke, Scott E.; Vaughn, Jacqueline; Yaffee, Steven L.
2012-01-01
State wildlife conservation programs provide a strong foundation for biodiversity conservation in the United States, building on state wildlife action plans. However, states may miss the species that are at the most risk at rangewide scales, and threats such as novel diseases and climate change increasingly act at regional and national levels. Regional collaborations among states and their partners have had impressive successes, and several federal programs now incorporate state priorities. However, regional collaborations are uneven across the country, and no national counterpart exists to support efforts at that scale. A national conservation-support program could fill this gap and could work across the conservation community to identify large-scale conservation needs and support efforts to meet them. By providing important information-sharing and capacity-building services, such a program would advance collaborative conservation among the states and their partners, thus increasing both the effectiveness and the efficiency of conservation in the United States.
The effects of climate change associated abiotic stresses on maize phytochemical defenses
USDA-ARS?s Scientific Manuscript database
Reliable large-scale maize production is an essential component of global food security; however, sustained efforts are needed to ensure optimized resilience under diverse crop stress conditions. Climate changes are expected to increase the frequency and intensity of both abiotic and biotic stress. ...
Slowing the flow: Setting priorities and defining success in Lake Superior’s South Shore watersheds
For over 60 years, watershed conservation efforts to improve water quality have largely focused on restoring and protecting hydrology under the mantra “slow the flow”. This approach seeks to reduce peak flows with landscape scale watershed restoration approaches that ...
The National Near-Road Mobile Source Air Toxics Study: Las Vegas
EPA, in collaboration with FHWA, has been involved in a large-scale monitoring research study in an effort to characterize highway vehicle emissions in a near-road environment. The pollutants of interest include particulate matter with aerodynamic diameter less than 2.5 microns ...
Student Engagement in Inclusive Classrooms
ERIC Educational Resources Information Center
Rangvid, Beatrice Schindler
2018-01-01
Using large scale survey data, I document substantial differences in behavioural engagement (defined as involvement in academic and social activities, cooperative participation in learning, and motivation and effort) and emotional engagement levels (defined as a sense of belonging and well-being at school) between students with and without special…
pre-feasibility analysis; wind data analysis; the small wind turbine certification process; economic Regional Test Center effort, analysis of the potential economic impact of large-scale MHK deployment off pre-feasibility analysis. Tony is an engineer officer in the Army Reserve. He has deployed twice
ERIC Educational Resources Information Center
Crane, Earl Newell
2013-01-01
The research problem that inspired this effort is the challenge of managing the security of systems in large-scale heterogeneous networked environments. Human intervention is slow and limited: humans operate at much slower speeds than networked computer communications and there are few humans associated with each network. Enabling each node in the…
Progress toward a low budget reference grade genome assembly
USDA-ARS?s Scientific Manuscript database
Reference quality de novo genome assemblies were once solely the domain of large, well-funded genome projects. While next-generation short read technology removed some of the cost barriers, accurate chromosome-scale assembly remains a real challenge. Here we present efforts to de novo assemble the...
Ecological Regional Analysis Applied to Campus Sustainability Performance
ERIC Educational Resources Information Center
Weber, Shana; Newman, Julie; Hill, Adam
2017-01-01
Purpose: Sustainability performance in higher education is often evaluated at a generalized large scale. It remains unknown to what extent campus efforts address regional sustainability needs. This study begins to address this gap by evaluating trends in performance through the lens of regional environmental characteristics.…
Monitoring aquatic resources for regional assessments requires an accurate and comprehensive inventory of the resource and useful classification of exosystem similarities. Our research effort to create an electronic database and work with various ways to classify coastal wetlands...
Strategic Planning Tools for Large-Scale Technology-Based Assessments
ERIC Educational Resources Information Center
Koomen, Marten; Zoanetti, Nathan
2018-01-01
Education systems are increasingly being called upon to implement new technology-based assessment systems that generate efficiencies, better meet changing stakeholder expectations, or fulfil new assessment purposes. These assessment systems require coordinated organisational effort to implement and can be expensive in time, skill and other…
An Overview of Science Education in the Caribbean: Research, Policy and Practice.
ERIC Educational Resources Information Center
Sweeney, Aldrin E.
2003-01-01
Analyzes science education in the Caribbean and provides examples of science education policy and practice. Emphasizes large-scale national efforts in Barbados, Bermuda, and Jamaica. Discusses and provides recommendations for future directions in science education in these countries. (Contains 88 references.) (Author/NB)
Counihan, Timothy D.; Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.
2018-01-01
Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers.
Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian S.; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.
2018-01-01
Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers. PMID:29364953
Physical habitat monitoring strategy (PHAMS) for reach-scale restoration effectiveness monitoring
Jones, Krista L.; O'Daniel, Scott J.; Beechie, Tim J.; Zakrajsek, John; Webster, John G.
2015-04-14
Habitat restoration efforts by the Confederated Tribes of the Umatilla Indian Reservation (CTUIR) have shifted from the site scale (1-10 meters) to the reach scale (100-1,000 meters). This shift was in response to the growing scientific emphasis on process-based restoration and to support from the 2007 Accords Agreement with the Bonneville Power Administration. With the increased size of restoration projects, the CTUIR and other agencies are in need of applicable monitoring methods for assessing large-scale changes in river and floodplain habitats following restoration. The goal of the Physical Habitat Monitoring Strategy is to outline methods that are useful for capturing reach-scale changes in surface and groundwater hydrology, geomorphology, hydrologic connectivity, and riparian vegetation at restoration projects. The Physical Habitat Monitoring Strategy aims to avoid duplication with existing regional effectiveness monitoring protocols by identifying complimentary reach-scale metrics and methods that may improve the ability of CTUIR and others to detect instream and riparian changes at large restoration projects.
How do you modernize a health service? A realist evaluation of whole-scale transformation in london.
Greenhalgh, Trisha; Humphrey, Charlotte; Hughes, Jane; Macfarlane, Fraser; Butler, Ceri; Pawson, Ray
2009-06-01
Large-scale, whole-systems interventions in health care require imaginative approaches to evaluation that go beyond assessing progress against predefined goals and milestones. This project evaluated a major change effort in inner London, funded by a charitable donation of approximately $21 million, which spanned four large health care organizations, covered three services (stroke, kidney, and sexual health), and sought to "modernize" these services with a view to making health care more efficient, effective, and patient centered. This organizational case study draws on the principles of realist evaluation, a largely qualitative approach that is centrally concerned with testing and refining program theories by exploring the complex and dynamic interaction among context, mechanism, and outcome. This approach used multiple data sources and methods in a pragmatic and reflexive manner to build a picture of the case and follow its fortunes over the three-year study period. The methods included ethnographic observation, semistructured interviews, and scrutiny of documents and other contemporaneous materials. As well as providing ongoing formative feedback to the change teams in specific areas of activity, we undertook a more abstract, interpretive analysis, which explored the context-mechanism-outcome relationship using the guiding question "what works, for whom, under what circumstances?" In this example of large-scale service transformation, numerous projects and subprojects emerged, fed into one another, and evolved over time. Six broad mechanisms appeared to be driving the efforts of change agents: integrating services across providers, finding and using evidence, involving service users in the modernization effort, supporting self-care, developing the workforce, and extending the range of services. Within each of these mechanisms, different teams chose widely differing approaches and met with differing success. The realist analysis of the fortunes of different subprojects identified aspects of context and mechanism that accounted for observed outcomes (both intended and unintended). This study was one of the first applications of realist evaluation to a large-scale change effort in health care. Even when an ambitious change program shifts from its original goals and meets unforeseen challenges (indeed, precisely because the program morphs and adapts over time), realist evaluation can draw useful lessons about how particular preconditions make particular outcomes more likely, even though it cannot produce predictive guidance or a simple recipe for success. Noting recent calls by others for the greater use of realist evaluation in health care, this article considers some of the challenges and limitations of this method in the light of this experience and suggests that its use will require some fundamental changes in the worldview of some health services researchers.
The future of emissions trading in light of the acid rain experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
McLean, B.J.; Rico, R.
1995-12-31
The idea of emissions trading was developed more than two decades ago by environmental economists eager to provide new ideas for how to improve the efficiency of environmental protection. However, early emissions trading efforts were built on the historical {open_quotes}command and control{close_quotes} infrastructure which has dominated U.S. environmental protection until today. The {open_quotes}command and control{close_quotes} model initially had advantages that were of a very pragmatic character: it assured large pollution reductions in a time when large, cheap reductions were available and necessary; and it did not require a sophisticated government infrastructure. Within the last five years, large-scale emission trading programsmore » have been successfully designed and started that are fundamentally different from the earlier efforts, creating a new paradigm for environmental control just when our understanding of environmental problems is changing as well. The purpose of this paper is to focus on the largest national-scale program--the Acid Rain Program--and from that experience, forecast when emission trading programs may be headed based on our understanding of the factors currently influencing environmental management. The first section of this paper will briefly review the history of emissions trading programs, followed by a summary of the features of the Acid Rain Program, highlighting those features that distinguish it from previous efforts. The last section addresses the opportunities for emissions trading (and its probable future directions).« less
NASA Astrophysics Data System (ADS)
Howard, E. A.; Coleman, K. J.; Barford, C. L.; Kucharik, C.; Foley, J. A.
2005-12-01
Understanding environmental problems that cross physical and disciplinary boundaries requires a more holistic view of the world - a "systems" approach. Yet it is a challenge for many learners to start thinking this way, particularly when the problems are large in scale and not easily visible. We will describe our online university course, "Humans and the Changing Biosphere," which takes a whole-systems perspective for teaching regional to global-scale environmental science concepts, including climate, hydrology, ecology, and human demographics. We will share our syllabus and learning objectives and summarize our efforts to incorporate "best" practices for online teaching. We will describe challenges we have faced, and our efforts to reach different learner types. Our goals for this presentation are: (1) to communicate how a systems approach ties together environmental sciences (including climate, hydrology, ecology, biogeochemistry, and demography) that are often taught as separate disciplines; (2) to generate discussion about challenges of teaching large-scale environmental processes; (3) to share our experiences in teaching these topics online; (4) to receive ideas and feedback on future teaching strategies. We will explain why we developed this course online, and share our experiences about benefits and challenges of teaching over the web - including some suggestions about how to use technology to supplement face-to-face learning experiences (and vice versa). We will summarize assessment data about what students learned during the course, and discuss key misconceptions and barriers to learning. We will highlight the role of an online discussion board in creating classroom community, identifying misconceptions, and engaging different types of learners.
Cresswell, Kathrin; Morrison, Zoe; Crowe, Sarah; Robertson, Ann; Sheikh, Aziz
2011-01-01
The absence of meaningful end user engagement has repeatedly been highlighted as a key factor contributing to 'failed' implementations of electronic health records (EHRs), but achieving this is particularly challenging in the context of national scale initiatives. In 2002, the National Health Service (NHS) embarked on a so-called 'top-down' national implementation strategy aimed at introducing commercial, centrally procured, EHRs into hospitals throughout England. We aimed to examine approaches to, and experiences of, user engagement in the context of a large-scale EHR implementation across purposefully selected hospital care providers implementing early versions of nationally procured software. We conducted a qualitative, case-study based, socio-technically informed, longitudinal investigation, purposefully sampling and collecting data from four hospitals. Our data comprised a total of 123 semi-structured interviews with users and managers, 15 interviews with additional stakeholders, 43 hours of non-participant observations of meetings and system use, and relevant organisation-specific documents from each case study site. Analysis was thematic, building on an existing model of user engagement that was originally developed in the context of studying the implementation of relatively simple technologies in commercial settings. NVivo8 software was used to facilitate coding. Despite an enduring commitment to the vision of shared EHRs and an appreciation of their potential benefits, meaningful end user engagement was never achieved. Hospital staff were not consulted in systems choice, leading to frustration; they were then further alienated by the implementation of systems that they perceived as inadequately customised. Various efforts to achieve local engagement were attempted, but these were in effect risk mitigation strategies. We found the role of clinical champions to be important in these engagement efforts, but progress was hampered by the hierarchical structures within healthcare teams. As a result, engagement efforts focused mainly on clinical staff with inadequate consideration of management and administrative staff. This work has allowed us to further develop an existing model of user engagement from the commercial sector and adapt it to inform user engagement in the context of large-scale eHealth implementations. By identifying key points of possible engagement, disengagement and re-engagement, this model will we hope both help those planning similar large-scale EHR implementation efforts and act as a much needed catalyst to further research in this neglected field of enquiry.
NASA Technical Reports Server (NTRS)
Saunders, J. D.; Stueber, T. J.; Thomas, S. R.; Suder, K. L.; Weir, L. J.; Sanders, B. W.
2012-01-01
Status on an effort to develop Turbine Based Combined Cycle (TBCC) propulsion is described. This propulsion technology can enable reliable and reusable space launch systems. TBCC propulsion offers improved performance and safety over rocket propulsion. The potential to realize aircraft-like operations and reduced maintenance are additional benefits. Among most the critical TBCC enabling technologies are: 1) mode transition from turbine to scramjet propulsion, 2) high Mach turbine engines and 3) TBCC integration. To address these TBCC challenges, the effort is centered on a propulsion mode transition experiment and includes analytical research. The test program, the Combined-Cycle Engine Large Scale Inlet Mode Transition Experiment (CCE LIMX), was conceived to integrate TBCC propulsion with proposed hypersonic vehicles. The goals address: (1) dual inlet operability and performance, (2) mode-transition sequences enabling a switch between turbine and scramjet flow paths, and (3) turbine engine transients during transition. Four test phases are planned from which a database can be used to both validate design and analysis codes and characterize operability and integration issues for TBCC propulsion. In this paper we discuss the research objectives, features of the CCE hardware and test plans, and status of the parametric inlet characterization testing which began in 2011. This effort is sponsored by the NASA Fundamental Aeronautics Hypersonics project
NASP and ISPA Response to the Japanese Natural Disaster
ERIC Educational Resources Information Center
Pfohl, Bill; Cowan, Katherine
2011-01-01
The authors have worked together with the NASP (National Association of School Psychologists) National Emergency Assistance Team (NEAT) for a decade to help coordinate communications around large-scale crisis response efforts. The massive earthquake and tsunami that devastated the northeastern part of Japan and the subsequent response represented…
Turning of COGS moves forward findings for hormonally mediated cancers.
Sakoda, Lori C; Jorgenson, Eric; Witte, John S
2013-04-01
The large-scale Collaborative Oncological Gene-environment Study (COGS) presents new findings that further characterize the genetic bases of breast, ovarian and prostate cancers. We summarize and provide insights into this collection of papers from COGS and discuss the implications of the results and future directions for such efforts.
Effects of Landscape Conditions and Management Practices on Lakes in Northeastern USA.
Lakes continue to face escalating pressures associated with land cover change and growing human populations. The U.S. EPA National Lakes Assessment, which sampled 1,028 lakes during the summer of 2007 using a probabilistic survey, was the first large scale effort to determine the...
Multi-profile analysis of soil moisture within the U.S. Climate Reference Network
USDA-ARS?s Scientific Manuscript database
Soil moisture estimates are crucial for hydrologic modeling and agricultural decision-support efforts. These measurements are also pivotal for long-term inquiries regarding the impacts of climate change and the resulting droughts over large spatial and temporal scales. However, it has only been t...
Students' Test Motivation in PISA: The Case of Norway
ERIC Educational Resources Information Center
Hopfenbeck, Therese N.; Kjaernsli, Marit
2016-01-01
Do students make their best effort in large-scale assessment studies such as the "Programme for International Student Assessment" (PISA)? Despite six cycles of PISA surveys from 2000 to 2015, empirical studies regarding students' test motivation and experience of the tests are sparse. The present study examines students' test motivation…
Implementing Technology: A Change Process
ERIC Educational Resources Information Center
Atwell, Nedra; Maxwell, Marge; Romero, Elizabeth
2008-01-01
The state of Kentucky has embarked upon a large scale systems change effort to integrate Universal Design for Learning (UDL) principles, including use of digital curriculum and computerized reading supports to improve overall student achievement. A major component of this initiative is the use of Read & Write Gold. As higher expectations are…
Plant community dynamics 25 years after juniper control
USDA-ARS?s Scientific Manuscript database
The expansion of piñon-juniper woodlands the past 100 to 150 years in the western United States has resulted in large scale efforts to kill trees and recover sagebrush steppe rangelands. Western juniper (Juniperus occidentalis spp. occidentalis Hook.) expansion in the northern Great Basin has reduc...
Portfolios in Practice: What Is a Portfolio?
ERIC Educational Resources Information Center
Arter, Judith A.
A consortium effort sponsored by the Northwest Regional Evaluation Association has arrived at a workable definition of a portfolio that takes into account the viewpoints of teachers and those interested in large-scale assessment. The modified definition states that: "A student portfolio is a purposeful collection of student work that tells…
Bottomland hardwood afforestation: State of the art
Emile S. Gardiner; D. Ramsey Russell; Mark Oliver; Lamar C. Dorris
2000-01-01
Over the past decade, land managers have implemented large-scale afforestation operations across the Southern United States to rehabilitate agricultural land historically converted from bottomland hardwood forest cover types. These afforestation efforts were initially concentrated on public land managed by State or Federal Government agencies, but have later shifted...
Fundraising in Community College Foundations. ERIC Digest.
ERIC Educational Resources Information Center
Schuyler, Gwyer
In response to declining local and state appropriations for public education, community colleges have taken steps to formalize fundraising efforts by creating institutional foundations as recipients of tax-deductible contributions. Large-scale external fundraising at community colleges began as a result of the 1965 Higher Education Act and the…
Building software tools to help contextualize and interpret monitoring data
USDA-ARS?s Scientific Manuscript database
Even modest monitoring efforts at landscape scales produce large volumes of data.These are most useful if they can be interpreted relative to land potential or other similar sites. However, for many ecological systems reference conditions may not be defined or are poorly described, which hinders und...
Plant succession and approaches to community restoration
Bruce A. Roundy
2005-01-01
The processes of vegetation change over time, or plant succession, are also the processes involved in plant community restoration. Restoration efforts attempt to use designed disturbance, seedbed preparation and sowing methods, and selection of adapted and compatible native plant materials to enhance ecological function. The large scale of wildfires and weed invasion...
Evaluation of the Teaching American History Program
ERIC Educational Resources Information Center
Humphrey, Daniel C.; Chang-Ross, Christopher; Donnelly, Mary Beth; Hersh, Lauren; Skolnik, Heidi
2005-01-01
Nearly 20 years ago, the first national assessment of student achievement in U.S. history yielded disappointing results. Although policy-makers and researchers expressed great concern about the low scores, the federal government did not undertake large-scale efforts to address poor student performance, and few research dollars were dedicated to…
Daniel.Studer@nrel.gov | 303-275-4368 Daniel joined NREL in 2009. As a member of the Commercial Buildings using EnergyPlus to identify large-scale areas for reducing and optimizing commercial building energy consumption. Recently, Daniel led NREL's commercial building workforce development efforts and he is leading
College-Bound Communities. Lumina Foundation Focus™. Summer 2014
ERIC Educational Resources Information Center
Giegerich, Steve
2014-01-01
Research shows a direct correlation between thriving cities and high levels of college-level learning. Regions with robust levels of educational attainment have stronger economies, greater individual earning power, and better quality of life. The Lumina Foundation is actively supporting large-scale efforts in 55 metro regions--to help adults…
REGRESSION MODELS THAT RELATE STREAMS TO WATERSHEDS: COPING WITH NUMEROUS, COLLINEAR PEDICTORS
GIS efforts can produce a very large number of watershed variables (climate, land use/land cover and topography, all defined for multiple areas of influence) that could serve as candidate predictors in a regression model of reach-scale stream features. Invariably, many of these ...
Keep New Mexico Beautiful, Recycling Project Successful
ERIC Educational Resources Information Center
Bickel, Victor R.
1975-01-01
Through the efforts of community groups, the support of local industries, and the state government, Keep New Mexico Beautiful, Inc. (KNMB) is now operating a large-scale recycling business. KNMB has been able to save tons of natural resources, provide local employment, and educate the public to this environmental concern. (MA)
Absorptive Capacity: A Conceptual Framework for Understanding District Central Office Learning
ERIC Educational Resources Information Center
Farrell, Caitlin C.; Coburn, Cynthia E.
2017-01-01
Globally, school systems are pressed to engage in large-scale school improvement. In the United States and other countries, school district central offices and other local governing agencies often engage with external organizations and individuals to support such educational change efforts. However, initiatives with external partners are not…
Elementary Administrators' Mathematics Supervision and Self-Efficacy Development
ERIC Educational Resources Information Center
Johnson, Kelly M. Gomez
2017-01-01
Mathematics curriculum reform is changing the content and resources in today's elementary classrooms as well as the culture of mathematics teaching and learning. Administrators face the challenge of leading large-scale curricular change efforts with limited prior knowledge or experiences with reform curricula structures. Administrators, as the…
A Short History of Performance Assessment: Lessons Learned.
ERIC Educational Resources Information Center
Madaus, George F.; O'Dwyer, Laura M.
1999-01-01
Places performance assessment in the context of high-stakes uses, describes underlying technologies, and outlines the history of performance testing from 210 B.C.E. to the present. Historical issues of fairness, efficiency, cost, and infrastructure influence contemporary efforts to use performance assessments in large-scale, high-stakes testing…
A Systemically Collaborative Approach to Achieving Equity in Higher Education
ERIC Educational Resources Information Center
Prystowsky, Richard J.
2018-01-01
Colleges and universities have long recognized the need to address inequities affecting students from underrepresented or underserved groups. Despite efforts undertaken by dedicated individuals, large-scale, national change in this area has not been realized. In this article, we address two major factors underlying this disappointing result (the…
Large-Scale Implementation of Formative Assessment Practices in an Examination-Oriented Culture
ERIC Educational Resources Information Center
Ratnam-Lim, Christina Tong Li; Tan, Kelvin Heng Kiat
2015-01-01
Singapore's education system has often been characterised as exam-oriented. This paper describes efforts ("windmills") made by the Government to constructively respond to the "winds of change" in the education system. A committee called the Primary Education Review and Implementation (PERI) Committee was appointed to study and…
Case study: Prioritization strategies for reforestation of minelands to benefit Cerulean Warblers
McDermott, Molly E.; Shumar, Matthew B.; Wood, Petra Bohall
2013-01-01
The central Appalachian landscape is being heavily altered by surface coal mining. The practice of Mountaintop Removal/Valley Fill (MTRVF) mining has transformed large areas of mature forest to non-forest and created much forest edge, affecting habitat quality for mature forest wildlife. The Appalachian Regional Reforestation Initiative is working to restore mined areas to native hardwood forest conditions, and strategies are needed to prioritize restoration efforts for wildlife. We present mineland reforestation guidelines for the imperiled Cerulean Warbler, considered a useful umbrella species, in its breeding range. In 2009, we surveyed forest predicted to have Cerulean Warblers near mined areas in the MTRVF region of West Virginia and Kentucky. We visited 36 transect routes and completed songbird surveys on 151 points along these routes. Cerulean Warblers were present at points with fewer large-scale canopy disturbances and more mature oak-hickory forest. We tested the accuracy of a predictive map for this species and demonstrated that it can be useful to guide reforestation efforts. We then developed a map of hot spot locations that can be used to determine potential habitat suitability. Restoration efforts would have greatest benefit for Cerulean Warblers and other mature forest birds if concentrated near a relative-abundance hot spot, on north- and east-facing ridgetops surrounded by mature deciduous forest, and prioritized to reduce edges and connect isolated forest patches. Our multi-scale approach for prioritizing restoration efforts using an umbrella species may be applied to restore habitat impacted by a variety of landscape disturbances.
Haugum, Mona; Danielsen, Kirsten; Iversen, Hilde Hestad; Bjertnaes, Oyvind
2014-12-01
An important goal for national and large-scale surveys of user experiences is quality improvement. However, large-scale surveys are normally conducted by a professional external surveyor, creating an institutionalized division between the measurement of user experiences and the quality work that is performed locally. The aim of this study was to identify and describe scientific studies related to the use of national and large-scale surveys of user experiences in local quality work. Ovid EMBASE, Ovid MEDLINE, Ovid PsycINFO and the Cochrane Database of Systematic Reviews. Scientific publications about user experiences and satisfaction about the extent to which data from national and other large-scale user experience surveys are used for local quality work in the health services. Themes of interest were identified and a narrative analysis was undertaken. Thirteen publications were included, all differed substantially in several characteristics. The results show that large-scale surveys of user experiences are used in local quality work. The types of follow-up activity varied considerably from conducting a follow-up analysis of user experience survey data to information sharing and more-systematic efforts to use the data as a basis for improving the quality of care. This review shows that large-scale surveys of user experiences are used in local quality work. However, there is a need for more, better and standardized research in this field. The considerable variation in follow-up activities points to the need for systematic guidance on how to use data in local quality work. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Gergel, Sarah E.; Vincent, Amanda C. J.
2018-01-01
Locally sustainable resource extraction activities, at times, transform into ecologically detrimental enterprises. Understanding such transitions is a primary challenge for conservation and management of many ecosystems. In marine systems, over-exploitation of small-scale fisheries creates problems such as reduced biodiversity and lower catches. However, long-term documentation of how governance and associated changes in fishing gears may have contributed to such declines is often lacking. Using fisher interviews, we characterized fishing gear dynamics over 60 years (1950–2010) in a coral reef ecosystem in the Philippines subject to changing fishing regulations. In aggregate fishers greatly diversified their use of fishing gears. However, most individual fishers used one or two gears at a time (mean number of fishing gears < 2 in all years). Individual fishing effort (days per year) was fairly steady over the study period, but cumulative fishing effort by all fishers increased 240%. In particular, we document large increases in total effort by fishers using nets and diving. Other fishing gears experienced less pronounced changes in total effort over time. Fishing intensified through escalating use of non-selective, active, and destructive fishing gears. We also found that policies promoting higher production over sustainability influenced the use of fishing gears, with changes in gear use persisting decades after those same policies were stopped. Our quantitative evidence shows dynamic changes in fishing gear use over time and indicates that gears used in contemporary small-scale fisheries impact oceans more than those used in earlier decades. PMID:29538370
Dorazio, Robert; Delampady, Mohan; Dey, Soumen; Gopalaswamy, Arjun M.; Karanth, K. Ullas; Nichols, James D.
2017-01-01
Conservationists and managers are continually under pressure from the public, the media, and political policy makers to provide “tiger numbers,” not just for protected reserves, but also for large spatial scales, including landscapes, regions, states, nations, and even globally. Estimating the abundance of tigers within relatively small areas (e.g., protected reserves) is becoming increasingly tractable (see Chaps. 9 and 10), but doing so for larger spatial scales still presents a formidable challenge. Those who seek “tiger numbers” are often not satisfied by estimates of tiger occupancy alone, regardless of the reliability of the estimates (see Chaps. 4 and 5). As a result, wherever tiger conservation efforts are underway, either substantially or nominally, scientists and managers are frequently asked to provide putative large-scale tiger numbers based either on a total count or on an extrapolation of some sort (see Chaps. 1 and 2).
Raising awareness of the importance of funding for tuberculosis small-molecule research.
Riccardi, Giovanna; Old, Iain G; Ekins, Sean
2017-03-01
Tuberculosis (TB) drug discovery research is hampered by several factors, but as in many research areas, the available funding is insufficient to support the needs of research and development. Recent years have seen various large collaborative efforts involving public-private partnerships, mimicking the situation during the golden age of antibiotic drug discovery during the 1950s and 1960s. The large-scale collaborative efforts funded by the European Union (EU) are now subject to diminishing financial support. As a result, TB researchers are increasingly looking for novel forms of funding, such as crowdfunding, to fill this gap. Any potential solution will require a careful reassessment of the incentives to encourage additional organizations to provide funding. Copyright © 2016 Elsevier Ltd. All rights reserved.
Engineering large-scale agent-based systems with consensus
NASA Technical Reports Server (NTRS)
Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.
1994-01-01
The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.
Report on the Human Genome Initiative for the Office of Health and Environmental Research
DOE R&D Accomplishments Database
Tinoco, I.; Cahill, G.; Cantor, C.; Caskey, T.; Dulbecco, R.; Engelhardt, D. L.; Hood, L.; Lerman, L. S.; Mendelsohn, M. L.; Sinsheimer, R. L.; Smith, T.; Soll, D.; Stormo, G.; White, R. L.
1987-04-01
The report urges DOE and the Nation to commit to a large, multi-year, multidisciplinary, technological undertaking to order and sequence the human genome. This effort will first require significant innovation in general capability to manipulate DNA, major new analytical methods for ordering and sequencing, theoretical developments in computer science and mathematical biology, and great expansions in our ability to store and manipulate the information and to interface it with other large and diverse genetic databases. The actual ordering and sequencing involves the coordinated processing of some 3 billion bases from a reference human genome. Science is poised on the rudimentary edge of being able to read and understand human genes. A concerted, broadly based, scientific effort to provide new methods of sufficient power and scale should transform this activity from an inefficient one-gene-at-a-time, single laboratory effort into a coordinated, worldwide, comprehensive reading of "the book of man". The effort will be extraordinary in scope and magnitude, but so will be the benefit to biological understanding, new technology and the diagnosis and treatment of human disease.
Fabio, Anthony; Geller, Ruth; Bazaco, Michael; Bear, Todd M; Foulds, Abigail L; Duell, Jessica; Sharma, Ravi
2015-01-01
Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates.
NASA Astrophysics Data System (ADS)
Akanda, A. S.; Jutla, A. S.; Islam, S.
2009-12-01
Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.
NASA Technical Reports Server (NTRS)
1973-01-01
The results are reported of the NASA/Drexel research effort which was conducted in two separate phases. The initial phase stressed exploration of the problem from the point of view of three primary research areas and the building of a multidisciplinary team. The final phase consisted of a clinical demonstration program in which the research associates consulted with the County Executive of New Castle County, Delaware, to aid in solving actual problems confronting the County Government. The three primary research areas of the initial phase are identified as technology, management science, and behavioral science. Five specific projects which made up the research effort are treated separately. A final section contains the conclusions drawn from total research effort as well as from the specific projects.
Bangladesh arsenic mitigation programs: lessons from the past
Milton, Abul Hasnat; Hore, Samar Kumar; Hossain, Mohammad Zahid; Rahman, Mahfuzar
2012-01-01
Ensuring access to safe drinking water by 2015 is a global commitment by the Millennium Development Goals (MDGs). In Bangladesh, significant achievements in providing safe water were made earlier by nationwide tubewell-installation programme. This achievement was overshadowed in 1993 by the presence of arsenic in underground water. A total of 6 million tubewells have been tested for arsenic since then, the results of which warranted immediate mitigation. Mitigation measures included tubewell testing and replacing; usage of deeper wells; surface water preservation and treatment; use of sanitary dug wells, river sand and pond sand filters; rainwater collection and storage; household-scale and large-scale arsenic filtrations; and rural pipeline water supply installation. Shallow tubewell installation was discouraged. Efforts have been made to increase people's awareness. This paper describes the lessons learned about mitigation efforts by the authors from experience of arsenic-related work. In spite of national mitigation plans and efforts, a few challenges still persist: inadequate coordination between stakeholders, differences in inter-sectoral attitudes, inadequate research to identify region-specific, suitable safe water options, poor quality of works by various implementing agencies, and inadequate dissemination of the knowledge and experiences to the people by those organizations. Issues such as long-time adaptation using ground water, poor surface water quality including bad smell and turbidity, and refusal to using neighbor's water have delayed mitigation measures so far. Region-specific mitigation water supply policy led by the health sector could be adopted with multisectoral involvement and responsibility. Large-scale piped water supply could be arranged through Public Private Partnerships (PPP) in new national approach. PMID:22558005
Bangladesh arsenic mitigation programs: lessons from the past.
Milton, Abul Hasnat; Hore, Samar Kumar; Hossain, Mohammad Zahid; Rahman, Mahfuzar
2012-01-01
Ensuring access to safe drinking water by 2015 is a global commitment by the Millennium Development Goals (MDGs). In Bangladesh, significant achievements in providing safe water were made earlier by nationwide tubewell-installation programme. This achievement was overshadowed in 1993 by the presence of arsenic in underground water. A total of 6 million tubewells have been tested for arsenic since then, the results of which warranted immediate mitigation. Mitigation measures included tubewell testing and replacing; usage of deeper wells; surface water preservation and treatment; use of sanitary dug wells, river sand and pond sand filters; rainwater collection and storage; household-scale and large-scale arsenic filtrations; and rural pipeline water supply installation. Shallow tubewell installation was discouraged. Efforts have been made to increase people's awareness. This paper describes the lessons learned about mitigation efforts by the authors from experience of arsenic-related work. In spite of national mitigation plans and efforts, a few challenges still persist: inadequate coordination between stakeholders, differences in inter-sectoral attitudes, inadequate research to identify region-specific, suitable safe water options, poor quality of works by various implementing agencies, and inadequate dissemination of the knowledge and experiences to the people by those organizations. Issues such as long-time adaptation using ground water, poor surface water quality including bad smell and turbidity, and refusal to using neighbor's water have delayed mitigation measures so far. Region-specific mitigation water supply policy led by the health sector could be adopted with multisectoral involvement and responsibility. Large-scale piped water supply could be arranged through Public Private Partnerships (PPP) in new national approach.
Pre-Launch Risk Reduction Activities Conducted at KSC for the International Space Station
NASA Technical Reports Server (NTRS)
Kirkpatrick, Paul
2011-01-01
In the development of any large scale space-based multi-piece assembly effort, planning must include provisions for testing and verification; not only of the individual pieces but also of the pieces together. Without such testing on the ground, the risk to cost, schedule and technical performance increases substantially. This paper will review the efforts undertaken by the International Space Station (ISS), including the International Partners, during the pre-launch phase, primarily at KSC, to reduce the risks associated with the on-orbit assembly and operation of the ISS.
The development of a solar-powered residential heating and cooling system
NASA Technical Reports Server (NTRS)
1974-01-01
Efforts to demonstrate the engineering feasibility of utilizing solar power for residential heating and cooling are described. These efforts were concentrated on the analysis, design, and test of a full-scale demonstration system which is currently under construction at the National Aeronautics and Space Administration, Marshall Space Flight Center, Huntsville, Alabama. The basic solar heating and cooling system under development utilizes a flat plate solar energy collector, a large water tank for thermal energy storage, heat exchangers for space heating and water heating, and an absorption cycle air conditioner for space cooling.
Update on worldwide efforts to prevent type 1 diabetes.
Skyler, Jay S
2008-12-01
This paper reviews worldwide efforts to interdict the type 1 diabetes (T1D) disease process, during the stage of evolution of the disease prior to the time of disease onset. The goal of intervention before disease onset is to arrest immune destruction and thus prevent or delay clinical disease. In this regard, there have been several large-scale multicenter randomized controlled clinical trials designed to prevent T1D. These have tested nicotinamide, parenteral insulin, oral insulin, nasal insulin, and the elimination of cow's milk from infant feeding.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shea, M.
1995-09-01
The proper isolation of radioactive waste is one of today`s most pressing environmental issues. Research is being carried out by many countries around the world in order to answer critical and perplexing questions regarding the safe disposal of radioactive waste. Natural analogue studies are an increasingly important facet of this international research effort. The Pocos de Caldas Project represents a major effort of the international technical and scientific community towards addressing one of modern civilization`s most critical environmental issues - radioactive waste isolation.
Mate, Kedar S; Ngidi, Wilbroda Hlolisile; Reddy, Jennifer; Mphatswe, Wendy; Rollins, Nigel; Barker, Pierre
2013-11-01
New approaches are needed to evaluate quality improvement (QI) within large-scale public health efforts. This case report details challenges to large-scale QI evaluation, and proposes solutions relying on adaptive study design. We used two sequential evaluative methods to study a QI effort to improve delivery of HIV preventive care in public health facilities in three districts in KwaZulu-Natal, South Africa, over a 3-year period. We initially used a cluster randomised controlled trial (RCT) design. During the RCT study period, tensions arose between intervention implementation and evaluation design due to loss of integrity of the randomisation unit over time, pressure to implement changes across the randomisation unit boundaries, and use of administrative rather than functional structures for the randomisation. In response to this loss of design integrity, we switched to a more flexible intervention design and a mixed-methods quasiexperimental evaluation relying on both a qualitative analysis and an interrupted time series quantitative analysis. Cluster RCT designs may not be optimal for evaluating complex interventions to improve implementation in uncontrolled 'real world' settings. More flexible, context-sensitive evaluation designs offer a better balance of the need to adjust the intervention during the evaluation to meet implementation challenges while providing the data required to evaluate effectiveness. Our case study involved HIV care in a resource-limited setting, but these issues likely apply to complex improvement interventions in other settings.
An investigation of small scales of turbulence in a boundary layer at high Reynolds numbers
NASA Technical Reports Server (NTRS)
Wallace, James M.; Ong, L.; Balint, J.-L.
1993-01-01
The assumption that turbulence at large wave-numbers is isotropic and has universal spectral characteristics which are independent of the flow geometry, at least for high Reynolds numbers, has been a cornerstone of closure theories as well as of the most promising recent development in the effort to predict turbulent flows, viz. large eddy simulations. This hypothesis was first advanced by Kolmogorov based on the supposition that turbulent kinetic energy cascades down the scales (up the wave-numbers) of turbulence and that, if the number of these cascade steps is sufficiently large (i.e. the wave-number range is large), then the effects of anisotropies at the large scales are lost in the energy transfer process. Experimental attempts were repeatedly made to verify this fundamental assumption. However, Van Atta has recently suggested that an examination of the scalar and velocity gradient fields is necessary to definitively verify this hypothesis or prove it to be unfounded. Of course, this must be carried out in a flow with a sufficiently high Reynolds number to provide the necessary separation of scales in order unambiguously to provide the possibility of local isotropy at large wave-numbers. An opportunity to use our 12-sensor hot-wire probe to address this issue directly was made available at the 80'x120' wind tunnel at the NASA Ames Research Center, which is normally used for full-scale aircraft tests. An initial report on this high Reynolds number experiment and progress toward its evaluation is presented.
Large Scale Bacterial Colony Screening of Diversified FRET Biosensors
Litzlbauer, Julia; Schifferer, Martina; Ng, David; Fabritius, Arne; Thestrup, Thomas; Griesbeck, Oliver
2015-01-01
Biosensors based on Förster Resonance Energy Transfer (FRET) between fluorescent protein mutants have started to revolutionize physiology and biochemistry. However, many types of FRET biosensors show relatively small FRET changes, making measurements with these probes challenging when used under sub-optimal experimental conditions. Thus, a major effort in the field currently lies in designing new optimization strategies for these types of sensors. Here we describe procedures for optimizing FRET changes by large scale screening of mutant biosensor libraries in bacterial colonies. We describe optimization of biosensor expression, permeabilization of bacteria, software tools for analysis, and screening conditions. The procedures reported here may help in improving FRET changes in multiple suitable classes of biosensors. PMID:26061878
NASA Technical Reports Server (NTRS)
Walkmeyer, J.
1973-01-01
This memorandum explores a host of considerations meriting attention from those who are concerned with designing organizational structures for development and control of a large scale educational telecommunications system using satellites. Part of a broader investigation at Washington University into the potential uses of fixed/broadcast satellites in U.S. education, this study lays ground work for a later effort to spell out a small number of hypothetical organizational blueprints for such a system and for assessment of potential short and long term impacts. The memorandum consists of two main parts. Part A deals with subjects of system-wide concern, while Part B deals with matters related to specific system components.
Bridging the Science/Policy Gap through Boundary Chain Partnerships and Communities of Practice
NASA Astrophysics Data System (ADS)
Kalafatis, S.
2014-12-01
Generating the capacity to facilitate the informed usage of climate change science by decision makers on a large scale is fast becoming an area of great concern. While research demonstrates that sustained interactions between producers of such information and potential users can overcome barriers to information usage, it also demonstrates the high resource demand of these efforts. Our social science work at Great Lakes Integrated Sciences and Assessments (GLISA) sheds light on scaling up the usability of climate science through two research areas. The first focuses on partnerships with other boundary organizations that GLISA has leveraged - the "boundary chains" approach. These partnerships reduce the transaction costs involved with outreach and have enhanced the scope of GLISA's climate service efforts to encompass new users such as First Nations groups in Wisconsin and Michigan and underserved neighborhoods in St. Paul, Minnesota. The second research area looks at the development of information usability across the regional scale of the eight Great Lakes states. It has identified the critical role that communities of practice are playing in making information usable to large groups of users who work in similar contexts and have similar information needs. Both these research areas demonstrate the emerging potential of flexible knowledge networks to enhance society's ability to prepare for the impacts of climate change.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
NASA Technical Reports Server (NTRS)
Grung, B. L.; Heaps, J. D.; Schmit, F. M.; Schuldt, S. B.; Zook, J. D.
1981-01-01
The technical feasibility of producing solar-cell-quality sheet silicon to meet the Department of Energy (DOE) 1986 overall price goal of $0.70/watt was investigated. With the silicon-on-ceramic (SOC) approach, a low-cost ceramic substrate is coated with large-grain polycrystalline silicon by unidirectional solidification of molten silicon. This effort was divided into several areas of investigation in order to most efficiently meet the goals of the program. These areas include: (1) dip-coating; (2) continuous coating designated SCIM-coating, and acronym for Silicon Coating by an Inverted Meniscus (SCIM); (3) material characterization; (4) cell fabrication and evaluation; and (5) theoretical analysis. Both coating approaches were successful in producing thin layers of large grain, solar-cell-quality silicon. The dip-coating approach was initially investigated and considerable effort was given to this technique. The SCIM technique was adopted because of its scale-up potential and its capability to produce more conventiently large areas of SOC.
Changing American Education. Recapturing the Past or Inventing the Future?
ERIC Educational Resources Information Center
Borman, Kathryn M., Ed.; Greenman, Nancy P., Ed.
This book examines the nature of comprehensive, large scale historical and social changes that contextualize educational reform, and it amplifies the meaning of lessons learned by those who have assisted in change efforts. It also examines how the rhetoric of educational change may fall short of the reality, as translated to processes and…
Erin L. Landguth; Michael K. Schwartz
2014-01-01
One of the most pressing issues in spatial genetics concerns sampling. Traditionally, substructure and gene flow are estimated for individuals sampled within discrete populations. Because many species may be continuously distributed across a landscape without discrete boundaries, understanding sampling issues becomes paramount. Given large-scale, geographically broad...
Learning a Living: First Results of the Adult Literacy and Life Skills Survey
ERIC Educational Resources Information Center
OECD Publishing (NJ1), 2005
2005-01-01
The Adult Literacy and Life Skills Survey (ALL) is a large-scale co-operative effort undertaken by governments, national statistics agencies, research institutions and multi-lateral agencies. The development and management of the study were co-ordinated by Statistics Canada and the Educational Testing Service (ETS) in collaboration with the…
ERIC Educational Resources Information Center
Patel, Vimla L.; Branch, Timothy; Gutnik, Lily; Arocha, Jose F.
2006-01-01
High-risk behavior in youths related to HIV transmission continues to occur despite large-scale efforts to disseminate information about safe sexual practices through education. Our study examined the relationships among knowledge, decision-making strategies, and risk assessment about HIV by youths during peer group focused discussions. Two focus…
Implementing Assessment Engineering in the Uniform Certified Public Accountant (CPA) Examination
ERIC Educational Resources Information Center
Burke, Matthew; Devore, Richard; Stopek, Josh
2013-01-01
This paper describes efforts to bring principled assessment design to a large-scale, high-stakes licensure examination by employing the frameworks of Assessment Engineering (AE), the Revised Bloom's Taxonomy (RBT), and Cognitive Task Analysis (CTA). The Uniform CPA Examination is practice-oriented and focuses on the skills of accounting. In…
Design for a Study of American Youth.
ERIC Educational Resources Information Center
Flanagan, John C.; And Others
Project TALENT is a large-scale, long-range educational research effort aimed at developing methods for the identification, development, and utilization of human talents, which has involved some 440,000 students in 1,353 public, private, and parochial secondary schools in all parts of the country. Data collected through teacher-administered tests,…
Conservation of Louisiana's coastal wetland forests
Jim L. Chambers; Richard F. Keim; William H. Conner; John W. Jr. Day; Stephen P. Faulkner; Emile S. Gardiner; Melinda s. Hughes; Sammy L. King; Kenneth W. McLeod; Craig A. Miller; J. Andrew Nyman; Gary P. Shaffer
2006-01-01
Large-scale efforts to protect and restore coastal wetlands and the concurrent renewal of forest harvesting in cypress-tupelo swamps have brought new attention to Louisiana's coastal wetland forests in recent years. Our understanding of these coastal wetland forests has been limited by inadequate data and the lack of a comprehensive review of existing information...
Responding to Terrorism Victims: Oklahoma City and Beyond.
ERIC Educational Resources Information Center
Dinsmore, Janet
This report identifies the special measures needed to protect the rights and meet the needs of victims of a large-scale terrorist attack involving mass casualties. In particular, it demonstrates efforts required to ensure an effective response to victims' rights and their short- and long-term emotional and psychological needs as an integral part…
Laying a Solid Foundation: Strategies for Effective Program Replication
ERIC Educational Resources Information Center
Summerville, Geri
2009-01-01
The replication of proven social programs is a cost-effective and efficient way to achieve large-scale, positive social change. Yet there has been little guidance available about how to approach program replication and limited development of systems--at local, state or federal levels--to support replication efforts. "Laying a Solid Foundation:…
Integrating social, economic, and ecological values across large landscapes
Jessica E. Halofsky; Megan K. Creutzburg; Miles A. Hemstrom
2014-01-01
The Integrated Landscape Assessment Project (ILAP) was a multiyear effort to produce information, maps, and models to help land managers, policymakers, and others conduct mid- to broad-scale (e.g., watersheds to states and larger areas) prioritization of land management actions, perform landscape assessments, and estimate cumulative effects of management actions for...
ERIC Educational Resources Information Center
Geller, Cornelia; Neumann, Knut; Boone, William J.; Fischer, Hans E.
2014-01-01
This manuscript details our efforts to assess and compare students' learning about electricity in three countries. As our world is increasingly driven by technological advancements, the education of future citizens in science becomes one important resource for economic productivity. Not surprisingly international large-scale assessments are viewed…
Detoxification of Mycotoxins and Other Compounds of Military Interest
1987-01-14
h to A conclusion. A newly recognized naturally occurring gILLathione derivative y ltmlguttin)hs enpnae and methods for large scale preparation of...whereas ochers contained as much as 50-70% of the new material. An effort is currently in progress in order to determine the nature of the new compound
ERIC Educational Resources Information Center
Henderson, Daphne Carr; Rupley, William H.; Nichols, Janet Alys; Nichols, William Dee; Rasinski, Timothy V.
2018-01-01
Current professional development efforts in writing at the secondary level have not resulted in student improvement on large-scale writing assessments. To maximize funding resources and instructional time, school leaders need a way to determine professional development content for writing teachers that aligns with specific student outcomes. The…
Evidence-Based Practice for Teachers of Children with Autism: A Dynamic Approach
ERIC Educational Resources Information Center
Lubas, Margaret; Mitchell, Jennifer; De Leo, Gianluca
2016-01-01
Evidence-based practice related to autism research is a controversial topic. Governmental entities and national agencies are defining evidence-based practice as a specific set of interventions that educators should implement; however, large-scale efforts to generalize autism research, which are often single-subject case designs, may be a setback…
Lakes continue to face escalating pressures associated with land cover change and growing human populations. The U.S. EPA National Lakes Assessment, which sampled more than 1000 lakes in a probabilistic survey, was the first large scale effort to characterize the condition of lak...
ERIC Educational Resources Information Center
McHugh, R. Kathryn; Barlow, David H.
2010-01-01
Recognizing an urgent need for increased access to evidenced-based psychological treatments, public health authorities have recently allocated over $2 billion to better disseminate these interventions. In response, implementation of these programs has begun, some of it on a very large scale, with substantial implications for the science and…
What Googling Trends Tell Us About Public Interest in Earthquakes
NASA Astrophysics Data System (ADS)
Tan, Y. J.; Maharjan, R.
2017-12-01
Previous studies have shown that immediately after large earthquakes, there is a period of increased public interest. This represents a window of opportunity for science communication and disaster relief fundraising efforts to reach more people. However, how public interest varies for different earthquakes has not been quantified systematically on a global scale. We analyze how global search interest for the term "earthquake" on Google varies following earthquakes of magnitude ≥ 5.5 from 2004 to 2016. We find that there is a spike in search interest after large earthquakes followed by an exponential temporal decay. Preliminary results suggest that the period of increased search interest scales with death toll and correlates with the period of increased media coverage. This suggests that the relationship between the period of increased public interest in earthquakes and death toll might be an effect of differences in media coverage. However, public interest never remains elevated for more than three weeks. Therefore, to take advantage of this short period of increased public interest, science communication and disaster relief fundraising efforts have to act promptly following devastating earthquakes.
A Computational Chemistry Database for Semiconductor Processing
NASA Technical Reports Server (NTRS)
Jaffe, R.; Meyyappan, M.; Arnold, J. O. (Technical Monitor)
1998-01-01
The concept of 'virtual reactor' or 'virtual prototyping' has received much attention recently in the semiconductor industry. Commercial codes to simulate thermal CVD and plasma processes have become available to aid in equipment and process design efforts, The virtual prototyping effort would go nowhere if codes do not come with a reliable database of chemical and physical properties of gases involved in semiconductor processing. Commercial code vendors have no capabilities to generate such a database, rather leave the task to the user of finding whatever is needed. While individual investigations of interesting chemical systems continue at Universities, there has not been any large scale effort to create a database. In this presentation, we outline our efforts in this area. Our effort focuses on the following five areas: 1. Thermal CVD reaction mechanism and rate constants. 2. Thermochemical properties. 3. Transport properties.4. Electron-molecule collision cross sections. and 5. Gas-surface interactions.
Exact-Differential Large-Scale Traffic Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios
2015-01-01
Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less
Large-Scale Low-Boom Inlet Test Overview
NASA Technical Reports Server (NTRS)
Hirt, Stefanie
2011-01-01
This presentation provides a high level overview of the Large-Scale Low-Boom Inlet Test and was presented at the Fundamental Aeronautics 2011 Technical Conference. In October 2010 a low-boom supersonic inlet concept with flow control was tested in the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). The primary objectives of the test were to evaluate the inlet stability and operability of a large-scale low-boom supersonic inlet concept by acquiring performance and flowfield validation data, as well as evaluate simple, passive, bleedless inlet boundary layer control options. During this effort two models were tested: a dual stream inlet intended to model potential flight hardware and a single stream design to study a zero-degree external cowl angle and to permit surface flow visualization of the vortex generator flow control on the internal centerbody surface. The tests were conducted by a team of researchers from NASA GRC, Gulfstream Aerospace Corporation, University of Illinois at Urbana-Champaign, and the University of Virginia
Evolving from bioinformatics in-the-small to bioinformatics in-the-large.
Parker, D Stott; Gorlick, Michael M; Lee, Christopher J
2003-01-01
We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.
Scale-dependent feedbacks between patch size and plant reproduction in desert grassland
Svejcar, Lauren N.; Bestelmeyer, Brandon T.; Duniway, Michael C.; James, Darren K.
2015-01-01
Theoretical models suggest that scale-dependent feedbacks between plant reproductive success and plant patch size govern transitions from highly to sparsely vegetated states in drylands, yet there is scant empirical evidence for these mechanisms. Scale-dependent feedback models suggest that an optimal patch size exists for growth and reproduction of plants and that a threshold patch organization exists below which positive feedbacks between vegetation and resources can break down, leading to critical transitions. We examined the relationship between patch size and plant reproduction using an experiment in a Chihuahuan Desert grassland. We tested the hypothesis that reproductive effort and success of a dominant grass (Bouteloua eriopoda) would vary predictably with patch size. We found that focal plants in medium-sized patches featured higher rates of grass reproductive success than when plants occupied either large patch interiors or small patches. These patterns support the existence of scale-dependent feedbacks in Chihuahuan Desert grasslands and indicate an optimal patch size for reproductive effort and success in B. eriopoda. We discuss the implications of these results for detecting ecological thresholds in desert grasslands.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scales, John
The broad purpose of CSM's 6-year (3 years plus renewal) DOE project was to develop and apply new experimental physics technology to the material characterization of rocks at the grain scale or smaller. This is motivated by a knowledge that the bulk chemistry and physics of rocks are strongly influenced by processes occurring at the grain scale: the flow of fluids, cation exchange, the state of cementation of grains, and many more. It may also be possible in some cases to ``upscale'' or homogenize the mesoscopic properties of rocks in order to directly infer the large-scale properties of formations, butmore » that is not our central goal. Understanding the physics and chemistry at the small scale is. During the first 3 years, most effort was devoted to developing and validating the near-field scanning technology. During the 3 year renewal phase, most effort was focused on applying the technology in the labs Professors Batzle (now deceased) in Geophysics and Prasad in Petroleum engineering.« less
Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach
Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.
2017-01-01
The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat requirements and will be useful for management and conservation activities.
Pelletier, David; Pelto, Gretel
2013-01-01
Undernutrition in low-income countries is receiving unprecedented attention at global and national levels due to the convergence of many forces, including strong evidence concerning its magnitude, consequences, and potential solutions and effective advocacy by many organizations. The translation of this attention into large-scale reductions in undernutrition at the country level requires the alignment and support of many organizations in the development and implementation of a coherent policy agenda for nutrition, including the strengthening of operational and strategic capacities and a supportive research agenda. However, many countries experience difficulties achieving such alignment. This article uses the concept of organizational culture to better understand some of the reasons for these difficulties. This concept is applied to the constellation of organizations that make up the “National Nutrition Network” in a given country and some of the individual organizations within that network, including academic institutions that conduct research on undernutrition. We illustrate this concept through a case study involving a middle-income country. We conclude that efforts to align organizations in support of coherent nutrition agendas should do the following: 1) make intentional and sustained efforts to foster common understanding, shared learning, and socialization of new members and other elements of a shared culture among partners; 2) seek a way to frame problems and solutions in a fashion that enables individual organizations to secure some of their particular interests by joining the effort; and 3) not only advocate on the importance of nutrition but also insist that high-level officials hold organizations accountable for aligning in support of common-interest solutions (through some elements of a common culture) that can be effective and appropriate in the national context. We further conclude that a culture change is needed within academic departments if the discipline of nutrition is to play a central role in translating the findings from efficacy trials into large-scale reductions in undernutrition. PMID:24228200
From efficacy research to large-scale impact on undernutrition: the role of organizational cultures.
Pelletier, David; Pelto, Gretel
2013-11-01
Undernutrition in low-income countries is receiving unprecedented attention at global and national levels due to the convergence of many forces, including strong evidence concerning its magnitude, consequences, and potential solutions and effective advocacy by many organizations. The translation of this attention into large-scale reductions in undernutrition at the country level requires the alignment and support of many organizations in the development and implementation of a coherent policy agenda for nutrition, including the strengthening of operational and strategic capacities and a supportive research agenda. However, many countries experience difficulties achieving such alignment. This article uses the concept of organizational culture to better understand some of the reasons for these difficulties. This concept is applied to the constellation of organizations that make up the "National Nutrition Network" in a given country and some of the individual organizations within that network, including academic institutions that conduct research on undernutrition. We illustrate this concept through a case study involving a middle-income country. We conclude that efforts to align organizations in support of coherent nutrition agendas should do the following: 1) make intentional and sustained efforts to foster common understanding, shared learning, and socialization of new members and other elements of a shared culture among partners; 2) seek a way to frame problems and solutions in a fashion that enables individual organizations to secure some of their particular interests by joining the effort; and 3) not only advocate on the importance of nutrition but also insist that high-level officials hold organizations accountable for aligning in support of common-interest solutions (through some elements of a common culture) that can be effective and appropriate in the national context. We further conclude that a culture change is needed within academic departments if the discipline of nutrition is to play a central role in translating the findings from efficacy trials into large-scale reductions in undernutrition.
A Commercialization Roadmap for Carbon-Negative Energy Systems
NASA Astrophysics Data System (ADS)
Sanchez, D.
2016-12-01
The Intergovernmental Panel on Climate Change (IPCC) envisages the need for large-scale deployment of net-negative CO2 emissions technologies by mid-century to meet stringent climate mitigation goals and yield a net drawdown of atmospheric carbon. Yet there are few commercial deployments of BECCS outside of niche markets, creating uncertainty about commercialization pathways and sustainability impacts at scale. This uncertainty is exacerbated by the absence of a strong policy framework, such as high carbon prices and research coordination. Here, we propose a strategy for the potential commercial deployment of BECCS. This roadmap proceeds via three steps: 1) via capture and utilization of biogenic CO2 from existing bioenergy facilities, notably ethanol fermentation, 2) via thermochemical co-conversion of biomass and fossil fuels, particularly coal, and 3) via dedicated, large-scale BECCS. Although biochemical conversion is a proven first market for BECCS, this trajectory alone is unlikely to drive commercialization of BECCS at the gigatonne scale. In contrast to biochemical conversion, thermochemical conversion of coal and biomass enables large-scale production of fuels and electricity with a wide range of carbon intensities, process efficiencies and process scales. Aside from systems integration, primarily technical barriers are involved in large-scale biomass logistics, gasification and gas cleaning. Key uncertainties around large-scale BECCS deployment are not limited to commercialization pathways; rather, they include physical constraints on biomass cultivation or CO2 storage, as well as social barriers, including public acceptance of new technologies and conceptions of renewable and fossil energy, which co-conversion systems confound. Despite sustainability risks, this commercialization strategy presents a pathway where energy suppliers, manufacturers and governments could transition from laggards to leaders in climate change mitigation efforts.
Bait preference by the Argentine ant (Hymenoptera: Formicidae) in Haleakala National Park, Hawaii
Krushelnycky, Paul D.; Reimer, Neil J.
1998-01-01
The Argentine ant, Linepithema humile (Mayr), has proven to be a threat to native arthropod species in Haleakala National Park, Maui, HI, and is also a potential threat to the park's native flora. As it continues to expand its range, an effort has been undertaken to eradicate it, or at the least, control its spread. The 1st part of this effort focused on finding a bait carrier for subsequent toxicant-based control tests. A year-long bait preference test was implemented at each of the ant's 2 infestation sites in Haleakala National Park, in which 6 solid baits and 2 liquid baits were assessed for attractiveness and feasibility for large scale control. At both sites, a toxicant-free formulation of Maxforce, a protein-based granular bait made from ground silkworm, Bombyx mori (L.), pupae, and a 25% sugar water solution were the most attractive baits. Ants took more Maxforce (without toxicant) and sugar water than all other baits, including honey granules and a fish protein bait. Sugar water, however, is difficult to distribute over large natural areas. Maxforce was therefore concluded to be the best bait carrier for toxicant-based control at Haleakala National Park because of its attractiveness and its ease for large scale broadcast dispersal.
Brusseau, M. L.; Hatton, J.; DiGuiseppi, W.
2011-01-01
The long-term impact of source-zone remediation efforts was assessed for a large site contaminated by trichloroethene. The impact of the remediation efforts (soil vapor extraction and in-situ chemical oxidation) was assessed through analysis of plume-scale contaminant mass discharge, which was measured using a high-resolution data set obtained from 23 years of operation of a large pump-and-treat system. The initial contaminant mass discharge peaked at approximately 7 kg/d, and then declined to approximately 2 kg/d. This latter value was sustained for several years prior to the initiation of source-zone remediation efforts. The contaminant mass discharge in 2010, measured several years after completion of the two source-zone remediation actions, was approximately 0.2 kg/d, which is ten times lower than the value prior to source-zone remediation. The time-continuous contaminant mass discharge data can be used to evaluate the impact of the source-zone remediation efforts on reducing the time required to operate the pump-and-treat system, and to estimate the cost savings associated with the decreased operational period. While significant reductions have been achieved, it is evident that the remediation efforts have not completely eliminated contaminant mass discharge and associated risk. Remaining contaminant mass contributing to the current mass discharge is hypothesized to comprise poorly-accessible mass in the source zones, as well as aqueous (and sorbed) mass present in the extensive lower-permeability units located within and adjacent to the contaminant plume. The fate of these sources is an issue of critical import to the remediation of chlorinated-solvent contaminated sites, and development of methods to address these sources will be required to achieve successful long-term management of such sites and to ultimately transition them to closure. PMID:22115080
Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform.
Marshall-Colon, Amy; Long, Stephen P; Allen, Douglas K; Allen, Gabrielle; Beard, Daniel A; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A J; Cox, Donna J; Hart, John C; Hirst, Peter M; Kannan, Kavya; Katz, Daniel S; Lynch, Jonathan P; Millar, Andrew J; Panneerselvam, Balaji; Price, Nathan D; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J; Voit, Eberhard O; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang
2017-01-01
Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop.
Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform
Marshall-Colon, Amy; Long, Stephen P.; Allen, Douglas K.; Allen, Gabrielle; Beard, Daniel A.; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A. J.; Cox, Donna J.; Hart, John C.; Hirst, Peter M.; Kannan, Kavya; Katz, Daniel S.; Lynch, Jonathan P.; Millar, Andrew J.; Panneerselvam, Balaji; Price, Nathan D.; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G.; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J.; Voit, Eberhard O.; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang
2017-01-01
Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop. PMID:28555150
Rabinowitz, Amanda R; Merritt, Victoria; Arnett, Peter A
2016-08-01
Baseline neuropsychological testing is commonly used in the management of sports-related concussion. However, underperformance due to poor effort could lead to invalid conclusions regarding postconcussion cognitive decline. We designed the Motivation Behaviors Checklist (MBC) as an observational rating scale to assess effort towards baseline neuropsychological testing. Here we present preliminary data in support of its reliability and validity. MBC items were generated based on the consensus of a panel of graduate students, undergraduates, and a clinical neuropsychologist who conduct neuropsychological evaluations for a sports concussion management program. A total of 261 college athletes were administered a standard neuropsychological test battery in addition to the MBC. A subset of evaluations (n= 101) was videotape and viewed by a second rater. Exploratory factor analysis (EFA) was used to refine the scale, and reliability and validity were evaluated. EFA revealed that the MBC items represent four latent factors-Complaints, Poor Focus, Psychomotor Agitation, and Impulsivity. Reliability analyses demonstrated that the MBC has good inter-rater reliability (intraclass correlation coefficient, ICC = .767) and internal consistency (α = .839). The construct validity of the MBC is supported by large correlations with examiners' ratings of effort (ρ = -.623) and medium-sized relationships with cognitive performance and self-ratings of effort (|ρ| between .263 and .345). Discriminant validity was supported by nonsignificant correlations with measures of depression and postconcussion symptoms (ρ = .056 and .082, respectively). These findings provide preliminary evidence that the MBC could be a useful adjunct to baseline neuropsychological evaluations for sports-concussion management.
Quality Function Deployment for Large Systems
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1992-01-01
Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.
Aircraft Measurements for Understanding Air-Sea Coupling and Improving Coupled Model Predictions
2013-09-30
physical parameterizations of the coupled model in various large-scale forcing conditions. OBJECTIVES The NOAA WP-3D efforts of DYNAMO /LASP intend...various phases of the MJO; 3) to extend point measurements on island and ships to a broader area near the DYNAMO region; and 4) To obtain a suite of...upper ocean characteristics from a large number of AXBT/AXCTD data. In addition, as one of the unique measurement strategy of LASP/ DYNAMO WP-3D project
NASA Astrophysics Data System (ADS)
DeLong, S.; Henderson, W. M.
2012-12-01
The use of erosion control structures to mitigate or even reverse erosion and to restore ecological function along dryland channels (arroyos and gullies) has led to a long list of both successful and failed restoration efforts. We propose that successful implementation of "engineering" approaches to fluvial restoration that include in-channel control structures require either a quantitative approach to design (by scientists and engineers), or intimate on-the-ground knowledge, local observation, and a commitment to adapt and maintain restoration efforts in response to landscape change (by local land managers), or both. We further propose that the biophysical interactions among engineering, sedimentation, flood hydrology and vegetation reestablishment are what determine resilience to destructive extreme events that commonly cause erosion control structure failure. Our insights come from comprehensive monitoring of a remarkable experiment underway at Ranch San Bernardino, Sonora, MX. At this site, private landowners are working to restore ecosystem function to riparian corridors and former cieñega wetlands using cessation of grazing; vegetation planting; upland grass restoration; large scale rock gabions (up to 100 m wide) to encourage local sediment deposition and water storage; and large earthen berms (up to 900 m wide) with cement spillways that form reservoirs that fill rapidly with water and sediment. Well-planned and managed erosion control structures have been used elsewhere successfully in smaller gully networks, but we are unaware of a comparable attempt to use gabions and berms for the sole purpose of ecological restoration along >10 km of arroyo channels draining watersheds on the order of ~400 km2 and larger. We present an approach to monitoring the efficacy of arroyo channel restoration using terrestrial and airborne LiDAR, remote sensing, streamflow monitoring, shallow groundwater monitoring, hydrological modeling and field observation. Our methods allow us to directly quantify the magnitude of sedimentation (and hence reversal of arroyo cutting) upstream of in-channel structures as a function of hydrology, and to quantify the dampening of flood energy caused by erosion control structures and by the restoration of riparian vegetation. We are also able to create a surface water budget that constrains water storage and infiltration by monitoring streamflow at several places above, within, and downstream of restoration efforts. We also speculate on the resilience of such efforts. Quantifying the effects of the restoration efforts at Rancho San Bernardino may prove useful in guiding similar large-scale ecological restoration efforts elsewhere in degraded dryland landscapes.
EPA, in collaboration with FHWA, has been involved in a large-scale monitoring research study in an effort to characterize highway vehicle emissions in a near-road environment. The pollutants of interest include particulate matter with aerodynamic diameter less than 2.5 microns ...
ERIC Educational Resources Information Center
Nelson-Barber, Sharon; Trumbull, Elise
2015-01-01
This monograph explores the ways in which large-scale school reform efforts play out in American Indian/Alaska Native communities and schools, starting from a historical and cultural perspective, and focusing on the translation of research into concrete steps leading to American Indian/Alaska Native student academic success and personal well-being.
ERIC Educational Resources Information Center
Powers, Stephen; And Others
Sex differences in attributions for success and failure in algebra of Samoan community college students were examined and compared with attributions of a large group of mainland U.S. students. study included the Mathematics Attribution Scale: Algebra Version (MAS), which assessed students' attributions of achievement in algebra to their effort,…
The problem of ecological scaling in spatially complex, nonequilibrium ecological systems [chapter 3
Samuel A. Cushman; Jeremy Littell; Kevin McGarigal
2010-01-01
In the previous chapter we reviewed the challenges posed by spatial complexity and temporal disequilibrium to efforts to understand and predict the structure and dynamics of ecological systems. The central theme was that spatial variability in the environment and population processes fundamentally alters the interactions between species and their environments, largely...
Bureaucratic Activism and Radical School Change in Tamil Nadu, India
ERIC Educational Resources Information Center
Niesz, Tricia; Krishnamurthy, Ramchandar
2013-01-01
In 2007, Activity Based Learning (ABL), a child-centered, activity-based method of pedagogical practice, transformed classrooms in all of the over 37,000 primary-level government schools in Tamil Nadu, India. The large scale, rapid pace, and radical nature of educational change sets the ABL initiative apart from most school reform efforts.…
Processing of Fine-Scale Piezoelectric Ceramic/Polymer Composites for Sensors and Actuators
NASA Technical Reports Server (NTRS)
Janas, V. F.; Safari, A.
1996-01-01
The objective of the research effort at Rutgers is the development of lead zirconate titanate (PZT) ceramic/polymer composites with different designs for transducer applications including hydrophones, biomedical imaging, non-destructive testing, and air imaging. In this review, methods for processing both large area and multifunctional ceramic/polymer composites for acoustic transducers were discussed.
Feedback, Goal Setting, and Incentives Effects on Organizational Productivity.
ERIC Educational Resources Information Center
Pritchard, Robert D.; And Others
This technical paper is one of three produced by a large-scale effort aimed at implementing a new approach to measuring productivity, and using that approach to assess the impact of feedback, goal setting, and incentives on productivity. The productivity measurement system was developed for five units in the maintenance and supply areas at an Air…
Mass Digitization at Yale University Library: Exposing the Treasures in Our Stacks
ERIC Educational Resources Information Center
Weintraub, Jennifer; Wisner, Melissa
2008-01-01
In September 2007, Yale University Library (YUL) and Microsoft agreed to partner in a large-scale project to digitize 100,000 books from the YUL collections--an ambitious effort that would substantially increase the library's digitized holdings, particularly in the area of its own text collections. YUL has been digitizing materials from its…
Improving the Validity and Reliability of Large Scale Writing Assessment.
ERIC Educational Resources Information Center
Fenton, Ray; Straugh, Tom; Stofflet, Fred; Garrison, Steve
This paper examines the efforts of the Anchorage School District, Alaska, to improve the validity of its writing assessment as a useful tool for the training of teachers and the characterization of the quality of student writing. The paper examines how a number of changes in the process and scoring of the Anchorage Writing Assessment affected the…
Symposium on Documentation Planning in Developing Countries at Bad Godesberg, 28-30 November 1967.
ERIC Educational Resources Information Center
German Foundation for International Development, Bonn (West Germany).
One reason given for the failure of the large-scale efforts in the decade 1955-1965 to increase significantly the rate of economic and technological growth in the "developing" countries of the world has been insufficient utilization of existing information essential to this development. Motivated by this belief and the opinion that this…
Positive effects of afforestation efforts on the health of urban soils
Emily E. Oldfield; Alexander J. Felson; Stephen A. Wood; Richard A. Hallett; Michael S. Strickland; Mark A. Bradford
2014-01-01
Large-scale tree planting projects in cities are increasingly implemented as a strategy to improve the urban environment. Trees provide multiple benefits in cities, including reduction of urban temperatures, improved air quality, mitigation of storm-water run-off, and provision of wildlife habitat. How urban afforestation affects the properties and functions of urban...
Irreconcilable Differences? Education Vouchers and the Suburban Response
ERIC Educational Resources Information Center
d'Entremont, Chad; Huerta, Luis A.
2007-01-01
This article discusses the limited use of education vouchers in an era of unprecedented growth in school choice. It is divided into two parts: first, a description of the policy, political, and legal barriers that may limit the expansion of large-scale voucher programs is presented. Discussion then shifts to the efforts of voucher advocates to…
Gaming the System: Culture, Process, and Perspectives Supporting a Game and App Design Curriculum
ERIC Educational Resources Information Center
Herro, Danielle
2015-01-01
Games and digital media experiences permeate the lives of youth. Researchers have argued the participatory attributes and cognitive benefits of gaming and media production for more than a decade, relying on socio-cultural theory to bolster their claims. Only recently have large-scale efforts ensued towards moving game play and design into formal…
The forest ecosystem of southeast Alaska: 7. Forest ecology and timber management.
Arland S. Harris; Wilbur A. Farr
1974-01-01
Large-scale use of the timber resource of southeast Alaska began in 1953 after long efforts to establish a timber industry. Development and present status of the industry and present management of the timber resource are summarized, stressing the biological basis for timber management activities in southeast Alaska today. Ecological and silvicultural considerations...
Patterns of Drug Use Among College Students. A Preliminary Report.
ERIC Educational Resources Information Center
Mizner, George L.; And Others
Initial data from a survey of drug usage among college students was presented. A large-scale effort was made to produce reliable figures on: (1) drug use patterns; (2) attitudes toward drug use; and (3) incidence of drug use among college students. Questionnaires were answered by 26,000 college students from the Denver-Boulder area, who were…
Targeted business intelligence pays off.
Hennen, James
2009-03-01
Application business intelligence can accomplish much of what large-scale, enterprisewide efforts can accomplish: Focus on a variety of data that are interrelated in a meaningful way, Support decision making at multiple levels within a given organization, Leverage data that are already captured but not fully used, Provide actionable information and support quick response via a dashboard or control panel.
Historical open forest ecosystems in the Missouri Ozarks: reconstruction and restoration targets
Brice B. Hanberry; D. Todd Jones-Farrand; John M. Kabrick
2014-01-01
Current forests no longer resemble historical open forest ecosystems in the eastern United States. In the absence of representative forest ecosystems under a continuous surface fire regime at a large scale, reconstruction of historical landscapes can provide a reference for restoration efforts. For initial expert-assigned vegetation phases ranging from prairie to...
ERIC Educational Resources Information Center
Bryson, Susan E.; Koegel, Lynn K.; Koegel, Robert L.; Openden, Daniel; Smith, Isabel M.; Nefdt, Nicolette
2007-01-01
This paper describes a collaborative effort aimed at province-wide dissemination and implementation of pivotal response treatment (PRT) for young children with autism spectrum disorder (ASD) in Nova Scotia, Canada. Three critical components of the associated training model are described: (1) direct training of treatment teams (parents, one-to-one…
The National Cancer Institute's (NCI) Clinical Proteomic Technologies for Cancer (CPTC) initiative at the National Institutes of Health has entered into a memorandum of understanding (MOU) with the Korea Institute of Science and Technology (KIST). This MOU promotes proteomic technology optimization and standards implementation in large-scale international programs.
Learning Analysis of K-12 Students' Online Problem Solving: A Three-Stage Assessment Approach
ERIC Educational Resources Information Center
Hu, Yiling; Wu, Bian; Gu, Xiaoqing
2017-01-01
Problem solving is considered a fundamental human skill. However, large-scale assessment of problem solving in K-12 education remains a challenging task. Researchers have argued for the development of an enhanced assessment approach through joint effort from multiple disciplines. In this study, a three-stage approach based on an evidence-centered…
Knowledge, Education and Research: Making Common Cause across Communities of Practice
ERIC Educational Resources Information Center
Moss, Gemma
2016-01-01
This article considers how knowledge, education and research interact as the institutional structures that support them change. Many efforts at large-scale education reform depend upon the proposition that what counts as useful knowledge can be easily defined, without reference to the specific contexts in which that knowledge will be set to work.…
Advancing effects analysis for integrated, large-scale wildfire risk assessment
Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager
2011-01-01
In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...
A Year of Progress in School-to-Career System Building. The Benchmark Communities Initiative.
ERIC Educational Resources Information Center
Martinez, Martha I.; And Others
This document examines the first year of Jobs for the Future's Benchmark Communities Initiative (BCI), a 5-year effort to achieve the following: large-scale systemic restructuring of K-16 educational systems; involvement of significant numbers of employers in work and learning partnerships; and development of the infrastructure necessary to…
The Common Core State Standards: School Reform at Three Suburban Middle Schools
ERIC Educational Resources Information Center
Morante-Brock, Sandra
2014-01-01
A growing body of research supports the idea that large scale school reform efforts often fail to create sustained change within the public school sector. Proponents of school reform argue that implementing school reform, effectively and with fidelity, can work to ensure the success of reform initiatives in public education. When implementing deep…
Restoring bottomland hardwood forests: A comparison of four techniques
John A. Stanturf; Emile S. Cardiner; James P. Shepard; Callie J. Schweitzer; C. Jeffrey Portwood; Lamar Dorris
2004-01-01
Large-scale afforestation of former agricultural lands in the Lower Mississippi Alluvial Valley (LMAV) is one of the largest forest restoration efforts in the world and continues to attract interest from landowners, policy makers, scientists, and managers. The decision by many landowners to afforest these lands has been aided in part by the increased availability of...
Multi-resource and multi-scale approaches for meeting the challenge of managing multiple species
Frank R. Thompson; Deborah M. Finch; John R. Probst; Glen D. Gaines; David S. Dobkin
1999-01-01
The large number of Neotropical migratory bird (NTMB) species and their diverse habitat requirements create conflicts and difficulties for land managers and conservationists. We provide examples of assessments or conservation efforts that attempt to address the problem of managing for multiple NTMB species. We advocate approaches at a variety of spatial and geographic...
Lindsay K. Campbell; Erika S. Svendsen; Lara A. Roman
2016-01-01
Cities are increasingly engaging in sustainability efforts and investment in green infrastructure, including large-scale urban tree planting campaigns. In this context, researchers and practitioners are working jointly to develop applicable knowledge for planning and managing the urban forest. This paper presents three case studies of knowledge co-production in the...
The hardwood ecosystem experiment: extension and outreach
Brian J. MacGowan; Lenny D. Farlee; Robert N. Chapman
2013-01-01
The Hardwood Ecosystem Experiment (HEE) in Indiana is a long-term, large-scale experimental study of forest management and its impacts on plants and animals. Information from the HEE should and will be made available to a diverse group of potential users. This paper summarizes educational efforts during the pre-treatment period and highlights potential mechanisms and...
Dams and Intergovernmental Transfers
NASA Astrophysics Data System (ADS)
Bao, X.
2012-12-01
Gainers and Losers are always associated with large scale hydrological infrastructure construction, such as dams, canals and water treatment facilities. Since most of these projects are public services and public goods, Some of these uneven impacts cannot fully be solved by markets. This paper tried to explore whether the governments are paying any effort to balance the uneven distributional impacts caused by dam construction or not. It showed that dam construction brought an average 2% decrease in per capita tax revenue in the upstream counties, a 30% increase in the dam-location counties and an insignificant increase in downstream counties. Similar distributional impacts were observed for other outcome variables. like rural income and agricultural crop yields, though the impacts differ across different crops. The paper also found some balancing efforts from inter-governmental transfers to reduce the unevenly distributed impacts caused by dam construction. However, overall the inter-governmental fiscal transfer efforts were not large enough to fully correct those uneven distributions, reflected from a 2% decrease of per capita GDP in upstream counties and increase of per capita GDP in local and downstream counties. This paper may shed some lights on the governmental considerations in the decision making process for large hydrological infrastructures.
Kongelf, Anine; Bandewar, Sunita V. S.; Bharat, Shalini; Collumbien, Martine
2015-01-01
Background In the last decade, community mobilisation (CM) interventions targeting female sex workers (FSWs) have been scaled-up in India’s national response to the HIV epidemic. This included the Bill and Melinda Gates Foundation’s Avahan programme which adopted a business approach to plan and manage implementation at scale. With the focus of evaluation efforts on measuring effectiveness and health impacts there has been little analysis thus far of the interaction of the CM interventions with the sex work industry in complex urban environments. Methods and Findings Between March and July 2012 semi-structured, in-depth interviews and focus group discussions were conducted with 63 HIV intervention implementers, to explore challenges of HIV prevention among FSWs in Mumbai. A thematic analysis identified contextual factors that impact CM implementation. Large-scale interventions are not only impacted by, but were shown to shape the dynamic social context. Registration practices and programme monitoring were experienced as stigmatising, reflected in shifting client preferences towards women not disclosing as ‘sex workers’. This combined with urban redevelopment and gentrification of traditional red light areas, forcing dispersal and more ‘hidden’ ways of solicitation, further challenging outreach and collectivisation. Participants reported that brothel owners and ‘pimps’ continued to restrict access to sex workers and the heterogeneous ‘community’ of FSWs remains fragmented with high levels of mobility. Stakeholder engagement was poor and mobilising around HIV prevention not compelling. Interventions largely failed to respond to community needs as strong target-orientation skewed activities towards those most easily measured and reported. Conclusion Large-scale interventions have been impacted by and contributed to an increasingly complex sex work environment in Mumbai, challenging outreach and mobilisation efforts. Sex workers remain a vulnerable and disempowered group needing continued support and more comprehensive services. PMID:25811484
Santangeli, Andrea; Arroyo, Beatriz; Millon, Alexandre; Bretagnolle, Vincent
2015-08-01
1. Modern farming practices threaten wildlife in different ways, and failure to identify the complexity of multiple threats acting in synergy may result in ineffective management. To protect ground-nesting birds in farmland, monitoring and mitigating impacts of mechanical harvesting is crucial. 2. Here, we use 6 years of data from a nationwide volunteer-based monitoring scheme of the Montagu's harrier, a ground-nesting raptor, in French farmlands. We assess the effectiveness of alternative nest protection measures and map their potential benefit to the species. 3. We show that unprotected nests in cultivated land are strongly negatively affected by harvesting and thus require active management. Further, we show that protection from harvesting alone (e.g. by leaving a small unharvested buffer around the nest) is impaired by post-harvest predation at nests that become highly conspicuous after harvest. Measures that simultaneously protect from harvesting and predation (by adding a fence around the nest) significantly enhance nest productivity. 4. The map of expected gain from nest protection in relation to available volunteers' workforce pinpoints large areas of high expected gain from nest protection that are not matched by equally high workforce availability. This mismatch suggests that the impact of nest protection can be further improved by increasing volunteer efforts in key areas where they are low relative to the expected gain they could have. 5. Synthesis and applications . This study shows that synergistic interplay of multiple factors (e.g. mechanical harvesting and predation) may completely undermine the success of well-intentioned conservation efforts. However, identifying areas where the greatest expected gains can be achieved relative to effort expended can minimize the risk of wasted volunteer actions. Overall, this study underscores the importance of citizen science for collecting large-scale data useful for producing science and ultimately informs large-scale evidence-based conservation actions within an adaptive management framework.
Fast Bound Methods for Large Scale Simulation with Application for Engineering Optimization
NASA Technical Reports Server (NTRS)
Patera, Anthony T.; Peraire, Jaime; Zang, Thomas A. (Technical Monitor)
2002-01-01
In this work, we have focused on fast bound methods for large scale simulation with application for engineering optimization. The emphasis is on the development of techniques that provide both very fast turnaround and a certificate of Fidelity; these attributes ensure that the results are indeed relevant to - and trustworthy within - the engineering context. The bound methodology which underlies this work has many different instantiations: finite element approximation; iterative solution techniques; and reduced-basis (parameter) approximation. In this grant we have, in fact, treated all three, but most of our effort has been concentrated on the first and third. We describe these below briefly - but with a pointer to an Appendix which describes, in some detail, the current "state of the art."
HIV Topical Microbicides: Steer the Ship or Run Aground
Gross, Michael
2004-01-01
Six HIV candidate microbicides are scheduled to enter 6 large-scale effectiveness trials in the next year. The selection of products for testing and the design of this group of trials should be reconsidered to provide an answer to a key question now before the field: Does a sulfonated polyanion, delivered intravaginally as a gel, block HIV attachment to target cells with sufficient potency to protect women from sexually acquired HIV infection? Paradoxically, entering more candidates into more trials may confuse or compromise efforts to identify an effective product. Instead, a single trial of the most promising product(s) best serves the current candidates while also preserving resources needed to promptly advance innovative new protective concepts into future large-scale trials. PMID:15226123
NASA Technical Reports Server (NTRS)
Aanstoos, J. V.; Snyder, W. E.
1981-01-01
Anticipated major advances in integrated circuit technology in the near future are described as well as their impact on satellite onboard signal processing systems. Dramatic improvements in chip density, speed, power consumption, and system reliability are expected from very large scale integration. Improvements are expected from very large scale integration enable more intelligence to be placed on remote sensing platforms in space, meeting the goals of NASA's information adaptive system concept, a major component of the NASA End-to-End Data System program. A forecast of VLSI technological advances is presented, including a description of the Defense Department's very high speed integrated circuit program, a seven-year research and development effort.
Large-scale quantitative analysis of painting arts.
Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong
2014-12-11
Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.
Safety Testing of Ammonium Nitrate Based Mixtures
NASA Astrophysics Data System (ADS)
Phillips, Jason; Lappo, Karmen; Phelan, James; Peterson, Nathan; Gilbert, Don
2013-06-01
Ammonium nitrate (AN)/ammonium nitrate based explosives have a lengthy documented history of use by adversaries in acts of terror. While historical research has been conducted on AN-based explosive mixtures, it has primarily focused on detonation performance while varying the oxygen balance between the oxidizer and fuel components. Similarly, historical safety data on these materials is often lacking in pertinent details such as specific fuel type, particle size parameters, oxidizer form, etc. A variety of AN-based fuel-oxidizer mixtures were tested for small-scale sensitivity in preparation for large-scale testing. Current efforts focus on maintaining a zero oxygen-balance (a stoichiometric ratio for active chemical participants) while varying factors such as charge geometry, oxidizer form, particle size, and inert diluent ratios. Small-scale safety testing was conducted on various mixtures and fuels. It was found that ESD sensitivity is significantly affected by particle size, while this is less so for impact and friction. Thermal testing is in progress to evaluate hazards that may be experienced during large-scale testing.
Velasco, Veronica; Griffin, Kenneth W; Antichi, Mariella; Celata, Corrado
2015-10-01
Across developed countries, experimentation with alcohol, tobacco, and other drugs often begins in the early adolescent years. Several evidence-based programs have been developed to prevent adolescent substance use. Many of the most rigorously tested and empirically supported prevention programs were initially developed and tested in the United States. Increasingly, these interventions are being adopted for use in Europe and throughout the world. This paper reports on a large-scale comprehensive initiative designed to select, adapt, implement, and sustain an evidence-based drug abuse prevention program in Italy. As part of a large-scale regionally funded collaboration in the Lombardy region of Italy, we report on processes through which a team of stakeholders selected, translated and culturally adapted, planned, implemented and evaluated the Life Skills Training (LST) school-based drug abuse prevention program, an evidence-based intervention developed in the United States. We discuss several challenges and lessons learned and implications for prevention practitioners and researchers attempting to undertake similar international dissemination projects. We review several published conceptual models designed to promote the replication and widespread dissemination of effective programs, and discuss their strengths and limitations in the context of planning and implementing a complex, large-scale real-world dissemination effort. Copyright © 2015 Elsevier Ltd. All rights reserved.
Geller, Ruth; Bear, Todd M.; Foulds, Abigail L.; Duell, Jessica; Sharma, Ravi
2015-01-01
Background. Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. Objective. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Methods. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Results. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Conclusions. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates. PMID:26273310
Inflation physics from the cosmic microwave background and large scale structure
NASA Astrophysics Data System (ADS)
Abazajian, K. N.; Arnold, K.; Austermann, J.; Benson, B. A.; Bischoff, C.; Bock, J.; Bond, J. R.; Borrill, J.; Buder, I.; Burke, D. L.; Calabrese, E.; Carlstrom, J. E.; Carvalho, C. S.; Chang, C. L.; Chiang, H. C.; Church, S.; Cooray, A.; Crawford, T. M.; Crill, B. P.; Dawson, K. S.; Das, S.; Devlin, M. J.; Dobbs, M.; Dodelson, S.; Doré, O.; Dunkley, J.; Feng, J. L.; Fraisse, A.; Gallicchio, J.; Giddings, S. B.; Green, D.; Halverson, N. W.; Hanany, S.; Hanson, D.; Hildebrandt, S. R.; Hincks, A.; Hlozek, R.; Holder, G.; Holzapfel, W. L.; Honscheid, K.; Horowitz, G.; Hu, W.; Hubmayr, J.; Irwin, K.; Jackson, M.; Jones, W. C.; Kallosh, R.; Kamionkowski, M.; Keating, B.; Keisler, R.; Kinney, W.; Knox, L.; Komatsu, E.; Kovac, J.; Kuo, C.-L.; Kusaka, A.; Lawrence, C.; Lee, A. T.; Leitch, E.; Linde, A.; Linder, E.; Lubin, P.; Maldacena, J.; Martinec, E.; McMahon, J.; Miller, A.; Mukhanov, V.; Newburgh, L.; Niemack, M. D.; Nguyen, H.; Nguyen, H. T.; Page, L.; Pryke, C.; Reichardt, C. L.; Ruhl, J. E.; Sehgal, N.; Seljak, U.; Senatore, L.; Sievers, J.; Silverstein, E.; Slosar, A.; Smith, K. M.; Spergel, D.; Staggs, S. T.; Stark, A.; Stompor, R.; Vieregg, A. G.; Wang, G.; Watson, S.; Wollack, E. J.; Wu, W. L. K.; Yoon, K. W.; Zahn, O.; Zaldarriaga, M.
2015-03-01
Fluctuations in the intensity and polarization of the cosmic microwave background (CMB) and the large-scale distribution of matter in the universe each contain clues about the nature of the earliest moments of time. The next generation of CMB and large-scale structure (LSS) experiments are poised to test the leading paradigm for these earliest moments-the theory of cosmic inflation-and to detect the imprints of the inflationary epoch, thereby dramatically increasing our understanding of fundamental physics and the early universe. A future CMB experiment with sufficient angular resolution and frequency coverage that surveys at least 1% of the sky to a depth of 1 uK-arcmin can deliver a constraint on the tensor-to-scalar ratio that will either result in a 5 σ measurement of the energy scale of inflation or rule out all large-field inflation models, even in the presence of foregrounds and the gravitational lensing B-mode signal. LSS experiments, particularly spectroscopic surveys such as the Dark Energy Spectroscopic Instrument, will complement the CMB effort by improving current constraints on running of the spectral index by up to a factor of four, improving constraints on curvature by a factor of ten, and providing non-Gaussianity constraints that are competitive with the current CMB bounds.
Inflation Physics from the Cosmic Microwave Background and Large Scale Structure
NASA Technical Reports Server (NTRS)
Abazajian, K.N.; Arnold,K.; Austermann, J.; Benson, B.A.; Bischoff, C.; Bock, J.; Bond, J.R.; Borrill, J.; Buder, I.; Burke, D.L.;
2013-01-01
Fluctuations in the intensity and polarization of the cosmic microwave background (CMB) and the large-scale distribution of matter in the universe each contain clues about the nature of the earliest moments of time. The next generation of CMB and large-scale structure (LSS) experiments are poised to test the leading paradigm for these earliest moments---the theory of cosmic inflation---and to detect the imprints of the inflationary epoch, thereby dramatically increasing our understanding of fundamental physics and the early universe. A future CMB experiment with sufficient angular resolution and frequency coverage that surveys at least 1 of the sky to a depth of 1 uK-arcmin can deliver a constraint on the tensor-to-scalar ratio that will either result in a 5-sigma measurement of the energy scale of inflation or rule out all large-field inflation models, even in the presence of foregrounds and the gravitational lensing B-mode signal. LSS experiments, particularly spectroscopic surveys such as the Dark Energy Spectroscopic Instrument, will complement the CMB effort by improving current constraints on running of the spectral index by up to a factor of four, improving constraints on curvature by a factor of ten, and providing non-Gaussianity constraints that are competitive with the current CMB bounds.
Inflation physics from the cosmic microwave background and large scale structure
Abazajian, K. N.; Arnold, K.; Austermann, J.; ...
2014-06-26
Here, fluctuations in the intensity and polarization of the cosmic microwave background (CMB) and the large-scale distribution of matter in the universe each contain clues about the nature of the earliest moments of time. The next generation of CMB and large-scale structure (LSS) experiments are poised to test the leading paradigm for these earliest moments—the theory of cosmic inflation—and to detect the imprints of the inflationary epoch, thereby dramatically increasing our understanding of fundamental physics and the early universe. A future CMB experiment with sufficient angular resolution and frequency coverage that surveys at least 1% of the sky to amore » depth of 1 uK-arcmin can deliver a constraint on the tensor-to-scalar ratio that will either result in a 5σ measurement of the energy scale of inflation or rule out all large-field inflation models, even in the presence of foregrounds and the gravitational lensing B -mode signal. LSS experiments, particularly spectroscopic surveys such as the Dark Energy Spectroscopic Instrument, will complement the CMB effort by improving current constraints on running of the spectral index by up to a factor of four, improving constraints on curvature by a factor of ten, and providing non-Gaussianity constraints that are competitive with the current CMB bounds.« less
High throughput platforms for structural genomics of integral membrane proteins.
Mancia, Filippo; Love, James
2011-08-01
Structural genomics approaches on integral membrane proteins have been postulated for over a decade, yet specific efforts are lagging years behind their soluble counterparts. Indeed, high throughput methodologies for production and characterization of prokaryotic integral membrane proteins are only now emerging, while large-scale efforts for eukaryotic ones are still in their infancy. Presented here is a review of recent literature on actively ongoing structural genomics of membrane protein initiatives, with a focus on those aimed at implementing interesting techniques aimed at increasing our rate of success for this class of macromolecules. Copyright © 2011 Elsevier Ltd. All rights reserved.
Evaluating ecosystem-based management options: Effects of trawling in Torres Strait, Australia
NASA Astrophysics Data System (ADS)
Ellis, Nick; Pantus, Francis; Welna, Andrzej; Butler, Alan
2008-09-01
A suite of management options for a prawn trawl fishery in Torres Strait, Australia was assessed for impacts on the benthic fauna using a dynamic management strategy evaluation approach. The specification of the management options was gained through consultation with stakeholders. Data for the model was drawn from several sources: the fleet data from fishery logbooks and satellite vessel monitoring systems, benthic depletion rates from trawl-down experiments, benthic recovery rates from post-experiment recovery monitoring studies, and benthic distribution from large-scale benthic surveys. Although there were large uncertainties in the resulting indicators, robust measures relevant to management were obtained by taking ratios relative to the status quo. The management control with the biggest effect was total effort; reducing trawl effort always led to increases in benthic faunal density of up to 10%. Spatial closures had a smaller benefit of up to 2%. The effect of closing a set of buffer zones around reefs to trawling was indistinguishable from the status quo option. Closing a larger area, however, was largely beneficial especially for sea cucumbers. When the spatial distributions of fauna prior to fishing were accounted for, fauna with distributions positively correlated with effort improved relative to those negatively correlated. The reduction in prawn catch under effort reduction scenarios could be ameliorated by introducing temporal closures over the full-moon period.
Terwilliger, Thomas C; Bricogne, Gerard
2014-10-01
Accurate crystal structures of macromolecules are of high importance in the biological and biomedical fields. Models of crystal structures in the Protein Data Bank (PDB) are in general of very high quality as deposited. However, methods for obtaining the best model of a macromolecular structure from a given set of experimental X-ray data continue to progress at a rapid pace, making it possible to improve most PDB entries after their deposition by re-analyzing the original deposited data with more recent software. This possibility represents a very significant departure from the situation that prevailed when the PDB was created, when it was envisioned as a cumulative repository of static contents. A radical paradigm shift for the PDB is therefore proposed, away from the static archive model towards a much more dynamic body of continuously improving results in symbiosis with continuously improving methods and software. These simultaneous improvements in methods and final results are made possible by the current deposition of processed crystallographic data (structure-factor amplitudes) and will be supported further by the deposition of raw data (diffraction images). It is argued that it is both desirable and feasible to carry out small-scale and large-scale efforts to make this paradigm shift a reality. Small-scale efforts would focus on optimizing structures that are of interest to specific investigators. Large-scale efforts would undertake a systematic re-optimization of all of the structures in the PDB, or alternatively the redetermination of groups of structures that are either related to or focused on specific questions. All of the resulting structures should be made generally available, along with the precursor entries, with various views of the structures being made available depending on the types of questions that users are interested in answering.
Terwilliger, Thomas C.; Bricogne, Gerard
2014-09-30
Accurate crystal structures of macromolecules are of high importance in the biological and biomedical fields. Models of crystal structures in the Protein Data Bank (PDB) are in general of very high quality as deposited. However, methods for obtaining the best model of a macromolecular structure from a given set of experimental X-ray data continue to progress at a rapid pace, making it possible to improve most PDB entries after their deposition by re-analyzing the original deposited data with more recent software. This possibility represents a very significant departure from the situation that prevailed when the PDB was created, when itmore » was envisioned as a cumulative repository of static contents. A radical paradigm shift for the PDB is therefore proposed, away from the static archive model towards a much more dynamic body of continuously improving results in symbiosis with continuously improving methods and software. These simultaneous improvements in methods and final results are made possible by the current deposition of processed crystallographic data (structure-factor amplitudes) and will be supported further by the deposition of raw data (diffraction images). It is argued that it is both desirable and feasible to carry out small-scale and large-scale efforts to make this paradigm shift a reality. Small-scale efforts would focus on optimizing structures that are of interest to specific investigators. Large-scale efforts would undertake a systematic re-optimization of all of the structures in the PDB, or alternatively the redetermination of groups of structures that are either related to or focused on specific questions. All of the resulting structures should be made generally available, along with the precursor entries, with various views of the structures being made available depending on the types of questions that users are interested in answering.« less
Terwilliger, Thomas C.; Bricogne, Gerard
2014-01-01
Accurate crystal structures of macromolecules are of high importance in the biological and biomedical fields. Models of crystal structures in the Protein Data Bank (PDB) are in general of very high quality as deposited. However, methods for obtaining the best model of a macromolecular structure from a given set of experimental X-ray data continue to progress at a rapid pace, making it possible to improve most PDB entries after their deposition by re-analyzing the original deposited data with more recent software. This possibility represents a very significant departure from the situation that prevailed when the PDB was created, when it was envisioned as a cumulative repository of static contents. A radical paradigm shift for the PDB is therefore proposed, away from the static archive model towards a much more dynamic body of continuously improving results in symbiosis with continuously improving methods and software. These simultaneous improvements in methods and final results are made possible by the current deposition of processed crystallographic data (structure-factor amplitudes) and will be supported further by the deposition of raw data (diffraction images). It is argued that it is both desirable and feasible to carry out small-scale and large-scale efforts to make this paradigm shift a reality. Small-scale efforts would focus on optimizing structures that are of interest to specific investigators. Large-scale efforts would undertake a systematic re-optimization of all of the structures in the PDB, or alternatively the redetermination of groups of structures that are either related to or focused on specific questions. All of the resulting structures should be made generally available, along with the precursor entries, with various views of the structures being made available depending on the types of questions that users are interested in answering. PMID:25286839
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terwilliger, Thomas C.; Bricogne, Gerard
Accurate crystal structures of macromolecules are of high importance in the biological and biomedical fields. Models of crystal structures in the Protein Data Bank (PDB) are in general of very high quality as deposited. However, methods for obtaining the best model of a macromolecular structure from a given set of experimental X-ray data continue to progress at a rapid pace, making it possible to improve most PDB entries after their deposition by re-analyzing the original deposited data with more recent software. This possibility represents a very significant departure from the situation that prevailed when the PDB was created, when itmore » was envisioned as a cumulative repository of static contents. A radical paradigm shift for the PDB is therefore proposed, away from the static archive model towards a much more dynamic body of continuously improving results in symbiosis with continuously improving methods and software. These simultaneous improvements in methods and final results are made possible by the current deposition of processed crystallographic data (structure-factor amplitudes) and will be supported further by the deposition of raw data (diffraction images). It is argued that it is both desirable and feasible to carry out small-scale and large-scale efforts to make this paradigm shift a reality. Small-scale efforts would focus on optimizing structures that are of interest to specific investigators. Large-scale efforts would undertake a systematic re-optimization of all of the structures in the PDB, or alternatively the redetermination of groups of structures that are either related to or focused on specific questions. All of the resulting structures should be made generally available, along with the precursor entries, with various views of the structures being made available depending on the types of questions that users are interested in answering.« less
Ma, Ben; Lei, Shuo; Qing, Qin; Wen, Yali
2018-05-03
The International Union for Conservation of Nature (IUCN) reduced the threat status of the giant panda from “endangered” to “vulnerable” in September 2016. In this study, we analyzed current practices for giant panda conservation at regional and local environmental scales, based on recent reports of giant panda protection efforts in Sichuan Province, China, combined with the survey results from 927 households within and adjacent to the giant panda reserves in this area. The results showed that household attitudes were very positive regarding giant panda protection efforts. Over the last 10 years, farmers’ dependence on the natural resources provided by giant panda reserves significantly decreased. However, socio-economic development increased resource consumption, and led to climate change, habitat fragmentation, environmental pollution, and other issues that placed increased pressure on giant panda populations. This difference between local and regional scales must be considered when evaluating the IUCN status of giant pandas. While the status of this species has improved in the short-term due to positive local attitudes, large-scale socio-economic development pressure could have long-term negative impacts. Consequently, the IUCN assessment leading to the classification of giant panda as “vulnerable” instead of “endangered”, should not affect its conservation intensity and effort, as such actions could negatively impact population recovery efforts, leading to the extinction of this charismatic species.
NASA Astrophysics Data System (ADS)
Henkel, Daniela; Eisenhauer, Anton
2017-04-01
During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.
(abstract) Scaling Nominal Solar Cell Impedances for Array Design
NASA Technical Reports Server (NTRS)
Mueller, Robert L; Wallace, Matthew T.; Iles, Peter
1994-01-01
This paper discusses a task the objective of which is to characterize solar cell array AC impedance and develop scaling rules for impedance characterization of large arrays by testing single solar cells and small arrays. This effort is aimed at formulating a methodology for estimating the AC impedance of the Mars Pathfinder (MPF) cruise and lander solar arrays based upon testing single cells and small solar cell arrays and to create a basis for design of a single shunt limiter for MPF power control of flight solar arrays having very different inpedances.
Luo, Zhong-Cheng; Liu, Jian-Meng; Fraser, William D
2010-02-01
The adverse health effects of environmental contaminants (ECs) are a rising public health concern, and a major threat to sustainable socioeconomic development. The developing fetuses and growing children are particularly vulnerable to the adverse effects of ECs. However, assessing the health impact of ECs presents a major challenge, given that multiple outcomes may arise from one exposure, multiple exposures may result in one outcome, and the complex interactions between ECs, and between ECs, nutrients and genetic factors, and the dynamic temporal changes in EC exposures during the life course. Large-scale prospective birth cohort studies collecting extensive data and specimen starting from the prenatal or pre-conception period, although costly, hold promise as a means to more clearly quantify the health effects of ECs, and to unravel the complex interactions between ECs, nutrients and genotypes. A number of such large-scale studies have been launched in some developed counties. We present an overview of "why", "what" and "how" behind these efforts with an objective to uncover major unidentified limitations and needs. Three major limitations were identified: (1) limited data and bio-specimens regarding early life EC exposure assessments in some birth cohort studies; (2) heavy participant burdens in some birth cohort studies may bias participant recruitment, and risk substantial loss to follow-up, protocol deviations limiting the quality of data and specimens collection, with an overall potential bias towards the null effect; (3) lack of concerted efforts in building comparable birth cohorts across countries to take advantage of natural "experiments" (large EC exposure level differences between countries) for more in-depth assessments of dose-response relationships, threshold exposure levels, and positive and negative effect modifiers. Addressing these concerns in current or future large-scale birth cohort studies may help to produce better evidence on the health effects of ECs.
Multi-scale approaches for high-speed imaging and analysis of large neural populations
Ahrens, Misha B.; Yuste, Rafael; Peterka, Darcy S.; Paninski, Liam
2017-01-01
Progress in modern neuroscience critically depends on our ability to observe the activity of large neuronal populations with cellular spatial and high temporal resolution. However, two bottlenecks constrain efforts towards fast imaging of large populations. First, the resulting large video data is challenging to analyze. Second, there is an explicit tradeoff between imaging speed, signal-to-noise, and field of view: with current recording technology we cannot image very large neuronal populations with simultaneously high spatial and temporal resolution. Here we describe multi-scale approaches for alleviating both of these bottlenecks. First, we show that spatial and temporal decimation techniques based on simple local averaging provide order-of-magnitude speedups in spatiotemporally demixing calcium video data into estimates of single-cell neural activity. Second, once the shapes of individual neurons have been identified at fine scale (e.g., after an initial phase of conventional imaging with standard temporal and spatial resolution), we find that the spatial/temporal resolution tradeoff shifts dramatically: after demixing we can accurately recover denoised fluorescence traces and deconvolved neural activity of each individual neuron from coarse scale data that has been spatially decimated by an order of magnitude. This offers a cheap method for compressing this large video data, and also implies that it is possible to either speed up imaging significantly, or to “zoom out” by a corresponding factor to image order-of-magnitude larger neuronal populations with minimal loss in accuracy or temporal resolution. PMID:28771570
Hollis, Geoff; Westbury, Chris
2018-02-01
Large-scale semantic norms have become both prevalent and influential in recent psycholinguistic research. However, little attention has been directed towards understanding the methodological best practices of such norm collection efforts. We compared the quality of semantic norms obtained through rating scales, numeric estimation, and a less commonly used judgment format called best-worst scaling. We found that best-worst scaling usually produces norms with higher predictive validities than other response formats, and does so requiring less data to be collected overall. We also found evidence that the various response formats may be producing qualitatively, rather than just quantitatively, different data. This raises the issue of potential response format bias, which has not been addressed by previous efforts to collect semantic norms, likely because of previous reliance on a single type of response format for a single type of semantic judgment. We have made available software for creating best-worst stimuli and scoring best-worst data. We also made available new norms for age of acquisition, valence, arousal, and concreteness collected using best-worst scaling. These norms include entries for 1,040 words, of which 1,034 are also contained in the ANEW norms (Bradley & Lang, Affective norms for English words (ANEW): Instruction manual and affective ratings (pp. 1-45). Technical report C-1, the center for research in psychophysiology, University of Florida, 1999).
Bolinger, Elizabeth; Reese, Caitlin; Suhr, Julie; Larrabee, Glenn J
2014-02-01
We examined the effect of simulated head injury on scores on the Neurological Complaints (NUC) and Cognitive Complaints (COG) scales of the Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF). Young adults with a history of mild head injury were randomly assigned to simulate head injury or give their best effort on a battery of neuropsychological tests, including the MMPI-2-RF. Simulators who also showed poor effort on performance validity tests (PVTs) were compared with controls who showed valid performance on PVTs. Results showed that both scales, but especially NUC, are elevated in individuals simulating head injury, with medium to large effect sizes. Although both scales were highly correlated with all MMPI-2-RF over-reporting validity scales, the relationship of Response Bias Scale to both NUC and COG was much stronger in the simulators than controls. Even accounting for over-reporting on the MMPI-2-RF, NUC was related to general somatic complaints regardless of group membership, whereas COG was related to both psychological distress and somatic complaints in the control group only. Neither scale was related to actual neuropsychological performance, regardless of group membership. Overall, results provide further evidence that self-reported cognitive symptoms can be due to many causes, not necessarily cognitive impairment, and can be exaggerated in a non-credible manner.
Assessing sufficiency of thermal riverscapes for resilient ...
Resilient salmon populations require river networks that provide water temperature regimes sufficient to support a diversity of salmonid life histories across space and time. Efforts to protect, enhance and restore watershed thermal regimes for salmon may target specific locations and features within stream networks hypothesized to provide disproportionately high-value functional resilience to salmon populations. These include relatively small-scale features such as thermal refuges, and larger-scale features such as entire watersheds or aquifers that support thermal regimes buffered from local climatic conditions. Quantifying the value of both small and large scale thermal features to salmon populations has been challenged by both the difficulty of mapping thermal regimes at sufficient spatial and temporal resolutions, and integrating thermal regimes into population models. We attempt to address these challenges by using newly-available datasets and modeling approaches to link thermal regimes to salmon populations across scales. We will describe an individual-based modeling approach for assessing sufficiency of thermal refuges for migrating salmon and steelhead in large rivers, as well as a population modeling approach for assessing large-scale climate refugia for salmon in the Pacific Northwest. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec
Aguilera, Stacy E; Cole, Jennifer; Finkbeiner, Elena M; Le Cornu, Elodie; Ban, Natalie C; Carr, Mark H; Cinner, Joshua E; Crowder, Larry B; Gelcich, Stefan; Hicks, Christina C; Kittinger, John N; Martone, Rebecca; Malone, Daniel; Pomeroy, Carrie; Starr, Richard M; Seram, Sanah; Zuercher, Rachel; Broad, Kenneth
2015-01-01
Globally, small-scale fisheries are influenced by dynamic climate, governance, and market drivers, which present social and ecological challenges and opportunities. It is difficult to manage fisheries adaptively for fluctuating drivers, except to allow participants to shift effort among multiple fisheries. Adapting to changing conditions allows small-scale fishery participants to survive economic and environmental disturbances and benefit from optimal conditions. This study explores the relative influence of large-scale drivers on shifts in effort and outcomes among three closely linked fisheries in Monterey Bay since the Magnuson-Stevens Fisheries Conservation and Management Act of 1976. In this region, Pacific sardine (Sardinops sagax), northern anchovy (Engraulis mordax), and market squid (Loligo opalescens) fisheries comprise a tightly linked system where shifting focus among fisheries is a key element to adaptive capacity and reduced social and ecological vulnerability. Using a cluster analysis of landings, we identify four modes from 1974 to 2012 that are dominated (i.e., a given species accounting for the plurality of landings) by squid, sardine, anchovy, or lack any dominance, and seven points of transition among these periods. This approach enables us to determine which drivers are associated with each mode and each transition. Overall, we show that market and climate drivers are predominantly attributed to dominance transitions. Model selection of external drivers indicates that governance phases, reflected as perceived abundance, dictate long-term outcomes. Our findings suggest that globally, small-scale fishery managers should consider enabling shifts in effort among fisheries and retaining existing flexibility, as adaptive capacity is a critical determinant for social and ecological resilience.
NASA Astrophysics Data System (ADS)
Warneke, C.; Schwarz, J. P.; Yokelson, R. J.; Roberts, J. M.; Koss, A.; Coggon, M.; Yuan, B.; Sekimoto, K.
2017-12-01
A combination of a warmer, drier climate with fire-control practices over the last century have produced a situation in which we can expect more frequent fires and fires of larger magnitude in the Western U.S. and Canada. There are urgent needs to better understand the impacts of wildfire and biomass burning (BB) on the atmosphere and climate system, and for policy-relevant science to aid in the process of managing fires. The FIREX (Fire Influence on Regional and Global Environment Experiment) research effort is a multi-year, multi-agency measurement campaign focused on the impact of BB on climate and air quality from western North American wild fires, where research takes place on scales ranging from the flame-front to the global atmosphere. FIREX includes methods development and small- and large-scale laboratory and field experiments. FIREX will include: emission factor measurements from typical North American fuels in the fire science laboratory in Missoula, Montana; mobile laboratory deployments; ground site measurements at sites influenced by BB from several western states. The main FIREX effort will be a large field study with multiple aircraft and mobile labs in the fire season of 2019. One of the main advances of FIREX is the availability of various new measurement techniques that allows for smoke evaluation in unprecedented detail. The first major effort of FIREX was the fire science laboratory measurements in October 2016, where a large number of previously understudied Nitrogen containing volatile organic compounds (NVOCs) were measured using H3O+CIMS and I-CIMS instruments. The contribution of NVOCs to the total reactive Nitrogen budget and the relationship to the Nitrogen content of the fuel are investigated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terwilliger, Thomas C., E-mail: terwilliger@lanl.gov; Bricogne, Gerard, E-mail: terwilliger@lanl.gov; Los Alamos National Laboratory, Mail Stop M888, Los Alamos, NM 87507
Macromolecular structures deposited in the PDB can and should be continually reinterpreted and improved on the basis of their accompanying experimental X-ray data, exploiting the steady progress in methods and software that the deposition of such data into the PDB on a massive scale has made possible. Accurate crystal structures of macromolecules are of high importance in the biological and biomedical fields. Models of crystal structures in the Protein Data Bank (PDB) are in general of very high quality as deposited. However, methods for obtaining the best model of a macromolecular structure from a given set of experimental X-ray datamore » continue to progress at a rapid pace, making it possible to improve most PDB entries after their deposition by re-analyzing the original deposited data with more recent software. This possibility represents a very significant departure from the situation that prevailed when the PDB was created, when it was envisioned as a cumulative repository of static contents. A radical paradigm shift for the PDB is therefore proposed, away from the static archive model towards a much more dynamic body of continuously improving results in symbiosis with continuously improving methods and software. These simultaneous improvements in methods and final results are made possible by the current deposition of processed crystallographic data (structure-factor amplitudes) and will be supported further by the deposition of raw data (diffraction images). It is argued that it is both desirable and feasible to carry out small-scale and large-scale efforts to make this paradigm shift a reality. Small-scale efforts would focus on optimizing structures that are of interest to specific investigators. Large-scale efforts would undertake a systematic re-optimization of all of the structures in the PDB, or alternatively the redetermination of groups of structures that are either related to or focused on specific questions. All of the resulting structures should be made generally available, along with the precursor entries, with various views of the structures being made available depending on the types of questions that users are interested in answering.« less
Operational development of small plant growth systems
NASA Technical Reports Server (NTRS)
Scheld, H. W.; Magnuson, J. W.; Sauer, R. L.
1986-01-01
The results of a study undertaken on the first phase of an empricial effort in the development of small plant growth chambers for production of salad type vegetables on space shuttle or space station are discussed. The overall effort is visualized as providing the underpinning of practical experience in handling of plant systems in space which will provide major support for future efforts in planning, design, and construction of plant-based (phytomechanical) systems for support of human habitation in space. The assumptions underlying the effort hold that large scale phytomechanical habitability support systems for future space stations must evolve from the simple to the complex. The highly complex final systems will be developed from the accumulated experience and data gathered from repetitive tests and trials of fragments or subsystems of the whole in an operational mode. These developing system components will, meanwhile, serve a useful operational function in providing psychological support and diversion for the crews.
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
A global probabilistic tsunami hazard assessment from earthquake sources
Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana
2017-01-01
Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.
ERIC Educational Resources Information Center
Aßmann, Christian; Würbach, Ariane; Goßmann, Solange; Geissler, Ferdinand; Bela, Anika
2017-01-01
Large-scale surveys typically exhibit data structures characterized by rich mutual dependencies between surveyed variables and individual-specific skip patterns. Despite high efforts in fieldwork and questionnaire design, missing values inevitably occur. One approach for handling missing values is to provide multiply imputed data sets, thus…
ERIC Educational Resources Information Center
Volk, Robert J.; Lewis, Robert A.
Causal models of adolescent substance abuse from a family systems perspective are developed using data from a large-scale family therapy efficacy grant. It is argued that the literature on families of adolescent substance abusers is scattered in its theoretical and empirical efforts, tends to not account for individual and family developmental…
2012-10-01
earlier, LEMV experienced schedule delays of at least 10 months, largely rooted in technical, design, and engineering problems in scaling up the airship ...had informal coordination with the Blue Devil Block 2 effort in the past. For example, originally both airships had several diesel engine ...DEFENSE ACQUISITIONS Future Aerostat and Airship Investment Decisions Drive Oversight and Coordination Needs
Bamdad Barari; Thomas K. Ellingham; Issam I. Ghamhia; Krishna M. Pillai; Rani El-Hajjar; Lih-Sheng Turng; Ronald Sabo
2016-01-01
Plant derived cellulose nano-fibers (CNF) are a material with remarkable mechanical properties compared to other natural fibers. However, efforts to produce nano-composites on a large scale using CNF have yet to be investigated. In this study, scalable CNF nano-composites were made from isotropically porous CNF preforms using a freeze drying process. An improvised...
From CIRCUS to EL CIRCO: Issues in Instrument Development for Young Spanish-speaking Children.
ERIC Educational Resources Information Center
Hardy, Roy
The CIRCO project is a large scale effort to design a series of diagnostic instruments, based on the CIRCUS tests, for Spanish-speaking children in preschool, kindergarten, and first grade classrooms in the United States. The goal is to develop measures with the following characteristics: (1) is suitable for use with Spanish-speaking children from…
ERIC Educational Resources Information Center
Jasinski, Barbara Plummer
2013-01-01
Change is often expected as the logical outcome of large scale investments in professional development, yet research studies (e.g., Tyack & Cuban, 1995; Lipson, Mosenthal & Woodside-Jiron, 2000; Schraw & Olafson, 2002) note wide variations in instructional practice despite such efforts. This qualitative inquiry was designed to…
Adriana Sulak; Lynn Huntsinger
2002-01-01
The interlinkage of privately owned foothill oak woodlands and federal grazing permits in the central Sierra Nevada is examined. Knowledge of the viability of the range livestock industry is important to large-scale conservation of hardwood rangelands in the Sierran foothills. Because ranches in the Sierra often use USDA Forest Service grazing allotments, efforts at...
ERIC Educational Resources Information Center
United Nations Food and Agriculture Organization, Rome (Italy).
Focus of this 1976 journal on agricultural and rural development education is how to deal with the shortage of trained manpower which is an obstacle to large-scale rural development efforts. The journal's theme is that a broader approach must be made to generate adequate numbers of trained manpower--all types of nonformal education (agricultural…
A review of challenges to determining and demonstrating efficiency of large fire management
Matthew P. Thompson; Francisco Rodriguez y Silva; David E. Calkin; Michael S. Hand
2017-01-01
Characterising the impacts of wildland fire and fire suppression is critical information for fire management decision-making. Here, we focus on decisions related to the rare larger and longer-duration fire events, where the scope and scale of decision-making can be far broader than initial response efforts, and where determining and demonstrating efficiency of...
ERIC Educational Resources Information Center
Rushton, Gregory T.; Rosengrant, David; Dewar, Andrew; Shah, Lisa; Ray, Herman E.; Sheppard, Keith; Watanabe, Lynn
2017-01-01
Efforts to improve the number and quality of the high school physics teaching workforce have taken several forms, including those sponsored by professional organizations. Using a series of large-scale teacher demographic data sets from the National Center for Education Statistics (NCES), this study sought to investigate trends in teacher quality…
ERIC Educational Resources Information Center
Eklof, Hanna; Nyroos, Mikaela
2013-01-01
Although large-scale national tests have been used for many years in Swedish compulsory schools, very little is known about how pupils actually react to these tests. The question is relevant, however, as pupil reactions in the test situation may affect test performance as well as future attitudes towards assessment. The question is relevant also…
Nancy Rivard. The color of love.
2002-12-01
Nancy Rivard's personal journey to be a part of something bigger than herself resulted in the founding of Airline Ambassadors International (AAI). As a flight attendant for American Airlines, she enrolled the industry through a large-scale volunteer effort and established Airline Ambassadors, whose programs have delivered school supplies, food, medicine, and medical supplies to upwards of 100,000 children in 25 nations.
Petrovan, Silviu O; Schmidt, Benedikt R
2016-01-01
Rare and threatened species are the most frequent focus of conservation science and action. With the ongoing shift from single-species conservation towards the preservation of ecosystem services, there is a greater need to understand abundance trends of common species because declines in common species can disproportionately impact ecosystems function. We used volunteer-collected data in two European countries, the United Kingdom (UK) and Switzerland, since the 1970s to assess national and regional trends for one of Europe's most abundant amphibian species, the common toad (Bufo bufo). Millions of toads were moved by volunteers across roads during this period in an effort to protect them from road traffic. For Switzerland, we additionally estimated trends for the common frog (Rana temporaria), a similarly widespread and common amphibian species. We used state-space models to account for variability in detection and effort and included only populations with at least 5 years of data; 153 populations for the UK and 141 for Switzerland. Common toads declined continuously in each decade in both countries since the 1980s. Given the declines, this common species almost qualifies for International Union for the Conservation of Nature (IUCN) red-listing over this period despite volunteer conservation efforts. Reasons for the declines and wider impacts remain unknown. By contrast, common frog populations were stable or increasing in Switzerland, although there was evidence of declines after 2003. "Toads on Roads" schemes are vital citizen conservation action projects, and the data from such projects can be used for large scale trend estimations of widespread amphibians. We highlight the need for increased research into the status of common amphibian species in addition to conservation efforts focusing on rare and threatened species.
NASA Astrophysics Data System (ADS)
Sweeney, C.; Kort, E. A.; Rella, C.; Conley, S. A.; Karion, A.; Lauvaux, T.; Frankenberg, C.
2015-12-01
Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.
A Scoping Review to Address the Culture of Concussion in Youth and High School Sports.
Sarmiento, Kelly; Donnell, Zoe; Hoffman, Rosanne
2017-10-01
In 2013, the National Academy of Sciences emphasized the need to develop, implement, and evaluate effective large-scale educational strategies to improve the culture of concussion in youth and high school sports. In support of this recommendation, in this article we summarize research on factors that contribute to the culture of concussion. We conducted the literature search using 7 electronic databases. We used a scoping review method to identify studies that addressed knowledge, attitudes, behaviors, use of educational resources, and interventions related to concussion among young athletes, coaches, and parents. Of the 33 articles identified, most focused on concussion education (N = 15), followed by knowledge (N = 13), behaviors (N = 13), and attitudes (N = 5). Three studies addressed multiple study populations. The rapid spread of concussion education and awareness efforts has outpaced research on effective strategies to improve knowledge, attitudes, and behaviors that contribute to the culture of concussion. Further research is critical to inform the development and implementation of large-scale educational efforts. This research should incorporate rigorous study designs; be inclusive of diverse ages, socioeconomic status, and racial/ethnic groups; and examine opportunities to improve behavioral outcomes around concussion prevention, reporting, and management. © 2017, American School Health Association.
Fast and fuel efficient? Optimal use of wind by flying albatrosses.
Weimerskirch, H; Guionnet, T; Martin, J; Shaffer, S A; Costa, D P
2000-09-22
The influence of wind patterns on behaviour and effort of free-ranging male wandering albatrosses (Diomedea exulans) was studied with miniaturized external heart-rate recorders in conjunction with satellite transmitters and activity recorders. Heart rate was used as an instantaneous index of energy expenditure. When cruising with favourable tail or side winds, wandering albatrosses can achieve high flight speeds while expending little more energy than birds resting on land. In contrast, heart rate increases concomitantly with increasing head winds, and flight speeds decrease. Our results show that effort is greatest when albatrosses take off from or land on the water. On a larger scale, we show that in order for birds to have the highest probability of experiencing favourable winds, wandering albatrosses use predictable weather systems to engage in a stereotypical flight pattern of large looping tracks. When heading north, albatrosses fly in anticlockwise loops, and to the south, movements are in a clockwise direction. Thus, the capacity to integrate instantaneous eco-physiological measures with records of large-scale flight and wind patterns allows us to understand better the complex interplay between the evolution of morphological, physiological and behavioural adaptations of albatrosses in the windiest place on earth.
Computing at the speed limit (supercomputers)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernhard, R.
1982-07-01
The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less
Opportunities and challenges in industrial plantation mapping in big data era
NASA Astrophysics Data System (ADS)
Dong, J.; Xiao, X.; Qin, Y.; Chen, B.; Wang, J.; Kou, W.; Zhai, D.
2017-12-01
With the increasing demand in timer, rubber, palm oil in the world market, industrial plantations have dramatically expanded, especially in Southeast Asia; which have been affecting ecosystem services and human wellbeing. However, existing efforts on plantation mapping are still limited and blocked our understanding about the magnitude of plantation expansion and their potential environmental effects. Here we would present a literature review about the existing efforts on plantation mapping based on one or multiple remote sensing sources, including rubber, oil palm, and eucalyptus plantations. The biophysical features and spectral characteristics of plantations will be introduced first, a comparison on existing algorithms in terms of different plantation types. Based on that, we proposed potential improvements in large scale plantation mapping based on the virtual constellation of multiple sensors, citizen science tools, and cloud computing technology. Based on the literature review, we discussed a series of issues for future scale operational paddy rice mapping.
Piwoz, Ellen G; Huffman, Sandra L; Quinn, Victoria J
2003-03-01
Although many successes have been achieved in promoting breastfeeding, this has not been the case for complementary feeding. Some successes in promoting complementary feeding at the community level have been documented, but few of these efforts have expanded to a larger scale and become sustained. To discover the reasons for this difference, the key factors for the successful promotion of breastfeeding on a large scale were examined and compared with the efforts made in complementary feeding. These factors include definition and rationale, policy support, funding, advocacy, private-sector involvement, availability and use of monitoring data, integration of research into action, and the existence of a well-articulated series of steps for successful implementation. The lessons learned from the promotion of breastfeeding should be applied to complementary feeding, and the new Global Strategy for Infant and Young Child Feeding provides an excellent first step in this process.
An Overview of the NASA FAP Hypersonics Project Airbreathing Propulsion Research
NASA Technical Reports Server (NTRS)
Auslender, A. H.; Suder, Kenneth L.; Thomas, Scott R.
2009-01-01
The propulsion research portfolio of the National Aeronautics and Space Administration Fundamental Aeronautics Program Hypersonics Project encompasses a significant number of technical tasks that are aligned to achieve mastery and intellectual stewardship of the core competencies in the hypersonic-flight regime. An overall coordinated programmatic and technical effort has been structured to advance the state-of-the-art, via both experimental and analytical efforts. A subset of the entire hypersonics propulsion research portfolio is presented in this overview paper. To this end, two programmatic research disciplines are discussed; namely, (1) the Propulsion Discipline, including three associated research elements: the X-51A partnership, the HIFiRE-2 partnership, and the Durable Combustor Rig, and (2) the Turbine-Based Combine Cycle Discipline, including three associated research elements: the Combined Cycle Engine Large Scale Inlet Mode Transition Experiment, the small-scale Inlet Mode Transition Experiment, and the High-Mach Fan Rig.
Cartographic experiment for Latin America. [Santa Cruz, Bolivia and Concepcion, Paraguay
NASA Technical Reports Server (NTRS)
Staples, J. E. (Principal Investigator)
1974-01-01
The author has identified the following significant results. The two experiments clearly demonstrate the practical application of the Skylab photography to update existing maps at an optimum scale of 1:100,000. The photography can even be used, by employing first order photogrammetric instruments, for updating the cultural features in 1:50,000 scale mapping. The S190A imagery has also shown itself to be most economical in preparing new photomap products over previously unmapped areas, such as Concepcion, Paraguay. These maps indicate that Skylab quality imagery is invaluable to the Latin American cartographers in their efforts to provide the mapping products required to develop their countries. In Latin America, where over 5,000 people are employed in map production and where the Latin American governments are expending over $20 million in this effort, the use of such systems to maintain existing mapping and publish new mapping over previously unmapped areas, is of great economic value and could release the conventional Latin American mapping resources to be utilized to produce large scale 1:25,000 and 1:1,000 scale mapping that is needed for specific development projects.
Unmanned Vehicle Material Flammability Test
NASA Technical Reports Server (NTRS)
Urban, David; Ruff, Gary A.; Fernandez-Pello, A. Carlos; T’ien, James S.; Torero, Jose L.; Cowlard, Adam; Rouvreau, Sebastian; Minster, Olivier; Toth, Balazs; Legros, Guillaume;
2013-01-01
Microgravity combustion phenomena have been an active area of research for the past 3 decades however, there have been very few experiments directly studying spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample and environment sizes typical of those expected in a spacecraft fire. All previous experiments have been limited to samples of the order of 10 cm in length and width or smaller. Terrestrial fire safety standards for all other habitable volumes on earth, e.g. mines, buildings, airplanes, ships, etc., are based upon testing conducted with full-scale fires. Given the large differences between fire behavior in normal and reduced gravity, this lack of an experimental data base at relevant length scales forces spacecraft designers to base their designs using 1-g understanding. To address this question a large scale spacecraft fire experiment has been proposed by an international team of investigators. This poster presents the objectives, status and concept of this collaborative international project to examine spacecraft material flammability at realistic scales. The concept behind this project is to utilize an unmanned spacecraft such as Orbital Cygnus vehicle after it has completed its delivery of cargo to the ISS and it has begun its return journey to earth. This experiment will consist of a flame spread test involving a meter scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. A computer modeling effort will complement the experimental effort. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the examination of fire behavior on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model validation. This will be the first opportunity to examine microgravity flame behavior at scales approximating a spacecraft fire.
Evaluating large-scale health programmes at a district level in resource-limited countries.
Svoronos, Theodore; Mate, Kedar S
2011-11-01
Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.
A Ground-Based Research Vehicle for Base Drag Studies at Subsonic Speeds
NASA Technical Reports Server (NTRS)
Diebler, Corey; Smith, Mark
2002-01-01
A ground research vehicle (GRV) has been developed to study the base drag on large-scale vehicles at subsonic speeds. Existing models suggest that base drag is dependent upon vehicle forebody drag, and for certain configurations, the total drag of a vehicle can be reduced by increasing its forebody drag. Although these models work well for small projectile shapes, studies have shown that they do not provide accurate predictions when applied to large-scale vehicles. Experiments are underway at the NASA Dryden Flight Research Center to collect data at Reynolds numbers to a maximum of 3 x 10(exp 7), and to formulate a new model for predicting the base drag of trucks, buses, motor homes, reentry vehicles, and other large-scale vehicles. Preliminary tests have shown errors as great as 70 percent compared to Hoerner's two-dimensional base drag prediction. This report describes the GRV and its capabilities, details the studies currently underway at NASA Dryden, and presents preliminary results of both the effort to formulate a new base drag model and the investigation into a method of reducing total drag by manipulating forebody drag.
Effort-Reward Imbalance for Learning Is Associated with Fatigue in School Children
ERIC Educational Resources Information Center
Fukuda, Sanae; Yamano, Emi; Joudoi, Takako; Mizuno, Kei; Tanaka, Masaaki; Kawatani, Junko; Takano, Miyuki; Tomoda, Akemi; Imai-Matsumura, Kyoko; Miike, Teruhisa; Watanabe, Yasuyoshi
2010-01-01
We examined relationships among fatigue, sleep quality, and effort-reward imbalance for learning in school children. We developed an effort-reward for learning scale in school students and examined its reliability and validity. Self-administered surveys, including the effort reward for leaning scale and fatigue scale, were completed by 1,023…
Colvin, Christopher J.
2014-01-01
The HIV epidemic is widely recognised as having prompted one of the most remarkable intersections ever of illness, science and activism. The production, circulation, use and evaluation of empirical scientific ‘evidence’ played a central part in activists’ engagement with AIDS science. Previous activist engagement with evidence focused on the social and biomedical responses to HIV in the global North as well as challenges around ensuring antiretroviral treatment (ART) was available in the global South. More recently, however, with the roll-out and scale-up of large public-sector ART programmes and new multi-dimensional prevention efforts, the relationships between evidence and activism have been changing. Scale-up of these large-scale treatment and prevention programmes represents an exciting new opportunity while bringing with it a host of new challenges. This paper examines what new forms of evidence and activism will be required to address the challenges of the scaling-up era of HIV treatment and prevention. It reviews some recent controversies around evidence and HIV scale-up and describes the different forms of evidence and activist strategies that will be necessary for a robust response to these new challenges. PMID:24498918
Scherer, Laura; Venkatesh, Aranya; Karuppiah, Ramkumar; Pfister, Stephan
2015-04-21
Physical water scarcities can be described by water stress indices. These are often determined at an annual scale and a watershed level; however, such scales mask seasonal fluctuations and spatial heterogeneity within a watershed. In order to account for this level of detail, first and foremost, water availability estimates must be improved and refined. State-of-the-art global hydrological models such as WaterGAP and UNH/GRDC have previously been unable to reliably reflect water availability at the subbasin scale. In this study, the Soil and Water Assessment Tool (SWAT) was tested as an alternative to global models, using the case study of the Mississippi watershed. While SWAT clearly outperformed the global models at the scale of a large watershed, it was judged to be unsuitable for global scale simulations due to the high calibration efforts required. The results obtained in this study show that global assessments miss out on key aspects related to upstream/downstream relations and monthly fluctuations, which are important both for the characterization of water scarcity in the Mississippi watershed and for water footprints. Especially in arid regions, where scarcity is high, these models provide unsatisfying results.
Chen, Yang; Ren, Xiaofeng; Zhang, Guo-Qiang; Xu, Rong
2013-01-01
Visual information is a crucial aspect of medical knowledge. Building a comprehensive medical image base, in the spirit of the Unified Medical Language System (UMLS), would greatly benefit patient education and self-care. However, collection and annotation of such a large-scale image base is challenging. To combine visual object detection techniques with medical ontology to automatically mine web photos and retrieve a large number of disease manifestation images with minimal manual labeling effort. As a proof of concept, we first learnt five organ detectors on three detection scales for eyes, ears, lips, hands, and feet. Given a disease, we used information from the UMLS to select affected body parts, ran the pretrained organ detectors on web images, and combined the detection outputs to retrieve disease images. Compared with a supervised image retrieval approach that requires training images for every disease, our ontology-guided approach exploits shared visual information of body parts across diseases. In retrieving 2220 web images of 32 diseases, we reduced manual labeling effort to 15.6% while improving the average precision by 3.9% from 77.7% to 81.6%. For 40.6% of the diseases, we improved the precision by 10%. The results confirm the concept that the web is a feasible source for automatic disease image retrieval for health image database construction. Our approach requires a small amount of manual effort to collect complex disease images, and to annotate them by standard medical ontology terms.
Not Just About the Science: Cold War Politics and the International Indian Ocean Expedition
NASA Astrophysics Data System (ADS)
Harper, K.
2016-12-01
The International Indian Ocean Expedition broke ground for a series of multi-national oceanographic expeditions starting in the late 1950s. In and of itself, it would have been historically significant—like the International Geophysical Year (1957-58)—for pulling together the international scientific community during the Cold War. However, US support for this and follow-on Indian Ocean expeditions were not just about the science; they were also about diplomacy, specifically efforts to bring non-aligned India into the US political orbit and out of the clutches of its Cold War enemy, the Soviet Union. This paper examines the behind-the-scenes efforts at the highest reaches of the US government to extract international political gain out of a large-scale scientific effort.
Genetics of Resistant Hypertension: the Missing Heritability and Opportunities.
Teixeira, Samantha K; Pereira, Alexandre C; Krieger, Jose E
2018-05-19
Blood pressure regulation in humans has long been known to be a genetically determined trait. The identification of causal genetic modulators for this trait has been unfulfilling at the least. Despite the recent advances of genome-wide genetic studies, loci associated with hypertension or blood pressure still explain a very low percentage of the overall variation of blood pressure in the general population. This has precluded the translation of discoveries in the genetics of human hypertension to clinical use. Here, we propose the combined use of resistant hypertension as a trait for mapping genetic determinants in humans and the integration of new large-scale technologies to approach in model systems the multidimensional nature of the problem. New large-scale efforts in the genetic and genomic arenas are paving the way for an increased and granular understanding of genetic determinants of hypertension. New technologies for whole genome sequence and large-scale forward genetic screens can help prioritize gene and gene-pathways for downstream characterization and large-scale population studies, and guided pharmacological design can be used to drive discoveries to the translational application through better risk stratification and new therapeutic approaches. Although significant challenges remain in the mapping and identification of genetic determinants of hypertension, new large-scale technological approaches have been proposed to surpass some of the shortcomings that have limited progress in the area for the last three decades. The incorporation of these technologies to hypertension research may significantly help in the understanding of inter-individual blood pressure variation and the deployment of new phenotyping and treatment approaches for the condition.
Evaluating the Large-Scale Environment of Extreme Events Using Reanalyses
NASA Astrophysics Data System (ADS)
Bosilovich, M. G.; Schubert, S. D.; Koster, R. D.; da Silva, A. M., Jr.; Eichmann, A.
2014-12-01
Extreme conditions and events have always been a long standing concern in weather forecasting and national security. While some evidence indicates extreme weather will increase in global change scenarios, extremes are often related to the large scale atmospheric circulation, but also occurring infrequently. Reanalyses assimilate substantial amounts of weather data and a primary strength of reanalysis data is the representation of the large-scale atmospheric environment. In this effort, we link the occurrences of extreme events or climate indicators to the underlying regional and global weather patterns. Now, with greater than 3o years of data, reanalyses can include multiple cases of extreme events, and thereby identify commonality among the weather to better characterize the large-scale to global environment linked to the indicator or extreme event. Since these features are certainly regionally dependent, and also, the indicators of climate are continually being developed, we outline various methods to analyze the reanalysis data and the development of tools to support regional evaluation of the data. Here, we provide some examples of both individual case studies and composite studies of similar events. For example, we will compare the large scale environment for Northeastern US extreme precipitation with that of highest mean precipitation seasons. Likewise, southerly winds can shown to be a major contributor to very warm days in the Northeast winter. While most of our development has involved NASA's MERRA reanalysis, we are also looking forward to MERRA-2 which includes several new features that greatly improve the representation of weather and climate, especially for the regions and sectors involved in the National Climate Assessment.
NASA Astrophysics Data System (ADS)
Monks, P. S.; Archibald, A. T.; Colette, A.; Cooper, O.; Coyle, M.; Derwent, R.; Fowler, D.; Granier, C.; Law, K. S.; Mills, G. E.; Stevenson, D. S.; Tarasova, O.; Thouret, V.; von Schneidemesser, E.; Sommariva, R.; Wild, O.; Williams, M. L.
2015-08-01
Ozone holds a certain fascination in atmospheric science. It is ubiquitous in the atmosphere, central to tropospheric oxidation chemistry, yet harmful to human and ecosystem health as well as being an important greenhouse gas. It is not emitted into the atmosphere but is a byproduct of the very oxidation chemistry it largely initiates. Much effort is focused on the reduction of surface levels of ozone owing to its health and vegetation impacts, but recent efforts to achieve reductions in exposure at a country scale have proved difficult to achieve owing to increases in background ozone at the zonal hemispheric scale. There is also a growing realisation that the role of ozone as a short-lived climate pollutant could be important in integrated air quality climate change mitigation. This review examines current understanding of the processes regulating tropospheric ozone at global to local scales from both measurements and models. It takes the view that knowledge across the scales is important for dealing with air quality and climate change in a synergistic manner. The review shows that there remain a number of clear challenges for ozone such as explaining surface trends, incorporating new chemical understanding, ozone-climate coupling, and a better assessment of impacts. There is a clear and present need to treat ozone across the range of scales, a transboundary issue, but with an emphasis on the hemispheric scales. New observational opportunities are offered both by satellites and small sensors that bridge the scales.
Large-Scale Distributed Computational Fluid Dynamics on the Information Power Grid Using Globus
NASA Technical Reports Server (NTRS)
Barnard, Stephen; Biswas, Rupak; Saini, Subhash; VanderWijngaart, Robertus; Yarrow, Maurice; Zechtzer, Lou; Foster, Ian; Larsson, Olle
1999-01-01
This paper describes an experiment in which a large-scale scientific application development for tightly-coupled parallel machines is adapted to the distributed execution environment of the Information Power Grid (IPG). A brief overview of the IPG and a description of the computational fluid dynamics (CFD) algorithm are given. The Globus metacomputing toolkit is used as the enabling device for the geographically-distributed computation. Modifications related to latency hiding and Load balancing were required for an efficient implementation of the CFD application in the IPG environment. Performance results on a pair of SGI Origin 2000 machines indicate that real scientific applications can be effectively implemented on the IPG; however, a significant amount of continued effort is required to make such an environment useful and accessible to scientists and engineers.
Chemical Warfare and Medical Response During World War I
Fitzgerald, Gerard J.
2008-01-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914–1918). Historians now refer to the Great War as the chemist’s war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations. PMID:18356568
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Large-Scale Quantitative Analysis of Painting Arts
Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong
2014-01-01
Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877
Development of piezoelectric composites for transducers
NASA Astrophysics Data System (ADS)
Safari, A.
1994-07-01
For the past decade and a half, many different types of piezoelectric ceramic-polymer composites have been developed intended for transducer applications. These diphasic composites are prepared from non-active polymer, such as epoxy, and piezoelectric ceramic, such as PZT, in the form of filler powders, elongated fibers, multilayer and more complex three-dimensional structures. For the last four years, most of the efforts have been given to producing large area and fine scale PZT fiber composites. In this paper, processing of piezoelectric ceramic-polymer composites with various connectivity patterns are reviewed. Development of fine scale piezoelectric composites by lost mold, injection molding and the relic method are described. Research activities of different groups for preparing large area piezocomposites for hydrophone and actuator applications are briefly reviewed. Initial development of electrostrictive ceramics and composites are also
Chemical warfare and medical response during World War I.
Fitzgerald, Gerard J
2008-04-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914-1918). Historians now refer to the Great War as the chemist's war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcox, Karen; Marzouk, Youssef
2013-11-12
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghattas, Omar
2013-10-15
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach
Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.
2016-01-01
Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075
High Performance Computing for Modeling Wind Farms and Their Impact
NASA Astrophysics Data System (ADS)
Mavriplis, D.; Naughton, J. W.; Stoellinger, M. K.
2016-12-01
As energy generated by wind penetrates further into our electrical system, modeling of power production, power distribution, and the economic impact of wind-generated electricity is growing in importance. The models used for this work can range in fidelity from simple codes that run on a single computer to those that require high performance computing capabilities. Over the past several years, high fidelity models have been developed and deployed on the NCAR-Wyoming Supercomputing Center's Yellowstone machine. One of the primary modeling efforts focuses on developing the capability to compute the behavior of a wind farm in complex terrain under realistic atmospheric conditions. Fully modeling this system requires the simulation of continental flows to modeling the flow over a wind turbine blade, including down to the blade boundary level, fully 10 orders of magnitude in scale. To accomplish this, the simulations are broken up by scale, with information from the larger scales being passed to the lower scale models. In the code being developed, four scale levels are included: the continental weather scale, the local atmospheric flow in complex terrain, the wind plant scale, and the turbine scale. The current state of the models in the latter three scales will be discussed. These simulations are based on a high-order accurate dynamic overset and adaptive mesh approach, which runs at large scale on the NWSC Yellowstone machine. A second effort on modeling the economic impact of new wind development as well as improvement in wind plant performance and enhancements to the transmission infrastructure will also be discussed.
A new resource for developing and strengthening large-scale community health worker programs.
Perry, Henry; Crigler, Lauren; Lewin, Simon; Glenton, Claire; LeBan, Karen; Hodgins, Steve
2017-01-12
Large-scale community health worker programs are now growing in importance around the world in response to the resurgence of interest and growing evidence of the importance of community-based primary health care for improving the health of populations in resource-constrained, high-mortality settings. These programs, because of their scale and operational challenges, merit special consideration by the global health community, national policy-makers, and program implementers. A new online resource is now available to assist in that effort: Developing and Strengthening Community Health Worker Programs at Scale: A Reference Guide and Case Studies for Program Managers and Policymakers ( http://www.mchip.net/CHWReferenceGuide ). This CHW Reference Guide is the product of 27 different collaborators who, collectively, have a formidable breadth and depth of experience and knowledge about CHW programming around the world. It provides a thoughtful discussion about the many operational issues that large-scale CHW programs need to address as they undergo the process of development, expansion or strengthening. Detailed case studies of 12 national CHW programs are included in the Appendix-the most current and complete cases studies as a group that are currently available. Future articles in this journal will highlight many of the themes in the CHW Reference Guide and provide an update of recent advances and experiences. These articles will serve, we hope, to (1) increase awareness about the CHW Reference Guide and its usefulness and (2) connect a broader audience to the critical importance of strengthening large-scale CHW programs for the health benefits that they can bring to underserved populations around the world.
NASA Technical Reports Server (NTRS)
Han, Qingyuan; Rossow, William B.; Chou, Joyce; Welch, Ronald M.
1997-01-01
Cloud microphysical parameterizations have attracted a great deal of attention in recent years due to their effect on cloud radiative properties and cloud-related hydrological processes in large-scale models. The parameterization of cirrus particle size has been demonstrated as an indispensable component in the climate feedback analysis. Therefore, global-scale, long-term observations of cirrus particle sizes are required both as a basis of and as a validation of parameterizations for climate models. While there is a global scale, long-term survey of water cloud droplet sizes (Han et al.), there is no comparable study for cirrus ice crystals. This study is an effort to supply such a data set.
NASA Technical Reports Server (NTRS)
Morton, Douglas; Souza, Carlos, Jr.; Souza, Carlos, Jr.; Keller, Michael
2012-01-01
Large-scale tropical forest monitoring efforts in support of REDD+ (Reducing Emissions from Deforestation and forest Degradation plus enhancing forest carbon stocks) confront a range of challenges. REDD+ activities typically have short reporting time scales, diverse data needs, and low tolerance for uncertainties. Meeting these challenges will require innovative use of remote sensing data, including integrating data at different spatial and temporal resolutions. The global scientific community is engaged in developing, evaluating, and applying new methods for regional to global scale forest monitoring. Pilot REDD+ activities are underway across the tropics with support from a range of national and international groups, including SilvaCarbon, an interagency effort to coordinate US expertise on forest monitoring and resource management. Early actions on REDD+ have exposed some of the inherent tradeoffs that arise from the use of incomplete or inaccurate data to quantify forest area changes and related carbon emissions. Here, we summarize recent advances in forest monitoring to identify and target the main sources of uncertainty in estimates of forest area changes, aboveground carbon stocks, and Amazon forest carbon emissions.
Facilitating large-scale clinical trials: in Asia.
Choi, Han Yong; Ko, Jae-Wook
2010-01-01
The number of clinical trials conducted in Asian countries has started to increase as a result of expansion of the pharmaceutical market in this area. There is a growing opportunity for large-scale clinical trials because of the large number of patients, significant market potential, good quality of data, and the cost effective and qualified medical infrastructure. However, for carrying out large-scale clinical trials in Asia, there are several major challenges, including the quality control of data, budget control, laboratory validation, monitoring capacity, authorship, staff training, and nonstandard treatment that need to be considered. There are also several difficulties in collaborating on international trials in Asia because Asia is an extremely diverse continent. The major challenges are language differences, diversity of patterns of disease, and current treatments, a large gap in the experience with performing multinational trials, and regulatory differences among the Asian countries. In addition, there are also differences in the understanding of global clinical trials, medical facilities, indemnity assurance, and culture, including food and religion. To make regional and local data provide evidence for efficacy through the standardization of these differences, unlimited effort is required. At this time, there are no large clinical trials led by urologists in Asia, but it is anticipated that the role of urologists in clinical trials will continue to increase. Copyright © 2010 Elsevier Inc. All rights reserved.
Community-based native seed production for restoration in Brazil - the role of science and policy.
Schmidt, I B; de Urzedo, D I; Piña-Rodrigues, F C M; Vieira, D L M; de Rezende, G M; Sampaio, A B; Junqueira, R G P
2018-05-20
Large-scale restoration programmes in the tropics require large volumes of high quality, genetically diverse and locally adapted seeds from a large number of species. However, scarcity of native seeds is a critical restriction to achieve restoration targets. In this paper, we analyse three successful community-based networks that supply native seeds and seedlings for Brazilian Amazon and Cerrado restoration projects. In addition, we propose directions to promote local participation, legal, technical and commercialisation issues for up-scaling the market of native seeds for restoration with high quality and social justice. We argue that effective community-based restoration arrangements should follow some principles: (i) seed production must be based on real market demand; (ii) non-governmental and governmental organisations have a key role in supporting local organisation, legal requirements and selling processes; (iii) local ecological knowledge and labour should be valued, enabling local communities to promote large-scale seed production; (iv) applied research can help develop appropriate techniques and solve technical issues. The case studies from Brazil and principles presented here can be useful for the up-scaling restoration ecology efforts in many other parts of the world and especially in tropical countries where improving rural community income is a strategy for biodiversity conservation and restoration. © 2018 German Society for Plant Sciences and The Royal Botanical Society of the Netherlands.
White, David T; Eroglu, Arife Unal; Wang, Guohua; Zhang, Liyun; Sengupta, Sumitra; Ding, Ding; Rajpurohit, Surendra K; Walker, Steven L; Ji, Hongkai; Qian, Jiang; Mumm, Jeff S
2017-01-01
The zebrafish has emerged as an important model for whole-organism small-molecule screening. However, most zebrafish-based chemical screens have achieved only mid-throughput rates. Here we describe a versatile whole-organism drug discovery platform that can achieve true high-throughput screening (HTS) capacities. This system combines our automated reporter quantification in vivo (ARQiv) system with customized robotics, and is termed ‘ARQiv-HTS’. We detail the process of establishing and implementing ARQiv-HTS: (i) assay design and optimization, (ii) calculation of sample size and hit criteria, (iii) large-scale egg production, (iv) automated compound titration, (v) dispensing of embryos into microtiter plates, and (vi) reporter quantification. We also outline what we see as best practice strategies for leveraging the power of ARQiv-HTS for zebrafish-based drug discovery, and address technical challenges of applying zebrafish to large-scale chemical screens. Finally, we provide a detailed protocol for a recently completed inaugural ARQiv-HTS effort, which involved the identification of compounds that elevate insulin reporter activity. Compounds that increased the number of insulin-producing pancreatic beta cells represent potential new therapeutics for diabetic patients. For this effort, individual screening sessions took 1 week to conclude, and sessions were performed iteratively approximately every other day to increase throughput. At the conclusion of the screen, more than a half million drug-treated larvae had been evaluated. Beyond this initial example, however, the ARQiv-HTS platform is adaptable to almost any reporter-based assay designed to evaluate the effects of chemical compounds in living small-animal models. ARQiv-HTS thus enables large-scale whole-organism drug discovery for a variety of model species and from numerous disease-oriented perspectives. PMID:27831568
Static Schedulers for Embedded Real-Time Systems
1989-12-01
Because of the need for having efficient scheduling algorithms in large scale real time systems , software engineers put a lot of effort on developing...provide static schedulers for he Embedded Real Time Systems with single processor using Ada programming language. The independent nonpreemptable...support the Computer Aided Rapid Prototyping for Embedded Real Time Systems so that we determine whether the system, as designed, meets the required
Tornado Recovery Ongoing at NASA’s Michoud Assembly Facility, New Orleans LA
2017-02-07
Teams at NASA’s Michoud Assembly Facility in New Orleans are continuing with recovery efforts following a tornado strike at the facility Tuesday, Feb. 7. Michoud remains closed to all but security and emergency operations crews. For more than half a century, Michoud has been the space agency’s premiere site for manufacturing and assembly of large-scale space structures and systems.
Findings from the First & Only National Data Base on Elemiddle & Middle Schools (Executive Summary)
ERIC Educational Resources Information Center
Hough, David L.
2009-01-01
The study presented here is the first large scale effort on a national level to examine the relationship between K-8 Elemiddle Schools and 6-8 Middle Schools. From a population of more than 2,000 middle grades schools in 49 public school districts across 26 states, a sample of 542 Elemiddle and 506 Middle Schools was drawn. Both regression and…
Christopher Woodall; James Westfall
2009-01-01
Live tree size-density relationships in forests have long provided a framework for understanding stand dynamics. There has been little examination of the relationship between the size-density attributes of live and standing/down dead trees (e.g., number and mean tree size per unit area, such information could help in large-scale efforts to estimate dead wood resources...
Larry E. Laing; David Gori; James T. Jones
2005-01-01
The multi-partner Greater Huachuca Mountains fire planning effort involves over 500,000 acres of public and private lands. This large area supports distinct landscapes that have evolved with fire. Utilizing GIS as a tool, the United States Forest Service (USFS), General Ecosystem Survey (GES), and Natural Resources Conservation Service (NRCS) State Soil Geographic...
Intergovernmental Unity of Effort in Support of Biological Threat Prevention
2013-09-01
jurisdictional barriers (such as time delays in developing decisions and implementing large scale action) are tangible. Connecting the “dots” of awareness...groups are developing the capability and the intention to deliver biological weapons of mass destruction. Four coalitions of governments were studied...Intelligence sources from around the globe report that terrorist groups are developing the capability and the intention to deliver biological weapons
ERIC Educational Resources Information Center
Information Dynamics Corp., Reading, MA.
A five-year development program plan was drawn up for the Defense Documentation Center (DDC). This report presents in summary form the results of various surveys and reviews performed in selected areas of micrographics to support the efforts of the program's planners. Exhibits of supporting documentation are presented, together with a discussion…
Iraq: Recent Developments in Reconstruction Assistance
2005-01-27
Developments in Reconstruction Assistance Summary Large-scale reconstruction assistance programs are being undertaken by the United States following the war...in grant aid and as much as $13.3 billion in possible loans. On June 28, 2004, the entity implementing assistance programs , the Coalition Provisional... programs are being undertaken by the United States in Iraq. This report describes recent developments in this assistance effort. The report will be updated
Iraq: Recent Developments in Reconstruction Assistance
2004-12-20
Developments in Reconstruction Assistance Summary Large-scale reconstruction assistance programs are being undertaken by the United States following the war...in grant aid and as much as $13.3 billion in possible loans. On June 28, 2004, the entity implementing assistance programs , the Coalition... programs are being undertaken by the United States in Iraq. This report describes recent developments in this assistance effort. The report will be
Status of liquid metal fast breeder reactor fuel development in Japan
NASA Astrophysics Data System (ADS)
Katsuragawa, M.; Kashihara, H.; Akebi, M.
1993-09-01
The mixed-oxide fuel technology for a liquid metal fast breeder reactor (LMFBR) in Japan is progressing toward commercial deployment of LMFBR. Based on accumulated experience in Joyo and Monju fuel development, efforts for large scale LMFBR fuel development are devoted to improved irradiation performance, reliability and economy. This paper summarizes accomplishments, current activities and future plans for LMFBR fuel development in Japan.
RELIABILITY, AVAILABILITY, AND SERVICEABILITY FOR PETASCALE HIGH-END COMPUTING AND BEYOND
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chokchai "Box" Leangsuksun
2011-05-31
Our project is a multi-institutional research effort that adopts interplay of RELIABILITY, AVAILABILITY, and SERVICEABILITY (RAS) aspects for solving resilience issues in highend scientific computing in the next generation of supercomputers. results lie in the following tracks: Failure prediction in a large scale HPC; Investigate reliability issues and mitigation techniques including in GPGPU-based HPC system; HPC resilience runtime & tools.
ERIC Educational Resources Information Center
King, Gary; Gakidou, Emmanuela; Ravishankar, Nirmala; Moore, Ryan T.; Lakin, Jason; Vargas, Manett; Tellez-Rojo, Martha Maria; Avila, Juan Eugenio Hernandez; Avila, Mauricio Hernandez; Llamas, Hector Hernandez
2007-01-01
We develop an approach to conducting large-scale randomized public policy experiments intended to be more robust to the political interventions that have ruined some or all parts of many similar previous efforts. Our proposed design is insulated from selection bias in some circumstances even if we lose observations; our inferences can still be…
Troy E. Hall; Jennifer O. Farnum; Terry C. Slider; Kathy Ludlow
2009-01-01
This report chronicles a large-scale effort to map place values across the Pacific Northwest Region (Washington and Oregon) of the U.S. Forest Service. Through workshops held with Forest Service staff, 485 socioculturally meaningful places were identified. Staff also generated corresponding descriptions of the placesâ unique social and biophysical elementsâin other...
Soldier Data Tag Study Effort.
1985-06-10
interested in protecting it. The tag itself is difficult--though not impossible--to counterfeit . Also, it (’• iii 71 -, potentially improves the data...attacks during the design, manufacture, and distribution processes, counterfeiting , unauthorized access/alteration of tag data, and use of the tag to...45 3.3.2 Hijacking of SOT System Shipments, or Large- Scale Counterfeit of SOT Systems ....................... 46 3.3.3 Unauthorized Alteration
Dawn M. Lawson; Jesse A. Giessow; Jason H. Giessow
2005-01-01
A large-scale effort to control the aggressively invasive exotic species Arundo donax in the Santa Margarita River watershed in Californiaâs south coast ecoregion was initiated in 1997. The project was prompted by the need for Marine Corps Base Camp Pendleton to address impacts to habitat for federally-listed endangered species and wetlands regulated...
ERIC Educational Resources Information Center
Seitsinger, Anne M.; Felner, Robert D.; Brand, Stephen; Burns, Amy
2008-01-01
As schools move forward with comprehensive school reform, parents' roles have shifted and been redefined. Parent-teacher communication is critical to student success, yet how schools and teachers contact parents is the subject of few studies. Evaluations of school-change efforts require reliable and useful measures of teachers' practices in…
ERIC Educational Resources Information Center
Gogia, Laura Park
2016-01-01
Virginia Commonwealth University (VCU) is implementing a large scale exploration of digital pedagogies, including connected learning and open education, in an effort to promote digital fluency and integrative thinking among students. The purpose of this study was to develop a classroom assessment toolkit for faculty who wish to document student…
WikiPEATia - a web based platform for assembling peatland data through ‘crowd sourcing’
NASA Astrophysics Data System (ADS)
Wisser, D.; Glidden, S.; Fieseher, C.; Treat, C. C.; Routhier, M.; Frolking, S. E.
2009-12-01
The Earth System Science community is realizing that peatlands are an important and unique terrestrial ecosystem that has not yet been well-integrated into large-scale earth system analyses. A major hurdle is the lack of accessible, geospatial data of peatland distribution, coupled with data on peatland properties (e.g., vegetation composition, peat depth, basal dates, soil chemistry, peatland class) at the global scale. This data, however, is available at the local scale. Although a comprehensive global database on peatlands probably lags similar data on more economically important ecosystems such as forests, grasslands, croplands, a large amount of field data have been collected over the past several decades. A few efforts have been made to map peatlands at large scales but existing data have not been assembled into a single geospatial database that is publicly accessible or do not depict data with a level of detail that is needed in the Earth System Science Community. A global peatland database would contribute to advances in a number of research fields such as hydrology, vegetation and ecosystem modeling, permafrost modeling, and earth system modeling. We present a Web 2.0 approach that uses state-of-the-art webserver and innovative online mapping technologies and is designed to create such a global database through ‘crowd-sourcing’. Primary functions of the online system include form-driven textual user input of peatland research metadata, spatial data input of peatland areas via a mapping interface, database editing and querying editing capabilities, as well as advanced visualization and data analysis tools. WikiPEATia provides an integrated information technology platform for assembling, integrating, and posting peatland-related geospatial datasets facilitates and encourages research community involvement. A successful effort will make existing peatland data much more useful to the research community, and will help to identify significant data gaps.
Schwartz, Mark D.; Beaubien, Elisabeth G.; Crimmins, Theresa M.; Weltzin, Jake F.; Edited by Schwartz, Mark D.
2013-01-01
Plant phenological observations and networks in North America have been largely local and regional in extent until recent decades. In the USA, cloned plant monitoring networks were the exception to this pattern, with data collection spanning the late 1950s until approximately the early 1990s. Animal observation networks, especially for birds have been more extensive. The USA National Phenology Network (USA-NPN), established in the mid-2000s is a recent effort to operate a comprehensive national-scale network in the United States. In Canada, PlantWatch, as part of Nature Watch, is the current national-scale plant phenology program.
NASA advanced turboprop research and concept validation program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlow, J.B. Jr.; Sievers, G.K.
1988-01-01
NASA has determined by experimental and analytical effort that use of advanced turboprop propulsion instead of the conventional turbofans in the older narrow-body airline fleet could reduce fuel consumption for this type of aircraft by up to 50 percent. In cooperation with industry, NASA has defined and implemented an Advanced Turboprop (ATP) program to develop and validate the technology required for these new high-speed, multibladed, thin, swept propeller concepts. This paper presents an overview of the analysis, model-scale test, and large-scale flight test elements of the program together with preliminary test results, as available.
Aguilera, Stacy E.; Cole, Jennifer; Finkbeiner, Elena M.; Le Cornu, Elodie; Ban, Natalie C.; Carr, Mark H.; Cinner, Joshua E.; Crowder, Larry B.; Gelcich, Stefan; Hicks, Christina C.; Kittinger, John N.; Martone, Rebecca; Malone, Daniel; Pomeroy, Carrie; Starr, Richard M.; Seram, Sanah; Zuercher, Rachel; Broad, Kenneth
2015-01-01
Globally, small-scale fisheries are influenced by dynamic climate, governance, and market drivers, which present social and ecological challenges and opportunities. It is difficult to manage fisheries adaptively for fluctuating drivers, except to allow participants to shift effort among multiple fisheries. Adapting to changing conditions allows small-scale fishery participants to survive economic and environmental disturbances and benefit from optimal conditions. This study explores the relative influence of large-scale drivers on shifts in effort and outcomes among three closely linked fisheries in Monterey Bay since the Magnuson-Stevens Fisheries Conservation and Management Act of 1976. In this region, Pacific sardine (Sardinops sagax), northern anchovy (Engraulis mordax), and market squid (Loligo opalescens) fisheries comprise a tightly linked system where shifting focus among fisheries is a key element to adaptive capacity and reduced social and ecological vulnerability. Using a cluster analysis of landings, we identify four modes from 1974 to 2012 that are dominated (i.e., a given species accounting for the plurality of landings) by squid, sardine, anchovy, or lack any dominance, and seven points of transition among these periods. This approach enables us to determine which drivers are associated with each mode and each transition. Overall, we show that market and climate drivers are predominantly attributed to dominance transitions. Model selection of external drivers indicates that governance phases, reflected as perceived abundance, dictate long-term outcomes. Our findings suggest that globally, small-scale fishery managers should consider enabling shifts in effort among fisheries and retaining existing flexibility, as adaptive capacity is a critical determinant for social and ecological resilience. PMID:25790464
Peoples, Shelagh M; O'Dwyer, Laura M; Shields, Katherine A; Wang, Yang
2013-01-01
This research describes the development process, psychometric analyses and part validation study of a theoretically-grounded Rasch-based instrument, the Nature of Science Instrument-Elementary (NOSI-E). The NOSI-E was designed to measure elementary students' understanding of the Nature of Science (NOS). Evidence is provided for three of the six validity aspects (content, substantive and generalizability) needed to support the construct validity of the NOSI-E. A future article will examine the structural and external validity aspects. Rasch modeling proved especially productive in scale improvement efforts. The instrument, designed for large-scale assessment use, is conceptualized using five construct domains. Data from 741 elementary students were used to pilot the Rasch scale, with continuous improvements made over three successive administrations. The psychometric properties of the NOSI-E instrument are consistent with the basic assumptions of Rasch measurement, namely that the items are well-fitting and invariant. Items from each of the five domains (Empirical, Theory-Laden, Certainty, Inventive, and Socially and Culturally Embedded) are spread along the scale's continuum and appear to overlap well. Most importantly, the scale seems appropriately calibrated and responsive for elementary school-aged children, the target age group. As a result, the NOSI-E should prove beneficial for science education research. As the United States' science education reform efforts move toward students' learning science through engaging in authentic scientific practices (NRC, 2011), it will be important to assess whether this new approach to teaching science is effective. The NOSI-E can be used as one measure of whether this reform effort has an impact.
Topical report on sources and systems for aquatic plant biomass as an energy resource
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldman, J.C.; Ryther, J.H.; Waaland, R.
1977-10-21
Background information is documented on the mass cultivation of aquatic plants and systems design that is available from the literature and through consultation with active research scientists and engineers. The biology of microalgae, macroalgae, and aquatic angiosperms is discussed in terms of morphology, life history, mode of existence, and ecological significance, as they relate to cultivation. The requirements for growth of these plants, which are outlined in the test, suggest that productivity rates are dependent primarily on the availability of light and nutrients. It is concluded that the systems should be run with an excess of nutrients and with lightmore » as the limiting factor. A historical review of the mass cultivation of aquatic plants describes the techniques used in commercial large-scale operations throughout the world and recent small-scale research efforts. This review presents information on the biomass yields that have been attained to date in various geographical locations with different plant species and culture conditions, emphasizing the contrast between high yields in small-scale operations and lower yields in large-scale operations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crater, Jason; Galleher, Connor; Lievense, Jeff
NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integratedmore » black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.« less
Tracking a head-mounted display in a room-sized environment with head-mounted cameras
NASA Astrophysics Data System (ADS)
Wang, Jih-Fang; Azuma, Ronald T.; Bishop, Gary; Chi, Vernon; Eyles, John; Fuchs, Henry
1990-10-01
This paper presents our efforts to accurately track a Head-Mounted Display (HMD) in a large environment. We review our current benchtop prototype (introduced in {WCF9O]), then describe our plans for building the full-scale system. Both systems use an inside-oui optical tracking scheme, where lateraleffect photodiodes mounted on the user's helmet view flashing infrared beacons placed in the environment. Church's method uses the measured 2D image positions and the known 3D beacon locations to recover the 3D position and orientation of the helmet in real-time. We discuss the implementation and performance of the benchtop prototype. The full-scale system design includes ceiling panels that hold the infrared beacons and a new sensor arrangement of two photodiodes with holographic lenses. In the full-scale system, the user can walk almost anywhere under the grid of ceiling panels, making the working volume nearly as large as the room.
Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach
Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.
2016-01-01
Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048
Using large-scale genome variation cohorts to decipher the molecular mechanism of cancer.
Habermann, Nina; Mardin, Balca R; Yakneen, Sergei; Korbel, Jan O
2016-01-01
Characterizing genomic structural variations (SVs) in the human genome remains challenging, and there is a growing interest to understand somatic SVs occurring in cancer, a disease of the genome. A havoc-causing SV process known as chromothripsis scars the genome when localized chromosome shattering and repair occur in a one-off catastrophe. Recent efforts led to the development of a set of conceptual criteria for the inference of chromothripsis events in cancer genomes and to the development of experimental model systems for studying this striking DNA alteration process in vitro. We discuss these approaches, and additionally touch upon current "Big Data" efforts that employ hybrid cloud computing to enable studies of numerous cancer genomes in an effort to search for commonalities and differences in molecular DNA alteration processes in cancer. Copyright © 2016. Published by Elsevier SAS.
VME rollback hardware for time warp multiprocessor systems
NASA Technical Reports Server (NTRS)
Robb, Michael J.; Buzzell, Calvin A.
1992-01-01
The purpose of the research effort is to develop and demonstrate innovative hardware to implement specific rollback and timing functions required for efficient queue management and precision timekeeping in multiprocessor discrete event simulations. The previously completed phase 1 effort demonstrated the technical feasibility of building hardware modules which eliminate the state saving overhead of the Time Warp paradigm used in distributed simulations on multiprocessor systems. The current phase 2 effort will build multiple pre-production rollback hardware modules integrated with a network of Sun workstations, and the integrated system will be tested by executing a Time Warp simulation. The rollback hardware will be designed to interface with the greatest number of multiprocessor systems possible. The authors believe that the rollback hardware will provide for significant speedup of large scale discrete event simulation problems and allow multiprocessors using Time Warp to dramatically increase performance.
Nagel, Corey L; Kirby, Miles A; Zambrano, Laura D; Rosa, Ghislane; Barstow, Christina K; Thomas, Evan A; Clasen, Thomas F
2016-12-15
In Rwanda, pneumonia and diarrhea are the first and second leading causes of death, respectively, among children under five. Household air pollution (HAP) resultant from cooking indoors with biomass fuels on traditional stoves is a significant risk factor for pneumonia, while consumption of contaminated drinking water is a primary cause of diarrheal disease. To date, there have been no large-scale effectiveness trials of programmatic efforts to provide either improved cookstoves or household water filters at scale in a low-income country. In this paper we describe the design of a cluster-randomized trial to evaluate the impact of a national-level program to distribute and promote the use of improved cookstoves and advanced water filters to the poorest quarter of households in Rwanda. We randomly allocated 72 sectors (administratively defined units) in Western Province to the intervention, with the remaining 24 sectors in the province serving as controls. In the intervention sectors, roughly 100,000 households received improved cookstoves and household water filters through a government-sponsored program targeting the poorest quarter of households nationally. The primary outcome measures are the incidence of acute respiratory infection (ARI) and diarrhea among children under five years of age. Over a one-year surveillance period, all cases of acute respiratory infection (ARI) and diarrhea identified by health workers in the study area will be extracted from records maintained at health facilities and by community health workers (CHW). In addition, we are conducting intensive, longitudinal data collection among a random sample of households in the study area for in-depth assessment of coverage, use, environmental exposures, and additional health measures. Although previous research has examined the impact of providing household water treatment and improved cookstoves on child health, there have been no studies of national-level programs to deliver these interventions at scale in a developing country. The results of this study, the first RCT of a large-scale programmatic cookstove or household water filter intervention, will inform global efforts to reduce childhood morbidity and mortality from diarrheal disease and pneumonia. This trial is registered at Clinicaltrials.gov (NCT02239250).
Study of Travelling Interplanetary Phenomena Report
NASA Astrophysics Data System (ADS)
Dryer, Murray
1987-09-01
Scientific progress on the topic of energy, mass, and momentum transport from the Sun into the heliosphere is contingent upon interdisciplinary and international cooperative efforts on the part of many workers. Summarized here is a report of some highlights of research carried out during the SMY/SMA by the STIP (Study of Travelling Interplanetary Phenomena) Project that included solar and interplanetary scientists around the world. These highlights are concerned with coronal mass ejections from solar flares or erupting prominences (sometimes together); their large-scale consequences in interplanetary space (such as shocks and magnetic 'bubbles'); and energetic particles and their relationship to these large-scale structures. It is concluded that future progress is contingent upon similar international programs assisted by real-time (or near-real-time) warnings of solar activity by cooperating agencies along the lines experienced during the SMY/SMA.
NASA/FAA general aviation crash dynamics program - An update
NASA Technical Reports Server (NTRS)
Hayduk, R. J.; Thomson, R. G.; Carden, H. D.
1979-01-01
Work in progress in the NASA/FAA General Aviation Crash Dynamics Program for the development of technology for increased crash-worthiness and occupant survivability of general aviation aircraft is presented. Full-scale crash testing facilities and procedures are outlined, and a chronological summary of full-scale tests conducted and planned is presented. The Plastic and Large Deflection Analysis of Nonlinear Structures and Modified Seat Occupant Model for Light Aircraft computer programs which form part of the effort to predict nonlinear geometric and material behavior of sheet-stringer aircraft structures subjected to large deformations are described, and excellent agreement between simulations and experiments is noted. The development of structural concepts to attenuate the load transmitted to the passenger through the seats and subfloor structure is discussed, and an apparatus built to test emergency locator transmitters in a realistic environment is presented.
Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)
NASA Technical Reports Server (NTRS)
Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David
2012-01-01
With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind tunnel model design would be completed, manufactured, and calibrated. During the fifth year the large scale wind tunnel test was conducted. This technical memo will describe all phases of the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) project and provide a brief summary of the background and modeling efforts involved in the NRA. The conceptual designs considered for this project and the decision process for the selected configuration adapted for a wind tunnel model will be briefly discussed. The internal configuration of AMELIA, and the internal measurements chosen in order to satisfy the requirements of obtaining a database of experimental data to be used for future computational model validations. The external experimental techniques that were employed during the test, along with the large-scale wind tunnel test facility are covered in great detail. Experimental measurements in the database include forces and moments, and surface pressure distributions, local skin friction measurements, boundary and shear layer velocity profiles, far-field acoustic data and noise signatures from turbofan propulsion simulators. Results and discussion of the circulation control performance, over-the-wing mounted engines, and the combined performance are also discussed in great detail.
VIDAL, OMAR; LÓPEZ-GARCÍA, JOSÉ; RENDÓN-SALINAS, EDUARDO
2014-01-01
We used aerial photographs, satellite images, and field surveys to monitor forest cover in the core zones of the Monarch Butterfly Biosphere Reserve in Mexico from 2001 to 2012. We used our data to assess the effectiveness of conservation actions that involved local, state, and federal authorities and community members (e.g., local landowners and private and civil organizations) in one of the world’s most iconic protected areas. From 2001 through 2012, 1254 ha were deforested (i.e., cleared areas had <10% canopy cover), 925 ha were degraded (i.e., areas for which canopy forest decreased), and 122 ha were affected by climatic conditions. Of the total 2179 ha of affected area, 2057 ha were affected by illegal logging: 1503 ha by large-scale logging and 554 ha by small-scale logging. Mexican authorities effectively enforced efforts to protect the monarch reserve, particularly from 2007 to 2012. Those efforts, together with the decade-long financial support from Mexican and international philanthropists and businesses to create local alternative-income generation and employment, resulted in the decrease of large-scale illegal logging from 731 ha affected in 2005–2007 to none affected in 2012, although small-scale logging is of growing concern. However, dire regional social and economic problems remain, and they must be addressed to ensure the reserve’s long-term conservation. The monarch butterfly (Danaus plexippus) overwintering colonies in Mexico—which engage in one of the longest known insect migrations—are threatened by deforestation, and a multistakeholder, regional, sustainable-development strategy is needed to protect the reserve. PMID:24001209
Aerodynamic Effects of Simulated Ice Accretion on a Generic Transport Model
NASA Technical Reports Server (NTRS)
Broeren, Andy P.; Lee, Sam; Shah, Gautam H.; Murphy, Patrick C.
2012-01-01
An experimental research effort was begun to develop a database of airplane aerodynamic characteristics with simulated ice accretion over a large range of incidence and sideslip angles. Wind-tunnel testing was performed at the NASA Langley 12-ft Low-Speed Wind Tunnel using a 3.5 percent scale model of the NASA Langley Generic Transport Model. Aerodynamic data were acquired from a six-component force and moment balance in static-model sweeps from alpha = -5deg to 85deg and beta = -45 deg to 45 deg at a Reynolds number of 0.24 x10(exp 6) and Mach number of 0.06. The 3.5 percent scale GTM was tested in both the clean configuration and with full-span artificial ice shapes attached to the leading edges of the wing, horizontal and vertical tail. Aerodynamic results for the clean airplane configuration compared favorably with similar experiments carried out on a 5.5 percent scale GTM. The addition of the large, glaze-horn type ice shapes did result in an increase in airplane drag coefficient but had little effect on the lift and pitching moment. The lateral-directional characteristics showed mixed results with a small effect of the ice shapes observed in some cases. The flow visualization images revealed the presence and evolution of a spanwise-running vortex on the wing that was the dominant feature of the flowfield for both clean and iced configurations. The lack of ice-induced performance and flowfield effects observed in this effort was likely due to Reynolds number effects for the clean configuration. Estimates of full-scale baseline performance were included in this analysis to illustrate the potential icing effects.
Vidal, Omar; López-García, José; Rendón-Salinas, Eduardo
2014-02-01
We used aerial photographs, satellite images, and field surveys to monitor forest cover in the core zones of the Monarch Butterfly Biosphere Reserve in Mexico from 2001 to 2012. We used our data to assess the effectiveness of conservation actions that involved local, state, and federal authorities and community members (e.g., local landowners and private and civil organizations) in one of the world's most iconic protected areas. From 2001 through 2012, 1254 ha were deforested (i.e., cleared areas had <10% canopy cover), 925 ha were degraded (i.e., areas for which canopy forest decreased), and 122 ha were affected by climatic conditions. Of the total 2179 ha of affected area, 2057 ha were affected by illegal logging: 1503 ha by large-scale logging and 554 ha by small-scale logging. Mexican authorities effectively enforced efforts to protect the monarch reserve, particularly from 2007 to 2012. Those efforts, together with the decade-long financial support from Mexican and international philanthropists and businesses to create local alternative-income generation and employment, resulted in the decrease of large-scale illegal logging from 731 ha affected in 2005-2007 to none affected in 2012, although small-scale logging is of growing concern. However, dire regional social and economic problems remain, and they must be addressed to ensure the reserve's long-term conservation. The monarch butterfly (Danaus plexippus) overwintering colonies in Mexico-which engage in one of the longest known insect migrations-are threatened by deforestation, and a multistakeholder, regional, sustainable-development strategy is needed to protect the reserve. © 2013 Society for Conservation Biology.
Conner, William H.; Krauss, Ken W.; Shaffer, Gary P.; Stanturf, John A.; Madsen, Palle; Lamb, David
2012-01-01
Freshwater forested wetlands commonly occur in the lower Coastal Plain of the southeastern US with baldcypress (Taxodium distichum [L.] L.C. Rich.) and water tupelo (Nyssa aquatica L.) often being the dominant trees. Extensive anthropogenic activities combined with eustatic sea-level rise and land subsidence have caused widespread hydrological changes in many of these forests. In addition, hurricanes (a common, although aperiodic occurrence) cause wide-spread damage from wind and storm surge events, with impacts exacerbated by human-mediated coastal modifications (e.g., dredging, navigation channels, etc.). Restoration of forested wetlands in coastal areas is important because emergent canopies can greatly diminish wind penetration, thereby reducing the wind stress available to generate surface waves and storm surge that are the major cause of damage to coastal ecosystems and their surrounding communities. While there is an overall paucity of large-scale restoration efforts within coastal forested wetlands of the southeastern US, we have determined important characteristics that should drive future efforts. Restoration efforts may be enhanced considerably if coupled with hydrological enhancement, such as freshwater, sediment, or sewage wastewater diversions. Large-scale restoration of coastal forests should be attempted to create a landscape capable of minimizing storm impacts and maximizing wetland sustainability in the face of climate change. Planting is the preferred regeneration method in many forested wetland sites because hydrological alterations have increased flooding, and planted seedlings must be protected from herbivory to enhance establishment. Programs identifying salt tolerance in coastal forest tree species need to be continued to help increase resilience to repetitive storm surge events.
Development of a Large Scale, High Speed Wheel Test Facility
NASA Technical Reports Server (NTRS)
Kondoleon, Anthony; Seltzer, Donald; Thornton, Richard; Thompson, Marc
1996-01-01
Draper Laboratory, with its internal research and development budget, has for the past two years been funding a joint effort with the Massachusetts Institute of Technology (MIT) for the development of a large scale, high speed wheel test facility. This facility was developed to perform experiments and carry out evaluations on levitation and propulsion designs for MagLev systems currently under consideration. The facility was developed to rotate a large (2 meter) wheel which could operate with peripheral speeds of greater than 100 meters/second. The rim of the wheel was constructed of a non-magnetic, non-conductive composite material to avoid the generation of errors from spurious forces. A sensor package containing a multi-axis force and torque sensor mounted to the base of the station, provides a signal of the lift and drag forces on the package being tested. Position tables mounted on the station allow for the introduction of errors in real time. A computer controlled data acquisition system was developed around a Macintosh IIfx to record the test data and control the speed of the wheel. This paper describes the development of this test facility. A detailed description of the major components is presented. Recently completed tests carried out on a novel Electrodynamic (EDS) suspension system, developed by MIT as part of this joint effort are described and presented. Adaptation of this facility for linear motor and other propulsion and levitation testing is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiszpanski, Anna M.
Metamaterials are composites with patterned subwavelength features where the choice of materials and subwavelength structuring bestows upon the metamaterials unique optical properties not found in nature, thereby enabling optical applications previously considered impossible. However, because the structure of optical metamaterials must be subwavelength, metamaterials operating at visible wavelengths require features on the order of 100 nm or smaller, and such resolution typically requires top-down lithographic fabrication techniques that are not easily scaled to device-relevant areas that are square centimeters in size. In this project, we developed a new fabrication route using block copolymers to make over large device-relevant areas opticalmore » metamaterials that operate at visible wavelengths. Our structures are smaller in size (sub-100 nm) and cover a larger area (cm 2) than what has been achieved with traditional nanofabrication routes. To guide our experimental efforts, we developed an algorithm to calculate the expected optical properties (specifically the index of refraction) of such metamaterials that predicts that we can achieve surprisingly large changes in optical properties with small changes in metamaterials’ structure. In the course of our work, we also found that the ordered metal nanowires meshes produced by our scalable fabrication route for making optical metamaterials may also possibly act as transparent electrodes, which are needed in electrical displays and solar cells. We explored the ordered metal nanowires meshes’ utility for this application and developed design guidelines to aide our experimental efforts.« less
C.W. Woodall; J.A. Westfall
2009-01-01
There has been little examination of the relationship between the stocking of live trees in forests and the associated attributes of dead tree resources which could inform large-scale efforts to estimate and manage deadwood resources. The goal of this study was to examine the relationships between the stocking of standing live trees and attributes of standing dead and...
Colin M. Beier; Amy Lauren Lovecraft; F. Stuart Chapin
2009-01-01
Large-scale government efforts to develop resources for societal benefit have often experienced cycles of growth and decline that leave behind difficult social and ecological legacies. To understand the origins and outcomes of these failures of resource governance, scholars have applied the framework of the adaptive cycle. In this study, we used the adaptive cycle as a...
ASEAN and Indochina: A Strategy for Regional Stability in the 1980’s.
1984-12-01
regional powers. The resultant regional balance of power is precarious, unstable, and ever threatens to deteriorate into armed conflict. The unequal...an armed force that is capable of large scale defense. Ironically, while ostensibly defensively motivated, these efforts have resulted in a war machine... resulted in the atmosphere of tense uncertainty in Southeast Asia today. In contrast to Vietnamese motivations for their force -- structure, the other
Computational Omics Pre-Awardees | Office of Cancer Clinical Proteomics Research
The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the pre-awardees of the Computational Omics solicitation. Working with NVIDIA Foundation's Compute the Cure initiative and Leidos Biomedical Research Inc., the NCI, through this solicitation, seeks to leverage computational efforts to provide tools for the mining and interpretation of large-scale publicly available ‘omics’ datasets.
ERIC Educational Resources Information Center
Baharev, Zulejka
2016-01-01
At the start of the 21st century large scale educational initiatives reshaped the landscape of general education setting rigorous academic expectations to all students. Despite the legal efforts to improve K-12 education, an abundance of research indicates that students entering college often lack basic learning and study skills. For adolescents…
ERIC Educational Resources Information Center
Griffiths, Rebecca; Mulhern, Christine; Spies, Richard; Chingos, Matthew
2015-01-01
To address the paucity of data on the use of MOOCs in "traditional" postsecondary institutions, Ithaka S+R and the University System of Maryland studied the feasibility of repurposing MOOCs for use in hybrid, credit-bearing courses. In this paper we will describe the design of a large-scale study undertaken to examine the use of MOOCs in…
Peace Operations in Mali: Theory into Practice Then Measuring Effectiveness
2017-06-09
community’s response along two broad lines of effort (LOE): Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a... Safe and Secure Environment , two objectives were measured. Objective #1 sought the Cessation of Large Scale Violence. Success was attained, as...Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a Safe and Secure Environment , two objectives were
Brian G. Tavernia; Mark D. Nelson; Michael E. Goerndt; Brian F. Walters; Chris Toney
2013-01-01
Large-scale and long-term habitat management plans are needed to maintain the diversity of habitat classes required by wildlife species. Planning efforts would benefit from assessments of potential climate and land-use change effects on habitats. We assessed climate and land-use driven changes in areas of closed- and open-canopy forest across the Northeast and Midwest...
Numerical Simulations of Subscale Wind Turbine Rotor Inboard Airfoils at Low Reynolds Number
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, Myra L.; Maniaci, David Charles; Resor, Brian R.
2015-04-01
New blade designs are planned to support future research campaigns at the SWiFT facility in Lubbock, Texas. The sub-scale blades will reproduce specific aerodynamic characteristics of utility-scale rotors. Reynolds numbers for megawatt-, utility-scale rotors are generally above 2-8 million. The thickness of inboard airfoils for these large rotors are typically as high as 35-40%. The thickness and the proximity to three-dimensional flow of these airfoils present design and analysis challenges, even at the full scale. However, more than a decade of experience with the airfoils in numerical simulation, in the wind tunnel, and in the field has generated confidence inmore » their performance. Reynolds number regimes for the sub-scale rotor are significantly lower for the inboard blade, ranging from 0.7 to 1 million. Performance of the thick airfoils in this regime is uncertain because of the lack of wind tunnel data and the inherent challenge associated with numerical simulations. This report documents efforts to determine the most capable analysis tools to support these simulations in an effort to improve understanding of the aerodynamic properties of thick airfoils in this Reynolds number regime. Numerical results from various codes of four airfoils are verified against previously published wind tunnel results where data at those Reynolds numbers are available. Results are then computed for other Reynolds numbers of interest.« less
Corporate funding and ideological polarization about climate change
Farrell, Justin
2016-01-01
Drawing on large-scale computational data and methods, this research demonstrates how polarization efforts are influenced by a patterned network of political and financial actors. These dynamics, which have been notoriously difficult to quantify, are illustrated here with a computational analysis of climate change politics in the United States. The comprehensive data include all individual and organizational actors in the climate change countermovement (164 organizations), as well as all written and verbal texts produced by this network between 1993–2013 (40,785 texts, more than 39 million words). Two main findings emerge. First, that organizations with corporate funding were more likely to have written and disseminated texts meant to polarize the climate change issue. Second, and more importantly, that corporate funding influences the actual thematic content of these polarization efforts, and the discursive prevalence of that thematic content over time. These findings provide new, and comprehensive, confirmation of dynamics long thought to be at the root of climate change politics and discourse. Beyond the specifics of climate change, this paper has important implications for understanding ideological polarization more generally, and the increasing role of private funding in determining why certain polarizing themes are created and amplified. Lastly, the paper suggests that future studies build on the novel approach taken here that integrates large-scale textual analysis with social networks. PMID:26598653
Helping organizations help others: organization development as a facilitator of social change.
Boyd, Neil M
2011-01-01
This article explores organization development (OD) interventions and their likelihood of increasing social change outcomes in public agencies. The central argument of this work is that public and nonprofit organizations can deliver better social outcomes by systematically engaging in OD interventions. An in-depth survey was conducted in 3 agencies of the Commonwealth of Pennsylvania at the end of the gubernatorial administration of Tom Ridge (1995-2002). During his administration, Governor Ridge led the agencies of Pennsylvania government through a large-scale change effort to improve the efficiency and effectiveness of service delivery to the citizens of the Commonwealth of Pennsylvania. The change effort was a remarkable event for the Commonwealth because no other governor in the history of the state had attempted to conceptualize and deliver a comprehensive large-scale change management initiative. The successes and setbacks served as a fertile context to shed light on the following research question: Do OD interventions increase the likelihood that public organizations will deliver better social outcomes? This question is important in that public organizations may need to engage in organization development activities to improve their internal operations, which in turn may help them provide exemplary social outcomes to those whom they serve. In short, organization development interventions might allow public organizations to help themselves to help others.
Corporate funding and ideological polarization about climate change.
Farrell, Justin
2016-01-05
Drawing on large-scale computational data and methods, this research demonstrates how polarization efforts are influenced by a patterned network of political and financial actors. These dynamics, which have been notoriously difficult to quantify, are illustrated here with a computational analysis of climate change politics in the United States. The comprehensive data include all individual and organizational actors in the climate change countermovement (164 organizations), as well as all written and verbal texts produced by this network between 1993-2013 (40,785 texts, more than 39 million words). Two main findings emerge. First, that organizations with corporate funding were more likely to have written and disseminated texts meant to polarize the climate change issue. Second, and more importantly, that corporate funding influences the actual thematic content of these polarization efforts, and the discursive prevalence of that thematic content over time. These findings provide new, and comprehensive, confirmation of dynamics long thought to be at the root of climate change politics and discourse. Beyond the specifics of climate change, this paper has important implications for understanding ideological polarization more generally, and the increasing role of private funding in determining why certain polarizing themes are created and amplified. Lastly, the paper suggests that future studies build on the novel approach taken here that integrates large-scale textual analysis with social networks.
Gaining and maintaining commitment to large-scale change in healthcare organizations.
Narine, L; Persaud, D D
2003-08-01
Healthcare administrators have sought to improve the quality of healthcare services by using organizational change as a lever. Unfortunately, evaluations of organizational change efforts in areas such as total quality management (TQM), continuous quality improvement (CQI), and organizational restructuring have indicated that these change programmes have not fulfilled their promise in improving service delivery. Furthermore, there are no easy answers as to why so many large-scale change programmes are unsuccessful. The aim of this analysis is to provide insights into practices that may be utilized to improve the chances of successful change management. It is proposed that in order to effect change, implementers must first gain commitment to the change. This is done by ensuring organizational readiness for change, surfacing dissatisfaction with the present state, communicating a clear vision of the proposed change, promoting participation in the change effort, and developing a clear and consistent communication plan. However gaining commitment is not enough. Many change programmes have been initially perceived as being successful but long-term success has been elusive. Therefore, maintaining commitment during the uncertainty associated with the transition period is imperative. This can be done by successfully managing the transition using action steps such as consolidating change using feedback mechanisms and making the change a permanent part of the organization's culture.
Impact resistant boron/aluminum composites for large fan blades
NASA Technical Reports Server (NTRS)
Oller, T. L.; Salemme, C. T.; Bowden, J. H.; Doble, G. S.; Melnyk, P.
1977-01-01
Blade-like specimens were subjected to static ballistic impact testing to determine their relative FOD impact resistance levels. It was determined that a plus or minus 15 deg layup exhibited good impact resistance. The design of a large solid boron/aluminum fan blade was conducted based on the FOD test results. The CF6 fan blade was used as a baseline for these design studies. The solid boron/aluminum fan blade design was used to fabricate two blades. This effort enabled the assessment of the scale up of existing blade manufacturing details for the fabrication of a large B/Al fan blade. Existing CF6 fan blade tooling was modified for use in fabricating these blades.
NASA Astrophysics Data System (ADS)
Dechant, B.; Ryu, Y.; Jiang, C.; Yang, K.
2017-12-01
Solar-induced chlorophyll fluorescence (SIF) is rapidly becoming an important tool to remotely estimate terrestrial gross primary productivity (GPP) at large spatial scales. Many findings, however, are based on empirical relationships between SIF and GPP that have been found to be dependent on plant functional types. Therefore, combining model-based analysis with observations is crucial to improve our understanding of SIF-GPP relationships. So far, most model-based results were based on SCOPE, a complex ecophysiological model with explicit description of canopy layers and a large number of parameters that may not be easily obtained reliably on large scales. Here, we report on our efforts to incorporate SIF into a two-big leaf (sun and shade) process-based model that is suitable for obtaining its inputs entirely from satellite products. We examine if the SIF-GPP relationships are consistent with the findings from SCOPE simulations and investigate if incorporation of the SIF signal into BESS can help improve GPP estimation. A case study in a rice paddy is presented.
Development of the Large-Scale Forcing Data to Support MC3E Cloud Modeling Studies
NASA Astrophysics Data System (ADS)
Xie, S.; Zhang, Y.
2011-12-01
The large-scale forcing fields (e.g., vertical velocity and advective tendencies) are required to run single-column and cloud-resolving models (SCMs/CRMs), which are the two key modeling frameworks widely used to link field data to climate model developments. In this study, we use an advanced objective analysis approach to derive the required forcing data from the soundings collected by the Midlatitude Continental Convective Cloud Experiment (MC3E) in support of its cloud modeling studies. MC3E is the latest major field campaign conducted during the period 22 April 2011 to 06 June 2011 in south-central Oklahoma through a joint effort between the DOE ARM program and the NASA Global Precipitation Measurement Program. One of its primary goals is to provide a comprehensive dataset that can be used to describe the large-scale environment of convective cloud systems and evaluate model cumulus parameterizations. The objective analysis used in this study is the constrained variational analysis method. A unique feature of this approach is the use of domain-averaged surface and top-of-the atmosphere (TOA) observations (e.g., precipitation and radiative and turbulent fluxes) as constraints to adjust atmospheric state variables from soundings by the smallest possible amount to conserve column-integrated mass, moisture, and static energy so that the final analysis data is dynamically and thermodynamically consistent. To address potential uncertainties in the surface observations, an ensemble forcing dataset will be developed. Multi-scale forcing will be also created for simulating various scale convective systems. At the meeting, we will provide more details about the forcing development and present some preliminary analysis of the characteristics of the large-scale forcing structures for several selected convective systems observed during MC3E.
A Pilot Study: Testing of the Psychological Conditions Scale Among Hospital Nurses.
Fountain, Donna M; Thomas-Hawkins, Charlotte
2016-11-01
The aim of this study was to test the reliability and validity of the Psychological Conditions Scale (PCS), a measure of drivers of engagement in hospital-based nurses. Research suggests drivers of engagement are positive links to patient, employee, and hospital outcomes. Although this scale has been used in other occupations, it has not been tested in nursing. A cross-sectional, methodological study using a convenience sample of 200 nurses in a large Magnet® hospital in New Jersey. Cronbach's α's ranged from .64 to .95. Principal components exploratory factor analysis with oblique rotation revealed that 13 items loaded unambiguously in 3 domains and explained 76% of the variance. Mean PCS scores ranged from 3.62 to 4.68 on a 5-point Likert scale. The scale is an adequate measure of drivers of engagement in hospital-based nurses. Leadership efforts to promote the facilitators of engagement are recommended.
NASA Astrophysics Data System (ADS)
Zhang, Daili
Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications are made for critical agents and are organized into logical rings. This architecture maintains clear guidelines for complexity decomposition and also increases the robustness of the whole system. Multiple Sectioned Dynamic Bayesian Networks (MSDBNs) as a distributed dynamic probabilistic inference engine, can be embedded into the control architecture to handle uncertainties of general large-scale complex systems. MSDBNs decomposes a large knowledge-based system into many agents. Each agent holds its partial perspective of a large problem domain by representing its knowledge as a Dynamic Bayesian Network (DBN). Each agent accesses local evidence from its corresponding local sensors and communicates with other agents through finite message passing. If the distributed agents can be organized into a tree structure, satisfying the running intersection property and d-sep set requirements, globally consistent inferences are achievable in a distributed way. By using different frequencies for local DBN agent belief updating and global system belief updating, it balances the communication cost with the global consistency of inferences. In this dissertation, a fully factorized Boyen-Koller (BK) approximation algorithm is used for local DBN agent belief updating, and the static Junction Forest Linkage Tree (JFLT) algorithm is used for global system belief updating. MSDBNs assume a static structure and a stable communication network for the whole system. However, for a real system, sub-Bayesian networks as nodes could be lost, and the communication network could be shut down due to partial damage in the system. Therefore, on-line and automatic MSDBNs structure formation is necessary for making robust state estimations and increasing survivability of the whole system. A Distributed Spanning Tree Optimization (DSTO) algorithm, a Distributed D-Sep Set Satisfaction (DDSSS) algorithm, and a Distributed Running Intersection Satisfaction (DRIS) algorithm are proposed in this dissertation. Combining these three distributed algorithms and a Distributed Belief Propagation (DBP) algorithm in MSDBNs makes state estimations robust to partial damage in the whole system. Combining the distributed control architecture design and the distributed inference engine design leads to a process of control system design for a general large-scale complex system. As applications of the proposed methodology, the control system design of a simplified ship chilled water system and a notional ship chilled water system have been demonstrated step by step. Simulation results not only show that the proposed methodology gives a clear guideline for control system design for general large-scale complex systems with dynamic and uncertain environment, but also indicate that the combination of MSDBNs and HyMABC can provide excellent performance for controlling general large-scale complex systems.
Applications of species accumulation curves in large-scale biological data analysis.
Deng, Chao; Daley, Timothy; Smith, Andrew D
2015-09-01
The species accumulation curve, or collector's curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45-63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k -mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible.
Applications of species accumulation curves in large-scale biological data analysis
Deng, Chao; Daley, Timothy; Smith, Andrew D
2016-01-01
The species accumulation curve, or collector’s curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45–63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k-mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible. PMID:27252899
Observing strategies for future solar facilities: the ATST test case
NASA Astrophysics Data System (ADS)
Uitenbroek, H.; Tritschler, A.
2012-12-01
Traditionally solar observations have been scheduled and performed very differently from night time efforts, in particular because we have been observing the Sun for a long time, requiring new combinations of observables to make progress, and because solar physics observations are often event driven on time scales of hours to days. With the proposal pressure that is expected for new large-aperture facilities, we can no longer afford the time spent on custom setups, and will have to rethink our scheduling and operations. We will discuss our efforts at Sac Peak in preparing for this new era, and outline the planned scheduling and operations planning for the ATST in particular.
Rocket Propulsion (RP) 21 Steering Committee Meeting - NASA Spacecraft Propulsion Update
NASA Technical Reports Server (NTRS)
Klem, Mark
2016-01-01
Lander Tech is three separate but synergistic efforts: Lunar CATALYST (Lunar Cargo Transportation and Landing by Soft Touchdown) Support U.S. industry led robotic lunar lander development via three public-private efforts. Support U.S. industry led robotic lunar lander development via three public-private partnerships. Infuse or transfer landing technologies into these public private partnerships. Advanced Exploration Systems-Automated Propellant Loading (APL) -Integrated Ground Operations. Demonstrate LH2 zero loss storage, loading and transfer operations via testing on a large scale in a relevant launch vehicle servicing environment. (KSC, GRC). Game Changing Technology-20 Kelvin -20 Watt Cryocooler Development of a Reverse Turbo-Brayton Cryocooler operating at 20 Kelvin with 20 Watts of refrigeration lift.
NASA Astrophysics Data System (ADS)
Lele, Sanjiva K.
2002-08-01
Funds were received in April 2001 under the Department of Defense DURIP program for construction of a 48 processor high performance computing cluster. This report details the hardware which was purchased and how it has been used to enable and enhance research activities directly supported by, and of interest to, the Air Force Office of Scientific Research and the Department of Defense. The report is divided into two major sections. The first section after this summary describes the computer cluster, its setup, and some cluster performance benchmark results. The second section explains ongoing research efforts which have benefited from the cluster hardware, and presents highlights of those efforts since installation of the cluster.
Antibacterial Drug Discovery: Some Assembly Required.
Tommasi, Rubén; Iyer, Ramkumar; Miller, Alita A
2018-05-11
Our limited understanding of the molecular basis for compound entry into and efflux out of Gram-negative bacteria is now recognized as a key bottleneck for the rational discovery of novel antibacterial compounds. Traditional, large-scale biochemical or target-agnostic phenotypic antibacterial screening efforts have, as a result, not been very fruitful. A main driver of this knowledge gap has been the historical lack of predictive cellular assays, tools, and models that provide structure-activity relationships to inform optimization of compound accumulation. A variety of recent approaches has recently been described to address this conundrum. This Perspective explores these approaches and considers ways in which their integration could successfully redirect antibacterial drug discovery efforts.
The life cycles of intense cyclonic and anticyclonic circulation systems observed over oceans
NASA Technical Reports Server (NTRS)
Smith, Phillip J.
1993-01-01
Full attention was now directed to the blocking case studies mentioned in previous reports. Coding and initial computational tests were completed for a North Atlantic blocking case that occurred in late October/early November 1985 and an upstream cyclone that developed rapidly 24 hours before block onset. This work is the subject of two papers accepted for presentation at the International Symposium on the Lifecycles of Extratropical Cyclones in Bergen, Norway, 27 June - 1 July 1994. This effort is currently highlighted by two features. The first is the extension of the Zwack-Okossi equation, originally formulated for the diagnosis of surface wave development, for application at any pressure level. The second is the separation of the basic large-scale analysis fields into synoptic-scale and planetary-scale components, using a two-dimensional Shapiro filter, and the corresponding partitioning of the Zwack-Okossi equation into synoptic-scale, planetary-scale, and synoptic/planetary-scale interaction terms. Preliminary tests suggest substantial contribution from the synoptic-scale and interaction terms.
Saura, Santiago; Rondinini, Carlo
2016-01-01
One of the biggest challenges in large-scale conservation is quantifying connectivity at broad geographic scales and for a large set of species. Because connectivity analyses can be computationally intensive, and the planning process quite complex when multiple taxa are involved, assessing connectivity at large spatial extents for many species turns to be often intractable. Such limitation results in that conducted assessments are often partial by focusing on a few key species only, or are generic by considering a range of dispersal distances and a fixed set of areas to connect that are not directly linked to the actual spatial distribution or mobility of particular species. By using a graph theory framework, here we propose an approach to reduce computational effort and effectively consider large assemblages of species in obtaining multi-species connectivity priorities. We demonstrate the potential of the approach by identifying defragmentation priorities in the Italian road network focusing on medium and large terrestrial mammals. We show that by combining probabilistic species graphs prior to conducting the network analysis (i) it is possible to analyse connectivity once for all species simultaneously, obtaining conservation or restoration priorities that apply for the entire species assemblage; and that (ii) those priorities are well aligned with the ones that would be obtained by aggregating the results of separate connectivity analysis for each of the individual species. This approach offers great opportunities to extend connectivity assessments to large assemblages of species and broad geographic scales. PMID:27768718
Job stress and temperaments in female nurses.
Kikuchi, Y; Nakaya, M; Ikeda, M; Takeda, M; Nishi, M
2013-03-01
According to previous studies, temperament predicts a large share of the variance in job stress. It may be necessary for mental health practitioners to offer intervention strategies in accordance with individual temperament. To investigate the relationship between job stress and temperament among nurses in a general hospital and to provide insight into personality traits influencing their mental or physical health. A questionnaire survey of nurses in a general hospital. Work stress was measured using the Japanese version of the Effort-Reward Imbalance (ERI) scale. Temperament was assessed by a Japanese version of Temperament Evaluation of Memphis, Pisa, Paris and San Diego-Autoquestionnaire (TEMPS-A). Hierarchical multiple regression analysis was used to determine the independent contribution of temperament to effort-reward ratio and over-commitment. Response rate was 48% (326/685). Temperament predicted part of the variance of the four ERI ratios (effort-reward ratio 26%; effort-esteem ratio 27%; effort-promotion ratio 26%; and effort-security ratio 18%) and also of over-commitment (38%). Depressive temperament influenced all four ERI ratios and over-commitment. Anxious temperament influenced only over-commitment. Nurses with depressive or anxious temperaments should be identified, monitored for signs of job stress and offered interventions to prevent adverse physical and mental effects.
Native fish conservation areas: A vision for large-scale conservation of native fish communities
Williams, Jack E.; Williams, Richard N.; Thurow, Russell F.; Elwell, Leah; Philipp, David P.; Harris, Fred A.; Kershner, Jeffrey L.; Martinez, Patrick J.; Miller, Dirk; Reeves, Gordon H.; Frissell, Christopher A.; Sedell, James R.
2011-01-01
The status of freshwater fishes continues to decline despite substantial conservation efforts to reverse this trend and recover threatened and endangered aquatic species. Lack of success is partially due to working at smaller spatial scales and focusing on habitats and species that are already degraded. Protecting entire watersheds and aquatic communities, which we term "native fish conservation areas" (NFCAs), would complement existing conservation efforts by protecting intact aquatic communities while allowing compatible uses. Four critical elements need to be met within a NFCA: (1) maintain processes that create habitat complexity, diversity, and connectivity; (2) nurture all of the life history stages of the fishes being protected; (3) include a long-term enough watershed to provide long-term persistence of native fish populations; and (4) provide management that is sustainable over time. We describe how a network of protected watersheds could be created that would anchor aquatic conservation needs in river basins across the country.
The Ramifications of Meddling with Systems Governed by Self-organized Critical Dynamics
NASA Astrophysics Data System (ADS)
Carreras, B. A.; Newman, D. E.; Dobson, I.
2002-12-01
Complex natural, well as man-made, systems often exhibit characteristics similar to those seen in self-organized critical (SOC) systems. The concept of self-organized criticality brings together ideas of self-organization of nonlinear dynamical systems with the often-observed near critical behavior of many natural phenomena. These phenomena exhibit self-similarities over extended ranges of spatial and temporal scales. In those systems, scale lengths may be described by fractal geometry and time scales that lead to 1/f-like power spectra. Natural applications include modeling the motion of tectonics plates, forest fires, magnetospheric dynamics, spin glass systems, and turbulent transport. In man-made systems, applications have included traffic dynamics, power and communications networks, and financial markets among many others. Simple cellular automata models such as the running sandpile model have been very useful in reproducing the complexity and characteristics of these systems. One characteristic property of the SOC systems is that they relax through what we call events. These events can happen over all scales of the system. Examples of these events are: earthquakes in the case of plate tectonic; fires in forest evolution extinction in the co evolution of biological species; and blackouts in power transmission systems. In a time-averaged sense, these systems are subcritical (that is, they lie in an average state that should not trigger any events) and the relaxation events happen intermittently. The time spent in a subcritical state relative to the time of the events varies from one system to another. For instance, the chance of finding a forest on fire is very low with the frequency of fires being on the order of one fire every few years and with many of these fires small and inconsequential. Very large fires happen over time periods of decades or even centuries. However, because of their consequences, these large but infrequent events are the important ones to understand, control and minimize. The main thrust of this research is to understand how and when global events occur in such systems when we apply mitigation techniques and how this impacts risk assessment. As sample systems we investigate both forest fire models and electrical power transmission network models, though the results are probably applicable to a wide variety of systems. It is found, perhaps counter intuitively, that apparently sensible attempts to mitigate failures in such complex systems can have adverse effects and therefore must be approached with care. The success of mitigation efforts in SOC systems is strongly influenced by the dynamics of the system. Unless the mitigation efforts alter the self-organization forces driving the system, the system will in general be pushed toward criticality. To alter those forces with mitigation efforts may be quite difficult because the forces are an intrinsic part of the system. Moreover, in many cases, efforts to mitigate small disruptions will increase the frequency of large disruptions. This occurs because the large and small disruptions are not independent but are strongly coupled by the dynamics. Before discussing this in the more complicated case of power systems, we will illustrate this phenomenon with a forest fire model.
Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.
Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128
Draut, Amy; Ritchie, Andrew C.
2015-01-01
Removal of two dams 32 m and 64 m high on the Elwha River, Washington, USA, provided the first opportunity to examine river response to a dam removal and controlled sediment influx on such a large scale. Although many recent river-restoration efforts have included dam removal, large dam removals have been rare enough that their physical and ecological effects remain poorly understood. New sedimentary deposits that formed during this multi-stage dam removal result from a unique, artificially created imbalance between fluvial sediment supply and transport capacity. River flows during dam removal were essentially natural and included no large floods in the first two years, while draining of the two reservoirs greatly increased the sediment supply available for fluvial transport. The resulting sedimentary deposits exhibited substantial spatial heterogeneity in thickness, stratal-formation patterns, grain size and organic content. Initial mud deposition in the first year of dam removal filled pore spaces in the pre-dam-removal cobble bed, potentially causing ecological disturbance but not aggrading the bed substantially at first. During the second winter of dam removal, thicker and in some cases coarser deposits replaced the early mud deposits. By 18 months into dam removal, channel-margin and floodplain deposits were commonly >0.5 m thick and, contrary to pre-dam-removal predictions that silt and clay would bypass the river system, included average mud content around 20%. Large wood and lenses of smaller organic particles were common in the new deposits, presumably contributing additional carbon and nutrients to the ecosystem downstream of the dam sites. Understanding initial sedimentary response to the Elwha River dam removals will inform subsequent analyses of longer-term sedimentary, geomorphic and ecosystem changes in this fluvial and coastal system, and will provide important lessons for other river-restoration efforts where large dam removal is planned or proposed.
Kalies, E L; Dickson, B G; Chambers, C L; Covington, W W
2012-01-01
In western North American conifer forests, wildfires are increasing in frequency and severity due to heavy fuel loads that have accumulated after a century of fire suppression. Forest restoration treatments (e.g., thinning and/or burning) are being designed and implemented at large spatial and temporal scales in an effort to reduce fire risk and restore forest structure and function. In ponderosa pine (Pinus ponderosa) forests, predominantly open forest structure and a frequent, low-severity fire regime constituted the evolutionary environment for wildlife that persisted for thousands of years. Small mammals are important in forest ecosystems as prey and in affecting primary production and decomposition. During 2006-2009, we trapped eight species of small mammals at 294 sites in northern Arizona and used occupancy modeling to determine community responses to thinning and habitat features. The most important covariates in predicting small mammal occupancy were understory vegetation cover, large snags, and treatment. Our analysis identified two generalist species found at relatively high occupancy rates across all sites, four open-forest species that responded positively to treatment, and two dense-forest species that responded negatively to treatment unless specific habitat features were retained. Our results indicate that all eight small mammal species can benefit from restoration treatments, particularly if aspects of their evolutionary environment (e.g., large trees, snags, woody debris) are restored. The occupancy modeling approach we used resulted in precise species-level estimates of occupancy in response to habitat attributes for a greater number of small mammal species than in other comparable studies. We recommend our approach for other studies faced with high variability and broad spatial and temporal scales in assessing impacts of treatments or habitat alteration on wildlife species. Moreover, since forest planning efforts are increasingly focusing on progressively larger treatment implementation, better and more efficiently obtained ecological information is needed to inform these efforts.
Large-scale water projects in the developing world: Revisiting the past and looking to the future
NASA Astrophysics Data System (ADS)
Sivakumar, Bellie; Chen, Ji
2014-05-01
During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning and management, with proper consideration to potential technological developments and new options. Finally, it highlights the urgent need for a broader framework that integrates the physical science-related aspects ("hard sciences") and the human science-related aspects ("soft sciences").
Kim, Nancy; Boone, Kyle B; Victor, Tara; Lu, Po; Keatinge, Carolyn; Mitchell, Cary
2010-08-01
Recently published practice standards recommend that multiple effort indicators be interspersed throughout neuropsychological evaluations to assess for response bias, which is most efficiently accomplished through use of effort indicators from standard cognitive tests already included in test batteries. The present study examined the utility of a timed recognition trial added to standard administration of the WAIS-III Digit Symbol subtest in a large sample of "real world" noncredible patients (n=82) as compared with credible neuropsychology clinic patients (n=89). Scores from the recognition trial were more sensitive in identifying poor effort than were standard Digit Symbol scores, and use of an equation incorporating Digit Symbol Age-Corrected Scaled Scores plus accuracy and time scores from the recognition trial was associated with nearly 80% sensitivity at 88.7% specificity. Thus, inclusion of a brief recognition trial to Digit Symbol administration has the potential to provide accurate assessment of response bias.
The role of the Department of Defense in PACS and telemedicine research and development.
Mogel, Greg T
2003-01-01
The United States Department of Defense (DOD) has played a leading role in the movement of digital imaging, picture archiving and communications systems, and more recently telemedicine with its associated technologies into the mainstream of healthcare. Beginning in the 1980s with domestic implementations, and followed in the 1990s by both small and large-scale military deployments, these technologies have been put into action with varying degrees of success. These efforts however, have always served as a guidepost for similar civilian efforts and the establishment of a marketplace for the technologies. This paper examines the history of the DOD's role in these areas, the projects and programs established, assessing their current state of development and identifying the future direction of the DOD's research and implementation efforts in telemedicine and advanced medical technologies. Copyright 2002 Published by Elsevier Science Ltd.
Yang, Dan; Fan, Da Yong; Xie, Zong Qiang; Zhang, Ai Ying; Xiong, Gao Ming; Zhao, Chang Ming; Xu, Wen Ting
2016-03-01
Riparian zone, the ecological transition buffer between terrestrial and aquatic ecosystems (rivers, lakes, reservoirs, wetlands, and other specific water bodies) with unique eco-hydrological and biogeochemical processes, is the last ecological barrier to prevent ammonium, nitrate and other non-point nitrogen pollutants from adjacent water bodies. Based on a summary of current progress of related studies, we found there were two major mechanisms underpinning the nitrogen retention/removal by the riparian ecosystems: 1) the relative locations of nitrogen in the soil-plant-atmosphere continuum system could be altered by riparian vegetation; 2) nitrogen could also be denitrified and then removed permanently by microorganisms in riparian soil. However, which process is more critical for the nitrogen removal remains elusive. Due to large variances of hydro-dynamic, vegetation, microbial, and soil substrate properties in nitrogen retention and transformation with various watersheds, it's difficult to identify which factor is the most important one driving nitrogen cycle in the riparian ecosystems. It is also found that the limitation of study methods, paucity of data at large spatial and temporal scale, and no consensus on the riparian width, are the three major reasons leading to large variances of the results among studies. In conclusion, it is suggested that further efforts should be focused on: 1) the detailed analysis on the successive environmental factors with long-term; 2) the application of a comprehensive method combining mathematical models, geographic information system, remote sensing and quantified technique (such as the coupled technique of the isotopic tracer and gas exchange measurement); 3) the implementation of studies at large temporal and spatial scales. It is sure that, these efforts can help to optimize the nitrogen removal pathways in the riparian ecosystems and provide scientific basis for ecosystem management.
NASA Astrophysics Data System (ADS)
Zorita, E.
2009-12-01
One of the objectives when comparing simulations of past climates to proxy-based climate reconstructions is to asses the skill of climate models to simulate climate change. This comparison may accomplished at large spatial scales, for instance the evolution of simulated and reconstructed Northern Hemisphere annual temperature, or at regional or point scales. In both approaches a 'fair' comparison has to take into account different aspects that affect the inevitable uncertainties and biases in the simulations and in the reconstructions. These efforts face a trade-off: climate models are believed to be more skillful at large hemispheric scales, but climate reconstructions are these scales are burdened by the spatial distribution of available proxies and by methodological issues surrounding the statistical method used to translate the proxy information into large-spatial averages. Furthermore, the internal climatic noise at large hemispheric scales is low, so that the sampling uncertainty tends to be also low. On the other hand, the skill of climate models at regional scales is limited by the coarse spatial resolution, which hinders a faithful representation of aspects important for the regional climate. At small spatial scales, the reconstruction of past climate probably faces less methodological problems if information from different proxies is available. The internal climatic variability at regional scales is, however, high. In this contribution some examples of the different issues faced when comparing simulation and reconstructions at small spatial scales in the past millennium are discussed. These examples comprise reconstructions from dendrochronological data and from historical documentary data in Europe and climate simulations with global and regional models. These examples indicate that the centennial climate variations can offer a reasonable target to assess the skill of global climate models and of proxy-based reconstructions, even at small spatial scales. However, as the focus shifts towards higher frequency variability, decadal or multidecadal, the need for larger simulation ensembles becomes more evident. Nevertheless,the comparison at these time scales may expose some lines of research on the origin of multidecadal regional climate variability.
Ambroggio, Xavier I; Dommer, Jennifer; Gopalan, Vivek; Dunham, Eleca J; Taubenberger, Jeffery K; Hurt, Darrell E
2013-06-18
Influenza A viruses possess RNA genomes that mutate frequently in response to immune pressures. The mutations in the hemagglutinin genes are particularly significant, as the hemagglutinin proteins mediate attachment and fusion to host cells, thereby influencing viral pathogenicity and species specificity. Large-scale influenza A genome sequencing efforts have been ongoing to understand past epidemics and pandemics and anticipate future outbreaks. Sequencing efforts thus far have generated nearly 9,000 distinct hemagglutinin amino acid sequences. Comparative models for all publicly available influenza A hemagglutinin protein sequences (8,769 to date) were generated using the Rosetta modeling suite. The C-alpha root mean square deviations between a randomly chosen test set of models and their crystallographic templates were less than 2 Å, suggesting that the modeling protocols yielded high-quality results. The models were compiled into an online resource, the Hemagglutinin Structure Prediction (HASP) server. The HASP server was designed as a scientific tool for researchers to visualize hemagglutinin protein sequences of interest in a three-dimensional context. With a built-in molecular viewer, hemagglutinin models can be compared side-by-side and navigated by a corresponding sequence alignment. The models and alignments can be downloaded for offline use and further analysis. The modeling protocols used in the HASP server scale well for large amounts of sequences and will keep pace with expanded sequencing efforts. The conservative approach to modeling and the intuitive search and visualization interfaces allow researchers to quickly analyze hemagglutinin sequences of interest in the context of the most highly related experimental structures, and allow them to directly compare hemagglutinin sequences to each other simultaneously in their two- and three-dimensional contexts. The models and methodology have shown utility in current research efforts and the ongoing aim of the HASP server is to continue to accelerate influenza A research and have a positive impact on global public health.
Assessing a Science Graduate School Recruitment Symposium.
González-Espada, Wilson; Díaz-Muñoz, Greetchen; Feliú-Mójer, Mónica; Flores-Otero, Jacqueline; Fortis-Santiago, Yaihara; Guerrero-Medina, Giovanna; López-Casillas, Marcos; Colón-Ramos, Daniel A; Fernández-Repollet, Emma
2015-12-01
Ciencia Puerto Rico, a non-profit organization dedicated to promoting science, research and scientific education among Latinos, organized an educational symposium to provide college science majors the tools, opportunities and advice to pursue graduate degrees and succeed in the STEM disciplines. In this article we share our experiences and lessons learned, for others interested in developing large-scale events to recruit underrepresented minorities to STEM and in evaluating the effectiveness of these efforts.
Large-scale sequencing efforts are uncovering the complexity of cancer genomes, which are composed of causal "driver" mutations that promote tumor progression along with many more pathologically neutral "passenger" events. The majority of mutations, both in known cancer drivers and uncharacterized genes, are generally of low occurrence, highlighting the need to functionally annotate the long tail of infrequent mutations present in heterogeneous cancers.
U.S. Strategic Interest in the Middle East and Implications for the Army
2017-01-01
three of which evolved into civil wars, have altered the landscape. These have resulted in occasional opportunities, such as the effort to build a...the Arab world, though experience indicates that large-scale intervention in such conflicts is likely to produce disappointing results . Army...the Iranian regime to rally people around the flag. As a result , the United States should be cautious in demonstrating support for such progress
Cosmology: A research briefing
NASA Technical Reports Server (NTRS)
1995-01-01
As part of its effort to update topics dealt with in the 1986 decadal physics survey, the Board on Physics and Astronomy of the National Research Council (NRC) formed a Panel on Cosmology. The Panel produced this report, intended to be accessible to science policymakers and nonscientists. The chapters include an overview ('What Is Cosmology?'), a discussion of cosmic microwave background radiation, the large-scale structure of the universe, the distant universe, and physics of the early universe.
Nonproliferation and Threat Reduction Assistance: U.S. Programs in the Former Soviet Union
2009-07-31
seeks to help Russia reconfigure its large - scale former BW-related facilities so that they can perform peaceful research issues such as infectious...opting instead for the construction of fast breeder reactors that could burn plutonium directly for energy production. The United States might not fund...this effort, as many in the United States argue that breeder reactors , which produce more plutonium than they consume, would undermine
Nonproliferation and Threat Reduction Assistance: U.S. Programs in the Former Soviet Union
2011-04-26
large - scale former BW-related facilities so that they can perform peaceful research issues such as infectious diseases. The Global Threat Reduction...indicated that it may not pursue the MOX program to eliminate its plutonium, opting instead for the construction of fast breeder reactors that could...burn plutonium directly for energy production. The United States might not fund this effort, as many in the United States argue that breeder reactors
Open source tools for large-scale neuroscience.
Freeman, Jeremy
2015-06-01
New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
Assessing a Science Graduate School Recruitment Symposium
González-Espada, Wilson; Díaz-Muñoz, Greetchen; Feliú-Mójer, Mónica; Flores-Otero, Jacqueline; Fortis-Santiago, Yaihara; Guerrero-Medina, Giovanna; López-Casillas, Marcos; Colón-Ramos, Daniel A.; Fernández-Repollet, Emma
2015-01-01
Ciencia Puerto Rico, a non-profit organization dedicated to promoting science, research and scientific education among Latinos, organized an educational symposium to provide college science majors the tools, opportunities and advice to pursue graduate degrees and succeed in the STEM disciplines. In this article we share our experiences and lessons learned, for others interested in developing large-scale events to recruit underrepresented minorities to STEM and in evaluating the effectiveness of these efforts. PMID:26770074
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.
Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.
Testing Einstein's Gravity on Large Scales
NASA Technical Reports Server (NTRS)
Prescod-Weinstein, Chandra
2011-01-01
A little over a decade has passed since two teams studying high redshift Type Ia supernovae announced the discovery that the expansion of the universe was accelerating. After all this time, we?re still not sure how cosmic acceleration fits into the theory that tells us about the large-scale universe: General Relativity (GR). As part of our search for answers, we have been forced to question GR itself. But how will we test our ideas? We are fortunate enough to be entering the era of precision cosmology, where the standard model of gravity can be subjected to more rigorous testing. Various techniques will be employed over the next decade or two in the effort to better understand cosmic acceleration and the theory behind it. In this talk, I will describe cosmic acceleration, current proposals to explain it, and weak gravitational lensing, an observational effect that allows us to do the necessary precision cosmology.
NASA Astrophysics Data System (ADS)
Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar
2015-03-01
Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.
Experimental Investigation of a Large-Scale Low-Boom Inlet Concept
NASA Technical Reports Server (NTRS)
Hirt, Stefanie M.; Chima, Rodrick V.; Vyas, Manan A.; Wayman, Thomas R.; Conners, Timothy R.; Reger, Robert W.
2011-01-01
A large-scale low-boom inlet concept was tested in the NASA Glenn Research Center 8- x 6- foot Supersonic Wind Tunnel. The purpose of this test was to assess inlet performance, stability and operability at various Mach numbers and angles of attack. During this effort, two models were tested: a dual stream inlet designed to mimic potential aircraft flight hardware integrating a high-flow bypass stream; and a single stream inlet designed to study a configuration with a zero-degree external cowl angle and to permit surface visualization of the vortex generator flow on the internal centerbody surface. During the course of the test, the low-boom inlet concept was demonstrated to have high recovery, excellent buzz margin, and high operability. This paper will provide an overview of the setup, show a brief comparison of the dual stream and single stream inlet results, and examine the dual stream inlet characteristics.
Exploring The Relation Between Upper Tropospheric (UT) Clouds and Convection
NASA Astrophysics Data System (ADS)
Stephens, G. L.; Stubenrauch, C.
2017-12-01
The importance of knowing the vertical transports of water vapor and condensate by atmospheric moist convection cannot be overstated. Vertical convective transports have wide-ranging influences on the Earth system, shaping weather, climate, the hydrological cycle and the composition of the atmosphere. These transports also influence the upper tropospheric cloudiness that exerts profound effects on climate. Although there are presently no direct observations to quantify these transports on the large scale, and there are no observations to constrain model assumptions about them, it might be possible to derive useful observations proxies of these transports and their influence. This talk will present results derived from a large community effort that has developed important observations data records that link clouds and convection. Steps to use these observational metrics to examine the relation between convection, UT clouds in both cloud and global scale models are exemplified and important feedbacks between high clouds, radiation and convection will be elucidated.
Carbon and Carbon Hybrid Materials as Anodes for Sodium-Ion Batteries.
Zhong, Xiongwu; Wu, Ying; Zeng, Sifan; Yu, Yan
2018-02-12
Sodium-ion batteries (SIBs) have attracted much attention for application in large-scale grid energy storage owing to the abundance and low cost of sodium sources. However, low energy density and poor cycling life hinder practical application of SIBs. Recently, substantial efforts have been made to develop electrode materials to push forward large-scale practical applications. Carbon materials can be directly used as anode materials, and they show excellent sodium storage performance. Additionally, designing and constructing carbon hybrid materials is an effective strategy to obtain high-performance anodes for SIBs. In this review, we summarize recent research progress on carbon and carbon hybrid materials as anodes for SIBs. Nanostructural design to enhance the sodium storage performance of anode materials is discussed, and we offer some insight into the potential directions of and future high-performance anode materials for SIBs. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
System design and integration of the large-scale advanced prop-fan
NASA Technical Reports Server (NTRS)
Huth, B. P.
1986-01-01
In recent years, considerable attention has been directed toward improving aircraft fuel consumption. Studies have shown that blades with thin airfoils and aerodynamic sweep extend the inherent efficiency advantage that turboprop propulsion systems have demonstrated to the higher speed to today's aircraft. Hamilton Standard has designed a 9-foot diameter single-rotation Prop-Fan. It will test the hardware on a static test stand, in low speed and high speed wind tunnels and on a research aircraft. The major objective of this testing is to establish the structural integrity of large scale Prop-Fans of advanced construction, in addition to the evaluation of aerodynamic performance and the aeroacoustic design. The coordination efforts performed to ensure smooth operation and assembly of the Prop-Fan are summarized. A summary of the loads used to size the system components, the methodology used to establish material allowables and a review of the key analytical results are given.
2012-01-01
Isolation of polyhydroxyalkanoates (PHAs) from bacterial cell matter is a critical step in order to achieve a profitable production of the polymer. Therefore, an extraction method must lead to a high recovery of a pure product at low costs. This study presents a simplified method for large scale poly(3-hydroxybutyrate), poly(3HB), extraction using sodium hypochlorite. Poly(3HB) was extracted from cells of Ralstonia eutropha H16 at almost 96% purity. At different extraction volumes, a maximum recovery rate of 91.32% was obtained. At the largest extraction volume of 50 L, poly(3HB) with an average purity of 93.32% ± 4.62% was extracted with a maximum recovery of 87.03% of the initial poly(3HB) content. This process is easy to handle and requires less efforts than previously described processes. PMID:23164136
Large-Eddy Simulation of Aeroacoustic Applications
NASA Technical Reports Server (NTRS)
Pruett, C. David; Sochacki, James S.
1999-01-01
This report summarizes work accomplished under a one-year NASA grant from NASA Langley Research Center (LaRC). The effort culminates three years of NASA-supported research under three consecutive one-year grants. The period of support was April 6, 1998, through April 5, 1999. By request, the grant period was extended at no-cost until October 6, 1999. Its predecessors have been directed toward adapting the numerical tool of large-eddy simulation (LES) to aeroacoustic applications, with particular focus on noise suppression in subsonic round jets. In LES, the filtered Navier-Stokes equations are solved numerically on a relatively coarse computational grid. Residual stresses, generated by scales of motion too small to be resolved on the coarse grid, are modeled. Although most LES incorporate spatial filtering, time-domain filtering affords certain conceptual and computational advantages, particularly for aeroacoustic applications. Consequently, this work has focused on the development of subgrid-scale (SGS) models that incorporate time-domain filters.
Space and time scales in human-landscape systems.
Kondolf, G Mathias; Podolak, Kristen
2014-01-01
Exploring spatial and temporal scales provides a way to understand human alteration of landscape processes and human responses to these processes. We address three topics relevant to human-landscape systems: (1) scales of human impacts on geomorphic processes, (2) spatial and temporal scales in river restoration, and (3) time scales of natural disasters and behavioral and institutional responses. Studies showing dramatic recent change in sediment yields from uplands to the ocean via rivers illustrate the increasingly vast spatial extent and quick rate of human landscape change in the last two millennia, but especially in the second half of the twentieth century. Recent river restoration efforts are typically small in spatial and temporal scale compared to the historical human changes to ecosystem processes, but the cumulative effectiveness of multiple small restoration projects in achieving large ecosystem goals has yet to be demonstrated. The mismatch between infrequent natural disasters and individual risk perception, media coverage, and institutional response to natural disasters results in un-preparedness and unsustainable land use and building practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.
CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less
Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.; ...
2015-04-27
CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less
Test of an argon cusp plasma for tin LPP power scaling
NASA Astrophysics Data System (ADS)
McGeoch, Malcolm W.
2015-03-01
Scaling the power of the tin droplet laser-produced-plasma (LPP) extreme ultraviolet (EUV) source to 500W has eluded the industry after a decade of effort. In 2014 we proposed [2] a solution: placing the laser-plasma interaction region within an argon plasma in a magnetic cusp. This would serve to ionize tin atoms and guide them to a large area annular beam dump. We have since demonstrated the feasibility of this approach. We present first results from a full-scale test plasma at power levels relevant to the generation of at least 200W, showing both that the argon cusp plasma is very stable, and that its geometrical properties are ideal for the transport of exhaust power and tin to the beam dump.
Ionita-Laza, Iuliana; Ottman, Ruth
2011-11-01
The recent progress in sequencing technologies makes possible large-scale medical sequencing efforts to assess the importance of rare variants in complex diseases. The results of such efforts depend heavily on the use of efficient study designs and analytical methods. We introduce here a unified framework for association testing of rare variants in family-based designs or designs based on unselected affected individuals. This framework allows us to quantify the enrichment in rare disease variants in families containing multiple affected individuals and to investigate the optimal design of studies aiming to identify rare disease variants in complex traits. We show that for many complex diseases with small values for the overall sibling recurrence risk ratio, such as Alzheimer's disease and most cancers, sequencing affected individuals with a positive family history of the disease can be extremely advantageous for identifying rare disease variants. In contrast, for complex diseases with large values of the sibling recurrence risk ratio, sequencing unselected affected individuals may be preferable.
Leveraging annotation-based modeling with Jump.
Bergmayr, Alexander; Grossniklaus, Michael; Wimmer, Manuel; Kappel, Gerti
2018-01-01
The capability of UML profiles to serve as annotation mechanism has been recognized in both research and industry. Today's modeling tools offer profiles specific to platforms, such as Java, as they facilitate model-based engineering approaches. However, considering the large number of possible annotations in Java, manually developing the corresponding profiles would only be achievable by huge development and maintenance efforts. Thus, leveraging annotation-based modeling requires an automated approach capable of generating platform-specific profiles from Java libraries. To address this challenge, we present the fully automated transformation chain realized by Jump, thereby continuing existing mapping efforts between Java and UML by emphasizing on annotations and profiles. The evaluation of Jump shows that it scales for large Java libraries and generates profiles of equal or even improved quality compared to profiles currently used in practice. Furthermore, we demonstrate the practical value of Jump by contributing profiles that facilitate reverse engineering and forward engineering processes for the Java platform by applying it to a modernization scenario.
2010-01-01
Despite increasing efforts and support for anti-malarial drug R&D, globally anti-malarial drug discovery and development remains largely uncoordinated and fragmented. The current window of opportunity for large scale funding of R&D into malaria is likely to narrow in the coming decade due to a contraction in available resources caused by the current economic difficulties and new priorities (e.g. climate change). It is, therefore, essential that stakeholders are given well-articulated action plans and priorities to guide judgments on where resources can be best targeted. The CRIMALDDI Consortium (a European Union funded initiative) has been set up to develop, through a process of stakeholder and expert consultations, such priorities and recommendations to address them. It is hoped that the recommendations will help to guide the priorities of the European anti-malarial research as well as the wider global discovery agenda in the coming decade. PMID:20626844
Environmental DNA illuminates the dark diversity of sharks
Boussarie, Germain; Bonnin, Lucas; Kulbicki, Michel; Vigliola, Laurent
2018-01-01
In the era of “Anthropocene defaunation,” large species are often no longer detected in habitats where they formerly occurred. However, it is unclear whether this apparent missing, or “dark,” diversity of megafauna results from local species extirpations or from failure to detect elusive remaining individuals. We find that despite two orders of magnitude less sampling effort, environmental DNA (eDNA) detects 44% more shark species than traditional underwater visual censuses and baited videos across the New Caledonian archipelago (south-western Pacific). Furthermore, eDNA analysis reveals the presence of previously unobserved shark species in human-impacted areas. Overall, our results highlight a greater prevalence of sharks than described by traditional survey methods in both impacted and wilderness areas. This indicates an urgent need for large-scale eDNA assessments to improve monitoring of threatened and elusive megafauna. Finally, our findings emphasize the need for conservation efforts specifically geared toward the protection of elusive, residual populations. PMID:29732403
Safety modelling and testing of lithium-ion batteries in electrified vehicles
NASA Astrophysics Data System (ADS)
Deng, Jie; Bae, Chulheung; Marcicki, James; Masias, Alvaro; Miller, Theodore
2018-04-01
To optimize the safety of batteries, it is important to understand their behaviours when subjected to abuse conditions. Most early efforts in battery safety modelling focused on either one battery cell or a single field of interest such as mechanical or thermal failure. These efforts may not completely reflect the failure of batteries in automotive applications, where various physical processes can take place in a large number of cells simultaneously. In this Perspective, we review modelling and testing approaches for battery safety under abuse conditions. We then propose a general framework for large-scale multi-physics modelling and experimental work to address safety issues of automotive batteries in real-world applications. In particular, we consider modelling coupled mechanical, electrical, electrochemical and thermal behaviours of batteries, and explore strategies to extend simulations to the battery module and pack level. Moreover, we evaluate safety test approaches for an entire range of automotive hardware sets from cell to pack. We also discuss challenges in building this framework and directions for its future development.
Extreme Urban Stargazing: Outreach in New York City
NASA Astrophysics Data System (ADS)
Kendall, Jason S.
2013-01-01
There is a fundamental need for the professional community to cultivate and nurture active relationships with amateur organizations. The rewards of such work are highly beneficial to general public education and town-gown relations, but are time-consuming and hard-won. New York City and the surrounding area is both ideally suited and unambiguously ill-suited for astronomy public outreach. I will detail the results of three major outreach efforts in coordination with the Amateur Astronomers Association of New York. I will highlight large public-space observing in the context of the Transit of Venus and star parties at other locations. I will also outline outreach efforts at William Paterson University, where two public nights and a Curiosity EDL event created a clear impact in Northern New Jersey. I will detail methods for encouraging and bringing out amateur observers to events, urban crowd management, publicity issues, and the benefits and pitfalls of social media in the promotion and execution of large-scale and moderate events.
Job strain, effort-reward imbalance and employee well-being: a large-scale cross-sectional study.
de Jonge, J; Bosma, H; Peter, R; Siegrist, J
2000-05-01
This study investigated the effects of the Job Demand-Control (JD-C) Model and the Effort-Reward Imbalance (ERI) Model on employee well-being. A cross-sectional survey was conducted comprising a large representative sample of 11,636 employed Dutch men and women. Logistic regression analyses were used. Controlling for job sector, demographic characteristics (including educational level) and managerial position, employees reporting high job demands (i.e. psychological and physical demands) and low job control had elevated risks of emotional exhaustion, psychosomatic and physical health complaints and job dissatisfaction (odds ratios ranged from 2.89 to 10.94). Odds ratios were generally higher in employees reporting both high (psychological and physical) efforts and low rewards (i.e. poor salary, job insecurity and low work support): they ranged from 3.23 to 15.43. Furthermore, overcommitted people had higher risks of poor well-being due to a high effort-low reward mismatch (ORs: 3.57-20.81) than their less committed counterparts (ORs: 3.01-12.71). Finally, high efforts and low occupational rewards were stronger predictors of poor well-being than low job control when both job stress models were simultaneously adjusted. In conclusion, our findings show independent cumulative effects of both the JD-C Model and the ERI Model on employee well-being and are not significantly different in men and women as well as in young and old people. In particular, high (psychological and physical) efforts and low rewards adversely affected employee well-being. Preliminary findings also indicate excess risks of poor well-being in overcommitted persons suffering from high cost--low gain conditions at work.
Development of a large scale Chimera grid system for the Space Shuttle Launch Vehicle
NASA Technical Reports Server (NTRS)
Pearce, Daniel G.; Stanley, Scott A.; Martin, Fred W., Jr.; Gomez, Ray J.; Le Beau, Gerald J.; Buning, Pieter G.; Chan, William M.; Chiu, Ing-Tsau; Wulf, Armin; Akdag, Vedat
1993-01-01
The application of CFD techniques to large problems has dictated the need for large team efforts. This paper offers an opportunity to examine the motivations, goals, needs, problems, as well as the methods, tools, and constraints that defined NASA's development of a 111 grid/16 million point grid system model for the Space Shuttle Launch Vehicle. The Chimera approach used for domain decomposition encouraged separation of the complex geometry into several major components each of which was modeled by an autonomous team. ICEM-CFD, a CAD based grid generation package, simplified the geometry and grid topology definition by provoding mature CAD tools and patch independent meshing. The resulting grid system has, on average, a four inch resolution along the surface.
NASA Astrophysics Data System (ADS)
Kurucz, Charles N.; Waite, Thomas D.; Otaño, Suzana E.; Cooper, William J.; Nickelsen, Michael G.
2002-11-01
The effectiveness of using high energy electron beam irradiation for the removal of toxic organic chemicals from water and wastewater has been demonstrated by commercial-scale experiments conducted at the Electron Beam Research Facility (EBRF) located in Miami, Florida and elsewhere. The EBRF treats various waste and water streams up to 450 l min -1 (120 gal min -1) with doses up to 8 kilogray (kGy). Many experiments have been conducted by injecting toxic organic compounds into various plant feed streams and measuring the concentrations of compound(s) before and after exposure to the electron beam at various doses. Extensive experimentation has also been performed by dissolving selected chemicals in 22,700 l (6000 gal) tank trucks of potable water to simulate contaminated groundwater, and pumping the resulting solutions through the electron beam. These large-scale experiments, although necessary to demonstrate the commercial viability of the process, require a great deal of time and effort. This paper compares the results of large-scale electron beam irradiations to those obtained from bench-scale irradiations using gamma rays generated by a 60Co source. Dose constants from exponential contaminant removal models are found to depend on the source of radiation and initial contaminant concentration. Possible reasons for observed differences such as a dose rate effect are discussed. Models for estimating electron beam dose constants from bench-scale gamma experiments are presented. Data used to compare the removal of organic compounds using gamma irradiation and electron beam irradiation are taken from the literature and a series of experiments designed to examine the effects of pH, the presence of turbidity, and initial concentration on the removal of various organic compounds (benzene, toluene, phenol, PCE, TCE and chloroform) from simulated groundwater.
Detection of suboptimal effort with symbol span: development of a new embedded index.
Young, J Christopher; Caron, Joshua E; Baughman, Brandon C; Sawyer, R John
2012-03-01
Developing embedded indicators of suboptimal effort on objective neurocognitive testing is essential for detecting increasingly sophisticated forms of symptom feigning. The current study explored whether Symbol Span, a novel Wechsler Memory Scale-fourth edition measure of supraspan visual attention, could be used to discriminate adequate effort from suboptimal effort. Archival data were collected from 136 veterans classified into Poor Effort (n = 42) and Good Effort (n = 94) groups based on symptom validity test (SVT) performance. The Poor Effort group had significantly lower raw scores (p < .001) and age-corrected scaled scores (p < .001) than the Good Effort group on the Symbol Span test. A raw score cutoff of <14 produced 83% specificity and 50% sensitivity for detection of Poor Effort. Similarly, sensitivity was 52% and specificity was 84% when employing a cutoff of <7 for Age-Corrected Scale Score. Collectively, present results suggest that Symbol Span can effectively differentiate veterans with multiple failures on established free-standing and embedded SVTs.
Numerical Simulation of a High Mach Number Jet Flow
NASA Technical Reports Server (NTRS)
Hayder, M. Ehtesham; Turkel, Eli; Mankbadi, Reda R.
1993-01-01
The recent efforts to develop accurate numerical schemes for transition and turbulent flows are motivated, among other factors, by the need for accurate prediction of flow noise. The success of developing high speed civil transport plane (HSCT) is contingent upon our understanding and suppression of the jet exhaust noise. The radiated sound can be directly obtained by solving the full (time-dependent) compressible Navier-Stokes equations. However, this requires computational storage that is beyond currently available machines. This difficulty can be overcome by limiting the solution domain to the near field where the jet is nonlinear and then use acoustic analogy (e.g., Lighthill) to relate the far-field noise to the near-field sources. The later requires obtaining the time-dependent flow field. The other difficulty in aeroacoustics computations is that at high Reynolds numbers the turbulent flow has a large range of scales. Direct numerical simulations (DNS) cannot obtain all the scales of motion at high Reynolds number of technological interest. However, it is believed that the large scale structure is more efficient than the small-scale structure in radiating noise. Thus, one can model the small scales and calculate the acoustically active scales. The large scale structure in the noise-producing initial region of the jet can be viewed as a wavelike nature, the net radiated sound is the net cancellation after integration over space. As such, aeroacoustics computations are highly sensitive to errors in computing the sound sources. It is therefore essential to use a high-order numerical scheme to predict the flow field. The present paper presents the first step in a ongoing effort to predict jet noise. The emphasis here is in accurate prediction of the unsteady flow field. We solve the full time-dependent Navier-Stokes equations by a high order finite difference method. Time accurate spatial simulations of both plane and axisymmetric jet are presented. Jet Mach numbers of 1.5 and 2.1 are considered. Reynolds number in the simulations was about a million. Our numerical model is based on the 2-4 scheme by Gottlieb & Turkel. Bayliss et al. applied the 2-4 scheme in boundary layer computations. This scheme was also used by Ragab and Sheen to study the nonlinear development of supersonic instability waves in a mixing layer. In this study, we present two dimensional direct simulation results for both plane and axisymmetric jets. These results are compared with linear theory predictions. These computations were made for near nozzle exit region and velocity in spanwise/azimuthal direction was assumed to be zero.
Large-Scale Spatial Distribution Patterns of Gastropod Assemblages in Rocky Shores
Miloslavich, Patricia; Cruz-Motta, Juan José; Klein, Eduardo; Iken, Katrin; Weinberger, Vanessa; Konar, Brenda; Trott, Tom; Pohle, Gerhard; Bigatti, Gregorio; Benedetti-Cecchi, Lisandro; Shirayama, Yoshihisa; Mead, Angela; Palomo, Gabriela; Ortiz, Manuel; Gobin, Judith; Sardi, Adriana; Díaz, Juan Manuel; Knowlton, Ann; Wong, Melisa; Peralta, Ana C.
2013-01-01
Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1) describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2) identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3) identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME) following the NaGISA (Natural Geography in Shore Areas) standard protocol (www.nagisa.coml.org). A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2%) appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs) followed by the Trochidae and the Columbellidae (6 LMEs). In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska). No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05). Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages. PMID:23967204
Youngjohn, James R; Wershba, Rebecca; Stevenson, Matthew; Sturgeon, John; Thomas, Michael L
2011-04-01
The MMPI-2 Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008) is replacing the MMPI-2 as the most widely used personality test in neuropsychological assessment, but additional validation studies are needed. Our study examines MMPI-2-RF Validity scales and the newly created Somatic/Cognitive scales in a recently reported sample of 82 traumatic brain injury (TBI) litigants who either passed or failed effort tests (Thomas & Youngjohn, 2009). The restructured Validity scales FBS-r (restructured symptom validity), F-r (restructured infrequent responses), and the newly created Fs (infrequent somatic responses) were not significant predictors of TBI severity. FBS-r was significantly related to passing or failing effort tests, and Fs and F-r showed non-significant trends in the same direction. Elevations on the Somatic/Cognitive scales profile (MLS-malaise, GIC-gastrointestinal complaints, HPC-head pain complaints, NUC-neurological complaints, and COG-cognitive complaints) were significant predictors of effort test failure. Additionally, HPC had the anticipated paradoxical inverse relationship with head injury severity. The Somatic/Cognitive scales as a group were better predictors of effort test failure than the RF Validity scales, which was an unexpected finding. MLS arose as the single best predictor of effort test failure of all RF Validity and Somatic/Cognitive scales. Item overlap analysis revealed that all MLS items are included in the original MMPI-2 Hy scale, making MLS essentially a subscale of Hy. This study validates the MMPI-2-RF as an effective tool for use in neuropsychological assessment of TBI litigants.
Osmundson, Todd W.; Robert, Vincent A.; Schoch, Conrad L.; Baker, Lydia J.; Smith, Amy; Robich, Giovanni; Mizzan, Luca; Garbelotto, Matteo M.
2013-01-01
Despite recent advances spearheaded by molecular approaches and novel technologies, species description and DNA sequence information are significantly lagging for fungi compared to many other groups of organisms. Large scale sequencing of vouchered herbarium material can aid in closing this gap. Here, we describe an effort to obtain broad ITS sequence coverage of the approximately 6000 macrofungal-species-rich herbarium of the Museum of Natural History in Venice, Italy. Our goals were to investigate issues related to large sequencing projects, develop heuristic methods for assessing the overall performance of such a project, and evaluate the prospects of such efforts to reduce the current gap in fungal biodiversity knowledge. The effort generated 1107 sequences submitted to GenBank, including 416 previously unrepresented taxa and 398 sequences exhibiting a best BLAST match to an unidentified environmental sequence. Specimen age and taxon affected sequencing success, and subsequent work on failed specimens showed that an ITS1 mini-barcode greatly increased sequencing success without greatly reducing the discriminating power of the barcode. Similarity comparisons and nonmetric multidimensional scaling ordinations based on pairwise distance matrices proved to be useful heuristic tools for validating the overall accuracy of specimen identifications, flagging potential misidentifications, and identifying taxa in need of additional species-level revision. Comparison of within- and among-species nucleotide variation showed a strong increase in species discriminating power at 1–2% dissimilarity, and identified potential barcoding issues (same sequence for different species and vice-versa). All sequences are linked to a vouchered specimen, and results from this study have already prompted revisions of species-sequence assignments in several taxa. PMID:23638077
Osmundson, Todd W; Robert, Vincent A; Schoch, Conrad L; Baker, Lydia J; Smith, Amy; Robich, Giovanni; Mizzan, Luca; Garbelotto, Matteo M
2013-01-01
Despite recent advances spearheaded by molecular approaches and novel technologies, species description and DNA sequence information are significantly lagging for fungi compared to many other groups of organisms. Large scale sequencing of vouchered herbarium material can aid in closing this gap. Here, we describe an effort to obtain broad ITS sequence coverage of the approximately 6000 macrofungal-species-rich herbarium of the Museum of Natural History in Venice, Italy. Our goals were to investigate issues related to large sequencing projects, develop heuristic methods for assessing the overall performance of such a project, and evaluate the prospects of such efforts to reduce the current gap in fungal biodiversity knowledge. The effort generated 1107 sequences submitted to GenBank, including 416 previously unrepresented taxa and 398 sequences exhibiting a best BLAST match to an unidentified environmental sequence. Specimen age and taxon affected sequencing success, and subsequent work on failed specimens showed that an ITS1 mini-barcode greatly increased sequencing success without greatly reducing the discriminating power of the barcode. Similarity comparisons and nonmetric multidimensional scaling ordinations based on pairwise distance matrices proved to be useful heuristic tools for validating the overall accuracy of specimen identifications, flagging potential misidentifications, and identifying taxa in need of additional species-level revision. Comparison of within- and among-species nucleotide variation showed a strong increase in species discriminating power at 1-2% dissimilarity, and identified potential barcoding issues (same sequence for different species and vice-versa). All sequences are linked to a vouchered specimen, and results from this study have already prompted revisions of species-sequence assignments in several taxa.
Seel, Ronald T.; Corrigan, John D.; Dijkers, Marcel P.; Barrett, Ryan S.; Bogner, Jennifer; Smout, Randall J.; Garmoe, William; Horn, Susan D.
2016-01-01
Objective To describe patients' level of effort in occupational, physical, and speech therapy sessions during traumatic brain injury (TBI) inpatient rehabilitation and to evaluate how age, injury severity, cognitive impairment, and time are associated with effort. Design Prospective, multicenter, longitudinal cohort study. Setting Acute TBI rehabilitation programs. Participants Patients (N=1946) receiving 138,555 therapy sessions. Interventions Not applicable. Main Outcome Measures Effort in rehabilitation sessions rated on the Rehabilitation Intensity of Therapy Scale, FIM, Comprehensive Severity Index brain injury severity score, posttraumatic amnesia (PTA), and Agitated Behavior Scale (ABS). Results The Rehabilitation Intensity of Therapy Scale effort ratings in individual therapy sessions closely conformed to a normative distribution for all 3 disciplines. Mean Rehabilitation Intensity of Therapy Scale ratings for patients' therapy sessions were higher in the discharge week than in the admission week (P<.001). For patients who completed 2, 3, or 4 weeks of rehabilitation, differences in effort ratings (P<.001) were observed between 5 subgroups stratified by admission FIM cognitive scores and over time. In linear mixed-effects modeling, age and Comprehensive Severity Index brain injury severity score at admission, days from injury to rehabilitation admission, days from admission, and daily ratings of PTA and ABS score were predictors of level of effort (P<.0001). Conclusions Patients' level of effort can be observed and reliably rated in the TBI inpatient rehabilitation setting using the Rehabilitation Intensity of Therapy Scale. Patients who sustain TBI show varying levels of effort in rehabilitation therapy sessions, with effort tending to increase over the stay. PTA and agitated behavior are primary risk factors that substantially reduce patient effort in therapies. PMID:26212400
The Center for Multiscale Plasma Dynamics, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gombosi, Tamas I.
The University of Michigan participated in the joint UCLA/Maryland fusion science center focused on plasma physics problems for which the traditional separation of the dynamics into microscale and macroscale processes breaks down. These processes involve large scale flows and magnetic fields tightly coupled to the small scale, kinetic dynamics of turbulence, particle acceleration and energy cascade. The interaction between these vastly disparate scales controls the evolution of the system. The enormous range of temporal and spatial scales associated with these problems renders direct simulation intractable even in computations that use the largest existing parallel computers. Our efforts focused on twomore » main problems: the development of Hall MHD solvers on solution adaptive grids and the development of solution adaptive grids using generalized coordinates so that the proper geometry of inertial confinement can be taken into account and efficient refinement strategies can be obtained.« less
LAMMPS strong scaling performance optimization on Blue Gene/Q
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coffman, Paul; Jiang, Wei; Romero, Nichols A.
2014-11-12
LAMMPS "Large-scale Atomic/Molecular Massively Parallel Simulator" is an open-source molecular dynamics package from Sandia National Laboratories. Significant performance improvements in strong-scaling and time-to-solution for this application on IBM's Blue Gene/Q have been achieved through computational optimizations of the OpenMP versions of the short-range Lennard-Jones term of the CHARMM force field and the long-range Coulombic interaction implemented with the PPPM (particle-particle-particle mesh) algorithm, enhanced by runtime parameter settings controlling thread utilization. Additionally, MPI communication performance improvements were made to the PPPM calculation by re-engineering the parallel 3D FFT to use MPICH collectives instead of point-to-point. Performance testing was done using anmore » 8.4-million atom simulation scaling up to 16 racks on the Mira system at Argonne Leadership Computing Facility (ALCF). Speedups resulting from this effort were in some cases over 2x.« less
NASA Technical Reports Server (NTRS)
Groesbeck, D. E.; Huff, R. G.; Vonglahn, U. H.
1977-01-01
Small-scale circular, noncircular, single- and multi-element nozzles with flow areas as large as 122 sq cm were tested with cold airflow at exit Mach numbers from 0.28 to 1.15. The effects of multi-element nozzle shape and element spacing on jet Mach number decay were studied in an effort to reduce the noise caused by jet impingement on externally blown flap (EBF) STOL aircraft. The jet Mach number decay data are well represented by empirical relations. Jet spreading and Mach number decay contours are presented for all configurations tested.
Subsurface Monitoring of CO2 Sequestration - A Review and Look Forward
NASA Astrophysics Data System (ADS)
Daley, T. M.
2012-12-01
The injection of CO2 into subsurface formations is at least 50 years old with large-scale utilization of CO2 for enhanced oil recovery (CO2-EOR) beginning in the 1970s. Early monitoring efforts had limited measurements in available boreholes. With growing interest in CO2 sequestration beginning in the 1990's, along with growth in geophysical reservoir monitoring, small to mid-size sequestration monitoring projects began to appear. The overall goals of a subsurface monitoring plan are to provide measurement of CO2 induced changes in subsurface properties at a range of spatial and temporal scales. The range of spatial scales allows tracking of the location and saturation of the plume with varying detail, while finer temporal sampling (up to continuous) allows better understanding of dynamic processes (e.g. multi-phase flow) and constraining of reservoir models. Early monitoring of small scale pilots associated with CO2-EOR (e.g., the McElroy field and the Lost Hills field), developed many of the methodologies including tomographic imaging and multi-physics measurements. Large (reservoir) scale sequestration monitoring began with the Sleipner and Weyburn projects. Typically, large scale monitoring, such as 4D surface seismic, has limited temporal sampling due to costs. Smaller scale pilots can allow more frequent measurements as either individual time-lapse 'snapshots' or as continuous monitoring. Pilot monitoring examples include the Frio, Nagaoka and Otway pilots using repeated well logging, crosswell imaging, vertical seismic profiles and CASSM (continuous active-source seismic monitoring). For saline reservoir sequestration projects, there is typically integration of characterization and monitoring, since the sites are not pre-characterized resource developments (oil or gas), which reinforces the need for multi-scale measurements. As we move beyond pilot sites, we need to quantify CO2 plume and reservoir properties (e.g. pressure) over large scales, while still obtaining high resolution. Typically the high-resolution (spatial and temporal) tools are deployed in permanent or semi-permanent borehole installations, where special well design may be necessary, such as non-conductive casing for electrical surveys. Effective utilization of monitoring wells requires an approach of modular borehole monitoring (MBM) were multiple measurements can be made. An example is recent work at the Citronelle pilot injection site where an MBM package with seismic, fluid sampling and distributed fiber sensing was deployed. For future large scale sequestration monitoring, an adaptive borehole-monitoring program is proposed.
Ross, Victoria; Kõlves, Kairi; De Leo, Diego
2017-07-03
Given the important role teachers play as gatekeepers in school suicide prevention, this study explored teachers' perspectives on what should be done to improve current suicide prevention efforts. The study, in Queensland, Australia, was part of a large-scale survey examining teachers' knowledge, attitudes and experience of suicidality. One hundred and fifteen teachers responded to an online survey question regarding their views on the requirements for school suicide prevention. Qualitative analysis identified five themes from teachers' responses: awareness and stigma reduction, support services for students, education and training, bullying and the role of social media. The results of this study provide some profound insights into teachers' perspectives on suicide and highlight the critical need for improved suicide prevention efforts in schools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oubeidillah, Abdoul A; Kao, Shih-Chieh; Ashfaq, Moetasim
2014-01-01
To extend geographical coverage, refine spatial resolution, and improve modeling efficiency, a computation- and data-intensive effort was conducted to organize a comprehensive hydrologic dataset with post-calibrated model parameters for hydro-climate impact assessment. Several key inputs for hydrologic simulation including meteorologic forcings, soil, land class, vegetation, and elevation were collected from multiple best-available data sources and organized for 2107 hydrologic subbasins (8-digit hydrologic units, HUC8s) in the conterminous United States at refined 1/24 (~4 km) spatial resolution. Using high-performance computing for intensive model calibration, a high-resolution parameter dataset was prepared for the macro-scale Variable Infiltration Capacity (VIC) hydrologic model. The VICmore » simulation was driven by DAYMET daily meteorological forcing and was calibrated against USGS WaterWatch monthly runoff observations for each HUC8. The results showed that this new parameter dataset may help reasonably simulate runoff at most US HUC8 subbasins. Based on this exhaustive calibration effort, it is now possible to accurately estimate the resources required for further model improvement across the entire conterminous United States. We anticipate that through this hydrologic parameter dataset, the repeated effort of fundamental data processing can be lessened, so that research efforts can emphasize the more challenging task of assessing climate change impacts. The pre-organized model parameter dataset will be provided to interested parties to support further hydro-climate impact assessment.« less
Guttery, Michael; Ribic, Christine; Sample, David W.; Paulios, Andy; Trosen, Chris; Dadisman, John D.; Schneider, Daniel; Horton, Josephine
2017-01-01
ContextBeyond the recognized importance of protecting large areas of contiguous habitat, conservation efforts for many species are complicated by the fact that patch suitability may also be affected by characteristics of the landscape within which the patch is located. Currently, little is known about the spatial scales at which species respond to different aspects of the landscape surrounding an occupied patch.ObjectivesUsing grassland bird point count data, we describe an approach to evaluating scale-specific effects of landscape composition on patch occupancy.MethodsWe used data from 793 point count surveys conducted in idle and grazed grasslands across Wisconsin, USA from 2012 to 2014 to evaluate scale-dependencies in the response of grassland birds to landscape composition. Patch occupancy models were used to evaluate the relationship between occupancy and landscape composition at scales from 100 to 3000 m.ResultsBobolink (Dolichonyx oryzivorus) exhibited a pattern indicating selection for grassland habitats in the surrounding landscape at all spatial scales while selecting against other habitats. Eastern Meadowlark (Sturnella magna) displayed evidence of scale sensitivity for all habitat types. Grasshopper Sparrow (Ammodramus savannarum) showed a strong positive response to pasture and idle grass at all scales and negatively to cropland at large scales. Unlike other species, patch occupancy by Henslow’s Sparrow (A. henslowii) was primarily influenced by patch area.ConclusionsOur results suggest that both working grasslands (pasture) and idle conservation grasslands can play an important role in grassland bird conservation but also highlight the importance of considering species-specific patch and landscape characteristics for effective conservation.
NASA Astrophysics Data System (ADS)
Vrieling, Anton; Hoedjes, Joost C. B.; van der Velde, Marijn
2015-04-01
Efforts to map and monitor soil erosion need to account for the erratic nature of the soil erosion process. Soil erosion by water occurs on sloped terrain when erosive rainfall and consequent surface runoff impact soils that are not well-protected by vegetation or other soil protective measures. Both rainfall erosivity and vegetation cover are highly variable through space and time. Due to data paucity and the relative ease of spatially overlaying geographical data layers into existing models like USLE (Universal Soil Loss Equation), many studies and mapping efforts merely use average annual values for erosivity and vegetation cover as input. We first show that rainfall erosivity can be estimated from satellite precipitation data. We obtained average annual erosivity estimates from 15 yr of 3-hourly TRMM Multi-satellite Precipitation Analysis (TMPA) data (1998-2012) using intensity-erosivity relationships. Our estimates showed a positive correlation (r = 0.84) with long-term annual erosivity values of 37 stations obtained from literature. Using these TMPA erosivity retrievals, we demonstrate the large interannual variability, with maximum annual erosivity often exceeding two to three times the mean value, especially in semi-arid areas. We then calculate erosivity at a 10-daily time-step and combine this with vegetation cover development for selected locations in Africa using NDVI - normalized difference vegetation index - time series from SPOT VEGETATION. Although we do not integrate the data at this point, the joint analysis of both variables stresses the need for joint accounting for erosivity and vegetation cover for large-scale erosion assessment and monitoring.
The “Wireless Sensor Networks for City-Wide Ambient Intelligence (WISE-WAI)” Project
Casari, Paolo; Castellani, Angelo P.; Cenedese, Angelo; Lora, Claudio; Rossi, Michele; Schenato, Luca; Zorzi, Michele
2009-01-01
This paper gives a detailed technical overview of some of the activities carried out in the context of the “Wireless Sensor networks for city-Wide Ambient Intelligence (WISE-WAI)” project, funded by the Cassa di Risparmio di Padova e Rovigo Foundation, Italy. The main aim of the project is to demonstrate the feasibility of large-scale wireless sensor network deployments, whereby tiny objects integrating one or more environmental sensors (humidity, temperature, light intensity), a microcontroller and a wireless transceiver are deployed over a large area, which in this case involves the buildings of the Department of Information Engineering at the University of Padova. We will describe how the network is organized to provide full-scale automated functions, and which services and applications it is configured to provide. These applications include long-term environmental monitoring, alarm event detection and propagation, single-sensor interrogation, localization and tracking of objects, assisted navigation, as well as fast data dissemination services to be used, e.g., to rapidly re-program all sensors over-the-air. The organization of such a large testbed requires notable efforts in terms of communication protocols and strategies, whose design must pursue scalability, energy efficiency (while sensors are connected through USB cables for logging and debugging purposes, most of them will be battery-operated), as well as the capability to support applications with diverse requirements. These efforts, the description of a subset of the results obtained so far, and of the final objectives to be met are the scope of the present paper. PMID:22408513
Global Instrumental Seismic Catalog: earthquake relocations for 1900-present
NASA Astrophysics Data System (ADS)
Villasenor, A.; Engdahl, E.; Storchak, D. A.; Bondar, I.
2010-12-01
We present the current status of our efforts to produce a set of homogeneous earthquake locations and improved focal depths towards the compilation of a Global Catalog of instrumentally recorded earthquakes that will be complete down to the lowest magnitude threshold possible on a global scale and for the time period considered. This project is currently being carried out under the auspices of GEM (Global Earthquake Model). The resulting earthquake catalog will be a fundamental dataset not only for earthquake risk modeling and assessment on a global scale, but also for a large number of studies such as global and regional seismotectonics; the rupture zones and return time of large, damaging earthquakes; the spatial-temporal pattern of moment release along seismic zones and faults etc. Our current goal is to re-locate all earthquakes with available station arrival data using the following magnitude thresholds: M5.5 for 1964-present, M6.25 for 1918-1963, M7.5 (complemented with significant events in continental regions) for 1900-1917. Phase arrival time data for earthquakes after 1963 are available in digital form from the International Seismological Centre (ISC). For earthquakes in the time period 1918-1963, phase data is obtained by scanning the printed International Seismological Summary (ISS) bulletins and applying optical character recognition routines. For earlier earthquakes we will collect phase data from individual station bulletins. We will illustrate some of the most significant results of this relocation effort, including aftershock distributions for large earthquakes, systematic differences in epicenter and depth with respect to previous location, examples of grossly mislocated events, etc.
König, Stephan; Wubet, Tesfaye; Dormann, Carsten F.; Hempel, Stefan; Renker, Carsten; Buscot, François
2010-01-01
Large-scale (temporal and/or spatial) molecular investigations of the diversity and distribution of arbuscular mycorrhizal fungi (AMF) require considerable sampling efforts and high-throughput analysis. To facilitate such efforts, we have developed a TaqMan real-time PCR assay to detect and identify AMF in environmental samples. First, we screened the diversity in clone libraries, generated by nested PCR, of the nuclear ribosomal DNA internal transcribed spacer (ITS) of AMF in environmental samples. We then generated probes and forward primers based on the detected sequences, enabling AMF sequence type-specific detection in TaqMan multiplex real-time PCR assays. In comparisons to conventional clone library screening and Sanger sequencing, the TaqMan assay approach provided similar accuracy but higher sensitivity with cost and time savings. The TaqMan assays were applied to analyze the AMF community composition within plots of a large-scale plant biodiversity manipulation experiment, the Jena Experiment, primarily designed to investigate the interactive effects of plant biodiversity on element cycling and trophic interactions. The results show that environmental variables hierarchically shape AMF communities and that the sequence type spectrum is strongly affected by previous land use and disturbance, which appears to favor disturbance-tolerant members of the genus Glomus. The AMF species richness of disturbance-associated communities can be largely explained by richness of plant species and plant functional groups, while plant productivity and soil parameters appear to have only weak effects on the AMF community. PMID:20418424
Silva, Déborah R O; Ligeiro, Raphael; Hughes, Robert M; Callisto, Marcos
2016-06-01
Taxonomic richness is one of the most important measures of biological diversity in ecological studies, including those with stream macroinvertebrates. However, it is impractical to measure the true richness of any site directly by sampling. Our objective was to evaluate the effect of sampling effort on estimates of macroinvertebrate family and Ephemeroptera, Plecoptera, and Trichoptera (EPT) genera richness at two scales: basin and stream site. In addition, we tried to determine which environmental factors at the site scale most influenced the amount of sampling effort needed. We sampled 39 sites in the Cerrado biome (neotropical savanna). In each site, we obtained 11 equidistant samples of the benthic assemblage and multiple physical habitat measurements. The observed basin-scale richness achieved a consistent estimation from Chao 1, Jack 1, and Jack 2 richness estimators. However, at the site scale, there was a constant increase in the observed number of taxa with increased number of samples. Models that best explained the slope of site-scale sampling curves (representing the necessity of greater sampling effort) included metrics that describe habitat heterogeneity, habitat structure, anthropogenic disturbance, and water quality, for both macroinvertebrate family and EPT genera richness. Our results demonstrate the importance of considering basin- and site-scale sampling effort in ecological surveys and that taxa accumulation curves and richness estimators are good tools for assessing sampling efficiency. The physical habitat explained a significant amount of the sampling effort needed. Therefore, future studies should explore the possible implications of physical habitat characteristics when developing sampling objectives, study designs, and calculating the needed sampling effort.
Advanced Distillation Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddalena Fanelli; Ravi Arora; Annalee Tonkovich
2010-03-24
The Advanced Distillation project was concluded on December 31, 2009. This U.S. Department of Energy (DOE) funded project was completed successfully and within budget during a timeline approved by DOE project managers, which included a one year extension to the initial ending date. The subject technology, Microchannel Process Technology (MPT) distillation, was expected to provide both capital and operating cost savings compared to conventional distillation technology. With efforts from Velocys and its project partners, MPT distillation was successfully demonstrated at a laboratory scale and its energy savings potential was calculated. While many objectives established at the beginning of the projectmore » were met, the project was only partially successful. At the conclusion, it appears that MPT distillation is not a good fit for the targeted separation of ethane and ethylene in large-scale ethylene production facilities, as greater advantages were seen for smaller scale distillations. Early in the project, work involved flowsheet analyses to discern the economic viability of ethane-ethylene MPT distillation and develop strategies for maximizing its impact on the economics of the process. This study confirmed that through modification to standard operating processes, MPT can enable net energy savings in excess of 20%. This advantage was used by ABB Lumus to determine the potential impact of MPT distillation on the ethane-ethylene market. The study indicated that a substantial market exists if the energy saving could be realized and if installed capital cost of MPT distillation was on par or less than conventional technology. Unfortunately, it was determined that the large number of MPT distillation units needed to perform ethane-ethylene separation for world-scale ethylene facilities, makes the targeted separation a poor fit for the technology in this application at the current state of manufacturing costs. Over the course of the project, distillation experiments were performed with the targeted mixture, ethane-ethylene, as well as with analogous low relative volatility systems: cyclohexane-hexane and cyclopentane-pentane. Devices and test stands were specifically designed for these efforts. Development progressed from experiments and models considering sections of a full scale device to the design, fabrication, and operation of a single-channel distillation unit with integrated heat transfer. Throughout the project, analytical and numerical models and Computational Fluid Dynamics (CFD) simulations were validated with experiments in the process of developing this platform technology. Experimental trials demonstrated steady and controllable distillation for a variety of process conditions. Values of Height-to-an-Equivalent Theoretical Plate (HETP) ranging from less than 0.5 inch to a few inches were experimentally proven, demonstrating a ten-fold performance enhancement relative to conventional distillation. This improvement, while substantial, is not sufficient for MPT distillation to displace very large scale distillation trains. Fortunately, parallel efforts in the area of business development have yielded other applications for MPT distillation, including smaller scale separations that benefit from the flowsheet flexibility offered by the technology. Talks with multiple potential partners are underway. Their outcome will also help determine the path ahead for MPT distillation.« less
Alecu, I M; Zheng, Jingjing; Zhao, Yan; Truhlar, Donald G
2010-09-14
Optimized scale factors for calculating vibrational harmonic and fundamental frequencies and zero-point energies have been determined for 145 electronic model chemistries, including 119 based on approximate functionals depending on occupied orbitals, 19 based on single-level wave function theory, three based on the neglect-of-diatomic-differential-overlap, two based on doubly hybrid density functional theory, and two based on multicoefficient correlation methods. Forty of the scale factors are obtained from large databases, which are also used to derive two universal scale factor ratios that can be used to interconvert between scale factors optimized for various properties, enabling the derivation of three key scale factors at the effort of optimizing only one of them. A reduced scale factor optimization model is formulated in order to further reduce the cost of optimizing scale factors, and the reduced model is illustrated by using it to obtain 105 additional scale factors. Using root-mean-square errors from the values in the large databases, we find that scaling reduces errors in zero-point energies by a factor of 2.3 and errors in fundamental vibrational frequencies by a factor of 3.0, but it reduces errors in harmonic vibrational frequencies by only a factor of 1.3. It is shown that, upon scaling, the balanced multicoefficient correlation method based on coupled cluster theory with single and double excitations (BMC-CCSD) can lead to very accurate predictions of vibrational frequencies. With a polarized, minimally augmented basis set, the density functionals with zero-point energy scale factors closest to unity are MPWLYP1M (1.009), τHCTHhyb (0.989), BB95 (1.012), BLYP (1.013), BP86 (1.014), B3LYP (0.986), MPW3LYP (0.986), and VSXC (0.986).
Tracing Large Scale Structure with a Redshift Survey of Rich Clusters of Galaxies
NASA Astrophysics Data System (ADS)
Batuski, D.; Slinglend, K.; Haase, S.; Hill, J. M.
1993-12-01
Rich clusters of galaxies from Abell's catalog show evidence of structure on scales of 100 Mpc and hold promise of confirming the existence of structure in the more immediate universe on scales corresponding to COBE results (i.e., on the order of 10% or more of the horizon size of the universe). However, most Abell clusters do not as yet have measured redshifts (or, in the case of most low redshift clusters, have only one or two galaxies measured), so present knowledge of their three dimensional distribution has quite large uncertainties. The shortage of measured redshifts for these clusters may also mask a problem of projection effects corrupting the membership counts for the clusters, perhaps even to the point of spurious identifications of some of the clusters themselves. Our approach in this effort has been to use the MX multifiber spectrometer to measure redshifts of at least ten galaxies in each of about 80 Abell cluster fields with richness class R>= 1 and mag10 <= 16.8. This work will result in a somewhat deeper, much more complete (and reliable) sample of positions of rich clusters. Our primary use for the sample is for two-point correlation and other studies of the large scale structure traced by these clusters. We are also obtaining enough redshifts per cluster so that a much better sample of reliable cluster velocity dispersions will be available for other studies of cluster properties. To date, we have collected such data for 40 clusters, and for most of them, we have seven or more cluster members with redshifts, allowing for reliable velocity dispersion calculations. Velocity histograms for several interesting cluster fields are presented, along with summary tables of cluster redshift results. Also, with 10 or more redshifts in most of our cluster fields (30({') } square, just about an `Abell diameter' at z ~ 0.1) we have investigated the extent of projection effects within the Abell catalog in an effort to quantify and understand how this may effect the Abell sample.
NASA Astrophysics Data System (ADS)
Unal, Caner; Peloso, Marco; Sorbo, Lorenzo; Garcia-Bellido, Juan
2017-01-01
A strong experimental effort is ongoing to detect the primordial gravitational waves (GW) generated during inflation from their impact on the Cosmic Microwave Background (CMB). This effort is motivated by the direct relation between the amplitude of GW signal and the energy scale of inflation, in the standard case of GW production from vacuum. I will discuss the robustness of this relation and the conditions under which particle production mechanisms during inflation can generate a stronger GW signal than the vacuum one. I will present a concrete model employing a coupling between a rolling axion and a gauge field, that can produce a detectable GW signal for an arbitrarily small inflation scale, respecting bounds from back-reaction, perturbativity, and the gaussianity of the measured density perturbations. I will show how the GW produced by this mechanism can be distinguished from the vacuum ones by their spectral dependence and statistical properties. I will finally discuss the possibility of detecting an inflationary GW signal at terrestrial (AdvLIGO) and space (LISA) interferometers. Such experiments are sensitive to the modes much smaller than the ones corresponding to CMB and Large Scale Structure, presenting a unique observational window on the final stages of inflation. The work of C.U. is s supported by a Doctoral Dissertation Fellowship from the Graduate School of the University of Minnesota.
NASA Astrophysics Data System (ADS)
Arndt, Roger; Chamorro, Leonardo; Sotiropoulos, Fotis
2010-11-01
Skin friction drag reduction through the use of riblets has been a topic of intensive research during the last decades. Main efforts have been placed on both numerical (mainly DNS) and experimental approaches. In spite of the valuable efforts, the fundamental mechanisms that induce drag reduction are not well established. In this study, wind tunnel experiments were performed to quantify the drag reduction in a wind turbine airfoil using different V-groove riblet structures. A full-scale 2.5MW Clipper wind turbine airfoil section (of 1 meter chord length, typical of the 88% blade span), was placed in the freestream flow of the wind tunnel at the Saint Anthony Falls Laboratory, University of Minnesota. Four different sizes of V-groove riblets were tested at different angles of attack at full scale Reynolds number of Re=2.67x106 (based on the airfoil chord length). Force sensors were used to measure Lift and Drag. A combination of single and cross-wire anemometers were also used to study the turbulent scale-to-scale interaction in the near wall region to better understand the physical mechanisms of drag reduction and flow characteristics in that region. The measurements will be used to develop and test the performance of near-wall boundary conditions in the context of RANS and hybrid RANS/LES models.
ERIC Educational Resources Information Center
Bilgin, Aysegül; Balbag, Mustafa Zafer
2016-01-01
This study has developed "Personal Professional Development Efforts Scale for Science and Technology Teachers Regarding Their Fields". Exploratory factor analysis of the scale has been conducted based on the data collected from 200 science and technology teachers across Turkey. The scale has been observed through varimax rotation method,…
The Structural Heat Intercept-Insulation-Vibration Evaluation Rig (SHIVER)
NASA Technical Reports Server (NTRS)
Johnson, W. L.; Zoeckler, J. G.; Best-Ameen, L. M.
2015-01-01
NASA is currently investigating methods to reduce the boil-off rate on large cryogenic upper stages. Two such methods to reduce the total heat load on existing upper stages are vapor cooling of the cryogenic tank support structure and integration of thick multilayer insulation systems to the upper stage of a launch vehicle. Previous efforts have flown a 2-layer MLI blanket and shown an improved thermal performance, and other efforts have ground-tested blankets up to 70 layers thick on tanks with diameters between 2 3 meters. However, thick multilayer insulation installation and testing in both thermal and structural modes has not been completed on a large scale tank. Similarly, multiple vapor cooled shields are common place on science payload helium dewars; however, minimal effort has gone into intercepting heat on large structural surfaces associated with rocket stages. A majority of the vapor cooling effort focuses on metallic cylinders called skirts, which are the most common structural components for launch vehicles. In order to provide test data for comparison with analytical models, a representative test tank is currently being designed to include skirt structural systems with integral vapor cooling. The tank is 4 m in diameter and 6.8 m tall to contain 5000 kg of liquid hydrogen. A multilayer insulation system will be designed to insulate the tank and structure while being installed in a representative manner that can be extended to tanks up to 10 meters in diameter. In order to prove that the insulation system and vapor cooling attachment methods are structurally sound, acoustic testing will also be performed on the system. The test tank with insulation and vapor cooled shield installed will be tested thermally in the B2 test facility at NASAs Plumbrook Station both before and after being vibration tested at Plumbrooks Space Power Facility.
An Engineering Methodology for Implementing and Testing VLSI (Very Large Scale Integrated) Circuits
1989-03-01
the pad frame and associated routing, conducted additional testing. and submitted the finished design effort to MOSIS for manufacturing. Throughout...register bank TSTCON Allows the XNOR circuitry to enter the TEST register bank PADIN Test signal to check operation of the input pad VCC Power connection...MOSSIM II simulation program. but the design offered little observability within the circuit. The initial design used 35 pins of a 40 pin pad frame
Evaluating the Combined UUV Efforts in a Large-Scale Mine Warfare Environment
2015-03-01
buddy, COL Fred Lough and Mary Mullin, for being loving and support parents-in-law, and Linda and Harry Thompson, for being the most devoted and...completed in 2015. Until the RMMV and AQS-20B issues are resolved, the RMS operational test cannot be completed onboard LCS ( Seligman , 2015). These poor...available in the foreseeable future. Therefore, the Avenger-Class ships will remain the main MCM platform ( Seligman , 2014). The Avenger-Class ships are
Lyman L. McDonald; Robert Bilby; Peter A. Bisson; Charles C. Coutant; John M. Epifanio; Daniel Goodman; Susan Hanna; Nancy Huntly; Erik Merrill; Brian Riddell; William Liss; Eric J. Loudenslager; David P. Philipp; William Smoker; Richard R. Whitney; Richard N. Williams
2007-01-01
The year 2006 marked two milestones in the Columbia River Basin and the Pacific Northwest region's efforts to rebuild its once great salmon and steelhead runs: the 25th anniversary of the creation of the Northwest Power and Conservation Council and the 10th anniversary of an amendment to the Northwest Power Act that formalized scientific peer review of the council...
Ammunition Suite for the FCS Multi-role Armament and Ammunition System (MRAAS)
2001-06-20
Cards Large Scale Gap Test (LSGT) Exploding Foil Initiator ( EFI ) Effort 19 Slow Burning Layer Fast Burning Layer FASTCORE Nitramines ETPEs RDX CL20...Center Burst Charge 48 M80 Grenades With Center Burst Charge ü Trade off performance with size, weight, etc. ü Develop initial space claim for...submunition ü Dynamic Analysis of projectile for different submunitions MRAAS Trades underway • Accomplishments – Initial meetings with TRADOC, Ft Knox and Ft
Eleventh Street and Bronx frontier: urban pioneering with wind power
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurwood, D.L.
1981-01-01
Wind energy is being applied to electricity generation at two locations in New York City. These small-scale systems (2 KW and 40 KW) are pioneering efforts contrasting with large wind turbines (such as the 2 MW experimental DOE-NASA unit in the Blue Ridge Mountains near Boone, N.C.), in that they are located in an urban setting, and represent initiatives by neighborhood associations and community groups rather than by government or utilities. 54 refs.
Nonproliferation and Threat Reduction Assistance: U.S, Programs in the Former Soviet Union
2008-03-26
reconfigure its large - scale former BW-related facilities so that they can perform peaceful research issues such as infectious diseases. For FY2004, the Bush...program to eliminate its plutonium, opting instead for the construction of fast breeder reactors that could burn plutonium directly for energy production...The United States might not fund this effort, as many in the United States argue that breeder reactors , which produce more plutonium than they
A role for shellfish aquaculture in coastal nitrogen management.
Rose, Julie M; Bricker, Suzanne B; Tedesco, Mark A; Wikfors, Gary H
2014-01-01
Excess nutrients in the coastal environment have been linked to a host of environmental problems, and nitrogen reduction efforts have been a top priority of resource managers for decades. The use of shellfish for coastal nitrogen remediation has been proposed, but formal incorporation into nitrogen management programs is lagging. Including shellfish aquaculture in existing nitrogen management programs makes sense from environmental, economic, and social perspectives, but challenges must be overcome for large-scale implementation to be possible.
ERIC Educational Resources Information Center
Huang, Denise; Cho, Jamie; Mostafavi, Sima; Nam, Hannah H.; Oh, Christine; Harven, Aletha; Leon, Seth
2010-01-01
In an effort to identify and incorporate exemplary practices into existing and future afterschool programs, the U.S. Department of Education commissioned a large-scale evaluation of the 21st Century Community Learning Center (CCLC) program. The purpose of this evaluation project was to develop resources and professional development that addresses…
Quantification of fossil fuel CO2 emissions on the building/street scale for a large U.S. city.
Gurney, Kevin R; Razlivanov, Igor; Song, Yang; Zhou, Yuyu; Benes, Bedrich; Abdul-Massih, Michel
2012-11-06
In order to advance the scientific understanding of carbon exchange with the land surface, build an effective carbon monitoring system, and contribute to quantitatively based U.S. climate change policy interests, fine spatial and temporal quantification of fossil fuel CO(2) emissions, the primary greenhouse gas, is essential. Called the "Hestia Project", this research effort is the first to use bottom-up methods to quantify all fossil fuel CO(2) emissions down to the scale of individual buildings, road segments, and industrial/electricity production facilities on an hourly basis for an entire urban landscape. Here, we describe the methods used to quantify the on-site fossil fuel CO(2) emissions across the city of Indianapolis, IN. This effort combines a series of data sets and simulation tools such as a building energy simulation model, traffic data, power production reporting, and local air pollution reporting. The system is general enough to be applied to any large U.S. city and holds tremendous potential as a key component of a carbon-monitoring system in addition to enabling efficient greenhouse gas mitigation and planning. We compare the natural gas component of our fossil fuel CO(2) emissions estimate to consumption data provided by the local gas utility. At the zip code level, we achieve a bias-adjusted Pearson r correlation value of 0.92 (p < 0.001).
Using standardized fishery data to inform rehabilitation efforts
Spurgeon, Jonathan J.; Stewart, Nathaniel T.; Pegg, Mark A.; Pope, Kevin L.; Porath, Mark T.
2016-01-01
Lakes and reservoirs progress through an aging process often accelerated by human activities, resulting in degradation or loss of ecosystem services. Resource managers thus attempt to slow or reverse the negative effects of aging using a myriad of rehabilitation strategies. Sustained monitoring programs to assess the efficacy of rehabilitation strategies are often limited; however, long-term standardized fishery surveys may be a valuable data source from which to begin evaluation. We present 3 case studies using standardized fishery survey data to assess rehabilitation efforts stemming from the Nebraska Aquatic Habitat Plan, a large-scale program with the mission to rehabilitate waterbodies within the state. The case studies highlight that biotic responses to rehabilitation efforts can be assessed, to an extent, using standardized fishery data; however, there were specific areas where minor increases in effort would clarify the effectiveness of rehabilitation techniques. Management of lakes and reservoirs can be streamlined by maximizing the utility of such datasets to work smarter, not harder. To facilitate such efforts, we stress collecting both biotic (e.g., fish lengths and weight) and abiotic (e.g., dissolved oxygen, pH, and turbidity) data during standardized fishery surveys and designing rehabilitation actions with an appropriate experimental design.
Informatics for Peru in the new millennium.
Karras, B T; Kimball, A M; Gonzales, V; Pautler, N A; Alarcón, J; Garcia, P J; Fuller, S
2001-01-01
As efforts continue to narrow the digital divide between the North and South, a new biomedical and health informatics training effort has been launched in Peru. This report describes the first year of work on this collaborative effort between the University of Washington (Seattle) Universidad Peruana Cayetano Heredia and Universidad Nacional de San Marcos (Peru) To describe activities in the first year of a new International Research and Training Program in Biomedical and Health Informatics. Descriptive analysis of key activities including an assessment of electronic environment through observation and survey, an in country short course with quantitative evaluation, and first round of recruitment of Peruvian scholars for long-term training in Seattle. A two-week short course on informatics was held in the country. Participants' success in learning was demonstrated through pretest/posttest. A systematic assessment of electronic environment in Peru was carried out and two scholars for long-term training were enrolled at the University of Washington, Seattle. Initial activity in the collaborative training effort has been high. Of particular importance in this environment is orchestration of efforts among interested parties with similar goals in Peru, and integration of informatics skills into ongoing large-scale research projects in country.
García-Grajales, Julián A.; Rucabado, Gabriel; García-Dopico, Antonio; Peña, José-María; Jérusalem, Antoine
2015-01-01
With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, functions of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells, Neurite, has only very recently been proposed. In this paper, we present the implementation details of this model: a finite difference parallel program for simulating electrical signal propagation along neurites under mechanical loading. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite—explicit and implicit—were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between electrophysiology and mechanics. This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon, a segmented dendritic tree, and a damaged axon. The capabilities of the program to deal with large scale scenarios, segmented neuronal structures, and functional deficits under mechanical loading are specifically highlighted. PMID:25680098
2013-01-01
The Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial is a large-scale research effort conducted by the National Cancer Institute. PLCO offers an example of coordinated research by both the extramural and intramural communities of the National Institutes of Health. The purpose of this article is to describe the PLCO research resource and how it is managed and to assess the productivity and the costs associated with this resource. Such an in-depth analysis of a single large-scale project can shed light on questions such as how large-scale projects should be managed, what metrics should be used to assess productivity, and how costs can be compared with productivity metrics. A comprehensive publication analysis identified 335 primary research publications resulting from research using PLCO data and biospecimens from 2000 to 2012. By the end of 2012, a total of 9679 citations (excluding self-citations) have resulted from this body of research publications, with an average of 29.7 citations per article, and an h index of 45, which is comparable with other large-scale studies, such as the Nurses’ Health Study. In terms of impact on public health, PLCO trial results have been used by the US Preventive Services Task Force in making recommendations concerning prostate and ovarian cancer screening. The overall cost of PLCO was $454 million over 20 years, adjusted to 2011 dollars, with approximately $37 million for the collection, processing, and storage of biospecimens, including blood samples, buccal cells, and pathology tissues. PMID:24115361
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jager, Yetta; Forsythe, Patrick S.; McLaughlin, Robert L.
The majority of large North American rivers are fragmented by dams that interrupt migrations of wide-ranging fishes like sturgeons. Reconnecting habitat is viewed as an important means of protecting sturgeon species in U.S. rivers because these species have lost between 5% and 60% of their historical ranges. Unfortunately, facilities designed to pass other fishes have rarely worked well for sturgeons. The most successful passage facilities were sized appropriately for sturgeons and accommodated bottom-oriented species. For upstream passage, facilities with large entrances, full-depth guidance systems, large lifts, or wide fishways without obstructions or tight turns worked well. However, facilitating upstream migrationmore » is only half the battle. Broader recovery for linked sturgeon populations requires safe round-trip passage involving multiple dams. The most successful downstream passage facilities included nature-like fishways, large canal bypasses, and bottom-draw sluice gates. We outline an adaptive approach to implementing passage that begins with temporary programs and structures and monitors success both at the scale of individual fish at individual dams and the scale of metapopulations in a river basin. The challenge will be to learn from past efforts and reconnect North American sturgeon populations in a way that promotes range expansion and facilitates population recovery.« less
Reconnecting fragmented sturgeon populations in North American rivers
Jager, Henriette; Parsley, Michael J.; Cech, Joseph J. Jr.; McLaughlin, R.L.; Forsythe, Patrick S.; Elliott, Robert S.
2016-01-01
The majority of large North American rivers are fragmented by dams that interrupt migrations of wide-ranging fishes like sturgeons. Reconnecting habitat is viewed as an important means of protecting sturgeon species in U.S. rivers because these species have lost between 5% and 60% of their historical ranges. Unfortunately, facilities designed to pass other fishes have rarely worked well for sturgeons. The most successful passage facilities were sized appropriately for sturgeons and accommodated bottom-oriented species. For upstream passage, facilities with large entrances, full-depth guidance systems, large lifts, or wide fishways without obstructions or tight turns worked well. However, facilitating upstream migration is only half the battle. Broader recovery for linked sturgeon populations requires safe “round-trip” passage involving multiple dams. The most successful downstream passage facilities included nature-like fishways, large canal bypasses, and bottom-draw sluice gates. We outline an adaptive approach to implementing passage that begins with temporary programs and structures and monitors success both at the scale of individual fish at individual dams and the scale of metapopulations in a river basin. The challenge will be to learn from past efforts and reconnect North American sturgeon populations in a way that promotes range expansion and facilitates population recovery.
Accelerating large-scale protein structure alignments with graphics processing units
2012-01-01
Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs). As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU. PMID:22357132
The Price of Precision: Large-Scale Mapping of Forest Structure and Biomass Using Airborne Lidar
NASA Astrophysics Data System (ADS)
Dubayah, R.
2015-12-01
Lidar remote sensing provides one of the best means for acquiring detailed information on forest structure. However, its application over large areas has been limited largely because of its expense. Nonetheless, extant data exist over many states in the U.S., funded largely by state and federal consortia and mainly for infrastructure, emergency response, flood plain and coastal mapping. These lidar data are almost always acquired in leaf-off seasons, and until recently, usually with low point count densities. Even with these limitations, they provide unprecedented wall-to-wall mappings that enable development of appropriate methodologies for large-scale deployment of lidar. In this talk we summarize our research and lessons learned in deriving forest structure over regional areas as part of NASA's Carbon Monitoring System (CMS). We focus on two areas: the entire state of Maryland and Sonoma County, California. The Maryland effort used low density, leaf-off data acquired by each county in varying epochs, while the on-going Sonoma work employs state-of-the-art, high density, wall-to-wall, leaf-on lidar data. In each area we combine these lidar coverages with high-resolution multispectral imagery from the National Agricultural Imagery Program (NAIP) and in situ plot data to produce maps of canopy height, tree cover and biomass, and compare our results against FIA plot data and national biomass maps. Our work demonstrates that large-scale mapping of forest structure at high spatial resolution is achievable but products may be complex to produce and validate over large areas. Furthermore, fundamental issues involving statistical approaches, plot types and sizes, geolocation, modeling scales, allometry, and even the definitions of "forest" and "non-forest" must be approached carefully. Ultimately, determining the "price of precision", that is, does the value of wall-to-wall forest structure data justify their expense, should consider not only carbon market applications, but the other ways the underlying lidar data may be used.
Using Unplanned Fires to Help Suppressing Future Large Fires in Mediterranean Forests
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire–succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000–2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18–22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change. PMID:24727853
Using unplanned fires to help suppressing future large fires in Mediterranean forests.
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change.
Genomic encyclopedia of bacteria and archaea: sequencing a myriad of type strains.
Kyrpides, Nikos C; Hugenholtz, Philip; Eisen, Jonathan A; Woyke, Tanja; Göker, Markus; Parker, Charles T; Amann, Rudolf; Beck, Brian J; Chain, Patrick S G; Chun, Jongsik; Colwell, Rita R; Danchin, Antoine; Dawyndt, Peter; Dedeurwaerdere, Tom; DeLong, Edward F; Detter, John C; De Vos, Paul; Donohue, Timothy J; Dong, Xiu-Zhu; Ehrlich, Dusko S; Fraser, Claire; Gibbs, Richard; Gilbert, Jack; Gilna, Paul; Glöckner, Frank Oliver; Jansson, Janet K; Keasling, Jay D; Knight, Rob; Labeda, David; Lapidus, Alla; Lee, Jung-Sook; Li, Wen-Jun; Ma, Juncai; Markowitz, Victor; Moore, Edward R B; Morrison, Mark; Meyer, Folker; Nelson, Karen E; Ohkuma, Moriya; Ouzounis, Christos A; Pace, Norman; Parkhill, Julian; Qin, Nan; Rossello-Mora, Ramon; Sikorski, Johannes; Smith, David; Sogin, Mitch; Stevens, Rick; Stingl, Uli; Suzuki, Ken-Ichiro; Taylor, Dorothea; Tiedje, Jim M; Tindall, Brian; Wagner, Michael; Weinstock, George; Weissenbach, Jean; White, Owen; Wang, Jun; Zhang, Lixin; Zhou, Yu-Guang; Field, Dawn; Whitman, William B; Garrity, George M; Klenk, Hans-Peter
2014-08-01
Microbes hold the key to life. They hold the secrets to our past (as the descendants of the earliest forms of life) and the prospects for our future (as we mine their genes for solutions to some of the planet's most pressing problems, from global warming to antibiotic resistance). However, the piecemeal approach that has defined efforts to study microbial genetic diversity for over 20 years and in over 30,000 genome projects risks squandering that promise. These efforts have covered less than 20% of the diversity of the cultured archaeal and bacterial species, which represent just 15% of the overall known prokaryotic diversity. Here we call for the funding of a systematic effort to produce a comprehensive genomic catalog of all cultured Bacteria and Archaea by sequencing, where available, the type strain of each species with a validly published name (currently∼11,000). This effort will provide an unprecedented level of coverage of our planet's genetic diversity, allow for the large-scale discovery of novel genes and functions, and lead to an improved understanding of microbial evolution and function in the environment.
Genomic Encyclopedia of Bacteria and Archaea: Sequencing a Myriad of Type Strains
Kyrpides, Nikos C.; Hugenholtz, Philip; Eisen, Jonathan A.; Woyke, Tanja; Göker, Markus; Parker, Charles T.; Amann, Rudolf; Beck, Brian J.; Chain, Patrick S. G.; Chun, Jongsik; Colwell, Rita R.; Danchin, Antoine; Dawyndt, Peter; Dedeurwaerdere, Tom; DeLong, Edward F.; Detter, John C.; De Vos, Paul; Donohue, Timothy J.; Dong, Xiu-Zhu; Ehrlich, Dusko S.; Fraser, Claire; Gibbs, Richard; Gilbert, Jack; Gilna, Paul; Glöckner, Frank Oliver; Jansson, Janet K.; Keasling, Jay D.; Knight, Rob; Labeda, David; Lapidus, Alla; Lee, Jung-Sook; Li, Wen-Jun; MA, Juncai; Markowitz, Victor; Moore, Edward R. B.; Morrison, Mark; Meyer, Folker; Nelson, Karen E.; Ohkuma, Moriya; Ouzounis, Christos A.; Pace, Norman; Parkhill, Julian; Qin, Nan; Rossello-Mora, Ramon; Sikorski, Johannes; Smith, David; Sogin, Mitch; Stevens, Rick; Stingl, Uli; Suzuki, Ken-ichiro; Taylor, Dorothea; Tiedje, Jim M.; Tindall, Brian; Wagner, Michael; Weinstock, George; Weissenbach, Jean; White, Owen; Wang, Jun; Zhang, Lixin; Zhou, Yu-Guang; Field, Dawn; Whitman, William B.; Garrity, George M.; Klenk, Hans-Peter
2014-01-01
Microbes hold the key to life. They hold the secrets to our past (as the descendants of the earliest forms of life) and the prospects for our future (as we mine their genes for solutions to some of the planet's most pressing problems, from global warming to antibiotic resistance). However, the piecemeal approach that has defined efforts to study microbial genetic diversity for over 20 years and in over 30,000 genome projects risks squandering that promise. These efforts have covered less than 20% of the diversity of the cultured archaeal and bacterial species, which represent just 15% of the overall known prokaryotic diversity. Here we call for the funding of a systematic effort to produce a comprehensive genomic catalog of all cultured Bacteria and Archaea by sequencing, where available, the type strain of each species with a validly published name (currently∼11,000). This effort will provide an unprecedented level of coverage of our planet's genetic diversity, allow for the large-scale discovery of novel genes and functions, and lead to an improved understanding of microbial evolution and function in the environment. PMID:25093819
Co-Optimization of Fuels and Engines (Co-Optima) -- Introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, John T; Wagner, Robert; Holladay, John
The Co-Optimization of Fuels and Engines (Co-Optima) initiative is a U.S. Department of Energy (DOE) effort funded by both the Vehicle and Bioenergy Technology Offices. The overall goal of the effort is to identify the combinations of fuel properties and engine characteristics that maximize efficiency, independent of production pathway or fuel composition, and accelerate commercialization of these technologies. Multiple research efforts are underway focused on both spark-ignition and compression-ignition strategies applicable across the entire light, medium, and heavy-duty fleet. A key objective of Co-Optima's research is to identify new blendstocks that enhance current petroleum blending components, increase blendstock diversity, andmore » provide refiners with increased flexibility to blend fuels with the key properties required to optimize advanced internal combustion engines. In addition to fuels and engines R&D, the initiative is guided by analyses assessing the near-term commercial feasibility of new blendstocks based on economics, environmental performance, compatibility, and large-scale production viability. This talk will provide an overview of the Co-Optima effort.« less
Hakkenberg, C R; Zhu, K; Peet, R K; Song, C
2018-02-01
The central role of floristic diversity in maintaining habitat integrity and ecosystem function has propelled efforts to map and monitor its distribution across forest landscapes. While biodiversity studies have traditionally relied largely on ground-based observations, the immensity of the task of generating accurate, repeatable, and spatially-continuous data on biodiversity patterns at large scales has stimulated the development of remote-sensing methods for scaling up from field plot measurements. One such approach is through integrated LiDAR and hyperspectral remote-sensing. However, despite their efficiencies in cost and effort, LiDAR-hyperspectral sensors are still highly constrained in structurally- and taxonomically-heterogeneous forests - especially when species' cover is smaller than the image resolution, intertwined with neighboring taxa, or otherwise obscured by overlapping canopy strata. In light of these challenges, this study goes beyond the remote characterization of upper canopy diversity to instead model total vascular plant species richness in a continuous-cover North Carolina Piedmont forest landscape. We focus on two related, but parallel, tasks. First, we demonstrate an application of predictive biodiversity mapping, using nonparametric models trained with spatially-nested field plots and aerial LiDAR-hyperspectral data, to predict spatially-explicit landscape patterns in floristic diversity across seven spatial scales between 0.01-900 m 2 . Second, we employ bivariate parametric models to test the significance of individual, remotely-sensed predictors of plant richness to determine how parameter estimates vary with scale. Cross-validated results indicate that predictive models were able to account for 15-70% of variance in plant richness, with LiDAR-derived estimates of topography and forest structural complexity, as well as spectral variance in hyperspectral imagery explaining the largest portion of variance in diversity levels. Importantly, bivariate tests provide evidence of scale-dependence among predictors, such that remotely-sensed variables significantly predict plant richness only at spatial scales that sufficiently subsume geolocational imprecision between remotely-sensed and field data, and best align with stand components including plant size and density, as well as canopy gaps and understory growth patterns. Beyond their insights into the scale-dependent patterns and drivers of plant diversity in Piedmont forests, these results highlight the potential of remotely-sensible essential biodiversity variables for mapping and monitoring landscape floristic diversity from air- and space-borne platforms. © 2017 by the Ecological Society of America.
Development of a Hybrid RANS/LES Method for Turbulent Mixing Layers
NASA Technical Reports Server (NTRS)
Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli
2001-01-01
Significant research has been underway for several years in NASA Glenn Research Center's nozzle branch to develop advanced computational methods for simulating turbulent flows in exhaust nozzles. The primary efforts of this research have concentrated on improving our ability to calculate the turbulent mixing layers that dominate flows both in the exhaust systems of modern-day aircraft and in those of hypersonic vehicles under development. As part of these efforts, a hybrid numerical method was recently developed to simulate such turbulent mixing layers. The method developed here is intended for configurations in which a dominant structural feature provides an unsteady mechanism to drive the turbulent development in the mixing layer. Interest in Large Eddy Simulation (LES) methods have increased in recent years, but applying an LES method to calculate the wide range of turbulent scales from small eddies in the wall-bounded regions to large eddies in the mixing region is not yet possible with current computers. As a result, the hybrid method developed here uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall-bounded regions entering a mixing section and uses a LES procedure to calculate the mixing-dominated regions. A numerical technique was developed to enable the use of the hybrid RANS-LES method on stretched, non-Cartesian grids. With this technique, closure for the RANS equations is obtained by using the Cebeci-Smith algebraic turbulence model in conjunction with the wall-function approach of Ota and Goldberg. The LES equations are closed using the Smagorinsky subgrid scale model. Although the function of the Cebeci-Smith model to replace all of the turbulent stresses is quite different from that of the Smagorinsky subgrid model, which only replaces the small subgrid turbulent stresses, both are eddy viscosity models and both are derived at least in part from mixing-length theory. The similar formulation of these two models enables the RANS and LES equations to be solved with a single solution scheme and computational grid. The hybrid RANS-LES method has been applied to a benchmark compressible mixing layer experiment in which two isolated supersonic streams, separated by a splitter plate, provide the flows to a constant-area mixing section. Although the configuration is largely two dimensional in nature, three-dimensional calculations were found to be necessary to enable disturbances to develop in three spatial directions and to transition to turbulence. The flow in the initial part of the mixing section consists of a periodic vortex shedding downstream of the splitter plate trailing edge. This organized vortex shedding then rapidly transitions to a turbulent structure, which is very similar to the flow development observed in the experiments. Although the qualitative nature of the large-scale turbulent development in the entire mixing section is captured well by the LES part of the current hybrid method, further efforts are planned to directly calculate a greater portion of the turbulence spectrum and to limit the subgrid scale modeling to only the very small scales. This will be accomplished by the use of higher accuracy solution schemes and more powerful computers, measured both in speed and memory capabilities.
Species composition and morphologic variation of Porites in the Gulf of California
NASA Astrophysics Data System (ADS)
López-Pérez, R. A.
2013-09-01
Morphometric analysis of corallite calices confirmed that from the late Miocene to the Recent, four species of Porites have inhabited the Gulf of California: the extinct Porites carrizensis, the locally extirpated Porites lobata and the extant Porites sverdrupi and Porites panamensis. Furthermore, large-scale spatial and temporal phenotypic plasticity was observed in the dominant species P. panamensis. Canonical discriminant analysis and ANOVA demonstrated that the calice structures of P. panamensis experienced size reduction between the late Pleistocene and Recent. Similarly, PERMANOVA, regression and correlation analyses demonstrated that across the 800 km north to south in the gulf, P. panamensis populations displayed a similar reduction in calice structures. Based on correlation analysis with environmental data, these large spatial changes are likely related to changes in nutrient concentration and sea surface temperature. As such, the large-scale spatial and temporal phenotypic variation recorded in populations of P. panamensis in the Gulf of California is likely related to optimization of corallite performance (energy acquisition) within various environmental scenarios. These findings may have relevance to modern conservation efforts within this ecological dominant genus.