PHOSPHORUS RECOVERY FROM SEWAGE
Phosphorus is a growth limiting nutrient that is mined from rock ore, refined, used in fertilizers, and discharged to the environment through municipal sewage. The impacts of phosphorus discharge include severe eutrophication of fresh water bodies. The future sustainable use of...
Spatially Refined Aerosol Direct Radiative Focusing Efficiencies
Global aerosol direct radiative forcing (DRF) is an important metric for assessing potential climate impacts of future emissions changes. However, the radiative consequences of emissions perturbations are not readily quantified nor well understood at the level of detail necessary...
Spatially Refined Aerosol Direct Radiative Forcing Efficiencies
Global aerosol direct radiative forcing (DRF) is an important metric for assessing potential climate impacts of future emissions changes. However, the radiative consequences of emissions perturbations are not readily quantified nor well understood at the level of detail necessary...
On the impact of a refined stochastic model for airborne LiDAR measurements
NASA Astrophysics Data System (ADS)
Bolkas, Dimitrios; Fotopoulos, Georgia; Glennie, Craig
2016-09-01
Accurate topographic information is critical for a number of applications in science and engineering. In recent years, airborne light detection and ranging (LiDAR) has become a standard tool for acquiring high quality topographic information. The assessment of airborne LiDAR derived DEMs is typically based on (i) independent ground control points and (ii) forward error propagation utilizing the LiDAR geo-referencing equation. The latter approach is dependent on the stochastic model information of the LiDAR observation components. In this paper, the well-known statistical tool of variance component estimation (VCE) is implemented for a dataset in Houston, Texas, in order to refine the initial stochastic information. Simulations demonstrate the impact of stochastic-model refinement for two practical applications, namely coastal inundation mapping and surface displacement estimation. Results highlight scenarios where erroneous stochastic information is detrimental. Furthermore, the refined stochastic information provides insights on the effect of each LiDAR measurement in the airborne LiDAR error budget. The latter is important for targeting future advancements in order to improve point cloud accuracy.
Downy brome control and impacts on perennial grass abundance: a systematic review spanning 64 years
USDA-ARS?s Scientific Manuscript database
Given the high cost of restoration and the underlying assumption that reducing annual grass abundance is a necessary precursor to rangeland restoration in the Intermountain West, USA, we sought to identify limitations and strengths of annual grass and woody plant reduction methods and refine future ...
The future of cerebral surgery: a kaleidoscope of opportunities.
Elder, James B; Hoh, Daniel J; Oh, Bryan C; Heller, A Chris; Liu, Charles Y; Apuzzo, Michael L J
2008-06-01
The emerging future of cerebral surgery will witness the refined evolution of current techniques, as well as the introduction of numerous novel concepts. Clinical practice and basic science research will benefit greatly from their application. The sum of these efforts will result in continued minimalism and improved accuracy and efficiency of neurosurgical diagnostic and therapeutic methodologies.Initially, the refinement of current technologies will further enhance various aspects of cerebral surgery. Advances in computing power and information technology will speed data acquisition, storage, and transfer. Miniaturization of current devices will impact diverse areas, such as modulation of endoscopy and endovascular techniques. The increased penetrance of surgical technologies such as stereotactic radiosurgery, neuronavigation, intraoperative imaging, and implantable electrodes for neurodegenerative disorders and epilepsy will enhance the knowledge and experience in these areas and facilitate refinements and advances in these technologies. Further into the future, technologies that are currently relatively remote to surgical events will fundamentally alter the complexity and scale at which a neurological disease may be treated or investigated. Seemingly futuristic concepts will become ubiquitous in the daily experience of the neurosurgeon. These include diverse fields such as nanotechnology, virtual reality, and robotics. Ultimately, combining advances in multiple fields will yield progress in diverse realms such as brain tumor therapy, neuromodulation for psychiatric diseases, and neuroprosthetics. Operating room equipment and design will benefit from each of the aforementioned advances. In this work, we discuss new developments in three parts. In Part I, concepts in minimalism important for future cerebral surgery are discussed. These include concrete and abstract ideas in miniaturization, as well as recent and future work in microelectromechanical systems and nanotechnology. Part II presents advances in computational sciences and technological fields dependent on these developments. Future breakthroughs in the components of the "computer," including data storage, electrical circuitry, and computing hardware and techniques, are discussed. Additionally, important concepts in the refinement of virtual environments and the brain-machine interface are presented, as their incorporation into cerebral surgery is closely linked to advances in computing and electronics. Finally, Part III offers insights into the future evolution of surgical and nonsurgical diagnostic and therapeutic modalities that are important for the future cerebral surgeon. A number of topics relevant to cerebral surgery are discussed, including the operative environment, imaging technologies, endoscopy, robotics, neuromodulation, stem cell therapy, radiosurgery, and technical methods of restoration of neural function. Cerebral surgery in the near and distant future will reflect the application of these emerging technologies. As this article indicates, the key to maximizing the impact of these advancements in the clinical arena is continued collaboration between scientists and neurosurgeons, as well as the emergence of a neurosurgeon whose scientific grounding and technical focus are far removed from those of his predecessors.
Xue, Mianqiang; Kendall, Alissa; Xu, Zhenming; Schoenung, Julie M
2015-01-20
Due to economic and societal reasons, informal activities including open burning, backyard recycling, and landfill are still the prevailing methods used for electronic waste treatment in developing countries. Great efforts have been made, especially in China, to promote formal approaches for electronic waste management by enacting laws, developing green recycling technologies, initiating pilot programs, etc. The formal recycling process can, however, engender environmental impact and resource consumption, although information on the environmental loads and resource consumption is currently limited. To quantitatively assess the environmental impact of the processes in a formal printed wiring board (PWB) recycling chain, life cycle assessment (LCA) was applied to a formal recycling chain that includes the steps from waste liberation through materials refining. The metal leaching in the refining stage was identified as a critical process, posing most of the environmental impact in the recycling chain. Global warming potential was the most significant environmental impact category after normalization and weighting, followed by fossil abiotic depletion potential, and marine aquatic eco-toxicity potential. Scenario modeling results showed that variations in the power source and chemical reagents consumption had the greatest influence on the environmental performance. The environmental impact from transportation used for PWB collection was also evaluated. The results were further compared to conventional primary metals production processes, highlighting the environmental benefit of metal recycling from waste PWBs. Optimizing the collection mode, increasing the precious metals recovery efficiency in the beneficiation stage and decreasing the chemical reagents consumption in the refining stage by effective materials liberation and separation are proposed as potential improvement strategies to make the recycling chain more environmentally friendly. The LCA results provide environmental information for the improvement of future integrated technologies and electronic waste management.
Developments in the Disposal of Residue from the Alumina Refining Industry
NASA Astrophysics Data System (ADS)
Cooling, D. J.
The disposal of residue forms an integral part of the alumina refining process. The refining of Western Australia bauxite, which is low grade ore by world standards, results in 2 dry tonnes of residue for every 1 tonne of alumina produced. The disposal of this residue contributes a significant proportion of the overall cost of producing alumina. The residue is also highly alkaline, and, if not contained in sealed impoundment areas, can impact on the local environment. It has been these two considerations, the cost of disposal and the potential impact of disposal on the environment, which have been the main driving forces behind changes to the way residue is stored. This paper traces the various residue disposal techniques adopted by Alcoa of Australia Limited from containment in large settling ponds, to splitting the coarse and fine fractions for separate disposal, to the storage of the fine mud fraction in base drained ponds, to the more recent pre-thickening of the fine mud fraction for disposal in solar drying ponds. The reasons for change and the problems encountered are reviewed, and possible future developments are discussed.
Romig, Barbara D; Tucker, Ann W; Hewitt, Anne M; O'Sullivan Maillet, Julie
2017-01-01
There is limited information and consensus on the future of clinical education and the key factors impacting allied health (AH) clinical training. AH deans identified both opportunities and challenges impacting clinical education based on a proposed educational model. From July 2013 to March 2014, 61 deans whose institutions were 2013 members of the Association of Schools of Allied Health Professions (ASAHP) participated in a three-round Delphi survey. Agreement on the relative importance of and the ability to impact the key factors was analyzed. Impact was evaluated for three groups: individual, collective, and both individual and collective deans. AH deans' responses were summarized and refined; individual items were accepted or rerated until agreement was achieved or study conclusion. Based on the deans' ratings of importance and impact, 159 key factors within 13 clinical education categories emerged as important for the future of clinical education. Agreement was achieved on 107 opportunities and 52 challenges. The Delphi technique generated new information where little existed specific to AH deans' perspectives on AH clinical education. This research supports the Key Factors Impacting Allied Health Clinical Education conceptual model proposed earlier and provides a foundation for AH deans to evaluate opportunities and challenges impacting AH clinical education and to design action plans based on this research.
Impact of Environmental Compliance Costs on U.S. Refining Profitability 1995-2001
2003-01-01
This report assesses the effects of pollution abatement requirements on the financial performance of U.S. petroleum refining and marketing operations during the 1995 to 2001 period. This study is a follow-up to the October 1997 publication entitled The Impact of Environmental Compliance Costs on U.S. Refining Profitability, that focused on the financial impacts of U.S. refining pollution abatement investment requirements in the 1988 to1995 period.
Are Cellulose Nanofibers a Solution for a More Circular Economy of Paper Products?
Delgado-Aguilar, Marc; Tarrés, Quim; Pèlach, M Àngels; Mutjé, Pere; Fullana-I-Palmer, Pere
2015-10-20
This paper presents the study of the feasibility of incorporating lignocellulosic nanofibers (LCNF) to paper in order to maintain the relevant physical properties and increase the number of cycles that paper can be recycled in the technosphere in a more circular economy. For that purpose, the effect of mechanical refining in recycling processes was compared with that of the novel LCNF addition. In this sense, the behavior of a bleached kraft hardwood pulp when recycled was investigated, as well as the effects of each methodology. Since there are many issues to be considered when trying to replace a technology, the present paper analyses its feasibility from a technical and environmental point of view. Technically, LCNF present greater advantages against mechanical refining, such as higher mechanical properties and longer durability of the fibers. A preliminary life cycle assessment showed that the environmental impacts of both systems are very similar; however, changing the boundary conditions to some feasible future scenarios, led to demonstrate that the CNF technology may improve significantly those impacts.
Crystallization in lactose refining-a review.
Wong, Shin Yee; Hartel, Richard W
2014-03-01
In the dairy industry, crystallization is an important separation process used in the refining of lactose from whey solutions. In the refining operation, lactose crystals are separated from the whey solution through nucleation, growth, and/or aggregation. The rate of crystallization is determined by the combined effect of crystallizer design, processing parameters, and impurities on the kinetics of the process. This review summarizes studies on lactose crystallization, including the mechanism, theory of crystallization, and the impact of various factors affecting the crystallization kinetics. In addition, an overview of the industrial crystallization operation highlights the problems faced by the lactose manufacturer. The approaches that are beneficial to the lactose manufacturer for process optimization or improvement are summarized in this review. Over the years, much knowledge has been acquired through extensive research. However, the industrial crystallization process is still far from optimized. Therefore, future effort should focus on transferring the new knowledge and technology to the dairy industry. © 2014 Institute of Food Technologists®
No smoke without fire: The impact of future friends on adolescent smoking behaviour.
Mercken, L; Candel, M; van Osch, L; de Vries, H
2011-02-01
This study examined the impact of future friends and the contribution of different social influence and selection processes in predicting adolescents' smoking behaviour by extending the theory of planned behaviour (TPB). We investigated the impact of previous smoking, direct pressure from friends, descriptive norms of present and future friends, smoking-based selection of future friends, and distinguished between reciprocal and desired friends. A longitudinal design with three measurements was used. METHODSL: The sample consisted of 1,475 Dutch high school students (mean age = 12.7 years) that participated as a control group in the European Smoking prevention Framework Approach study at three measurements. Structural equation modelling revealed that adolescent smoking was influenced by intention, previous smoking, descriptive norms of parents and siblings, and that desired as well as reciprocal friends were selected based on similar smoking behaviour. Future friends indirectly influenced adolescent smoking through intention, as did attitude, subjective norms of parents and siblings, previous smoking, and descriptive norms of reciprocal friends and siblings. The present results suggest that descriptive norms and selection of friends need to be considered as major factors explaining smoking behaviour among adolescents besides the TPB components. These insights contribute to the further refinement of smoking prevention strategies. ©2010 The British Psychological Society.
Using scenarios to assess possible future impacts of invasive species in the Laurentian Great Lakes
Lauber, T. Bruce; Stedman, Richard C.; Connelly, Nancy A; Rudstam, Lars G.; Ready, Richard C; Poe, Gregory L; Bunnell, David B.; Hook, Tomas O.; Koops, Marten A.; Ludsin, Stuart A.; Rutherford, Edward S; Wittmann, Marion E.
2016-01-01
The expected impacts of invasive species are key considerations in selecting policy responses to potential invasions. But predicting the impacts of invasive species is daunting, particularly in large systems threatened by multiple invasive species, such as North America’s Laurentian Great Lakes. We developed and evaluated a scenario-building process that relied on an expert panel to assess possible future impacts of aquatic invasive species on recreational fishing in the Great Lakes. To maximize its usefulness to policy makers, this process was designed to be implemented relatively rapidly and consider a range of species. The expert panel developed plausible, internally-consistent invasion scenarios for 5 aquatic invasive species, along with subjective probabilities of those scenarios. We describe these scenarios and evaluate this approach for assessing future invasive species impacts. The panel held diverse opinions about the likelihood of the scenarios, and only one scenario with impacts on sportfish species was considered likely by most of the experts. These outcomes are consistent with the literature on scenario building, which advocates for developing a range of plausible scenarios in decision making because the uncertainty of future conditions makes the likelihood of any particular scenario low. We believe that this scenario-building approach could contribute to policy decisions about whether and how to address the possible impacts of invasive species. In this case, scenarios could allow policy makers to narrow the range of possible impacts on Great Lakes fisheries they consider and help set a research agenda for further refining invasive species predictions.
NASA Astrophysics Data System (ADS)
Buotte, P.; Law, B. E.; Hicke, J. A.; Hudiburg, T. W.; Levis, S.; Kent, J.
2017-12-01
Fire and beetle outbreaks can have substantial impacts on forest structure, composition, and function and these types of disturbances are expected to increase in the future. Therefore understanding the ecological impacts of these disturbances into the future is important. We used ecosystem process modeling to estimate the future occurrence of fire and beetle outbreaks and their impacts on forest resilience and carbon sequestration. We modified the Community Land Model (CLM4.5) to better represent forest growth and mortality in the western US through multiple avenues: 1) we increased the ecological resolution to recognize 14 forest types common to the region; 2) we improved CLM4.5's ability to handle drought stress by adding forest type-specific controls on stomatal conductance and increased rates of leaf shed during periods of low soil moisture; 3) we developed and implemented a mechanistic model of beetle population growth and subsequent tree mortality; 4) we modified the current fire module to account for more refined forest types; and 5) we developed multiple scenarios of harvest based on past harvest rates and proposed changes in land management policies. We ran CLM4.5 in offline mode with climate forcing data. We compare future forest growth rates and carbon sequestration with historical metrics to estimate the combined influence of future disturbances on forest composition and carbon sequestration in the western US.
Chen, Yushun; Viadero, Roger C; Wei, Xinchao; Fortney, Ronald; Hedrick, Lara B; Welsh, Stuart A; Anderson, James T; Lin, Lian-Shin
2009-01-01
Refining best management practices (BMPs) for future highway construction depends on a comprehensive understanding of environmental impacts from current construction methods. Based on a before-after-control impact (BACI) experimental design, long-term stream monitoring (1997-2006) was conducted at upstream (as control, n = 3) and downstream (as impact, n = 6) sites in the Lost River watershed of the Mid-Atlantic Highlands region, West Virginia. Monitoring data were analyzed to assess impacts of during and after highway construction on 15 water quality parameters and macroinvertebrate condition using the West Virginia stream condition index (WVSCI). Principal components analysis (PCA) identified regional primary water quality variances, and paired t tests and time series analysis detected seven highway construction-impacted water quality parameters which were mainly associated with the second principal component. In particular, impacts on turbidity, total suspended solids, and total iron during construction, impacts on chloride and sulfate during and after construction, and impacts on acidity and nitrate after construction were observed at the downstream sites. The construction had statistically significant impacts on macroinvertebrate index scores (i.e., WVSCI) after construction, but did not change the overall good biological condition. Implementing BMPs that address those construction-impacted water quality parameters can be an effective mitigation strategy for future highway construction in this highlands region.
Spatially Refined Aerosol Direct Radiative Forcing Efficiencies
NASA Technical Reports Server (NTRS)
Henze, Daven K.; Shindell, Drew Todd; Akhtar, Farhan; Spurr, Robert J. D.; Pinder, Robert W.; Loughlin, Dan; Kopacz, Monika; Singh, Kumaresh; Shim, Changsub
2012-01-01
Global aerosol direct radiative forcing (DRF) is an important metric for assessing potential climate impacts of future emissions changes. However, the radiative consequences of emissions perturbations are not readily quantified nor well understood at the level of detail necessary to assess realistic policy options. To address this challenge, here we show how adjoint model sensitivities can be used to provide highly spatially resolved estimates of the DRF from emissions of black carbon (BC), primary organic carbon (OC), sulfur dioxide (SO2), and ammonia (NH3), using the example of emissions from each sector and country following multiple Representative Concentration Pathway (RCPs). The radiative forcing efficiencies of many individual emissions are found to differ considerably from regional or sectoral averages for NH3, SO2 from the power sector, and BC from domestic, industrial, transportation and biomass burning sources. Consequently, the amount of emissions controls required to attain a specific DRF varies at intracontinental scales by up to a factor of 4. These results thus demonstrate both a need and means for incorporating spatially refined aerosol DRF into analysis of future emissions scenario and design of air quality and climate change mitigation policies.
A top-down assessment of energy, water and land use in uranium mining, milling, and refining
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. Schneider; B. Carlsen; E. Tavrides
2013-11-01
Land, water and energy use are key measures of the sustainability of uranium production into the future. As the most attractive, accessible deposits are mined out, future discoveries may prove to be significantly, perhaps unsustainably, more intensive consumers of environmental resources. A number of previous attempts have been made to provide empirical relationships connecting these environmental impact metrics to process variables such as stripping ratio and ore grade. These earlier attempts were often constrained by a lack of real world data and perform poorly when compared against data from modern operations. This paper conditions new empirical models of energy, watermore » and land use in uranium mining, milling, and refining on contemporary data reported by operating mines. It shows that, at present, direct energy use from uranium production represents less than 1% of the electrical energy produced by the once-through fuel cycle. Projections of future energy intensity from uranium production are also possible by coupling the empirical models with estimates of uranium crustal abundance, characteristics of new discoveries, and demand. The projections show that even for the most pessimistic of scenarios considered, by 2100, the direct energy use from uranium production represents less than 3% of the electrical energy produced by the contemporary once-through fuel cycle.« less
NASA Capabilities That Could Impact Terrestrial Smart Grids of the Future
NASA Technical Reports Server (NTRS)
Beach, Raymond F.
2015-01-01
Incremental steps to steadily build, test, refine, and qualify capabilities that lead to affordable flight elements and a deep space capability. Potential Deep Space Vehicle Power system characteristics: power 10 kilowatts average; two independent power channels with multi-level cross-strapping; solar array power 24 plus kilowatts; multi-junction arrays; lithium Ion battery storage 200 plus ampere-hours; sized for deep space or low lunar orbit operation; distribution120 volts secondary (SAE AS 5698); 2 kilowatt power transfer between vehicles.
Welsh, Stuart A.; Chen, Yushun; Viadero, Stuart C.; Wei, Xinchao; Hedrick, Lara B.; Anderson, James T.; Lin, Lian-Shin
2009-01-01
Refining best management practices (BMPs) for future highway construction depends on a comprehensive understanding of environmental impacts from current construction methods. Based on a before-after-control impact (BACI) experimental design, long-term stream monitoring (1997–2006) was conducted at upstream (as control, n = 3) and downstream (as impact, n = 6) sites in the Lost River watershed of the Mid-Atlantic Highlands region, West Virginia. Monitoring data were analyzed to assess impacts of during and after highway construction on 15 water quality parameters and macroinvertebrate condition using the West Virginia stream condition index (WVSCI). Principal components analysis (PCA) identified regional primary water quality variances, and paired t tests and time series analysis detected seven highway construction-impacted water quality parameters which were mainly associated with the second principal component. In particular, impacts on turbidity, total suspended solids, and total iron during construction, impacts on chloride and sulfate during and after construction, and impacts on acidity and nitrate after construction were observed at the downstream sites. The construction had statistically significant impacts on macroinvertebrate index scores (i.e., WVSCI) after construction, but did not change the overall good biological condition. Implementing BMPs that address those construction-impacted water quality parameters can be an effective mitigation strategy for future highway construction in this highlands region.
The development of ecological impact assessment in China.
Liu, Xuehua; Li, Zhouyuan; Liao, Chenghao; Wang, Qing; Zhu, Annah; Li, Dong; Li, Yajun; Tang, Zhuo
2015-12-01
The balance between economic development and ecological conservation in China has become a critical issue in recent decades. Ecological impact assessment (EcoIA) was established beginning in the 1980s as a component of environmental impact assessment (EIA) that focuses specifically on human-related changes in ecosystem structure and function. EcoIA has since been widely applied throughout the country with continuous refinements in theory and practice. As compared to EIA, EcoIA is often performed at a larger scale in the long-term, and thus requires more advanced tools and techniques to quantify and assess. This paper reviews the development of EcoIA over the past 30years in China, with specific consideration given to refinements in legislation and methodology. Three stages in the development of EcoIA in China are identified, along with their achievements and limitations. Supplementing this qualitative analysis, the paper also provides a quantitative bibliometrics review of academic publications concerning EcoIA in China over the three identified stages. Lastly, general trends in the development of EcoIA are summarized with the aim of conveying potential future trajectories. This review is intended to introduce the EcoIA system to scholars interested in the growing field of environmental management in China. Copyright © 2015 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-16
...-82385] Notice of Availability of the Final Environmental Impact Statement for the UNEV Refined Liquid...) has prepared a Proposed Resource Management Plan Amendment (RMPA)/Final Environmental Impact Statement..., Tooele, Juab, Millard, Beaver, Iron, and Washington Counties in Utah; and Lincoln and Clark Counties in...
Absar, Syeda Mariya; Preston, Benjamin L.
2015-05-25
The exploration of alternative socioeconomic futures is an important aspect of understanding the potential consequences of climate change. While socioeconomic scenarios are common and, at times essential, tools for the impact, adaptation and vulnerability and integrated assessment modeling research communities, their approaches to scenario development have historically been quite distinct. However, increasing convergence of impact, adaptation and vulnerability and integrated assessment modeling research in terms of scales of analysis suggests there may be value in the development of a common framework for socioeconomic scenarios. The Shared Socioeconomic Pathways represents an opportunity for the development of such a common framework. However,more » the scales at which these global storylines have been developed are largely incommensurate with the sub-national scales at which impact, adaptation and vulnerability, and increasingly integrated assessment modeling, studies are conducted. Our objective for this study was to develop sub-national and sectoral extensions of the global SSP storylines in order to identify future socioeconomic challenges for adaptation for the U.S. Southeast. A set of nested qualitative socioeconomic storyline elements, integrated storylines, and accompanying quantitative indicators were developed through an application of the Factor-Actor-Sector framework. Finally, in addition to revealing challenges and opportunities associated with the use of the SSPs as a basis for more refined scenario development, this study generated sub-national storyline elements and storylines that can subsequently be used to explore the implications of alternative subnational socioeconomic futures for the assessment of climate change impacts and adaptation.« less
Chen, Y.; Viadero, R.C.; Wei, X.; Fortney, Ronald H.; Hedrick, Lara B.; Welsh, S.A.; Anderson, James T.; Lin, L.-S.
2009-01-01
Refining best management practices (BMPs) for future highway construction depends on a comprehensive understanding of environmental impacts from current construction methods. Based on a before-after-control impact (BACI) experimental design, long-term stream monitoring (1997-2006) was conducted at upstream (as control, n = 3) and downstream (as impact, n = 6) sites in the Lost River watershed of the Mid-Atlantic Highlands region, West Virginia. Monitoring data were analyzed to assess impacts of during and after highway construction on 15 water quality parameters and macroinvertebrate condition using the West Virginia stream condition index (WVSCI). Principal components analysis (PCA) identified regional primary water quality variances, and paired t tests and time series analysis detected seven highway construction-impacted water quality parameters which were mainly associated with the second principal component. In particular, impacts on turbidity, total suspended solids, and total iron during construction, impacts on chloride and sulfate during and after construction, and impacts on acidity and nitrate after construction were observed at the downstream sites. The construction had statistically significant impacts on macroinvertebrate index scores (i.e., WVSCI) after construction, but did not change the overall good biological condition. Implementing BMPs that address those construction-impacted water quality parameters can be an effective mitigation strategy for future highway construction in this highlands region. Copyright ?? 2009 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
The Influence of Grain Refiners on the Efficiency of Ceramic Foam Filters
NASA Astrophysics Data System (ADS)
Towsey, Nicholas; Schneider, Wolfgang; Krug, Hans-Peter; Hardman, Angela; Keegan, Neil J.
An extensive program of work has been carried out to evaluate the efficiency of ceramic foam filters under carefully controlled conditions. Work reported at previous TMS meetings showed that in the absence of grain refiners, ceramic foam filters have the capacity for high filtration efficiency and consistent, reliable performance. The current phase of the investigation focuses on the impact grain refiner additions have on filter performance. The high filtration efficiencies obtained using 50 or 80ppi CFF's in the absence of grain refiners diminish when Al-3%Ti-1%B grain refiners are added. This, together with the impact of incoming inclusion loading on filter performance and the level of grain refiner addition are considered in detail. The new generation Al-3%Ti-0.15%C grain refiner has also been included. At typical addition levels (1kg/tonne) the effect on filter efficiency is similar to that for TiB2based grain refiners. The work was again conducted on a production scale using AA1050 alloy. Metal quality was determined using LiMCA and PoDFA. Spent filters were also analysed.
Concept Development for Future Domains: A New Method of Knowledge Elicitation
2005-06-01
Procedure: U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) examined methods to generate, refine, test , and validate new...generate, elaborate, refine, describe, test , and validate new Future Force concepts relating to doctrine, tactics, techniques, procedures, unit and team...System (Harvey, 1993), and the Job Element Method (Primoff & Eyde , 1988). Figure 1 provides a more comprehensive list of task analytic methods. Please see
Impact assessment of risk management interventions.
Shryock, T R
2012-04-01
Much effort has been invested in the development and implementation of international recommendations to manage the risk of foodborne antimicrobial resistance, and monitoring programmes to measure bacterial antimicrobial resistance and antimicrobial product volumes. A variety of approaches have been recommended for various stakeholders in the food animal and food production sectors. Interestingly, much less consideration has been given to the establishment of success criteria for the individual interventions and even less for the cumulative effects, when all interventions are considered together as consecutive 'hurdles' along the food chain. The author explores the outcome and unforeseen consequences of these various interventions and appropriate methods that could provide data to assess their impact, as well as key learning experiences that should lead to refinements of such interventions in the future.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-09
... wind turbine generators; a substation; administration, operations and maintenance facilities... Action (the ``Refined Project''). Under the Refined Project configuration, only 112 wind turbines... Report for the Pattern Energy Group's Ocotillo Express Wind Energy Project and Proposed California Desert...
Anderson, Deverick J.; Cochran, Ronda L.; Hicks, Lauri A.; Srinivasan, Arjun; Dodds Ashley, Elizabeth S.
2017-01-01
Antimicrobial stewardship programs (ASPs) positively impact patient care, but metrics to assess ASP impact are poorly defined. We used a modified Delphi approach to select relevant metrics for assessing patient-level interventions in acute-care settings for the purposes of internal program decision making. An expert panel rated 90 candidate metrics on a 9-point Likert scale for association with 4 criteria: improved antimicrobial prescribing, improved patient care, utility in targeting stewardship efforts, and feasibility in hospitals with electronic health records. Experts further refined, added, or removed metrics during structured teleconferences and re-rated the retained metrics. Six metrics were rated >6 in all criteria: 2 measures of Clostridium difficile incidence, incidence of drug-resistant pathogens, days of therapy over admissions, days of therapy over patient days, and redundant therapy events. Fourteen metrics rated >6 in all criteria except feasibility were identified as targets for future development. PMID:27927866
Maes, Wouter H; Heuvelmans, Griet; Muys, Bart
2009-10-01
Although the importance of green (evaporative) water flows in delivering ecosystem services has been recognized, most operational impact assessment methods still focus only on blue water flows. In this paper, we present a new model to evaluate the effect of land use occupation and transformation on water quantity. Conceptually based on the supply of ecosystem services by terrestrial and aquatic ecosystems, the model is developed for, but not limited to, land use impact assessment in life cycle assessment (LCA) and requires a minimum amount of input data. Impact is minimal when evapotranspiration is equal to that of the potential natural vegetation, and maximal when evapotranspiration is zero or when it exceeds a threshold value derived from the concept of environmental water requirement. Three refinements to the model, requiring more input data, are proposed. The first refinement considers a minimal impact over a certain range based on the boundary evapotranspiration of the potential natural vegetation. In the second refinement the effects of evaporation and transpiration are accounted for separately, and in the third refinement a more correct estimate of evaporation from a fully sealed surface is incorporated. The simplicity and user friendliness of the proposed impact assessment method are illustrated with two examples.
Funding emergency care: Australian style.
Bell, Anthony; Crilly, Julia; Williams, Ged; Wylie, Kate; Toloo, Ghasem Sam; Burke, John; FitzGerald, Gerry
2014-08-01
The ongoing challenge for ED leaders is to remain abreast of system-wide changes that impact on the day-to-day management of their departments. Changes to the funding model creates another layer of complexity and this introductory paper serves as the beginning of a discussion about the way in which EDs are funded and how this can and will impact on business decisions, models of care and resource allocation within Australian EDs. Furthermore it is evident that any funding model today will mature and change with time, and moves are afoot to refine and contextualise ED funding over the medium term. This perspective seeks to provide a basis of understanding for our current and future funding arrangements in Australian EDs. © 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
NASA Technical Reports Server (NTRS)
Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter, III; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina;
2016-01-01
This work presents an overview of the This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes., a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beiter, Philipp; Stehly, Tyler
The potential for cost reduction and economic viability for offshore wind varies considerably within the United States. This analysis models the cost impact of a range of offshore wind locational cost variables across more than 7,000 potential coastal sites in the United States' offshore wind resource area. It also assesses the impact of over 50 technology innovations on potential future costs between 2015 and 2027 (Commercial Operation Date) for both fixed-bottom and floating wind systems. Comparing these costs to an initial assessment of local avoided generating costs, this analysis provides a framework for estimating the economic potential for offshore wind.more » Analyzing economic potential within this framework can help establish a refined understanding across industries of the technology and site-specific risks and opportunities associated with future offshore wind development. The findings from the original report indicate that under the modeled scenario, offshore wind can be expected to achieve significant cost reductions and may approach economic viability in some parts of the United States within the next 15 years.« less
[Socio professional impact of surgical release of carpal tunnel syndrome].
Kraiem, Aouatef Mahfoudh; Hnia, Hajer; Bouzgarrou, Lamia; Henchi, Mohamed Adnène; Khalfallah, Taoufik
2016-01-01
The objective was studying the socio-professional impact of release surgery for carpal tunnel syndrom (CTS). We conducted a cross-sectional study of patients operated for work-related CTS; data were collected in the Occupational Health Department at the University Hospital Tahar Sfar in Mahdia, Tunisia over a period of 8 years, from 1 January 2006 to December 2013. Data collection was performed using a survey form focusing on participants' socio-professional and medical characteristics and on their professional future. We used Karasek's questionnaire to study psychosocial constraints at work. The duration of a work stoppage following release surgery for CTS was significantly related to the existence of musculoskeletal disorders other than CTS, to a statement that the carpal tunnel syndrome was work related and to job seniority. As regards the professional future of operated employees, 50.7% remained in the same position, 15.3% were given customized workstation and 33.8% were offered a different position within the same company. The professional future of these employees was related to their occupational qualifications and to the type of sensory and/or motor impairment of median nerve detected during EMG test. A number of nonlesional factors determines the duration of the work stoppage, while the professional future of patients operated for CTS essentially depends on their professional qualifications and on EMG data. Certainly much broader studies would allow to refine these results.
Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.
2017-01-01
Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekolf, W.D.
1988-03-01
How the HPI and government react to new directions will not only set the course for the future of refining and marketing, it will have profound implications for the entire energy industry. Strategies developed by individual refiners and marketers in response to this changing environment will determine their future in the industry. In developing scenarios for the downstream, Cambridge Energy Research Associates (CERA), has identified three forces that will determine the downstream playing field in the nineties: 1. Imbalances between market demands and refinery capacity will continue to promote intense competition and to depress margins, 2. Product and crude pricemore » volatility will be at least as great in the future as it has been in the last three years and 3. Renewed environmental concerns will add new capital investment burdens to the industry. The implications of these three forces on refiners are clear - being in the downstream business is likely to become increasingly expensive, competitive and risky. The author shares CERA's perspective on why these forces have evolved and, in turn, led to new strategies and developments in the industry. Then he outlines how we think these new themes may affect players in the industry. Finally, he summarizes some key uncertainties the future holds.« less
Lactation and neonatal nutrition: Defining and refining the critical questions
USDA-ARS?s Scientific Manuscript database
This paper resulted from a conference entitled "Lactation and Milk: Defining and Refining the Critical Questions" held at the University of Colorado School of Medicine from January 18-20, 2012. The mission of the conference was to identify unresolved questions and set future goals for research into ...
NASA Astrophysics Data System (ADS)
Wootten, A.; Dixon, K. W.; Lanzante, J. R.; Mcpherson, R. A.
2017-12-01
Empirical statistical downscaling (ESD) approaches attempt to refine global climate model (GCM) information via statistical relationships between observations and GCM simulations. The aim of such downscaling efforts is to create added-value climate projections by adding finer spatial detail and reducing biases. The results of statistical downscaling exercises are often used in impact assessments under the assumption that past performance provides an indicator of future results. Given prior research describing the danger of this assumption with regards to temperature, this study expands the perfect model experimental design from previous case studies to test the stationarity assumption with respect to precipitation. Assuming stationarity implies the performance of ESD methods are similar between the future projections and historical training. Case study results from four quantile-mapping based ESD methods demonstrate violations of the stationarity assumption for both central tendency and extremes of precipitation. These violations vary geographically and seasonally. For the four ESD methods tested the greatest challenges for downscaling of daily total precipitation projections occur in regions with limited precipitation and for extremes of precipitation along Southeast coastal regions. We conclude with a discussion of future expansion of the perfect model experimental design and the implications for improving ESD methods and providing guidance on the use of ESD techniques for impact assessments and decision-support.
Reductions in Northeast Refining Activity: Potential Implications for Petroleum Product Markets
2011-01-01
This report is the Energy Information Administration's (EIA) initial effort to provide information and analysis on the potential impacts on petroleum product markets from reductions in Northeast petroleum refining activity.
ERIC Educational Resources Information Center
Bureau of Labor Statistics (DOL), Washington, DC.
This bulletin appraises major technological changes emerging in five American industries (coal mining, oil and gas extraction, petroleum refining, petroleum pipeline transportation, and electric and gas utilities) and discusses the impact of these changes on productivity and occupations over the next five to ten years. Its separate reports on each…
Grain refinement of high strength steels to improve cryogenic toughness
NASA Technical Reports Server (NTRS)
Rush, H. F.
1985-01-01
Grain-refining techniques using multistep heat treatments to reduce the grain size of five commercial high-strength steels were investigated. The goal of this investigation was to improve the low-temperature toughness as measured by Charpy V-notch impact test without a significant loss in tensile strength. The grain size of four of five alloys investigated was successfully reduced up to 1/10 of original size or smaller with increases in Charpy impact energy of 50 to 180 percent at -320 F. Tensile properties were reduced from 0 to 25 percent for the various alloys tested. An unexpected but highly beneficial side effect from grain refining was improved machinability.
NASA Astrophysics Data System (ADS)
Serafin, K.; Ruggiero, P.; Stockdon, H. F.; Barnard, P.; Long, J.
2014-12-01
Many coastal communities worldwide are vulnerable to flooding and erosion driven by extreme total water levels (TWL), potentially dangerous events produced by the combination of large waves, high tides, and high non-tidal residuals. The West coast of the United States provides an especially challenging environment to model these processes due to its complex geological setting combined with uncertain forecasts for sea level rise (SLR), changes in storminess, and possible changes in the frequency of major El Niños. Our research therefore aims to develop an appropriate methodology to assess present-day and future storm-induced coastal hazards along the entire U.S. West coast, filling this information gap. We present the application of this framework in a pilot study at Ocean Beach, California, a National Park site within the Golden Gate National Recreation Area where existing event-scale coastal change data can be used for model calibration and verification. We use a probabilistic, full simulation TWL model (TWL-FSM; Serafin and Ruggiero, in press) that captures the seasonal and interannual climatic variability in extremes using functions of regional climate indices, such as the Multivariate ENSO index (MEI), to represent atmospheric patterns related to the El Niño-Southern Oscillation (ENSO). In order to characterize the effect of climate variability on TWL components, we refine the TWL-FSM by splitting non-tidal residuals into low (monthly mean sea level anomalies) and high frequency (storm surge) components. We also develop synthetic climate indices using Markov sequences to reproduce the autocorrelated nature of ENSO behavior. With the refined TWL-FSM, we simulate each TWL component, resulting in synthetic TWL records providing robust estimates of extreme return level events (e.g., the 100-yr event) and the ability to examine the relative contribution of each TWL component to these extreme events. Extreme return levels are then used to drive storm impact models to examine the probability of coastal change (Stockdon et al., 2013) and thus, the vulnerability to storm-induced coastal hazards that Ocean Beach faces. Future climate variability is easily incorporated into this framework, allowing us to quantify how an evolving climate will alter future extreme TWLs and their related coastal impacts.
Continuous Calibration Improvement in Solar Reflective Bands: Landsat 5 Through Landsat 8
NASA Technical Reports Server (NTRS)
Mishra, Nischal; Helder, Dennis; Barsi, Julia; Markham, Brian
2016-01-01
Launched in February 2013, the Operational Land Imager (OLI) on-board Landsat 8 continues to perform exceedingly well and provides high science quality data globally. Several design enhancements have been made in the OLI instrument relative to prior Landsat instruments: pushbroom imaging which provides substantially improved Signal-to-Noise Ratio (SNR), spectral bandpasses refinement to avoid atmospheric absorption features, 12 bit data resolution to provide a larger dynamic range that limits the saturation level, a set of well-designed onboard calibrators to monitor the stability of the sensor. Some of these changes such as refinements in spectral bandpasses compared to earlier Landsats and well-designed on-board calibrator have a direct impact on the improved radiometric calibration performance of the instrument from both the stability of the response and the ability to track the changes. The on-board calibrator lamps and diffusers indicate that the instrument drift is generally less than 0.1% per year across the bands. The refined bandpasses of the OLI indicate that temporal uncertainty of better than 0.5% is possible when the instrument is trended over vicarious targets such as Pseudo Invariant Calibration Sites (PICS), a level of precision that was never achieved with the earlier Landsat instruments. The stability measurements indicated by on-board calibrators and PICS agree much better compared to the earlier Landsats, which is very encouraging and bodes well for the future Landsat missions too.
CONTINUOUS CALIBRATION IMPROVEMENT: LANDSAT 5 THROUGH LANDSAT 8
Mishra, Nischal; Helder, Dennis; Barsi, Julia; Markham, Brian
2018-01-01
Launched in February 2013, the Operational Land Imager (OLI) on-board Landsat 8 continues to perform exceedingly well and provides high science quality data globally. Several design enhancements have been made in the OLI instrument relative to prior Landsat instruments: pushbroom imaging which provides substantially improved Signal-to-Noise Ratio (SNR), spectral bandpasses refinement to avoid atmospheric absorption features, 12 bit data resolution to provide a larger dynamic range that limits the saturation level, a set of well-designed onboard calibrators to monitor the stability of the sensor. Some of these changes such as refinements in spectral bandpasses compared to earlier Landsats and well-designed on-board calibrator have a direct impact on the improved radiometric calibration performance of the instrument from both the stability of the response and the ability to track the changes. The on-board calibrator lamps and diffusers indicate that the instrument drift is generally less than 0.1% per year across the bands. The refined bandpasses of the OLI indicate that temporal uncertainty of better than 0.5% is possible when the instrument is trended over vicarious targets such as Pseudo Invariant Calibration Sites (PICS), a level of precision that was never achieved with the earlier Landsat instruments. The stability measurements indicated by on-board calibrators and PICS agree much better compared to the earlier Landsats, which is very encouraging and bodes well for the future Landsat missions too. PMID:29449747
NASA Astrophysics Data System (ADS)
Schleussner, C. F.
2016-12-01
Robust appraisals of climate impacts at different levels of global-mean temperature increase are vital to guide assessments of dangerous anthropogenic interference with the climate system. By establishing 1.5°C as the long term temperature limit for global average temperature increase and inviting a special report of the IPCC on the impacts of 1.5°C, the Paris Agreement has put such assessments high on the post-Paris science agenda. Here I will present recent findings of climate impacts at 1.5°C, including extreme weather events, water availability, agricultural yields, sea-level rise and risk of coral reef loss. In particular, I will present findings from a recent study that attempts to differentiate between such impacts at warming levels of 1.5°¸C and 2°C above pre-industrial (Schleussner et al., 2016). By analyzing changes in indicators for 26 world regions as applicable, the study found regional dependent differences between a 1.5°C and 2°C warming. Regional hot-spots of change emerge with tropical regions bearing the brunt of the impacts of an additional 0.5°C warming. These findings highlight the importance of regional differentiation to assess both future climate risks and different vulnerabilities to incremental increases in global-mean temperature. Building on that analysis, I will discuss limitations of existing approaches to differentiate between warming levels and outline opportunities for future work on refining our understanding of the difference between impacts at 1.5°C and 2°C warming. ReferencesSchleussner, C.-F. et al. Differential climate impacts for policy relevant limits to global warming: the case of 1.5°C and 2°C. Earth Syst. Dyn. 7, 327-351 (2016).
Optimization of Refining Craft for Vegetable Insulating Oil
NASA Astrophysics Data System (ADS)
Zhou, Zhu-Jun; Hu, Ting; Cheng, Lin; Tian, Kai; Wang, Xuan; Yang, Jun; Kong, Hai-Yang; Fang, Fu-Xin; Qian, Hang; Fu, Guang-Pan
2016-05-01
Vegetable insulating oil because of its environmental friendliness are considered as ideal material instead of mineral oil used for the insulation and the cooling of the transformer. The main steps of traditional refining process included alkali refining, bleaching and distillation. This kind of refining process used in small doses of insulating oil refining can get satisfactory effect, but can't be applied to the large capacity reaction kettle. This paper using rapeseed oil as crude oil, and the refining process has been optimized for large capacity reaction kettle. The optimized refining process increases the acid degumming process. The alkali compound adds the sodium silicate composition in the alkali refining process, and the ratio of each component is optimized. Add the amount of activated clay and activated carbon according to 10:1 proportion in the de-colorization process, which can effectively reduce the oil acid value and dielectric loss. Using vacuum pumping gas instead of distillation process can further reduce the acid value. Compared some part of the performance parameters of refined oil products with mineral insulating oil, the dielectric loss of vegetable insulating oil is still high and some measures are needed to take to further optimize in the future.
Human factors measurement for future air traffic control systems.
Langan-Fox, Janice; Sankey, Michael J; Canty, James M
2009-10-01
This article provides a critical review of research pertaining to the measurement of human factors (HF) issues in current and future air traffic control (ATC). Growing worldwide air traffic demands call for a radical departure from current ATC systems. Future systems will have a fundamental impact on the roles and responsibilities of ATC officers (ATCOs). Valid and reliable methods of assessing HF issues associated with these changes, such as a potential increase (or decrease) in workload, are of utmost importance for advancing theory and for designing systems, procedures, and training. We outline major aviation changes and how these relate to five key HF issues in ATC. Measures are outlined, compared, and evaluated and are followed by guidelines for assessing these issues in the ATC domain. Recommendations for future research are presented. A review of the literature suggests that situational awareness and workload have been widely researched and assessed using a variety of measures, but researchers have neglected the areas of trust, stress, and boredom. We make recommendations for use of particular measures and the construction of new measures. It is predicted that, given the changing role of ATCOs and profound future airspace requirements and configurations, issues of stress, trust, and boredom will become more significant. Researchers should develop and/or refine existing measures of all five key HF issues to assess their impact on ATCO performance. Furthermore, these issues should be considered in a holistic manner. The current article provides an evaluation of research and measures used in HF research on ATC that will aid research and ATC measurement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, B.F.
The US refining industry has experienced an extended period of change covering the past 20 years. Growing regulatory requirements, combined with shifting market characteristics, have resulted in massive investments and significant and ongoing structural change. Despite excellent capacity utilization, recent profitability has been poor. Industry psychology can be described as depressed, with honest concern about the long-term attractiveness of domestic refining as an area for continued participation and investment. This paper provides an overview of how the industry arrived at these levels of poor profitability, examines the current situation and future drivers, and presents Chem Systems` views on the outlookmore » for domestic refining.« less
Back to the Future: Anticipating and Preparing for Change.
ERIC Educational Resources Information Center
Lapin, Joel D.
1992-01-01
Explains how colleges can take control of their futures by anticipating needs and demands. Describes environmental scanning, a way of identifying future concerns based on current trends and emerging issues. Provides examples of colleges that used forecasting and scanning to develop new courses and refine mission statements. (DMM)
Contracting for nurse education: nurse leader experiences and future visions.
Moule, P
1999-02-01
The integration of nurse education into higher education establishments following Working for Patients, Working Paper 10 (DOH 1989a) has seen changes to the funding and delivery of nurse education. The introduction of contracting for education initiated a business culture which subsumed previous relationships, affecting collaborative partnerships and shared understanding. Discourse between the providers and purchasers of nurse education is vital to achieve proactive curriculum planning, which supports the development of nursing practitioners who are fit for award and fit for purpose. Research employed philosophical hermeneutics to guide the interviewing of seven nurse leaders within one region. Data analysis occurred within a hermeneutic circle and was refined using NUDIST. Two key themes were seen as impacting on the development of an effective educational strategy. Firstly, the development of collaborative working was thought to have been impeded by communication difficulties between the Trusts and higher education provider. Secondly, there was concern that curriculum developments would support the future evolution of nursing, acknowledging the professional issues impacting on nursing roles. The research findings suggest purchasers and providers of nurse education must move towards achieving mutual understanding and collaborate in developing a curriculum which will prepare nurses for practice and for award.
Maust, Donovan T.; Oslin, David W.; Thase, Michael E.
2012-01-01
Many older adults with Major Depressive Disorder (MDD) do not respond to antidepressant monotherapy. While there are evidence-based treatment options to support treatment beyond monotherapy for adults, the evidence for such strategies specifically in late-life MDD is relatively scarce. This review examines the published data describing strategies for antidepressant augmentation or acceleration studied specifically in older adults, including lithium, stimulants, and second-generation antipsychotics. In addition, the authors suggest strategies for future research, such as study of specific agents, refining understanding of the impact of medical or cognitive comorbidity in late-life depression, and comparative effectiveness to examine methods already used in clinical practice. PMID:23567381
Future of phytonematode taxonomy
USDA-ARS?s Scientific Manuscript database
The future of nematode taxonomy will be characterized by rapid accumulation of new and revised taxa examined in light of a refined understanding of phenotypic plasticity and cryptic speciation, multiple molecular markers and appropriate phylogenetic analyses. The inevitable result will be improved t...
NASA Technical Reports Server (NTRS)
Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William
2017-01-01
Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.
Assessing vulnerability of marine bird populations to offshore wind farms.
Furness, Robert W; Wade, Helen M; Masden, Elizabeth A
2013-04-15
Offshore wind farms may affect bird populations through collision mortality and displacement. Given the pressures to develop offshore wind farms, there is an urgent need to assess population-level impacts on protected marine birds. Here we refine an approach to assess aspects of their ecology that influence population vulnerability to wind farm impacts, also taking into account the conservation importance of each species. Flight height appears to be a key factor influencing collision mortality risk but improved data on flight heights of marine birds are needed. Collision index calculations identify populations of gulls, white-tailed eagles, northern gannets and skuas as of particularly high concern in Scottish waters. Displacement index calculations identify populations of divers and common scoters as most vulnerable to population-level impacts of displacement, but these are likely to be less evident than impacts of collision mortality. The collision and displacement indices developed here for Scottish marine bird populations could be applied to populations elsewhere, and this approach will help in identifying likely impacts of future offshore wind farms on marine birds and prioritising monitoring programmes, at least until data on macro-avoidance rates become available. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Idaho State Library, Boise.
In l998, Idahoans gathered in a series of six Regional Futures Conferences to identify what they thought was probable during the next ten years, what was possible for libraries to do and be, and what a preferred future of Idaho libraries might be. Participants from the regional conferences then convened to refine and focus descriptions of the…
Forward Contamination of the Moon and Mars: Implications for Future Life Detection Missions
NASA Technical Reports Server (NTRS)
Glavin, Daniel P.; Dworkin, Jason P.; Lupisella, Mark; Kminek, Gerhard; Rummel, John D.
2004-01-01
NASA and ESA have outlined new visions for solar system exploration that will include a series of lunar robotic missions to prepare for, and support a human return to the Moon, and future human exploration of Mars and other destinations. One of the guiding principles for exploration is to pursue compelling scientific questions about the origin and evolution of life. The search for life on objects such as Mars will require that all spacecraft and instrumentation be sufficiently cleaned and sterilized prior to launch to ensure that the scientific integrity of extraterrestrial samples is not jeopardized by terrestrial organic contamination. Under COSPAR's current planetary protection policy for the Moon, no sterilization procedures are required for outbound lunar spacecraft. Nonetheless, future in situ investigations of a variety of locations on the Moon by highly sensitive instruments designed to search for biologically derived organic compounds would help assess the contamination of the Moon by lunar spacecraft. These studies could also provide valuable "ground truth" data for Mars sample return missions and help define planetary protection requirements for future Mars bound spacecraft carrying life detection experiments. In addition, studies of the impact of terrestrial contamination of the lunar surface by the Apollo astronauts could provide valuable data to help refine future Mars surface exploration plans for a human mission to Mars.
Baker, Sandra E.; Sharp, Trudy M.; Macdonald, David W.
2016-01-01
Human-wildlife conflict is a global issue. Attempts to manage this conflict impact upon wild animal welfare, an issue receiving little attention until relatively recently. Where human activities harm animal welfare these effects should be minimised where possible. However, little is known about the welfare impacts of different wildlife management interventions, and opinions on impacts vary widely. Welfare impacts therefore need to be assessed objectively. Our objectives were to: 1) establish whether an existing welfare assessment model could differentiate and rank the impacts of different wildlife management interventions (for decision-making purposes); 2) identify and evaluate any additional benefits of making formal welfare assessments; and 3) illustrate issues raised by application of the model. We applied the welfare assessment model to interventions commonly used with rabbits (Oryctolagus cuniculus), moles (Talpa europaea) and crows (Corvus corone) in the UK. The model ranked interventions for rabbits (least impact first: fencing, head shot, chest shot) and crows (shooting, scaring, live trapping with cervical dislocation). For moles, managing molehills and tunnels scored least impact. Both spring trapping, and live trapping followed by translocation, scored greater impacts, but these could not be compared directly as they scored on different axes of the model. Some rankings appeared counter-intuitive, highlighting the need for objective formal welfare assessments. As well as ranking the humaneness of interventions, the model highlighted future research needs and how Standard Operating Procedures might be improved. The model is a milestone in assessing wildlife management welfare impacts, but our research revealed some limitations of the model and we discuss likely challenges in resolving these. In future, the model might be developed to improve its utility, e.g. by refining the time-scales. It might also be used to reach consensus among stakeholders about relative welfare impacts or to identify ways of improving wildlife management practice in the field. PMID:26726808
Baker, Sandra E; Sharp, Trudy M; Macdonald, David W
2016-01-01
Human-wildlife conflict is a global issue. Attempts to manage this conflict impact upon wild animal welfare, an issue receiving little attention until relatively recently. Where human activities harm animal welfare these effects should be minimised where possible. However, little is known about the welfare impacts of different wildlife management interventions, and opinions on impacts vary widely. Welfare impacts therefore need to be assessed objectively. Our objectives were to: 1) establish whether an existing welfare assessment model could differentiate and rank the impacts of different wildlife management interventions (for decision-making purposes); 2) identify and evaluate any additional benefits of making formal welfare assessments; and 3) illustrate issues raised by application of the model. We applied the welfare assessment model to interventions commonly used with rabbits (Oryctolagus cuniculus), moles (Talpa europaea) and crows (Corvus corone) in the UK. The model ranked interventions for rabbits (least impact first: fencing, head shot, chest shot) and crows (shooting, scaring, live trapping with cervical dislocation). For moles, managing molehills and tunnels scored least impact. Both spring trapping, and live trapping followed by translocation, scored greater impacts, but these could not be compared directly as they scored on different axes of the model. Some rankings appeared counter-intuitive, highlighting the need for objective formal welfare assessments. As well as ranking the humaneness of interventions, the model highlighted future research needs and how Standard Operating Procedures might be improved. The model is a milestone in assessing wildlife management welfare impacts, but our research revealed some limitations of the model and we discuss likely challenges in resolving these. In future, the model might be developed to improve its utility, e.g. by refining the time-scales. It might also be used to reach consensus among stakeholders about relative welfare impacts or to identify ways of improving wildlife management practice in the field.
The Torino Impact Hazard Scale
NASA Astrophysics Data System (ADS)
Binzel, Richard P.
2000-04-01
Newly discovered asteroids and comets have inherent uncertainties in their orbit determinations owing to the natural limits of positional measurement precision and the finite lengths of orbital arcs over which determinations are made. For some objects making predictable future close approaches to the Earth, orbital uncertainties may be such that a collision with the Earth cannot be ruled out. Careful and responsible communication between astronomers and the public is required for reporting these predictions and a 0-10 point hazard scale, reported inseparably with the date of close encounter, is recommended as a simple and efficient tool for this purpose. The goal of this scale, endorsed as the Torino Impact Hazard Scale, is to place into context the level of public concern that is warranted for any close encounter event within the next century. Concomitant reporting of the close encounter date further conveys the sense of urgency that is warranted. The Torino Scale value for a close approach event is based upon both collision probability and the estimated kinetic energy (collision consequence), where the scale value can change as probability and energy estimates are refined by further data. On the scale, Category 1 corresponds to collision probabilities that are comparable to the current annual chance for any given size impactor. Categories 8-10 correspond to certain (probability >99%) collisions having increasingly dire consequences. While close approaches falling Category 0 may be no cause for noteworthy public concern, there remains a professional responsibility to further refine orbital parameters for such objects and a figure of merit is suggested for evaluating such objects. Because impact predictions represent a multi-dimensional problem, there is no unique or perfect translation into a one-dimensional system such as the Torino Scale. These limitations are discussed.
Satisfaction of the Automotive Fleet Fuel Demand and Its Impact on the Oil Refining Industry
DOT National Transportation Integrated Search
1980-12-01
Because virtually all transportation fuels are based on petroleum, it is essential to include petroleum refining in any assessment of potential changes in the transportation system. A number of changes in the automotive fleet have been proposed to im...
Danieli, Yael; Norris, Fran H; Lindert, Jutta; Paisner, Vera; Kronenberg, Sefi; Engdahl, Brian; Richter, Julia
2015-05-01
The impacts of the Holocaust on children of survivors have been widely investigated. However, consensus is limited, and no validated measures have been tailored with or to them. We aimed to develop and validate a scale that measures these specific impacts (Part II of the Danieli Inventory of Multigenerational Legacies of Trauma). We studied 484 adult children of survivors who participated in a cross-sectional web-based survey in English or Hebrew; of these, 191 participated in a clinical interview. Exploratory factor analyses of 58 items to reduce and refine the measure yielded a 36-item scale, Reparative Adaptational Impacts, that had excellent internal consistency (α = .91) and congruence between English and Hebrew versions (φ ≥ .95). Associations between impacts and SCID-based diagnoses of major depressive episode, posttraumatic stress disorder, and generalized anxiety disorder were moderate to strong (ds = 0.48-0.89). Strong associations also emerged between severity of offspring's reparative adaptational impacts and intensity of their parents' posttrauma adaptational styles (Multiple R = .72), with intensity of victim style, especially the mother's, having the strongest effect (β = .31-.33). Having both research and clinical relevance for assessing Holocaust survivors' offspring, future studies might investigate the scale's generalizability to other populations affected by mass trauma. (c) 2015 APA, all rights reserved).
Developing a methodology to assess the impact of research grant funding: a mixed methods approach.
Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels
2014-04-01
This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.
COMPUTATIONAL METHODOLOGIES for REAL-SPACE STRUCTURAL REFINEMENT of LARGE MACROMOLECULAR COMPLEXES
Goh, Boon Chong; Hadden, Jodi A.; Bernardi, Rafael C.; Singharoy, Abhishek; McGreevy, Ryan; Rudack, Till; Cassidy, C. Keith; Schulten, Klaus
2017-01-01
The rise of the computer as a powerful tool for model building and refinement has revolutionized the field of structure determination for large biomolecular systems. Despite the wide availability of robust experimental methods capable of resolving structural details across a range of spatiotemporal resolutions, computational hybrid methods have the unique ability to integrate the diverse data from multimodal techniques such as X-ray crystallography and electron microscopy into consistent, fully atomistic structures. Here, commonly employed strategies for computational real-space structural refinement are reviewed, and their specific applications are illustrated for several large macromolecular complexes: ribosome, virus capsids, chemosensory array, and photosynthetic chromatophore. The increasingly important role of computational methods in large-scale structural refinement, along with current and future challenges, is discussed. PMID:27145875
Challenges of Representing Sub-Grid Physics in an Adaptive Mesh Refinement Atmospheric Model
NASA Astrophysics Data System (ADS)
O'Brien, T. A.; Johansen, H.; Johnson, J. N.; Rosa, D.; Benedict, J. J.; Keen, N. D.; Collins, W.; Goodfriend, E.
2015-12-01
Some of the greatest potential impacts from future climate change are tied to extreme atmospheric phenomena that are inherently multiscale, including tropical cyclones and atmospheric rivers. Extremes are challenging to simulate in conventional climate models due to existing models' coarse resolutions relative to the native length-scales of these phenomena. Studying the weather systems of interest requires an atmospheric model with sufficient local resolution, and sufficient performance for long-duration climate-change simulations. To this end, we have developed a new global climate code with adaptive spatial and temporal resolution. The dynamics are formulated using a block-structured conservative finite volume approach suitable for moist non-hydrostatic atmospheric dynamics. By using both space- and time-adaptive mesh refinement, the solver focuses computational resources only where greater accuracy is needed to resolve critical phenomena. We explore different methods for parameterizing sub-grid physics, such as microphysics, macrophysics, turbulence, and radiative transfer. In particular, we contrast the simplified physics representation of Reed and Jablonowski (2012) with the more complex physics representation used in the System for Atmospheric Modeling of Khairoutdinov and Randall (2003). We also explore the use of a novel macrophysics parameterization that is designed to be explicitly scale-aware.
Testing the prospective evaluation of a new healthcare system
Planitz, Birgit; Sanderson, Penelope; Freeman, Clinton; Xiao, Tania; Botea, Adi; Orihuela, Cristina Beltran
2012-01-01
Research into health ICT adoption suggests that the failure to understand the clinical workplace has been a major contributing factor to the failure of many computer-based clinical systems. We suggest that clinicians and administrators need methods for envisioning future use when adopting new ICT. This paper presents and evaluates a six-stage “prospective evaluation” model that clinicians can use when assessing the impact of a new electronic patient information system on a Specialist Outpatients Department (SOPD). The prospective evaluation model encompasses normative, descriptive, formative and projective approaches. We show that this combination helped health informaticians to make reasonably accurate predictions for technology adoption at the SOPD. We suggest some refinements, however, to improve the scope and accuracy of predictions. PMID:23304347
A Novel in Vivo Model for Assessing the Impact of Geophagic Earth on Iron Status
Seim, Gretchen L.; Tako, Elad; Ahn, Cedric; Glahn, Raymond P.; Young, Sera L.
2016-01-01
The causes and consequences of geophagy, the craving and consumption of earth, remain enigmatic, despite its recognition as a behavior with public health implications. Iron deficiency has been proposed as both a cause and consequence of geophagy, but methodological limitations have precluded a decisive investigation into this relationship. Here we present a novel in vivo model for assessing the impact of geophagic earth on iron status: Gallus gallus (broiler chicken). For four weeks, animals were gavaged daily with varying dosages of geophagic material or pure clay mineral. Differences in haemoglobin (Hb) across treatment groups were assessed weekly and differences in liver ferritin, liver iron, and gene expression of the iron transporters divalent metal transporter 1 (DMT1), duodenal cytochrome B (DcytB) and ferroportin were assessed at the end of the study. Minimal impact on iron status indicators was observed in all non-control groups, suggesting dosing of geophagic materials may need refining in future studies. However, this model shows clear advantages over prior methods used both in vitro and in humans, and represents an important step in explaining the public health impact of geophagy on iron status. PMID:27304966
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Chun; Leung, L. Ruby; Park, Sang-Hun
Advances in computing resources are gradually moving regional and global numerical forecasting simulations towards sub-10 km resolution, but global high resolution climate simulations remain a challenge. The non-hydrostatic Model for Prediction Across Scales (MPAS) provides a global framework to achieve very high resolution using regional mesh refinement. Previous studies using the hydrostatic version of MPAS (H-MPAS) with the physics parameterizations of Community Atmosphere Model version 4 (CAM4) found notable resolution dependent behaviors. This study revisits the resolution sensitivity using the non-hydrostatic version of MPAS (NH-MPAS) with both CAM4 and CAM5 physics. A series of aqua-planet simulations at global quasi-uniform resolutionsmore » ranging from 240 km to 30 km and global variable resolution simulations with a regional mesh refinement of 30 km resolution over the tropics are analyzed, with a primary focus on the distinct characteristics of NH-MPAS in simulating precipitation, clouds, and large-scale circulation features compared to H-MPAS-CAM4. The resolution sensitivity of total precipitation and column integrated moisture in NH-MPAS is smaller than that in H-MPAS-CAM4. This contributes importantly to the reduced resolution sensitivity of large-scale circulation features such as the inter-tropical convergence zone and Hadley circulation in NH-MPAS compared to H-MPAS. In addition, NH-MPAS shows almost no resolution sensitivity in the simulated westerly jet, in contrast to the obvious poleward shift in H-MPAS with increasing resolution, which is partly explained by differences in the hyperdiffusion coefficients used in the two models that influence wave activity. With the reduced resolution sensitivity, simulations in the refined region of the NH-MPAS global variable resolution configuration exhibit zonally symmetric features that are more comparable to the quasi-uniform high-resolution simulations than those from H-MPAS that displays zonal asymmetry in simulations inside the refined region. Overall, NH-MPAS with CAM5 physics shows less resolution sensitivity compared to CAM4. These results provide a reference for future studies to further explore the use of NH-MPAS for high-resolution climate simulations in idealized and realistic configurations.« less
Salter, H; Holland, R
2014-09-01
In the last decade, there have been intensive efforts to invent, qualify and use novel biomarkers as a means to improve success rates in drug discovery and development. The biomarkers field is maturing and this article considers whether these research efforts have brought about the expected benefits. The characteristics of a clinically useful biomarker are described and the impact this area of research has had is evaluated by reviewing a few, key examples of emerging biomarkers. There is evidence that the impact has been genuine and is increasing in both the drug and the diagnostic discovery and development processes. Beneficial impact on patient health outcomes seems relatively limited thus far, with the greatest impact in oncology (again, both in terms of novel drugs and in terms of more refined diagnoses and therefore more individualized treatment). However, the momentum of research would indicate that patient benefits are likely to increase substantially and to broaden across multiple therapeutic areas. Even though this research was originally driven by a desire to improve the drug discovery and development process, and was therefore funded with this aim in mind, it seems likely that the largest impact may actually come from more refined diagnosis. Refined diagnosis will facilitate both better allocation of healthcare resources and the use of treatment regimens which are optimized for the individual patient. This article also briefly reviews emerging technological approaches and how they relate to the challenges inherent in biomarker discovery and validation, and discusses the role of public/private partnerships in innovative biomarker research. © 2014 The Association for the Publication of the Journal of Internal Medicine.
Simulated Impacts of Climate Change on Water Use and Yield of Irrigated Sugarcane in South Africa
NASA Technical Reports Server (NTRS)
Jones, M.R; Singels, A.; Ruane, A. C.
2015-01-01
Reliable predictions of climate change impacts on water use, irrigation requirements and yields of irrigated sugarcane in South Africa (a water-scarce country) are necessary to plan adaptation strategies. Although previous work has been done in this regard, methodologies and results vary considerably. The objectives were (1) to estimate likely impacts of climate change on sugarcane yields, water use and irrigation demand at three irrigated sugarcane production sites in South Africa (Malelane, Pongola and La Mercy) for current (1980-2010) and future (2070-2100) climate scenarios, using an approach based on the Agricultural Model Inter-comparison and Improvement Project (AgMIP) protocols; and (2) to assess the suitability of this methodology for investigating climate change impacts on sugarcane production. Future climate datasets were generated using the Delta downscaling method and three Global Circulation Models (GCMs) assuming atmospheric CO2 concentration [CO2] of 734 ppm(A2 emissions scenario). Yield and water use were simulated using the DSSAT-Canegro v4.5 model. Irrigated cane yields are expected to increase at all three sites (between 11 and 14%), primarily due to increased interception of radiation as a result of accelerated canopy development. Evapotranspiration and irrigation requirements increased by 11% due to increased canopy cover and evaporative demand. Sucrose yields are expected to decline because of increased consumption of photo-assimilate for structural growth and maintenance respiration. Crop responses in canopy development and yield formation differed markedly between the crop cycles investigated. Possible agronomic implications of these results include reduced weed control costs due to shortened periods of partial canopy, a need for improved efficiency of irrigation to counter increased demands, and adjustments to ripening and harvest practices to counter decreased cane quality and optimize productivity. Although the Delta climate data downscaling method is considered robust, accurate and easily-understood, it does not change the future number of rain-days per month. The impacts of this and other climate data simplifications ought to be explored in future work. Shortcomings of the DSSAT-Canegro model include the simulated responses of phenological development, photosynthesis and respiration processes to high temperatures, and the disconnect between simulated biomass accumulation and expansive growth. Proposed methodology refinements should improve the reliability of predicted climate change impacts on sugarcane yield.
EVALUATION OF THE DISPOSAL OF FLUE GAS CLEANING WASTES IN COAL MINES AND AT SEA: REFINED ASSESSMENT
The report gives a refined assessment of the feasibility of disposing of flue gas cleaning (FGC) wastes in coal mines and at sea. Its focus is on specific impact areas identified in an earlier assessment. These areas were further investigated through laboratory studies as well as...
Wingfield, Tom; Boccia, Delia; Tovar, Marco A; Huff, Doug; Montoya, Rosario; Lewis, James J; Gilman, Robert H; Evans, Carlton A
2015-08-21
Cash transfers are key interventions in the World Health Organisation's post-2015 global TB policy. However, evidence guiding TB-specific cash transfer implementation is limited. We designed, implemented and refined a novel TB-specific socioeconomic intervention that included cash transfers, which aimed to support TB prevention and cure in resource-constrained shantytowns in Lima, Peru for: the Community Randomized Evaluation of a Socioeconomic Intervention to Prevent TB (CRESIPT) project. Newly-diagnosed TB patients from study-site healthposts were eligible to receive the intervention consisting of economic and social support. Economic support was provided to patient households through cash transfers on meeting the following conditions: screening for TB in household contacts and MDR TB in patients; adhering to TB treatment and chemoprophylaxis; and engaging with CRESIPT social support (household visits and community meetings). To evaluate project acceptability, quantitative and qualitative feedback was collected using a mixed-methods approach during formative activities. Formative activities included consultations, focus group discussions and questionnaires conducted with the project team, project participants, civil society and stakeholders. Over 7 months, 135 randomly-selected patients and their 647 household contacts were recruited from 32 impoverished shantytown communities. Of 1299 potential cash transfers, 964 (74 %) were achieved, 259 (19 %) were not achieved, and 76 (7 %) were yet to be achieved. Of those achieved, 885/964 (92 %) were achieved optimally and 79/964 (8 %) sub-optimally. Key project successes were identified during 135 formative activities and included: strong multi-sectorial collaboration; generation of new evidence for TB-specific cash transfer; and the project being perceived as patient-centred and empowering. Challenges included: participant confidence being eroded through cash transfer delays, hidden account-charges and stigma; access to the initial bank-provider being limited; and conditions requiring participation of all TB-affected household members (e.g. community meetings) being hard to achieve. Refinements were made to improve project acceptability and future impact: the initial bank-provider was changed; conditional and unconditional cash transfers were combined; cash transfer sums were increased to a locally-appropriate, evidence-based amount; and cash transfer size varied according to patient household size to maximally reduce mitigation of TB-related costs and be more responsive to household needs. A novel TB-specific socioeconomic intervention including conditional cash transfers has been designed, implemented, refined and is ready for impact assessment, including by the CRESIPT project. The lessons learnt during this research will inform policy-makers and decision-makers for future implementation of related interventions.
A Man-Machine System for Contemporary Counseling Practice: Diagnosis and Prediction.
ERIC Educational Resources Information Center
Roach, Arthur J.
This paper looks at present and future capabilities for diagnosis and prediction in computer-based guidance efforts and reviews the problems and potentials which will accompany the implementation of such capabilities. In addition to necessary procedural refinement in prediction, future developments in computer-based educational and career…
40 CFR 35.3525 - Authorized types of assistance from the Fund.
Code of Federal Regulations, 2010 CFR
2010-07-01
... requirement in future years. (6) A State may provide incremental assistance for a project (e.g., for a... allowance from future capitalization grants. In addition, a State must: (i) Indicate in the Intended Use... State may buy or refinance local debt obligations of municipal, intermunicipal, or interstate agencies...
NASA Technical Reports Server (NTRS)
Jackson, Dan
2017-01-01
The ISS is an outstanding platform for developing, testing and refining laser communications systems for future exploration. A recent ISS project which improved ISS communications satellite acquisition performance proves the platform’s utility as a laser communications systems testbed.
Sugarman, Jeremy; Barnes, Mark; Rose, Scott; Dumchev, Kostyantyn; Sarasvita, Riza; Viet, Ha Tran; Zeziulin, Oleksandr; Susami, Hepa; Go, Vivian; Hoffman, Irving; Miller, William C
2018-06-22
People who inject drugs with high-risk sharing practices have high rates of HIV transmission and face barriers to HIV care. Interventions to overcome these barriers are needed; however, stigmatisation of drug use and HIV infection leads to safety concerns during the planning and conduct of research on such interventions. In preparing to address concerns about safety and wellbeing of participants in an international research study, HIV Prevention Trials Network 074, we developed participant safety plans (PSPs) at each site to supplement local research ethics committee oversight, community engagement, and usual clinical trial procedures. The PSPs were informed by systematic local legal and policy assessments, and interviews with key stakeholders. After PSP refinement and implementation, we assessed social impacts at each study visit to ensure continued safety. Throughout the study, five participants reported a negative social impact, with three resulting from study participation. Future research with stigmatised populations should consider using and assessing this approach to enhance safety and welfare. Copyright © 2018 Elsevier Ltd. All rights reserved.
Color changing photonic crystals detect blast exposure
Cullen, D. Kacy; Xu, Yongan; Reneer, Dexter V.; Browne, Kevin D.; Geddes, James W.; Yang, Shu; Smith, Douglas H.
2010-01-01
Blast-induced traumatic brain injury (bTBI) is the “signature wound” of the current wars in Iraq and Afghanistan. However, with no objective information of relative blast exposure, warfighters with bTBI may not receive appropriate medical care and are at risk of being returned to the battlefield. Accordingly, we have created a colorimetric blast injury dosimeter (BID) that exploits material failure of photonic crystals to detect blast exposure. Appearing like a colored sticker, the BID is fabricated in photosensitive polymers via multi-beam interference lithography. Although very stable in the presence of heat, cold or physical impact, sculpted micro- and nano-structures of the BID are physically altered in a precise manner by blast exposure, resulting in color changes that correspond with blast intensity. This approach offers a lightweight, power-free sensor that can be readily interpreted by the naked eye. Importantly, with future refinement this technology may be deployed to identify soldiers exposed to blast at levels suggested to be supra-threshold for non-impact blast-induced mild TBI. PMID:21040795
The Uncertain Nature of Cometary Motions
NASA Technical Reports Server (NTRS)
Yeomans, Donald K.
1997-01-01
The number of active short- and long-periodic comets crossing the Earth's orbit each year is less than 10 percent of the corresponding number of asteroids crossing the Earth's orbit. However, the higher relative velocities of comets with respect to the Earth and the uncertainties associated with accurately computing their future trajectories can cause considerable problems when assessing the risks of Earth-crossing objects. Unlike asteroids, the motions of active comets are often affected by so-called nongravitational (outgassing) forces that are imperfectly modeled. In addition, the astrometric optical observations that are used to refine a comet's orbit are often imprecise because a comet's center of mass can be hidden by atmospheric gas and dust. For long-period comets, there is the additional problem of having to base orbital solutions on relatively short observational data intervals. Long-term numerical integrations extending two centuries into the future have been carried out to investigate upcoming Earth-close approaches by known periodic comets. Error analyses and impact probabilities have been computed for those comets that will pass closest to the Earth. Although there are no known comets that will make dangerously close Earth approaches in the next two centuries, there are a few objects that warrant future monitoring.
NASA Astrophysics Data System (ADS)
Dixon, K. W.; Lanzante, J. R.; Adams-Smith, D.
2017-12-01
Several challenges exist when seeking to use future climate model projections in a climate impacts study. A not uncommon approach is to utilize climate projection data sets derived from more than one future emissions scenario and from multiple global climate models (GCMs). The range of future climate responses represented in the set is sometimes taken to be indicative of levels of uncertainty in the projections. Yet, GCM outputs are deemed to be unsuitable for direct use in many climate impacts applications. GCM grids typically are viewed as being too coarse. Additionally, regional or local-scale biases in a GCM's simulation of the contemporary climate that may not be problematic from a global climate modeling perspective may be unacceptably large for a climate impacts application. Statistical downscaling (SD) of climate projections - a type of post-processing that uses observations to inform the refinement of GCM projections - is often used in an attempt to account for GCM biases and to provide additional spatial detail. "What downscaled climate projection is the best one to use" is a frequently asked question, but one that is not always easy to answer, as it can be dependent on stakeholder needs and expectations. Here we present results from a perfect model experimental design illustrating how SD method performance can vary not only by SD method, but how performance can also vary by location, season, climate variable of interest, amount of projected climate change, SD configuration choices, and whether one is interested in central tendencies or the tails of the distribution. Awareness of these factors can be helpful when seeking to determine the suitability of downscaled climate projections for specific climate impacts applications. It also points to the potential value of considering more than one SD data product in a study, so as to acknowledge uncertainties associated with the strengths and weaknesses of different downscaling methods.
Handbook of Petroleum Processing
NASA Astrophysics Data System (ADS)
Jones, David S. J.; Pujado, Peter P.
This handbook describes and discusses the features that make up the petroleum refining industry. It begins with a description of the crude oils and their nature, and continues with the saleable products from the refining processes, with a review of the environmental impact. There is a complete overview of the processes that make up the refinery with a brief history of those processes.
Space station needs, attributes, and architectural options: Technology development
NASA Technical Reports Server (NTRS)
Robert, A. C.
1983-01-01
The technology development of the space station is examined as it relates to space station growth and equipment requirements for future missions. Future mission topics are refined and used to establish a systems data base. Technology for human factors engineering, space maintenance, satellite design, and laser communications and tracking is discussed.
Presidential Address: Culture and the Future of Education Research
ERIC Educational Resources Information Center
Halse, Christine
2013-01-01
Recent changes in higher education have confronted education research with a conundrum: how our traditionally multidisciplinary field can refine itself as a unified discipline. In this address I sketch out what this conundrum may mean for education research, both substantively and methodologically, in the future. I propose that one starting point…
NASA Technical Reports Server (NTRS)
Lunsford, Myrtis Leigh
1998-01-01
The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.
Porretta, Daniele; Mastrantonio, Valentina; Amendolia, Sara; Gaiarsa, Stefano; Epis, Sara; Genchi, Claudio; Bandi, Claudio; Otranto, Domenico; Urbanelli, Sandra
2013-09-19
Global climate change can seriously impact on the epidemiological dynamics of vector-borne diseases. In this study we investigated how future climatic changes could affect the climatic niche of Ixodes ricinus (Acari, Ixodida), among the most important vectors of pathogens of medical and veterinary concern in Europe. Species Distribution Modelling (SDM) was used to reconstruct the climatic niche of I. ricinus, and to project it into the future conditions for 2050 and 2080, under two scenarios: a continuous human demographic growth and a severe increase of gas emissions (scenario A2), and a scenario that proposes lower human demographic growth than A2, and a more sustainable gas emissions (scenario B2). Models were reconstructed using the algorithm of "maximum entropy", as implemented in the software Maxent 3.3.3e; 4,544 occurrence points and 15 bioclimatic variables were used. In both scenarios an increase of climatic niche of about two times greater than the current area was predicted as well as a higher climatic suitability under the scenario B2 than A2. Such an increase occurred both in a latitudinal and longitudinal way, including northern Eurasian regions (e.g. Sweden and Russia), that were previously unsuitable for the species. Our models are congruent with the predictions of range expansion already observed in I. ricinus at a regional scale and provide a qualitative and quantitative assessment of the future climatically suitable areas for I. ricinus at a continental scale. Although the use of SDM at a higher resolution should be integrated by a more refined analysis of further abiotic and biotic data, the results presented here suggest that under future climatic scenarios most of the current distribution area of I. ricinus could remain suitable and significantly increase at a continental geographic scale. Therefore disease outbreaks of pathogens transmitted by this tick species could emerge in previous non-endemic geographic areas. Further studies will implement and refine present data toward a better understanding of the risk represented by I. ricinus to human health.
Susitna Hydroelectric Project: terrestrial environmental workshop and preliminary simulation model
Everitt, Robert R.; Sonntag, Nicholas C.; Auble, Gregory T.; Roelle, James E.; Gazey, William
1982-01-01
The technical feasibility, economic viability, and environmental impacts of a hydroelectric development project in the Susitna River Basin are being studied by Acres American, Inc. on behalf of the Alaska Power Authority. As part of these studies, Acres American recently contracted LGL Alaska Research Associates, Inc. to coordinate the terrestrial environmental studies being performed by the Alaska Department of Fish and Game and, as subcontractors to LGL, several University of Alaska research groups. LGL is responsible for further quantifying the potential impacts of the project on terrestrial wildlife and vegetation, and for developing a plan to mitigate adverse impacts on the terrestrial environment. The impact assessment and mitigation plan will be included as part of a license application to the Federal Energy Regulatory Commission (FERC) scheduled for the first quarter of 1983. The quantification of impacts, mitigation planning, and design of future research is being organized using a computer simulation modelling approach. Through a series of workshops attended by researchers, resource managers, and policy-makers, a computer model is being developed and refined for use in the quantification of impacts on terrestrial wildlife and vegetation, and for evaluating different mitigation measures such as habitat enhancement and the designation of replacement lands to be managed by wildlife habitat. This report describes the preliminary model developed at the first workshop held August 23 -27, 1982 in Anchorage.
Side impact test and analyses of a DOT-111 tank car : final report.
DOT National Transportation Integrated Search
2015-10-01
Transportation Technology Center, Inc. conducted a side impact test on a DOT-111 tank car to evaluate the performance of the : tank car under dynamic impact conditions and to provide data for the verification and refinement of a computational model. ...
Refining lunar impact chronology through high spatial resolution 40Ar/39Ar dating of impact melts
Mercer, Cameron M.; Young, Kelsey E.; Weirich, John R.; Hodges, Kip V.; Jolliff, Bradley L.; Wartho, Jo-Anne; van Soest, Matthijs C.
2015-01-01
Quantitative constraints on the ages of melt-forming impact events on the Moon are based primarily on isotope geochronology of returned samples. However, interpreting the results of such studies can often be difficult because the provenance region of any sample returned from the lunar surface may have experienced multiple impact events over the course of billions of years of bombardment. We illustrate this problem with new laser microprobe 40Ar/39Ar data for two Apollo 17 impact melt breccias. Whereas one sample yields a straightforward result, indicating a single melt-forming event at ca. 3.83 Ga, data from the other sample document multiple impact melt–forming events between ca. 3.81 Ga and at least as young as ca. 3.27 Ga. Notably, published zircon U/Pb data indicate the existence of even older melt products in the same sample. The revelation of multiple impact events through 40Ar/39Ar geochronology is likely not to have been possible using standard incremental heating methods alone, demonstrating the complementarity of the laser microprobe technique. Evidence for 3.83 Ga to 3.81 Ga melt components in these samples reinforces emerging interpretations that Apollo 17 impact breccia samples include a significant component of ejecta from the Imbrium basin impact. Collectively, our results underscore the need to quantitatively resolve the ages of different melt generations from multiple samples to improve our current understanding of the lunar impact record, and to establish the absolute ages of important impact structures encountered during future exploration missions in the inner Solar System. PMID:26601128
Mercer, Cameron M; Young, Kelsey E; Weirich, John R; Hodges, Kip V; Jolliff, Bradley L; Wartho, Jo-Anne; van Soest, Matthijs C
2015-02-01
Quantitative constraints on the ages of melt-forming impact events on the Moon are based primarily on isotope geochronology of returned samples. However, interpreting the results of such studies can often be difficult because the provenance region of any sample returned from the lunar surface may have experienced multiple impact events over the course of billions of years of bombardment. We illustrate this problem with new laser microprobe (40)Ar/(39)Ar data for two Apollo 17 impact melt breccias. Whereas one sample yields a straightforward result, indicating a single melt-forming event at ca. 3.83 Ga, data from the other sample document multiple impact melt-forming events between ca. 3.81 Ga and at least as young as ca. 3.27 Ga. Notably, published zircon U/Pb data indicate the existence of even older melt products in the same sample. The revelation of multiple impact events through (40)Ar/(39)Ar geochronology is likely not to have been possible using standard incremental heating methods alone, demonstrating the complementarity of the laser microprobe technique. Evidence for 3.83 Ga to 3.81 Ga melt components in these samples reinforces emerging interpretations that Apollo 17 impact breccia samples include a significant component of ejecta from the Imbrium basin impact. Collectively, our results underscore the need to quantitatively resolve the ages of different melt generations from multiple samples to improve our current understanding of the lunar impact record, and to establish the absolute ages of important impact structures encountered during future exploration missions in the inner Solar System.
NASA Astrophysics Data System (ADS)
Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron
2016-07-01
This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.
NASA Technical Reports Server (NTRS)
Denney, Ewen; Power, John
2003-01-01
We introduce a hierarchical notion of formal proof, useful in the implementation of theorem provers, which we call highproofs. Two alternative definitions are given, motivated by existing notations used in theorem proving research. We define transformations between these two forms of hiproof, develop notions of underlying proof, and give a suitable definition of refinement in order to model incremental proof development. We show that our transformations preserve both underlying proofs and refinement. The relationship of our theory to existing theorem proving systems is discussed, as is its future extension.
Beekhuijzen, Manon
2017-09-01
Since adoption of the first globally implemented guidelines for developmental and reproductive toxicity (DART) testing for pharmaceuticals, industrial chemicals and agrochemicals, many years passed without major updates. However in recent years, significant changes in these guidelines have been made or are being implemented. These changes have been guided by the ethical drive to reduce, refine and replace (3R) animal testing, as well as the addition of endocrine disruptor relevant endpoints. Recent applied improvements have focused on reduction and refinement. Ongoing scientific and technical innovations will provide the means for replacement of animal testing in the future and will improve predictivity in humans. The aim of this review is to provide an overview of ongoing global DART endeavors in respect to the 3Rs, with an outlook towards future advances in DART testing aspiring to reduce animal testing to a minimum and the supreme ambition towards animal-free hazard and risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hernes, P.; Tzortziou, M.; Salisbury, J.; Mannino, A.; Matrai, P.; Friedrichs, M. A.; Del Castillo, C. E.
2014-12-01
The Arctic region is warming faster than anywhere else on the planet, triggering rapid social and economic changes and impacting both terrestrial and marine ecosystems. Yet our understanding of critical processes and interactions along the Arctic land-ocean interface is limited. Arctic-COLORS is a Field Campaign Scoping Study funded by NASA's Ocean Biology and Biogeochemistry Program that aims to improve understanding and prediction of land-ocean interactions in a rapidly changing Arctic coastal zone, and assess vulnerability, response, feedbacks and resilience of coastal ecosystems, communities and natural resources to current and future pressures. Specific science objectives include: - Quantify lateral fluxes to the arctic inner shelf from (i) rivers and (ii) the outer shelf/basin that affect biology, biodiversity, biogeochemistry (i.e. organic matter, nutrients, suspended sediment), and the processing rates of these constituents in coastal waters. - Evaluate the impact of the thawing of Arctic permafrost within the river basins on coastal biology, biodiversity and biogeochemistry, including various rates of community production and the role these may play in the health of regional economies. - Assess the impact of changing Arctic landfast ice and coastal sea ice dynamics. - Establish a baseline for comparison to future change, and use state-of-the-art models to assess impacts of environmental change on coastal biology, biodiversity and biogeochemistry. A key component of Arctic-COLORS will be the integration of satellite and field observations with coupled physical-biogeochemical models for predicting impacts of future pressures on Arctic, coastal ocean, biological processes and biogeochemical cycles. Through interagency and international collaborations, and through the organization of dedicated workshops, town hall meetings and presentations at international conferences, the scoping study engages the broader scientific community and invites participation of experts from a wide range of disciplines, to refine our science objectives and outline detailed research strategies needed to attain these objectives. The deliverable will be a comprehensive report to NASA outlining the major scientific questions, and developing the initial study design and implementation concept.
Yé, Yazoume; Eisele, Thomas P; Eckert, Erin; Korenromp, Eline; Shah, Jui A; Hershey, Christine L; Ivanovich, Elizabeth; Newby, Holly; Carvajal-Velez, Liliana; Lynch, Michael; Komatsu, Ryuichi; Cibulskis, Richard E; Moore, Zhuzhi; Bhattarai, Achuyt
2017-09-01
Concerted efforts from national and international partners have scaled up malaria control interventions, including insecticide-treated nets, indoor residual spraying, diagnostics, prompt and effective treatment of malaria cases, and intermittent preventive treatment during pregnancy in sub-Saharan Africa (SSA). This scale-up warrants an assessment of its health impact to guide future efforts and investments; however, measuring malaria-specific mortality and the overall impact of malaria control interventions remains challenging. In 2007, Roll Back Malaria's Monitoring and Evaluation Reference Group proposed a theoretical framework for evaluating the impact of full-coverage malaria control interventions on morbidity and mortality in high-burden SSA countries. Recently, several evaluations have contributed new ideas and lessons to strengthen this plausibility design. This paper harnesses that new evaluation experience to expand the framework, with additional features, such as stratification, to examine subgroups most likely to experience improvement if control programs are working; the use of a national platform framework; and analysis of complete birth histories from national household surveys. The refined framework has shown that, despite persisting data challenges, combining multiple sources of data, considering potential contributions from both fundamental and proximate contextual factors, and conducting subnational analyses allows identification of the plausible contributions of malaria control interventions on malaria morbidity and mortality.
OCEAN THERMAL ENERGY CONVERSION (OTEC) PROGRAMMATIC ENVIRONMENTAL ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sands, M. D.
1980-01-01
This programmatic environmental analysis is an initial assessment of OTEC technology considering development, demonstration and commercialization; it is concluded that the OTEC development program should continue because the development, demonstration, and commercialization on a single-plant deployment basis should not present significant environmental impacts. However, several areas within the OTEC program require further investigation in order to assess the potential for environmental impacts from OTEC operation, particularly in large-scale deployments and in defining alternatives to closed-cycle biofouling control: (1) Larger-scale deployments of OTEC clusters or parks require further investigations in order to assess optimal platform siting distances necessary to minimize adversemore » environmental impacts. (2) The deployment and operation of the preoperational platform (OTEC-1) and future demonstration platforms must be carefully monitored to refine environmental assessment predictions, and to provide design modifications which may mitigate or reduce environmental impacts for larger-scale operations. These platforms will provide a valuable opportunity to fully evaluate the intake and discharge configurations, biofouling control methods, and both short-term and long-term environmental effects associated with platform operations. (3) Successful development of OTEC technology to use the maximal resource capabilities and to minimize environmental effects will require a concerted environmental management program, encompassing many different disciplines and environmental specialties.« less
Chen, Wendy Y; Aertsens, Joris; Liekens, Inge; Broekx, Steven; De Nocker, Leo
2014-08-01
The strategic importance of ecosystem service valuation as an operational basis for policy decisions on natural restoration has been increasingly recognized in order to align the provision of ecosystem services with the expectation of human society. The contingent valuation method (CVM) is widely used to quantify various ecosystem services. However, two areas of concern arise: (1) whether people value specific functional ecosystem services and overlook some intrinsic aspects of natural restoration, and (2) whether people understand the temporal dimension of ecosystem services and payment schedules given in the contingent scenarios. Using a peri-urban riparian meadow restoration project in Flanders, Belgium as a case, we explored the impacts of residents' perceived importance of various ecosystem services and stated financial constraints on their willingness-to-pay for the proposed restoration project employing the CVM. The results indicated that people tended to value all the benefits of riparian ecosystem restoration concurrently, although they accorded different importances to each individual category of ecosystem services. A longer payment scheme can help the respondents to think more about the flow of ecosystem services into future generations. A weak temporal embedding effect can be detected, which might be attributed to respondents' concern about current financial constraints, rather than financial bindings associated with their income and perceived future financial constraints. This demonstrates the multidimensionality of respondents' financial concerns in CV. This study sheds light on refining future CV studies, especially with regard to public expectation of ecosystem services and the temporal dimension of ecosystem services and payment schedules.
NASA Astrophysics Data System (ADS)
Chen, Wendy Y.; Aertsens, Joris; Liekens, Inge; Broekx, Steven; De Nocker, Leo
2014-08-01
The strategic importance of ecosystem service valuation as an operational basis for policy decisions on natural restoration has been increasingly recognized in order to align the provision of ecosystem services with the expectation of human society. The contingent valuation method (CVM) is widely used to quantify various ecosystem services. However, two areas of concern arise: (1) whether people value specific functional ecosystem services and overlook some intrinsic aspects of natural restoration, and (2) whether people understand the temporal dimension of ecosystem services and payment schedules given in the contingent scenarios. Using a peri-urban riparian meadow restoration project in Flanders, Belgium as a case, we explored the impacts of residents' perceived importance of various ecosystem services and stated financial constraints on their willingness-to-pay for the proposed restoration project employing the CVM. The results indicated that people tended to value all the benefits of riparian ecosystem restoration concurrently, although they accorded different importances to each individual category of ecosystem services. A longer payment scheme can help the respondents to think more about the flow of ecosystem services into future generations. A weak temporal embedding effect can be detected, which might be attributed to respondents' concern about current financial constraints, rather than financial bindings associated with their income and perceived future financial constraints. This demonstrates the multidimensionality of respondents' financial concerns in CV. This study sheds light on refining future CV studies, especially with regard to public expectation of ecosystem services and the temporal dimension of ecosystem services and payment schedules.
Refining a methodology for determining the economic impacts of transportation improvements.
DOT National Transportation Integrated Search
2012-07-01
Estimating the economic impact of transportation improvements has previously proven to be a difficult task. After an exhaustive literature review, it was clear that the transportation profession lacked standards and methodologies for determining econ...
This paper looks at the impact of enforcement activity on facility-level behavior and derives quantitative estimates of the impact. We measure facility-level behavior as the levels of Biological Oxygen Demand (BOD) and Total Suspended Solids (TSS) pollutant discharges generated b...
Copper: Its Environmental Impacts. AIO Red Paper #22.
ERIC Educational Resources Information Center
Boutis, Elizabeth; Jantzen, Jonathan Landis, Ed.
Although copper is a widespread and useful metal, the process of mining and refining copper can have severe detrimental impacts on humans, plants, and animals. The most serious impacts from copper production are the release of sulphur dioxide and other air pollutants and the poisoning of water supplies. These impacts occur in both the mining and…
Refining the site conceptual model at a former uranium mill site in Riverton, Wyoming, USA
Dam, William; Campbell, Sam; Johnson, Ray; ...
2015-07-07
Milling activities at a former uranium mill site near Riverton, Wyoming, USA, contaminated the shallow groundwater beneath and downgradient of the site. Although the mill operated for <6 years (1958-1963), its impact remains an environmental liability. Groundwater modeling predicted that contaminant concentrations were declining steadily, which confirmed the conceptual site model (CSM). However, local flooding in 2010 mobilized contaminants that migrated downgradient from the Riverton site and resulted in a dramatic increase in groundwater contaminant concentrations. This observation indicated that the original CSM was inadequate to explain site conditions and needed to be refined. In response to the new observationsmore » after the flood, a collaborative investigation to better understand site conditions and processes commenced. This investigation included installing 103 boreholes to collect soil and groundwater samples, sampling and analysis of evaporite minerals along the bank of the Little Wind River, an analysis of evaportranspiration in the shallow aquifer, and sampling naturally organic-rich sediments near groundwater discharge areas. The enhanced characterization revealed that the existing CSM did not account for high uranium concentrations in groundwater remaining on the former mill site and groundwater plume stagnation near the Little Wind River. Observations from the flood and subsequent investigations indicate that additional characterization is still needed to continue refining the CSM and determine the viability of the natural flushing compliance strategy. Additional sampling, analysis, and testing of soil and groundwater are necessary to investigate secondary contaminant sources, mobilization of contaminants during floods, geochemical processes, contaminant plume stagnation, distribution of evaporite minerals and organic-rich sediments, and mechanisms and rates of contaminant transfer from soil to groundwater. Future data collection will be used to continually revise the CSM and evaluate the compliance strategy at the site.« less
NASA Technical Reports Server (NTRS)
Glavin, Daniel P.; Dworkin, Jason P.; Lupisella, Mark; Kminek, Gerhard; Rummel, John D.
2010-01-01
NASA and ESA have outlined visions for solar system exploration that will include a series of lunar robotic precursor missions to prepare for, and support a human return to the Moon, and future human exploration of Mars and other destinations. One of the guiding principles for exploration is to pursue compelling scientific questions about the origin and evolution of life. The search for life on objects such as Mars will require that all spacecraft and instrumentation be sufficiently cleaned and sterilized prior to launch to ensure that the scientific integrity of extraterrestrial samples is not jeopardized by terrestrial organic contamination. Under the Committee on Space Research's (COSPAR's) current planetary protection policy for the Moon, no sterilization procedures are required for outbound lunar spacecraft, nor is there yet a planetary protection category for human missions. Future in situ investigations of a variety of locations on the Moon by highly sensitive instruments designed to search for biologically derived organic compounds would help assess the contamination of the Moon by lunar spacecraft. These studies could also provide valuable "ground truth" data for Mars sample return missions and help define planetary protection requirements for future Mars bound spacecraft carrying life detection experiments. In addition, studies of the impact of terrestrial contamination of the lunar surface by the Apollo astronauts could provide valuable data to help refine future Mars surface exploration plans for a human mission to Mars.
Ocean Thermal Energy Conversion (OTEC) Programmatic Environmental Analysis--Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Authors, Various
1980-01-01
The programmatic environmental analysis is an initial assessment of Ocean Thermal Energy Conversion (OTEC) technology considering development, demonstration and commercialization. It is concluded that the OTEC development program should continue because the development, demonstration, and commercialization on a single-plant deployment basis should not present significant environmental impacts. However, several areas within the OTEC program require further investigation in order to assess the potential for environmental impacts from OTEC operation, particularly in large-scale deployments and in defining alternatives to closed-cycle biofouling control: (1) Larger-scale deployments of OTEC clusters or parks require further investigations in order to assess optimal platform siting distancesmore » necessary to minimize adverse environmental impacts. (2) The deployment and operation of the preoperational platform (OTEC-1) and future demonstration platforms must be carefully monitored to refine environmental assessment predictions, and to provide design modifications which may mitigate or reduce environmental impacts for larger-scale operations. These platforms will provide a valuable opportunity to fully evaluate the intake and discharge configurations, biofouling control methods, and both short-term and long-term environmental effects associated with platform operations. (3) Successful development of OTEC technology to use the maximal resource capabilities and to minimize environmental effects will require a concerted environmental management program, encompassing many different disciplines and environmental specialties. This volume contains these appendices: Appendix A -- Deployment Scenario; Appendix B -- OTEC Regional Characterization; and Appendix C -- Impact and Related Calculations.« less
Developing empirically supported theories of change for housing investment and health
Thomson, Hilary; Thomas, Sian
2015-01-01
The assumption that improving housing conditions can lead to improved health may seem a self-evident hypothesis. Yet evidence from intervention studies suggests small or unclear health improvements, indicating that further thought is required to refine this hypothesis. Articulation of a theory can help avoid a black box approach to research and practice and has been advocated as especially valuable for those evaluating complex social interventions like housing. This paper presents a preliminary theory of housing improvement and health based on a systematic review conducted by the authors. Following extraction of health outcomes, data on all socio-economic impacts were extracted by two independent reviewers from both qualitative and quantitative studies. Health and socio-economic outcome data from the better quality studies (n = 23/34) were mapped onto a one page logic models by two independent reviewers and a final model reflecting reviewer agreement was prepared. Where there was supporting evidence of links between outcomes these were indicated in the model. Two models of specific improvements (warmth & energy efficiency; and housing led renewal), and a final overall model were prepared. The models provide a visual map of the best available evidence on the health and socio-economic impacts of housing improvement. The use of a logic model design helps to elucidate the possible pathways between housing improvement and health and as such might be described as an empirically based theory. Changes in housing factors were linked to changes in socio-economic determinants of health. This points to the potential for longer term health impacts which could not be detected within the lifespan of the evaluations. The developed theories are limited by the available data and need to be tested and refined. However, in addition to providing one page summaries for evidence users, the theory may usefully inform future research on housing and health. PMID:25461878
Developing empirically supported theories of change for housing investment and health.
Thomson, Hilary; Thomas, Sian
2015-01-01
The assumption that improving housing conditions can lead to improved health may seem a self-evident hypothesis. Yet evidence from intervention studies suggests small or unclear health improvements, indicating that further thought is required to refine this hypothesis. Articulation of a theory can help avoid a black box approach to research and practice and has been advocated as especially valuable for those evaluating complex social interventions like housing. This paper presents a preliminary theory of housing improvement and health based on a systematic review conducted by the authors. Following extraction of health outcomes, data on all socio-economic impacts were extracted by two independent reviewers from both qualitative and quantitative studies. Health and socio-economic outcome data from the better quality studies (n = 23/34) were mapped onto a one page logic models by two independent reviewers and a final model reflecting reviewer agreement was prepared. Where there was supporting evidence of links between outcomes these were indicated in the model. Two models of specific improvements (warmth & energy efficiency; and housing led renewal), and a final overall model were prepared. The models provide a visual map of the best available evidence on the health and socio-economic impacts of housing improvement. The use of a logic model design helps to elucidate the possible pathways between housing improvement and health and as such might be described as an empirically based theory. Changes in housing factors were linked to changes in socio-economic determinants of health. This points to the potential for longer term health impacts which could not be detected within the lifespan of the evaluations. The developed theories are limited by the available data and need to be tested and refined. However, in addition to providing one page summaries for evidence users, the theory may usefully inform future research on housing and health. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
A refined definition improves the measurement reliability of the tip-apex distance.
Sakagoshi, Daigo; Sawaguchi, Takeshi; Shima, Yosuke; Inoue, Daisuke; Oshima, Takeshi; Goldhahn, Sabine
2016-07-01
Tip-apex distance (TAD) is reported as a good predictor for cut-outs of lag screws and spiral blades in the treatment of intertrochanteric fractures, and surgeons are advised to strive for TAD within 20 mm. However, the femoral neck axis and the position of the lower limb in the lateral radiograph are not clearly defined and can lead to measurement errors. We propose a refined TAD by defining these factors. The objective of this study was to analyze the reliability of this refined TAD. The radiographs of 130 prospective cases with unstable trochanteric fractures were used for the analysis of the refined TAD. The refined TAD was independently measured by 2 raters with clinical experience of more than 10 years (rater 1, 2) and 2 raters with much less clinical experience (rater 3, 4) after they received a training about the new measurement method. Intraclass correlation coefficient (ICC [2,4]) was calculated to assess the interrater reliability. The mean refined TADs were 18.2:18.4:18.2:18.2 mm for rater 1:2:3:4. There was a strong correlation among all four raters (ICC 0.998, (95% CI: 0.998, 0.999). Regardless of the clinical experience of raters, the refined TAD is a reliable tool and can be used to develop new TAD recommendations for predicting failure of fixation. Future studies with larger samples are needed to evaluate the predictive value of the refined TAD. Copyright © 2016 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.
Sloand, Elizabeth; Groves, Sara; Brager, Rosemarie
2004-01-01
The importance of cultural competency in all areas of American society is well accepted. Indeed, the evolving demographics of the country make it imperative. A wide range of educational and work settings has addressed the concept, from business and government to education and health. Cultural competency is particularly critical in the realm of healthcare, as the potential impact on quality of health and life is at stake. Nursing is a leader in this field, with a long theoretical and practice history of attention to, and respect for, individual differences. This article reviews cultural competency education in nursing and its respective educational settings. Common threads and different models are discussed. The program components of cultural competency education in one School of Nursing are highlighted. Future directions towards refining cultural competency education are presented.
Levin-Rector, Alison; Wilson, Elisha L; Fine, Annie D; Greene, Sharon K
2015-02-01
Since the early 2000s, the Bureau of Communicable Disease of the New York City Department of Health and Mental Hygiene has analyzed reportable infectious disease data weekly by using the historical limits method to detect unusual clusters that could represent outbreaks. This method typically produced too many signals for each to be investigated with available resources while possibly failing to signal during true disease outbreaks. We made method refinements that improved the consistency of case inclusion criteria and accounted for data lags and trends and aberrations in historical data. During a 12-week period in 2013, we prospectively assessed these refinements using actual surveillance data. The refined method yielded 74 signals, a 45% decrease from what the original method would have produced. Fewer and less biased signals included a true citywide increase in legionellosis and a localized campylobacteriosis cluster subsequently linked to live-poultry markets. Future evaluations using simulated data could complement this descriptive assessment.
Major challenges for correlational ecological niche model projections to future climate conditions.
Peterson, A Townsend; Cobos, Marlon E; Jiménez-García, Daniel
2018-06-20
Species-level forecasts of distributional potential and likely distributional shifts, in the face of changing climates, have become popular in the literature in the past 20 years. Many refinements have been made to the methodology over the years, and the result has been an approach that considers multiple sources of variation in geographic predictions, and how that variation translates into both specific predictions and uncertainty in those predictions. Although numerous previous reviews and overviews of this field have pointed out a series of assumptions and caveats associated with the methodology, three aspects of the methodology have important impacts but have not been treated previously in detail. Here, we assess those three aspects: (1) effects of niche truncation on model transfers to future climate conditions, (2) effects of model selection procedures on future-climate transfers of ecological niche models, and (3) relative contributions of several factors (replicate samples of point data, general circulation models, representative concentration pathways, and alternative model parameterizations) to overall variance in model outcomes. Overall, the view is one of caution: although resulting predictions are fascinating and attractive, this paradigm has pitfalls that may bias and limit confidence in niche model outputs as regards the implications of climate change for species' geographic distributions. © 2018 New York Academy of Sciences.
ERIC Educational Resources Information Center
Mann, Dale
IMPACT II is a teacher-to-teacher networking program designed to improve teaching in New York City schools. Teachers who have been working on new ideas that need more refinement are eligible for $300 grants offered to program developers. Teachers who would like to adopt ideas previously developed by the program may receive $200 as replicator…
Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B
NASA Technical Reports Server (NTRS)
Yeganefard, Sanaz; Butler, Michael; Rezazadeh, Abdolbaghi
2010-01-01
Recently a set of guidelines, or cookbook, has been developed for modelling and refinement of control problems in Event-B. The Event-B formal method is used for system-level modelling by defining states of a system and events which act on these states. It also supports refinement of models. This cookbook is intended to systematize the process of modelling and refining a control problem system by distinguishing environment, controller and command phenomena. Our main objective in this paper is to investigate and evaluate the usefulness and effectiveness of this cookbook by following it throughout the formal modelling of cruise control system found in cars. The outcomes are identifying the benefits of the cookbook and also giving guidance to its future users.
Implementation of Implicit Adaptive Mesh Refinement in an Unstructured Finite-Volume Flow Solver
NASA Technical Reports Server (NTRS)
Schwing, Alan M.; Nompelis, Ioannis; Candler, Graham V.
2013-01-01
This paper explores the implementation of adaptive mesh refinement in an unstructured, finite-volume solver. Unsteady and steady problems are considered. The effect on the recovery of high-order numerics is explored and the results are favorable. Important to this work is the ability to provide a path for efficient, implicit time advancement. A method using a simple refinement sensor based on undivided differences is discussed and applied to a practical problem: a shock-shock interaction on a hypersonic, inviscid double-wedge. Cases are compared to uniform grids without the use of adapted meshes in order to assess error and computational expense. Discussion of difficulties, advances, and future work prepare this method for additional research. The potential for this method in more complicated flows is described.
Safirova, Elena; Barry, James J.; Hastorun, Sinan; Matos, Grecia R.; Perez, Alberto Alexander; Bedinger, George M.; Bray, E. Lee; Jasinski, Stephen M.; Kuck, Peter H.; Loferski, Patricia J.
2017-05-18
The potential immediate effects of a hypothetical shock to Russia’s supply of selected mineral commodities on the world market and on individual countries were determined and monetized (in 2014 U.S. dollars). The mineral commodities considered were aluminum (refined primary), nickel (refined primary), palladium (refined) and platinum (refined), potash, and titanium (mill products), and the regions and countries of primary interest were the United States, the European Union (EU–28), and China. The shock is assumed to have infinite duration, but only the immediate effects, those limited by a 1-year period, are considered.A methodology for computing and monetizing the potential impacts was developed. Then the data pertaining to all six mineral commodities were collected and the most likely effects were computed. Because of the uncertainties associated with some of the data, sensitivity analyses were conducted to confirm the validity of the results.Results indicate that the impact on the United States arising from a shock to Russia’s supply, in terms of the value of net exports, would range from a gain of \\$336 million for titanium mill products to a loss of \\$237 million for potash; thus, the overall effect of a supply shock is likely to be quite modest. The study also demonstrates that, taken alone, Russia’s share in the world production of a particular commodity is not necessarily indicative of the size of potential impacts resulting from a supply shock; other factors, such as prices, domestic production, and the structure of international commodity flows were found to be important as well.
Designing an Agent-Based Model for Childhood Obesity Interventions: A Case Study of ChildObesity180.
Hennessy, Erin; Ornstein, Joseph T; Economos, Christina D; Herzog, Julia Bloom; Lynskey, Vanessa; Coffield, Edward; Hammond, Ross A
2016-01-07
Complex systems modeling can provide useful insights when designing and anticipating the impact of public health interventions. We developed an agent-based, or individual-based, computation model (ABM) to aid in evaluating and refining implementation of behavior change interventions designed to increase physical activity and healthy eating and reduce unnecessary weight gain among school-aged children. The potential benefits of applying an ABM approach include estimating outcomes despite data gaps, anticipating impact among different populations or scenarios, and exploring how to expand or modify an intervention. The practical challenges inherent in implementing such an approach include data resources, data availability, and the skills and knowledge of ABM among the public health obesity intervention community. The aim of this article was to provide a step-by-step guide on how to develop an ABM to evaluate multifaceted interventions on childhood obesity prevention in multiple settings. We used data from 2 obesity prevention initiatives and public-use resources. The details and goals of the interventions, overview of the model design process, and generalizability of this approach for future interventions is discussed.
Impacts of Climate Policy on Regional Air Quality, Health, and Air Quality Regulatory Procedures
NASA Astrophysics Data System (ADS)
Thompson, T. M.; Selin, N. E.
2011-12-01
Both the changing climate, and the policy implemented to address climate change can impact regional air quality. We evaluate the impacts of potential selected climate policies on modeled regional air quality with respect to national pollution standards, human health and the sensitivity of health uncertainty ranges. To assess changes in air quality due to climate policy, we couple output from a regional computable general equilibrium economic model (the US Regional Energy Policy [USREP] model), with a regional air quality model (the Comprehensive Air Quality Model with Extensions [CAMx]). USREP uses economic variables to determine how potential future U.S. climate policy would change emissions of regional pollutants (CO, VOC, NOx, SO2, NH3, black carbon, and organic carbon) from ten emissions-heavy sectors of the economy (electricity, coal, gas, crude oil, refined oil, energy intensive industry, other industry, service, agriculture, and transportation [light duty and heavy duty]). Changes in emissions are then modeled using CAMx to determine the impact on air quality in several cities in the Northeast US. We first calculate the impact of climate policy by using regulatory procedures used to show attainment with National Ambient Air Quality Standards (NAAQS) for ozone and particulate matter. Building on previous work, we compare those results with the calculated results and uncertainties associated with human health impacts due to climate policy. This work addresses a potential disconnect between NAAQS regulatory procedures and the cost/benefit analysis required for and by the Clean Air Act.
Pollitt, Alexandra; Potoglou, Dimitris; Patil, Sunil; Burge, Peter; Guthrie, Susan; King, Suzanne; Wooding, Steven; Grant, Jonathan
2016-01-01
Objectives (1) To test the use of best–worst scaling (BWS) experiments in valuing different types of biomedical and health research impact, and (2) to explore how different types of research impact are valued by different stakeholder groups. Design Survey-based BWS experiment and discrete choice modelling. Setting The UK. Participants Current and recent UK Medical Research Council grant holders and a representative sample of the general public recruited from an online panel. Results In relation to the study's 2 objectives: (1) we demonstrate the application of BWS methodology in the quantitative assessment and valuation of research impact. (2) The general public and researchers provided similar valuations for research impacts such as improved life expectancy, job creation and reduced health costs, but there was less agreement between the groups on other impacts, including commercial capacity development, training and dissemination. Conclusions This is the second time that a discrete choice experiment has been used to assess how the general public and researchers value different types of research impact, and the first time that BWS has been used to elicit these choices. While the 2 groups value different research impacts in different ways, we note that where they agree, this is generally about matters that are seemingly more important and associated with wider social benefit, rather than impacts occurring within the research system. These findings are a first step in exploring how the beneficiaries and producers of research value different kinds of impact, an important consideration given the growing emphasis on funding and assessing research on the basis of (potential) impact. Future research should refine and replicate both the current study and that of Miller et al in other countries and disciplines. PMID:27540096
NASA Technical Reports Server (NTRS)
Hague, M. J.; Ferrari, M. R.; Miller, J. R.; Patterson, D. A.; Russell, G. L.; Farrell, A.P.; Hinch, S. G.
2010-01-01
Short episodic high temperature events can be lethal for migrating adult Pacific salmon (Oncorhynchus spp.). We downscaled temperatures for the Fraser River, British Columbia to evaluate the impact of climate warming on the frequency of exceeding thermal thresholds associated with salmon migratory success. Alarmingly, a modest 1.0 C increase in average summer water temperature over 100 years (1981-2000 to 2081-2100) tripled the number of days per year exceeding critical salmonid thermal thresholds (i.e. 19.0 C). Refined thresholds for two populations (Gates Creek and Weaver Creek) of sockeye salmon (Oncorhynchus nerka) were defined using physiological constraint models based on aerobic scope. While extreme temperatures leading to complete aerobic collapse remained unlikely under our warming scenario, both populations were increasingly forced to migrate upriver at reduced levels of aerobic performance (e.g. in 80% of future simulations, => 90% of salmon encountered temperatures exceeding population specific thermal optima for maximum aerobic scope; T(sub opt)) = 16.3 C for Gates Creek and T(sub sopt)=14.5 C for Weaver Creek). Assuming recent changes to river entry timing persist, we also predicted dramatic increases in the probability of freshwater mortality for Weaver Creek salmon due to reductions in aerobic, and general physiological, performance (e.g. in 42% of future simulations =>50% of Weaver Creek fish exceeded temperature thresholds associated with 0 - 60% of maximum aerobic scope). Potential for adaptation via directional selection on run-timing was more evident for the Weaver Creek population. Early entry Weaver Creek fish experienced 25% (range: 15 - 31%) more suboptimal temperatures than late entrants, compared with an 8% difference (range: 0 - 17%) between early and late Gates Creek fish. Our results emphasize the need to consider daily temperature variability in association with population-specific differences in behaviour and physiological constraints when forecasting impacts of climate change on migratory survival of aquatic species.
Assessment of Alternative Aircraft Fuels
NASA Technical Reports Server (NTRS)
1984-01-01
The purpose of this symposium is to provide representatives from industry, government, and academia concerned with the availability and quality of future aviation turbine fuels with recent technical results and a status review of DOD and NASA sponsored fuels research projects. The symposium has included presentations on the potential crude sources, refining methods, and characteristics of future fuels; the effects of changing fuel characteristics on the performance and durability of jet aircraft components and systems; and the prospects for evolving suitable technology to produce and use future fuels.
NASA Astrophysics Data System (ADS)
Evans, Martin; Allott, Tim; Worrall, Fred; Rowson, James; Maskill, Rachael
2014-05-01
Water table is arguably the dominant control on biogeochemical cycling in peatland systems. Local water tables are controlled by peat surface water balance and lateral transfer of water driven by slope can be a significant component of this balance. In particular, blanket peatlands typically have relatively high surface slope compared to other peatland types so that there is the potential for water table to be significantly contolled by topographic context. UK blanket peatlands are also significantly eroded so that there is the potential for additional topographic drainage of the peatland surface. This paper presents a topographically driven model of blanket peat water table. An initial model presented in Allott et al. (2009) has been refined and tested against further water table data collected across the Bleaklow and Kinderscout plateaux of the English Peak District. The water table model quantifies the impact of peat erosion on water table throughout this dramatically dissected landscape demonstrating that almost 50% of the landscape has suffered significant water table drawdown. The model calibrates the impact of slope and degree of dissection on local water tables but does not incorporate any effects of surface cover on water table conditions. Consequently significant outliers in the test data are potentially indicative of important impacts of surface cover on water table conditions. In the test data presented here sites associated with regular moorland burning are significant outliers. The data currently available do not allow us to draw conclusions around the impact of land cover but they indicate an important potential application of the validated model in controlling for topographic position in further testing of the impact of land cover on peatland water tables. Allott, T.E.H. & Evans, M.G., Lindsay, J.B., Agnew, C.T., Freer, J.E., Jones, A. & Parnell, M. Water tables in Peak District blanket peatlands. Moors for the Future Report No. 17. Moors for the Future Partnership, Edale, 47pp.
NASA Technical Reports Server (NTRS)
Costogue, E. N.; Ferber, R.; Lutwack, R.; Lorenz, J. H.; Pellin, R.
1984-01-01
Photovoltaic arrays that convert solar energy into electrical energy can become a cost effective bulk energy generation alternative, provided that an adequate supply of low cost materials is available. One of the key requirements for economic photovoltaic cells is reasonably priced silicon. At present, the photovoltaic industry is dependent upon polycrystalline silicon refined by the Siemens process primarily for integrated circuits, power devices, and discrete semiconductor devices. This dependency is expected to continue until the DOE sponsored low cost silicon refining technology developments have matured to the point where they are in commercial use. The photovoltaic industry can then develop its own source of supply. Silicon material availability and market pricing projections through 1988 are updated based on data collected early in 1984. The silicon refining industry plans to meet the increasing demands of the semiconductor device and photovoltaic product industries are overviewed. In addition, the DOE sponsored technology research for producing low cost polycrystalline silicon, probabilistic cost analysis for the two most promising production processes for achieving the DOE cost goals, and the impacts of the DOE photovoltaics program silicon refining research upon the commercial polycrystalline silicon refining industry are addressed.
Effect of refining variables on the properties and composition of JP-5
NASA Technical Reports Server (NTRS)
Lieberman, M.; Taylor, W. F.
1980-01-01
Potential future problem areas that could arise from changes in the composition, properties, and potential availability of JP-5 produced in the near future are identified. Potential fuel problems concerning thermal stability, lubricity, low temperature flow, combustion, and the effect of the use of specific additives on fuel properties and performance are discussed. An assessment of available crudes and refinery capabilities is given.
NASA Astrophysics Data System (ADS)
StJohn, D. H.; Easton, M. A.; Qian, M.; Taylor, J. A.
2013-07-01
This paper builds on the "Grain Refinement of Mg Alloys" published in 2005 and reviews the grain refinement research on Mg alloys that has been undertaken since then with an emphasis on the theoretical and analytical methods that have been developed. Consideration of recent research results and current theoretical knowledge has highlighted two important factors that affect an alloy's as-cast grain size. The first factor applies to commercial Mg-Al alloys where it is concluded that impurity and minor elements such as Fe and Mn have a substantially negative impact on grain size because, in combination with Al, intermetallic phases can be formed that tend to poison the more potent native or deliberately added nucleant particles present in the melt. This factor appears to explain the contradictory experimental outcomes reported in the literature and suggests that the search for a more potent and reliable grain refining technology may need to take a different approach. The second factor applies to all alloys and is related to the role of constitutional supercooling which, on the one hand, promotes grain nucleation and, on the other hand, forms a nucleation-free zone preventing further nucleation within this zone, consequently limiting the grain refinement achievable, particularly in low solute-containing alloys. Strategies to reduce the negative impact of these two factors are discussed. Further, the Interdependence model has been shown to apply to a broad range of casting methods from slow cooling gravity die casting to fast cooling high pressure die casting and dynamic methods such as ultrasonic treatment.
Validating neural-network refinements of nuclear mass models
NASA Astrophysics Data System (ADS)
Utama, R.; Piekarewicz, J.
2018-01-01
Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.
High Strain Rate Response of 7055 Aluminum Alloy Subject to Square-spot Laser Shock Peening
NASA Astrophysics Data System (ADS)
Sun, Rujian; Zhu, Ying; Li, Liuhe; Guo, Wei; Peng, Peng
2017-12-01
The influences of laser pulse energy and impact time on high strain rate response of 7055 aluminum alloy subject to square-spot laser shock peening (SLSP) were investigate. Microstructural evolution was characterized by OM, SEM and TEM. Microhardness distribution and in-depth residual stress in 15 J with one and two impacts and 25 J with one and two impacts were analyzed. Results show that the original rolling structures were significantly refined due to laser shock induced recrystallization. High density of microdefects was generated, such as dislocation tangles, dislocation wall and stacking faults. Subgrains and nanograins were induced in the surface layer, resulting in grain refinement in the near surface layer after SLSP. Compressive residual stresses with maximum value of more than -200 MPa and affected depths of more than 1 mm can be generated after SLSP. Impact time has more effectiveness than laser pulse energy in increasing the magnitude of residual stress and achieving thicker hardening layer.
>From the benefits of micro to the threats of nano for the ore-mining and ore-refining sectors
NASA Astrophysics Data System (ADS)
van Loon, A. J.
2002-07-01
Nanotechnology is developing fast, with much impact on a wide variety of industries. It is likely that current and future developments will result in the possibility to economically manipulate materials on a nanoscale. This will bring advantages on the one hand, as a further step forward after the development of technologies on a microscale, but it may also result in developments that make several now economically important activities largely, if not entirely, superfluous. The present-day progress in the field of recycling waste, in combination with developments that may make energy available in sufficient quantities at an acceptable price level, might result in technologies that isolate valuable compounds from waste at a nanoscale, thus, taking over the role of the mining industry as a provider of raw materials. It is suggested that the mining industry becomes strongly involved in nanoscale research, in order to combine their knowledge of ore properties and extraction methods with the knowledge of nanotechnological engineers about how to manipulate individual compounds. This may provide a chance for the present-day ore-mining and ore-refining companies to survive in a world that would otherwise probably not manage to supply sufficient raw materials for the Earth's growing population, which also will strive for a rise in the average standard of living.
Human Laboratory Paradigms in Alcohol Research
Plebani, Jennifer G.; Ray, Lara A.; Morean, Meghan E.; Corbin, William R.; Mackillop, James; Amlung, Michael; King, Andrea C.
2014-01-01
Human laboratory studies have a long and rich history in the field of alcoholism. Human laboratory studies have allowed for advances in alcohol research in a variety of ways, including elucidating of the neurobehavioral mechanisms of risk, identifying phenotypically distinct sub-types of alcohol users, investigating of candidate genes underlying experimental phenotypes for alcoholism, and testing mechanisms of action of alcoholism pharmacotherapies on clinically-relevant translational phenotypes, such as persons exhibiting positive-like alcohol effects or alcohol craving. Importantly, the field of human laboratory studies in addiction has progressed rapidly over the past decade and has built upon earlier findings of alcohol's neuropharmacological effects to advancing translational research on alcoholism etiology and treatment. To that end, the new generation of human laboratory studies has focused on applying new methodologies, further refining alcoholism phenotypes, and translating these findings to studies of alcoholism genetics, medication development, and pharmacogenetics. The combination of experimental laboratory approaches with recent developments in neuroscience and pharmacology has been particularly fruitful in furthering our understanding of the impact of individual differences in alcoholism risk and in treatment response. This review of the literature focuses on human laboratory studies of subjective intoxication, alcohol craving, anxiety, and behavioral economics. Each section discusses opportunities for phenotype refinement under laboratory conditions, as well as its application to translational science of alcoholism. A summary and recommendations for future research are also provided. PMID:22309888
Franco, N H; Olsson, I A S
2014-01-01
The 3Rs principle of replacement, reduction, and refinement has increasingly been endorsed by legislators and regulatory bodies as the best approach to tackle the ethical dilemma presented by animal experimentation in which the potential benefits for humans stand against the costs borne by the animals. Even when animal use is tightly regulated and supervised, the individual researcher's responsibility is still decisive in the implementation of the 3Rs. Training in laboratory animal science (LAS) aims to raise researchers' awareness and increase their knowledge, but its effect on scientists' attitudes and practice has not so far been systematically assessed. Participants (n = 206) in eight LAS courses (following the Federation of European Laboratory Animal Science Associations category C recommendations) in Portugal were surveyed in a self-administered questionnaire during the course. Questions were related mainly to the 3Rs and their application, attitudes to animal use and the ethical review of animal experiments. One year later, all the respondents were asked to answer a similar questionnaire (57% response rate) with added self-evaluation questions on the impact of training. Our results suggest that the course is effective in promoting awareness and increasing knowledge of the 3Rs, particularly with regard to refinement. However, participation in the course did not change perceptions on the current and future needs for animal use in research.
Comte, Adrien; Pendleton, Linwood H
2018-03-01
Coral reef ecosystems and the people who depend on them are increasingly exposed to the adverse effects of global environmental change (GEC), including increases in sea-surface temperature and ocean acidification. Managers and decision-makers need a better understanding of the options available for action in the face of these changes. We refine a typology of actions developed by Gattuso et al. (2015) that could serve in prioritizing strategies to deal with the impacts of GEC on reefs and people. Using the typology we refined, we investigate the scientific effort devoted to four types of management strategies: mitigate, protect, repair, adapt that we tie to the components of the chain of impact they affect: ecological vulnerability or social vulnerability. A systematic literature review is used to investigate quantitatively how scientific effort over the past 25 years is responding to the challenge posed by GEC on coral reefs and to identify gaps in research. A growing literature has focused on these impacts and on management strategies to sustain coral reef social-ecological systems. We identify 767 peer reviewed articles published between 1990 and 2016 that address coral reef management in the context of GEC. The rate of publication of such studies has increased over the years, following the general trend in climate research. The literature focuses on protect strategies the most, followed by mitigate and adapt strategies, and finally repair strategies. Developed countries, particularly Australia and the United States, are over-represented as authors and locations of case studies across all types of management strategies. Authors affiliated in developed countries play a major role in investigating case studies across the globe. The majority of articles focus on only one of the four categories of actions. A gap analysis reveals three directions for future research: (1) more research is needed in South-East Asia and other developing countries where the impacts of GEC on coral reefs will be the greatest, (2) more scholarly effort should be devoted to understanding how adapt and repair strategies can deal with the impacts of GEC, and (3) the simultaneous assessment of multiple strategies is needed to understand trade-offs and synergies between actions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Dinosaurs can fly -- High performance refining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Treat, J.E.
1995-09-01
High performance refining requires that one develop a winning strategy based on a clear understanding of one`s position in one`s company`s value chain; one`s competitive position in the products markets one serves; and the most likely drivers and direction of future market forces. The author discussed all three points, then described measuring performance of the company. To become a true high performance refiner often involves redesigning the organization as well as the business processes. The author discusses such redesigning. The paper summarizes ten rules to follow to achieve high performance: listen to the market; optimize; organize around asset or areamore » teams; trust the operators; stay flexible; source strategically; all maintenance is not equal; energy is not free; build project discipline; and measure and reward performance. The paper then discusses the constraints to the implementation of change.« less
Development of a foundation for a case definition of post-treatment Lyme disease syndrome.
Aucott, John N; Crowder, Lauren A; Kortte, Kathleen B
2013-06-01
The study objective is to demonstrate the clinical and research utility of an operationalized definition of post-treatment Lyme disease syndrome (PTLDS), as proposed by the Infectious Diseases Society of America. Seventy-four patients with confirmed erythema migrans and 14 controls were enrolled. Patient-reported symptoms and health function (SF-36) were collected pre-treatment and at follow-up visits over 6 months post-treatment. Eight (11%) patients met our operationalized definition of PTLDS, which included self-reported symptoms of fatigue, widespread musculoskeletal pain or cognitive complaints, and functional impact as measured by a T score of <45 on the composite SF-36. No controls met the functional impact criteria. Forty-three (60% patients returned to their previous health status when measured at 6 months post-treatment. Twenty (28%) patients had either residual symptoms or functional impact, but not both, and did not meet criteria for PTLDS. This operationalized definition of PTLDS allows for identification of those patients who are treated for early Lyme disease and have significant post-treatment illness, as they have both residual symptoms and impact on daily life functioning. With further refinement and improvement of this operationalized definition, the true incidence of PTLDS can be determined and future studies can be designed to examine its pathophysiology and treatment. Copyright © 2013 International Society for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Yé, Yazoume; Eisele, Thomas P.; Eckert, Erin; Korenromp, Eline; Shah, Jui A.; Hershey, Christine L.; Ivanovich, Elizabeth; Newby, Holly; Carvajal-Velez, Liliana; Lynch, Michael; Komatsu, Ryuichi; Cibulskis, Richard E.; Moore, Zhuzhi; Bhattarai, Achuyt
2017-01-01
Abstract. Concerted efforts from national and international partners have scaled up malaria control interventions, including insecticide-treated nets, indoor residual spraying, diagnostics, prompt and effective treatment of malaria cases, and intermittent preventive treatment during pregnancy in sub-Saharan Africa (SSA). This scale-up warrants an assessment of its health impact to guide future efforts and investments; however, measuring malaria-specific mortality and the overall impact of malaria control interventions remains challenging. In 2007, Roll Back Malaria's Monitoring and Evaluation Reference Group proposed a theoretical framework for evaluating the impact of full-coverage malaria control interventions on morbidity and mortality in high-burden SSA countries. Recently, several evaluations have contributed new ideas and lessons to strengthen this plausibility design. This paper harnesses that new evaluation experience to expand the framework, with additional features, such as stratification, to examine subgroups most likely to experience improvement if control programs are working; the use of a national platform framework; and analysis of complete birth histories from national household surveys. The refined framework has shown that, despite persisting data challenges, combining multiple sources of data, considering potential contributions from both fundamental and proximate contextual factors, and conducting subnational analyses allows identification of the plausible contributions of malaria control interventions on malaria morbidity and mortality. PMID:28990923
Unified Synthesis Product (USP) Recommendations
NASA Astrophysics Data System (ADS)
Peterson, T. C.
2009-05-01
The USP identifies a number of areas in which inadequate information or understanding hampers our ability to estimate likely future climate change and its impacts. For example, our knowledge of changes in tornadoes, hail, and ice storms is quite limited, making it difficult to know if and how such events have changed as climate has warmed, and how they might change in the future. Research on ecological responses to climate change also is limited, as is our understanding of social responses. The Report identifies the five most important gaps in knowledge and offers some thoughts on how to address those gaps: 1. Expand our understanding of climate change impacts. There is a clear need to increase understanding of how ecosystems, social and economic systems, human health, and the built environment will be affected by climate change in the context of other stresses. This includes ecosystems as well as economic systems, human health, and the built environment. 2. Refine ability to project climate change at local scales. One of the main messages to emerge from the past decade of synthesis and assessments is that while climate change is a global issue, it has a great deal of regional variability. There is an indisputable need to improve understanding of climate system effects at these smaller scales, because these are often the scales of decision-making in society. 3. Expand capacity to provide decision makers and the public with relevant information on climate change and its impacts. The United States has tremendous potential to create more comprehensive measurement, archive, and data-access systems that could provide great benefit to society. 4. Improve understanding of and ability to identify thresholds likely to lead to abrupt changes in the climate system. Paleoclimatic data shows that climate can and has changed quite abruptly when certain thresholds are crossed. Similarly, there is evidence that ecological and human systems can undergo abrupt change when tipping points are reached. 5. Enhance understanding of how society can adapt to climate change in the context of multiple stresses. There is currently limited knowledge about the ability of communities, regions, and sectors to adapt to future climate change. It is essential to improve understanding of how the capacity to adapt to a changing climate might be exercised, and the vulnerabilities to climate change and other environmental stresses that might remain. Results from these efforts would inform future assessments that continue building our understanding of humanity's impacts on climate, and climate's impacts on us. Such assessments will continue to play a role in helping the U.S. respond to changing conditions. A vision for future climate change assessments includes both sustained extensive practitioner and stakeholder involvement, and periodic, targeted, scientifically rigorous reports similar to the CCSP Synthesis and Assessment Products.
Chukwu, L O; Nwachukwu, S C U
2005-07-01
Water quality characteristics, benthic macro-invertebrates and microbial communities of three first order streams in South West Nigeria were investigated to assess the effects of refined petroleum five months after spillage. All physical and chemical conditions except temperature and pH were significantly different (P<0.01) at the upstream control stations and impacted stations reflecting the perturbational stress. The benthic macro-invertebrate fauna were dominated by arthropods, but the faunal spectrum was dissimilar at all the stations studied. Sampling stations at the epicentre of the spill showed considerable reduction in faunal compositions and relative abundance. Generally, the microbial density and diversity were highest in both soil and water samples from impacted sites than in control sites. There was a significantly higher proportion (P < 0.05) of hydrocarbon utilizers in soil than in water samples in all stations except in samples from stations (P<0.05).
Implementing System-Level Graduation Standards
ERIC Educational Resources Information Center
Moore, Carol A.; Wilks, Karrin E.
2011-01-01
Driven by external pressure for increased accountability and internal pressure for improved learning outcomes, colleges across the country have been developing and refining assessment systems for several decades. In some cases, assessment results have significant positive impact. In other cases, the results have little impact, are not seen as…
Good, Jean-Marc; Mahoney, Michael; Miyazaki, Taisuke; Tanaka, Kenji F; Sakimura, Kenji; Watanabe, Masahiko; Kitamura, Kazuo; Kano, Masanobu
2017-11-21
Neural circuits undergo massive refinements during postnatal development. In the developing cerebellum, the climbing fiber (CF) to Purkinje cell (PC) network is drastically reshaped by eliminating early-formed redundant CF to PC synapses. To investigate the impact of CF network refinement on PC population activity during postnatal development, we monitored spontaneous CF responses in neighboring PCs and the activity of populations of nearby CF terminals using in vivo two-photon calcium imaging. Population activity is highly synchronized in newborn mice, and the degree of synchrony gradually declines during the first postnatal week in PCs and, to a lesser extent, in CF terminals. Knockout mice lacking P/Q-type voltage-gated calcium channel or glutamate receptor δ2, in which CF network refinement is severely impaired, exhibit an abnormally high level of synchrony in PC population activity. These results suggest that CF network refinement is a structural basis for developmental desynchronization and maturation of PC population activity. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Development of Encapsulated Dye for Surface Impact Damage Indicator System.
1987-09-01
GROUP SUB-GROUP Composites Ultrasonics Dye Impact Microcapsules 11 04 NDE polyurethane 11 1 0Encapsulation Paint 19. ABSTRACT (Continue on reverse if...encapsulation, microencapsule incorporation into the USAF polyurethane paint, dnd initial correlation study of impact damage to impact coating indication. It is...project were to: 1. Refine the microcapsule formulation to be compatible with MIL-C-83286 paint. 2. Fabricate composite panels from isotropic graphite
Wild, Verina; Carina, Fourie; Frouzakis, Regula; Clarinval, Caroline; Fässler, Margrit; Elger, Bernice; Gächter, Thomas; Leu, Agnes; Spirig, Rebecca; Kleinknecht, Michael; Radovanovic, Dragana; Mouton Dorey, Corine; Burnand, Bernard; Vader, John-Paul; Januel, Jean-Marie; Biller-Andorno, Nikola; The IDoC Group
2015-01-01
The starting point of the interdisciplinary project "Assessing the impact of diagnosis related groups (DRGs) on patient care and professional practice" (IDoC) was the lack of a systematic ethical assessment for the introduction of cost containment measures in healthcare. Our aim was to contribute to the methodological and empirical basis of such an assessment. Five sub-groups conducted separate but related research within the fields of biomedical ethics, law, nursing sciences and health services, applying a number of complementary methodological approaches. The individual research projects were framed within an overall ethical matrix. Workshops and bilateral meetings were held to identify and elaborate joint research themes. Four common, ethically relevant themes emerged in the results of the studies across sub-groups: (1.) the quality and safety of patient care, (2.) the state of professional practice of physicians and nurses, (3.) changes in incentives structure, (4.) vulnerable groups and access to healthcare services. Furthermore, much-needed data for future comparative research has been collected and some early insights into the potential impact of DRGs are outlined. Based on the joint results we developed preliminary recommendations related to conceptual analysis, methodological refinement, monitoring and implementation.
Climate change and children's health--a call for research on what works to protect children.
Xu, Zhiwei; Sheffield, Perry E; Hu, Wenbiao; Su, Hong; Yu, Weiwei; Qi, Xin; Tong, Shilu
2012-09-10
Climate change is affecting and will increasingly influence human health and wellbeing. Children are particularly vulnerable to the impact of climate change. An extensive literature review regarding the impact of climate change on children's health was conducted in April 2012 by searching electronic databases PubMed, Scopus, ProQuest, ScienceDirect, and Web of Science, as well as relevant websites, such as IPCC and WHO. Climate change affects children's health through increased air pollution, more weather-related disasters, more frequent and intense heat waves, decreased water quality and quantity, food shortage and greater exposure to toxicants. As a result, children experience greater risk of mental disorders, malnutrition, infectious diseases, allergic diseases and respiratory diseases. Mitigation measures like reducing carbon pollution emissions, and adaptation measures such as early warning systems and post-disaster counseling are strongly needed. Future health research directions should focus on: (1) identifying whether climate change impacts on children will be modified by gender, age and socioeconomic status; (2) refining outcome measures of children's vulnerability to climate change; (3) projecting children's disease burden under climate change scenarios; (4) exploring children's disease burden related to climate change in low-income countries; and (5) identifying the most cost-effective mitigation and adaptation actions from a children's health perspective.
A new national mosaic of state landcover data
Thomas, I.; Handley, Lawrence R.; D'Erchia, Frank J.; Charron, Tammy M.
2000-01-01
This presentation reviewed current landcover mapping efforts and presented a new preliminary, national mosaic of Gap Analysis Program (GAP) and Multi-Resolution Land Characteristics Consortium (MRLC) landcover data with a discussion of techniques, problems faced, and future refinements.
Cheyenne/Laramie County MX Impact Human Service System Refinements Project. Refinements Manual
1986-01-01
following are but four of many possible examples ofthese types of questions. A. Assume that your agency has decided to address the problem of hunger . Should...they do not represent a long-term solution to problems . Conversely, community problem solving and attempts to bring about fundamental changes may be...are victims of acts of violence in the home... Problem solving approaches include education, the provision of food and temporary shel~er, counseling
Impact of scaffold rigidity on the design and evolution of an artificial Diels-Alderase
Preiswerk, Nathalie; Beck, Tobias; Schulz, Jessica D.; Milovník, Peter; Mayer, Clemens; Siegel, Justin B.; Baker, David; Hilvert, Donald
2014-01-01
By combining targeted mutagenesis, computational refinement, and directed evolution, a modestly active, computationally designed Diels-Alderase was converted into the most proficient biocatalyst for [4+2] cycloadditions known. The high stereoselectivity and minimal product inhibition of the evolved enzyme enabled preparative scale synthesis of a single product diastereomer. X-ray crystallography of the enzyme–product complex shows that the molecular changes introduced over the course of optimization, including addition of a lid structure, gradually reshaped the pocket for more effective substrate preorganization and transition state stabilization. The good overall agreement between the experimental structure and the original design model with respect to the orientations of both the bound product and the catalytic side chains contrasts with other computationally designed enzymes. Because design accuracy appears to correlate with scaffold rigidity, improved control over backbone conformation will likely be the key to future efforts to design more efficient enzymes for diverse chemical reactions. PMID:24847076
Emerging approaches in predictive toxicology.
Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan
2014-12-01
Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. © 2014 Wiley Periodicals, Inc.
A novel adjuvant to the resident selection process: the hartman value profile.
Cone, Jeffrey D; Byrum, C Stephen; Payne, Wyatt G; Smith, David J
2012-01-01
The goal of resident selection is twofold: (1) select candidates who will be successful residents and eventually successful practitioners and (2) avoid selecting candidates who will be unsuccessful residents and/or eventually unsuccessful practitioners. Traditional tools used to select residents have well-known limitations. The Hartman Value Profile (HVP) is a proven adjuvant tool to predicting future performance in candidates for advanced positions in the corporate setting. No literature exists to indicate use of the HVP for resident selection. The HVP evaluates the structure and the dynamics of an individual value system. Given the potential impact, we implemented its use beginning in 2007 as an adjuvant tool to the traditional selection process. Experience gained from incorporating the HVP into the residency selection process suggests that it may add objectivity and refinement in predicting resident performance. Further evaluation is warranted with longer follow-up times.
A Novel Adjuvant to the Resident Selection Process: the Hartman Value Profile
Cone, Jeffrey D.; Byrum, C. Stephen; Payne, Wyatt G.; Smith, David J.
2012-01-01
Objectives: The goal of resident selection is twofold: (1) select candidates who will be successful residents and eventually successful practitioners and (2) avoid selecting candidates who will be unsuccessful residents and/or eventually unsuccessful practitioners. Traditional tools used to select residents have well-known limitations. The Hartman Value Profile (HVP) is a proven adjuvant tool to predicting future performance in candidates for advanced positions in the corporate setting. Methods: No literature exists to indicate use of the HVP for resident selection. Results: The HVP evaluates the structure and the dynamics of an individual value system. Given the potential impact, we implemented its use beginning in 2007 as an adjuvant tool to the traditional selection process. Conclusions: Experience gained from incorporating the HVP into the residency selection process suggests that it may add objectivity and refinement in predicting resident performance. Further evaluation is warranted with longer follow-up times. PMID:22720114
Emerging Approaches in Predictive Toxicology
Zhang, Luoping; McHale, Cliona M.; Greene, Nigel; Snyder, Ronald D.; Rich, Ivan N.; Aardema, Marilyn J.; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan
2016-01-01
Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. PMID:25044351
Quality of life before and after cosmetic surgery.
Bensoussan, Jean-Charles; Bolton, Michael A; Pi, Sarah; Powell-Hicks, Allycin L; Postolova, Anna; Razani, Bahram; Reyes, Kevin; IsHak, Waguih William
2014-08-01
This article reviews the literature regarding the impact of cosmetic surgery on health-related quality of life (QOL). Studies were identified through PubMed/Medline and PsycINFO searches from January 1960 to December 2011. Twenty-eight studies were included in this review, according to specific selection criteria. The procedures and tools employed in cosmetic surgery research studies were remarkably diverse, thus yielding difficulties with data analysis. However, data indicate that individuals undergoing cosmetic surgery began with lower values on aspects of QOL than control subjects, and experienced significant QOL improvement post-procedurally, an effect that appeared to plateau with time. Despite the complexity of measuring QOL in cosmetic surgery patients, most studies showed an improvement in QOL after cosmetic surgery procedures. However, this finding was clouded by measurement precision as well as heterogeneity of procedures and study populations. Future research needs to focus on refining measurement techniques, including developing cosmetic surgery-specific QOL measures.
Use of hydrophilic polymer coatings for control of electroosmosis and protein adsorption
NASA Technical Reports Server (NTRS)
Harris, J. Milton
1987-01-01
The purpose of this project was to examine the utility of polyethylene glycol (PEG) and dextran coatings for control of electroosmosis and protein adsorption; electroosmosis is an important, deleterious process affecting electrophoretic separations, and protein adsorption is a factor which needs to be controlled during protein crystal growth to avoid multiple nucleation sites. Performance of the project required use of X-ray photoelectron spectroscopy to refine previously developed synthetic methods. The results of this spectroscopic examination are reported. Measurements of electroosmotic mobility of charged particles in appropriately coated capillaries reveals that a new, one-step route to coating capillaries gives a surface in which electroosmosis is dramatically reduced. Similarly, both PEG and dextran coatings were shown by protein adsorption measurements to be highly effective at reducing protein adsorption on solid surfaces. These results should have impact on future low-g electrophoretic and protein crystal growth experiments.
Economic impact and market analysis of a special event: The Great New England Air Show
Rodney B. Warnick; David C. Bojanic; Atul Sheel; Apurv Mather; Deepak Ninan
2010-01-01
We conducted a post-event evaluation for the Great New England Air Show to assess its general economic impact and to refine economic estimates where possible. In addition to the standard economic impact variables, we examined travel distance, purchase decision involvement, event satisfaction, and frequency of attendance. Graphic mapping of event visitors' home ZIP...
M and D SIG progress report: Laboratory simulations of LDEF impact features
NASA Technical Reports Server (NTRS)
Horz, Friedrich; Bernhard, R. P.; See, Thomas H.; Atkinson, Dale R.; Allbrooks, Martha K.
1991-01-01
Reported here are impact simulations into pure Teflon and aluminum targets. These experiments will allow first order interpretations of impact features on the Long Duration Exposure Facility (LDEF), and they will serve as guides for dedicated experiments that employ the real LDEF blankets, both unexposed and exposed, for a refined understanding of the Long Duration Exposure Facility's collisional environment.
NASA Astrophysics Data System (ADS)
Schwing, Alan Michael
For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable comparisons across a range of regimes. Unsteady and steady applications are considered in both subsonic and supersonic flows. Inviscid and viscous simulations achieve similar results at a much reduced cost when employing dynamic mesh adaptation. Several techniques for guiding adaptation are compared. Detailed analysis of statistics from the instrumented solver enable understanding of the costs associated with adaptation. Adaptive mesh refinement shows promise for the test cases presented here. It can be considerably faster than using conventional grids and provides accurate results. The procedures for adapting the grid are light-weight enough to not require significant computational time and yield significant reductions in grid size.
Refinement of NMR structures using implicit solvent and advanced sampling techniques.
Chen, Jianhan; Im, Wonpil; Brooks, Charles L
2004-12-15
NMR biomolecular structure calculations exploit simulated annealing methods for conformational sampling and require a relatively high level of redundancy in the experimental restraints to determine quality three-dimensional structures. Recent advances in generalized Born (GB) implicit solvent models should make it possible to combine information from both experimental measurements and accurate empirical force fields to improve the quality of NMR-derived structures. In this paper, we study the influence of implicit solvent on the refinement of protein NMR structures and identify an optimal protocol of utilizing these improved force fields. To do so, we carry out structure refinement experiments for model proteins with published NMR structures using full NMR restraints and subsets of them. We also investigate the application of advanced sampling techniques to NMR structure refinement. Similar to the observations of Xia et al. (J.Biomol. NMR 2002, 22, 317-331), we find that the impact of implicit solvent is rather small when there is a sufficient number of experimental restraints (such as in the final stage of NMR structure determination), whether implicit solvent is used throughout the calculation or only in the final refinement step. The application of advanced sampling techniques also seems to have minimal impact in this case. However, when the experimental data are limited, we demonstrate that refinement with implicit solvent can substantially improve the quality of the structures. In particular, when combined with an advanced sampling technique, the replica exchange (REX) method, near-native structures can be rapidly moved toward the native basin. The REX method provides both enhanced sampling and automatic selection of the most native-like (lowest energy) structures. An optimal protocol based on our studies first generates an ensemble of initial structures that maximally satisfy the available experimental data with conventional NMR software using a simplified force field and then refines these structures with implicit solvent using the REX method. We systematically examine the reliability and efficacy of this protocol using four proteins of various sizes ranging from the 56-residue B1 domain of Streptococcal protein G to the 370-residue Maltose-binding protein. Significant improvement in the structures was observed in all cases when refinement was based on low-redundancy restraint data. The proposed protocol is anticipated to be particularly useful in early stages of NMR structure determination where a reliable estimate of the native fold from limited data can significantly expedite the overall process. This refinement procedure is also expected to be useful when redundant experimental data are not readily available, such as for large multidomain biomolecules and in solid-state NMR structure determination.
NASA Astrophysics Data System (ADS)
Carlstrom, John E.
2016-06-01
The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.
Aircraft Research and Technology for Future Fuels
NASA Technical Reports Server (NTRS)
1980-01-01
The potential characteristics of future aviation turbine fuels and the property effects of these fuels on propulsion system components are examined. The topics that are discussed include jet fuel supply and demand trends, the effects of refining variables on fuel properties, shekle oil processing, the characteristics of broadened property fuels, the effects of fuel property variations on combustor and fuel system performance, and combuster and fuel system technology for broadened property fuels.
NASA Astrophysics Data System (ADS)
Koutroulis, A. G.; Tsanis, I. K.; Jacob, D.
2012-04-01
A robust signal of a warmer and drier climate over the western Mediterranean region is projected from the majority of climate models. This effect appears more pronounced during warm periods, when the seasonal decrease of precipitation can exceed control climatology by 25-30%. The rapid development of Crete in the last 30 years has exerted strong pressures on the natural resources of the region. Urbanization and growth of agriculture, tourism and industry had strong impact on the water resources of island by substantially increasing water demand. The objective of this study is to analyze and assess the impact of global change on the water resources status for the island of Crete for a range of 24 different scenarios of projected hydro-climatological regime, demand and supply potential. Water resources application issues analyzed and facilitated within this study, focusing on a refinement of the future water demands of the island, and comparing with "state of the art" global climate model (GCM) results and an ensemble of regional climate models (RCMs) under three different emission scenarios, to estimate water resources availability, during the 21st century. A robust signal of water scarcity is projected for all the combinations of emission (A2, A1B and B1), demand and infrastructure scenarios. Despite the uncertainty of the assessments, the quantitative impact of the projected changes on water availability indicates that climate change plays an equally important role to water use and management in controlling future water status in a Mediterranean island like the island of Crete. The outcome of this analysis will assist in short and long-term strategic water resources planning by prioritizing water related infrastructure development.
Status report on education in the economics of animal health: results from a European survey.
Waret-Szkuta, Agnès; Raboisson, Didier; Niemi, Jarkko; Aragrande, Maurizio; Gethmann, Jörn; Martins, Sara Babo; Hans, Lucie; Höreth-Böntgen, Detlef; Sans, Pierre; Stärk, Katharina D; Rushton, Jonathan; Häsler, Barbara
2015-01-01
Education on the use of economics applied to animal health (EAH) has been offered since the 1980s. However, it has never been institutionalized within veterinary curricula, and there is no systematic information on current teaching and education activities in Europe. Nevertheless, the need for economic skills in animal health has never been greater. Economics can add value to disease impact assessments; improve understanding of people's incentives to participate in animal health measures; and help refine resource allocation for public animal health budgets. The use of economics should improve animal health decision making. An online questionnaire was conducted in European countries to assess current and future needs and expectations of people using EAH. The main conclusion from the survey is that education in economics appears to be offered inconsistently in Europe, and information about the availability of training opportunities in this field is scarce. There is a lack of harmonization of EAH education and significant gaps exist in the veterinary curricula of many countries. Depending on whether respondents belonged to educational institutions, public bodies, or private organizations, they expressed concerns regarding the limited education on decision making and impact assessment for animal diseases or on the use of economics for general management. Both public and private organizations recognized the increasing importance of EAH in the future. This should motivate the development of teaching methods and materials that aim at developing the understanding of animal health problems for the benefit of students and professional veterinarians.
The future of anesthesiology: implications of the changing healthcare environment.
Prielipp, Richard C; Cohen, Neal H
2016-04-01
Anesthesiology is at a crossroad, particularly in the USA. We explore the changing and future roles for anesthesiologists, including the implication of new models of care such as the perioperative surgical home, changes in payment methodology, and the impact other refinements in healthcare delivery will have on practice opportunities and training requirements for anesthesiologists. The advances in the practice of anesthesiology are having a significant impact on patient care, allowing a more diverse and complex patient population to benefit from the knowledge, skills and expertise of anesthesiologists. Expanded clinical opportunities, increased utilization of technology and expansion in telemedicine will provide the foundation to care for more patients in diverse settings and to better monitor patients remotely while ensuring immediate intervention as needed. Although the roles of anesthesiologists have been diverse, the scope of practice varies from one country to another. The changing healthcare needs in the USA in particular are creating new opportunities for American anesthesiologists to define expanded roles in healthcare delivery. To fulfill these evolving needs of patients and health systems, resident training, ongoing education and methods to ensure continued competency must incorporate new approaches of education and continued certification to ensure that each anesthesiologist has the full breadth and depth of clinical skills needed to support patient and health system needs. The scope of anesthesia practice has expanded globally, providing anesthesiologists, particularly those in the USA, with unique new opportunities to assume a broader role in perioperative care of surgical patients.
Disaggregation and Refinement of System Dynamics Models via Agent-based Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nutaro, James J; Ozmen, Ozgur; Schryver, Jack C
System dynamics models are usually used to investigate aggregate level behavior, but these models can be decomposed into agents that have more realistic individual behaviors. Here we develop a simple model of the STEM workforce to illuminate the impacts that arise from the disaggregation and refinement of system dynamics models via agent-based modeling. Particularly, alteration of Poisson assumptions, adding heterogeneity to decision-making processes of agents, and discrete-time formulation are investigated and their impacts are illustrated. The goal is to demonstrate both the promise and danger of agent-based modeling in the context of a relatively simple model and to delineate themore » importance of modeling decisions that are often overlooked.« less
Assessing the Impact of Financial Aid Offers on Enrollment Decisions.
ERIC Educational Resources Information Center
Somers, Patricia A.; St. John, Edward P.
1993-01-01
A study tested a model for assessing the impact of financial aid offers on 2,558 accepted students' college enrollment decisions. The analysis demonstrates that financial aid strategies have a substantial influence on enrollment and the systematic analysis of student enrollment decisions can help institutional administrators refine their financing…
Recent advances in automated protein design and its future challenges.
Setiawan, Dani; Brender, Jeffrey; Zhang, Yang
2018-04-25
Protein function is determined by protein structure which is in turn determined by the corresponding protein sequence. If the rules that cause a protein to adopt a particular structure are understood, it should be possible to refine or even redefine the function of a protein by working backwards from the desired structure to the sequence. Automated protein design attempts to calculate the effects of mutations computationally with the goal of more radical or complex transformations than are accessible by experimental techniques. Areas covered: The authors give a brief overview of the recent methodological advances in computer-aided protein design, showing how methodological choices affect final design and how automated protein design can be used to address problems considered beyond traditional protein engineering, including the creation of novel protein scaffolds for drug development. Also, the authors address specifically the future challenges in the development of automated protein design. Expert opinion: Automated protein design holds potential as a protein engineering technique, particularly in cases where screening by combinatorial mutagenesis is problematic. Considering solubility and immunogenicity issues, automated protein design is initially more likely to make an impact as a research tool for exploring basic biology in drug discovery than in the design of protein biologics.
Nickerson, Philip E B; Ortin-Martinez, Arturo; Wallace, Valerie A
2018-01-01
Considerable research effort has been invested into the transplantation of mammalian photoreceptors into healthy and degenerating mouse eyes. Several platforms of rod and cone fluorescent reporting have been central to refining the isolation, purification and transplantation of photoreceptors. The tracking of engrafted cells, including identifying the position, morphology and degree of donor cell integration post-transplant is highly dependent on the use of fluorescent protein reporters. Improvements in imaging and analysis of transplant recipients have revealed that donor cell fluorescent reporters can transfer into host tissue though a process termed material exchange (ME). This recent discovery has chaperoned a new era of interpretation when reviewing the field's use of dissociated donor cell preparations, and has prompted scientists to re-examine how we use and interpret the information derived from fluorescence-based tracking tools. In this review, we describe the status of our understanding of ME in photoreceptor transplantation. In addition, we discuss the impact of this discovery on several aspects of historical rod and cone transplantation data, and provide insight into future standards and approaches to advance the field of cell engraftment.
Nickerson, Philip E. B.; Ortin-Martinez, Arturo; Wallace, Valerie A.
2018-01-01
Considerable research effort has been invested into the transplantation of mammalian photoreceptors into healthy and degenerating mouse eyes. Several platforms of rod and cone fluorescent reporting have been central to refining the isolation, purification and transplantation of photoreceptors. The tracking of engrafted cells, including identifying the position, morphology and degree of donor cell integration post-transplant is highly dependent on the use of fluorescent protein reporters. Improvements in imaging and analysis of transplant recipients have revealed that donor cell fluorescent reporters can transfer into host tissue though a process termed material exchange (ME). This recent discovery has chaperoned a new era of interpretation when reviewing the field’s use of dissociated donor cell preparations, and has prompted scientists to re-examine how we use and interpret the information derived from fluorescence-based tracking tools. In this review, we describe the status of our understanding of ME in photoreceptor transplantation. In addition, we discuss the impact of this discovery on several aspects of historical rod and cone transplantation data, and provide insight into future standards and approaches to advance the field of cell engraftment. PMID:29559897
Wang, Jianbo; Xu, Zhenming
2015-01-20
Over the past decades, China has been suffering from negative environmental impacts from distempered e-waste recycling activities. After a decade of effort, disassembly and raw materials recycling of environmentally friendly e-waste have been realized in specialized companies, in China, and law enforcement for illegal activities of e-waste recycling has also been made more and more strict. So up to now, the e-waste recycling in China should be developed toward more depth and refinement to promote industrial production of e-waste resource recovery. Waste printed circuit boards (WPCBs), which are the most complex, hazardous, and valuable components of e-waste, are selected as one typical example in this article that reviews the status of related regulations and technologies of WPCBs recycling, then optimizes, and integrates the proper approaches in existence, while the bottlenecks in the WPCBs recycling system are analyzed, and some preliminary experiments of pinch technologies are also conducted. Finally, in order to provide directional guidance for future development of WPCBs recycling, some key points in the WPCBs recycling system are proposed to point towards a future trend in the e-waste recycling industry.
Development of a Non-Contact, Inductive Depth Sensor for Free-Surface, Liquid-Metal Flows
NASA Astrophysics Data System (ADS)
Bruhaug, Gerrit; Kolemen, Egemen; Fischer, Adam; Hvasta, Mike
2017-10-01
This paper details a non-contact based, inductive depth measurement system that can sit behind a layer of steel and measure the depth of the liquid metal flowing over the steel. Free-surface liquid metal depth measurement is usually done with invasive sensors that impact the flow of the liquid metal, or complex external sensors that require lasers and precise alignment. Neither of these methods is suitable for the extreme environment encountered in the diverter region of a nuclear fusion reactor, where liquid metal open channel flows are being investigated for future use. A sensor was developed that used the inductive coupling of a coil to liquid metal to measure the height of the liquid metal present. The sensor was built and tested experimentally, and modeled with finite element modeling software to further understand the physics involved. Future work will attempt to integrate the sensor into the Liquid Metal eXperiment (LMX) at the Princeton Plasma Physics Laboratory for more refined testing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466.
Refined Simulation of Satellite Laser Altimeter Full Echo Waveform
NASA Astrophysics Data System (ADS)
Men, H.; Xing, Y.; Li, G.; Gao, X.; Zhao, Y.; Gao, X.
2018-04-01
The return waveform of satellite laser altimeter plays vital role in the satellite parameters designation, data processing and application. In this paper, a method of refined full waveform simulation is proposed based on the reflectivity of the ground target, the true emission waveform and the Laser Profile Array (LPA). The ICESat/GLAS data is used as the validation data. Finally, we evaluated the simulation accuracy with the correlation coefficient. It was found that the accuracy of echo simulation could be significantly improved by considering the reflectivity of the ground target and the emission waveform. However, the laser intensity distribution recorded by the LPA has little effect on the echo simulation accuracy when compared with the distribution of the simulated laser energy. At last, we proposed a refinement idea by analyzing the experimental results, in the hope of providing references for the waveform data simulation and processing of GF-7 satellite in the future.
Plasma Vehicle Charging Analysis for Orion Flight Test 1
NASA Technical Reports Server (NTRS)
Lallement, L.; McDonald, T.; Norgard, J.; Scully, B.
2014-01-01
In preparation for the upcoming experimental test flight for the Orion crew module, considerable interest was raised over the possibility of exposure to elevated levels of plasma activity and vehicle charging both externally on surfaces and internally on dielectrics during the flight test orbital operations. Initial analysis using NASCAP-2K indicated very high levels of exposure, and this generated additional interest in refining/defining the plasma and spacecraft models used in the analysis. This refinement was pursued, resulting in the use of specific AE8 and AP8 models, rather than SCATHA models, as well as consideration of flight trajectory, time duration, and other parameters possibly affecting the levels of exposure and the magnitude of charge deposition. Analysis using these refined models strongly indicated that, for flight test operations, no special surface coatings were necessary for the thermal protection system, but would definitely be required for future GEO, trans-lunar, and extra-lunar missions...
Plasma Vehicle Charging Analysis for Orion Flight Test 1
NASA Technical Reports Server (NTRS)
Scully, B.; Norgard, J.
2015-01-01
In preparation for the upcoming experimental test flight for the Orion crew module, considerable interest was raised over the possibility of exposure to elevated levels of plasma activity and vehicle charging both externally on surfaces and internally on dielectrics during the flight test orbital operations. Initial analysis using NASCAP-2K indicated very high levels of exposure, and this generated additional interest in refining/defining the plasma and spacecraft models used in the analysis. This refinement was pursued, resulting in the use of specific AE8 and AP8 models, rather than SCATHA models, as well as consideration of flight trajectory, time duration, and other parameters possibly affecting the levels of exposure and the magnitude of charge deposition. Analysis using these refined models strongly indicated that, for flight test operations, no special surface coatings were necessary for the Thermal Protection System (TPS), but would definitely be required for future GEO, trans-lunar, and extra-lunar missions.
Zhang, Yinyin; Brodusch, Nicolas; Descartes, Sylvie; Chromik, Richard R; Gauvin, Raynald
2014-10-01
The electron channeling contrast imaging technique was used to investigate the microstructure of copper coatings fabricated by cold gas dynamic spray. The high velocity impact characteristics for cold spray led to the formation of many substructures, such as high density dislocation walls, dislocation cells, deformation twins, and ultrafine equiaxed subgrains/grains. A schematic model is proposed to explain structure refinement of Cu during cold spray, where an emphasis is placed on the role of dislocation configurations and twinning.
Enzymatic approaches in paper industry for pulp refining and biofilm control.
Torres, C E; Negro, C; Fuente, E; Blanco, A
2012-10-01
The use of enzymes has a high potential in the pulp and paper industry to improve the economics of the paper production process and to achieve, at the same time, a reduced environmental impact. Specific enzymes contribute to reduce the amount of chemicals and energy required for the modification of fibers and helps to prevent the formation or development of biofilms. This review is aimed at presenting the latest progresses made in the application of enzymes as refining aids and biofilm control agents.
An Agenda for Climate Impacts Science
NASA Astrophysics Data System (ADS)
Kaye, J. A.
2009-12-01
The report Global Change Impacts in the United States released by the US Global Change Research Program in June 2009 identifies a number of areas in which inadequate information or understanding hampers our ability to estimate likely future climate change and its impacts. In this section of the report, the focus is on those areas of climate science that could contribute most towards advancing our knowledge of climate change impacts and those aspects of climate change responsible for these impacts in order to continue to guide decision making. The Report identifies the six most important gaps in knowledge and offers some thoughts on how to address those gaps: 1. Expand our understanding of climate change impacts. There is a clear need to increase understanding of how ecosystems, social and economic systems, human health, and the built environment will be affected by climate change in the context of other stresses. 2. Refine ability to project climate change, including extreme events, at local scales. While climate change is a global issue, it has a great deal of regional variability. There is an indisputable need to improve understanding of climate system effects at these smaller scales, because these are often the scales of decision-making in society. This includes advances in modeling capability and observations needed to address local scales and high-impact extreme events. 3. Expand capacity to provide decision makers and the public with relevant information on climate change and its impacts. Significant potential exists in the US to create more comprehensive measurement, archive, and data-access systems that could provide great benefit to society, which requires defining needed information, gathering it, expanding capacity to deliver it, and improving tools by which decision makers use it to best advantage. 4. Improve understanding of thresholds likely to lead to abrupt changes in climate or ecosystems. Potential areas of research include thresholds that could lead to rapid changes in ice-sheet dynamics that could impact future sea-level rise and tipping points in biological systems (including those that may be associated with ocean acidification). 5. Improve understanding of the most effective ways to reduce the rate and magnitude of climate change, as well as unintended consequences of such actions. Research will help to identify the desired mix of mitigation options necessary to control the rate and magnitude of climate change, and to examine possible unintended consequences of mitigation options. 6. Enhance understanding of how society can adapt to climate change. There is currently limited knowledge about the ability of communities, regions, and sectors to adapt to future climate change. It is important to improve understanding of how to enhance society’s capacity to adapt to a changing climate in the context of other environmental stresses.
Developing Our Water Resources
ERIC Educational Resources Information Center
Volker, Adriaan
1977-01-01
Only very recently developed as a refined scientific discipline, hydrology has to cope with a complexity of problems concerning the present and future management of a vital natural resource, water. This article examines available water supplies and the problems and prospects of water resource development. (Author/MA)
Climate change, ecosystem impacts, and management for Pacific salmon
D.E. Schindler; X. Augerot; E. Fleishman; N.J. Mantua; B. Riddell; M. Ruckelshaus; J. Seeb; M. Webster
2008-01-01
As climate change intensifies, there is increasing interest in developing models that reduce uncertainties in projections of global climate and refine these projections to finer spatial scales. Forecasts of climate impacts on ecosystems are far more challenging and their uncertainties even larger because of a limited understanding of physical controls on biological...
Sustainable Practices in Medicinal Chemistry Part 2: Green by Design.
Aliagas, Ignacio; Berger, Raphaëlle; Goldberg, Kristin; Nishimura, Rachel T; Reilly, John; Richardson, Paul; Richter, Daniel; Sherer, Edward C; Sparling, Brian A; Bryan, Marian C
2017-07-27
With the development of ever-expanding synthetic methodologies, a medicinal chemist's toolkit continues to swell. However, with finite time and resources as well as a growing understanding of our field's environment impact, it is critical to refine what can be made to what should be made. This review seeks to highlight multiple cheminformatic approaches in drug discovery that can influence and triage design and execution impacting the likelihood of rapidly generating high-value molecules in a more sustainable manner. This strategy gives chemists the tools to design and refine vast libraries, stress "druglikeness", and rapidly identify SAR trends. Project success, i.e., identification of a clinical candidate, is then reached faster with fewer molecules with the farther-reaching ramification of using fewer resources and generating less waste, thereby helping "green" our field.
Scott, Thomas F
2017-04-15
Recent studies suggest a need for refinement of the traditional two phase model of relapse onset multiple sclerosis (RMS) to include dynamically changing subgroups within the broad category of secondary progressive MS (SPMS). These studies challenge the traditional notion that relapses play a minor role in comparison to a secondary progressive (perhaps degenerative) process. Patients fulfilling the broad definition for SPMS may take several courses, including variable rates and patterns of overall worsening. New paradigms or models for mapping the trajectory of disability in RMS and SPMS (clinical phenotyping), including periods of remission, may impact our understanding of the underlying pathology, and will be important in assessing treatments. Copyright © 2017 Elsevier B.V. All rights reserved.
Chang, Y -F; Huang, C -F; Hwang, J -S; Kuo, J -F; Lin, K -M; Huang, H -C; Bagga, S; Kumar, A; Chen, F -P; Wu, C -H
2018-04-01
The analysis aimed to identify the treatment gaps in current fracture liaison services (FLS) and to provide recommendations for best practice establishment of future FLS across the Asia-Pacific region. The findings emphasize the unmet need for the implementation of new programs and provide recommendations for the refinement of existing ones. The study's objectives were to evaluate fracture liaison service (FLS) programs in the Asia-Pacific region and provide recommendations for establishment of future FLS programs. A systematic literature review (SLR) of Medline, PubMed, EMBASE, and Cochrane Library (2000-2017 inclusive) was performed using the following keywords: osteoporosis, fractures, liaison, and service. Inclusion criteria included the following: patients ≥ 50 years with osteoporosis-related fractures; randomized controlled trials or observational studies with control groups (prospective or retrospective), pre-post, cross-sectional and economic evaluation studies. Success of direct or indirect interventions was assessed based on patients' understanding of risk, bone mineral density assessment, calcium intake, osteoporosis treatment, re-fracture rates, adherence, and mortality, in addition to cost-effectiveness. Overall, 5663 unique citations were identified and the SLR identified 159 publications, reporting 37 studies in Asia-Pacific. These studies revealed the unmet need for public health education, adequate funding, and staff resourcing, along with greater cooperation between departments and physicians. These actions can help to overcome therapeutic inertia with sufficient follow-up to ensure adherence to recommendations and compliance with treatment. The findings also emphasize the importance of primary care physicians continuing to prescribe treatment and ensure service remains convenient. These findings highlight the limited evidence supporting FLS across the Asia-Pacific region, emphasizing the unmet need for new programs and/or refinement of existing ones to improve outcomes. With the continued increase in burden of fractures in Asia-Pacific, establishment of new FLS and assessment of existing services are warranted to determine the impact of FLS for healthcare professionals, patients, family/caregivers, and society.
Refining mass formulas for astrophysical applications: A Bayesian neural network approach
NASA Astrophysics Data System (ADS)
Utama, R.; Piekarewicz, J.
2017-10-01
Background: Exotic nuclei, particularly those near the drip lines, are at the core of one of the fundamental questions driving nuclear structure and astrophysics today: What are the limits of nuclear binding? Exotic nuclei play a critical role in both informing theoretical models as well as in our understanding of the origin of the heavy elements. Purpose: Our aim is to refine existing mass models through the training of an artificial neural network that will mitigate the large model discrepancies far away from stability. Methods: The basic paradigm of our two-pronged approach is an existing mass model that captures as much as possible of the underlying physics followed by the implementation of a Bayesian neural network (BNN) refinement to account for the missing physics. Bayesian inference is employed to determine the parameters of the neural network so that model predictions may be accompanied by theoretical uncertainties. Results: Despite the undeniable quality of the mass models adopted in this work, we observe a significant improvement (of about 40%) after the BNN refinement is implemented. Indeed, in the specific case of the Duflo-Zuker mass formula, we find that the rms deviation relative to experiment is reduced from σrms=0.503 MeV to σrms=0.286 MeV. These newly refined mass tables are used to map the neutron drip lines (or rather "drip bands") and to study a few critical r -process nuclei. Conclusions: The BNN approach is highly successful in refining the predictions of existing mass models. In particular, the large discrepancy displayed by the original "bare" models in regions where experimental data are unavailable is considerably quenched after the BNN refinement. This lends credence to our approach and has motivated us to publish refined mass tables that we trust will be helpful for future astrophysical applications.
NASA Astrophysics Data System (ADS)
Arnold, Nicholas; Loch, Stuart; Ballance, Connor; Thomas, Ed
2017-10-01
Low temperature plasmas (Te < 10 eV) are ubiquitous in the medical, industrial, basic, and dusty plasma communities, and offer an opportunity for researchers to gain a better understanding of atomic processes in plasmas. Here, we report on a new atomic dataset for neutral and low charge states of argon, from which rate coefficients and cross-sections for the electron-impact excitation of neutral argon are determined. We benchmark by comparing with electron impact excitation cross-sections available in the literature, with very good agreement. We have used the Atomic Data and Analysis Structure (ADAS) code suite to calculate a level-resolved, generalized collisional-radiative (GCR) model for line emission in low temperature argon plasmas. By combining our theoretical model with experimental electron temperature, density, and spectral measurements from the Auburn Linear eXperiment for Instability Studies (ALEXIS), we have developed diagnostic techniques to measure metastable fraction, electron temperature, and electron density. In the future we hope to refine our methods, and extend our model to plasmas other than ALEXIS. Supported by the U.S. Department of Energy. Grant Number: DE-FG02-00ER54476.
Cost/Effort Drivers and Decision Analysis
NASA Technical Reports Server (NTRS)
Seidel, Jonathan
2010-01-01
Engineering trade study analyses demand consideration of performance, cost and schedule impacts across the spectrum of alternative concepts and in direct reference to product requirements. Prior to detailed design, requirements are too often ill-defined (only goals ) and prone to creep, extending well beyond the Systems Requirements Review. Though lack of engineering design and definitive requirements inhibit the ability to perform detailed cost analyses, affordability trades still comprise the foundation of these future product decisions and must evolve in concert. This presentation excerpts results of the recent NASA subsonic Engine Concept Study for an Advanced Single Aisle Transport to demonstrate an affordability evaluation of performance characteristics and the subsequent impacts on engine architecture decisions. Applying the Process Based Economic Analysis Tool (PBEAT), development cost, production cost, as well as operation and support costs were considered in a traditional weighted ranking of the following system-level figures of merit: mission fuel burn, take-off noise, NOx emissions, and cruise speed. Weighting factors were varied to ascertain the architecture ranking sensitivities to these performance figures of merit with companion cost considerations. A more detailed examination of supersonic variable cycle engine cost is also briefly presented, with observations and recommendations for further refinements.
Final Reports of the Stardust ISPE: Seven Probable Interstellar Dust Particles
NASA Technical Reports Server (NTRS)
Allen, Carlton; Sans Tresseras, Juan-Angel; Westphal, Andrew J.; Stroud, Rhonda M.; Bechtel, Hans A.; Brenker, Frank E.; Butterworth, Anna L.; Flynn, George J.; Frank, David R.; Gainsforth, Zack;
2014-01-01
The Stardust spacecraft carried the first spaceborne collector specifically designed to capture and return a sample of contemporary interstellar dust to terrestrial laboratories for analysis [1]. The collector was exposed to the interstellar dust stream in two periods in 2000 and 2002 with a total exposure of approximately 1.8 10(exp 6) square meters sec. Approximately 85% of the collector consisted of aerogel, and the remainder consisted of Al foils. The Stardust Interstellar Preliminary Examination (ISPE) was a consortiumbased effort to characterize the collection in sufficient detail to enable future investigators to make informed sample requests. Among the questions to be answered were these: How many impacts are consistent in their characteristics with interstellar dust, with interplanetary dust, and with secondary ejecta from impacts on the spacecraft? Are the materials amorphous or crystalline? Are organics detectable? An additional goal of the ISPE was to develop or refine the techniques for preparation, analysis, and curation of these tiny samples, expected to be approximately 1 picogram or smaller, roughly three orders of magnitude smaller in mass than the samples in other small particle collections in NASA's collections - the cometary samples returned by Stardust, and the collection of Interplanetary Dust Particles collected in the stratosphere.
Guieysse, Benoit; Norvill, Zane N
2014-02-28
When direct wastewater biological treatment is unfeasible, a cost- and resource-efficient alternative to direct chemical treatment consists of combining biological treatment with a chemical pre-treatment aiming to convert the hazardous pollutants into more biodegradable compounds. Whereas the principles and advantages of sequential treatment have been demonstrated for a broad range of pollutants and process configurations, recent progresses (2011-present) in the field provide the basis for refining assessment of feasibility, costs, and environmental impacts. This paper thus reviews recent real wastewater demonstrations at pilot and full scale as well as new process configurations. It also discusses new insights on the potential impacts of microbial community dynamics on process feasibility, design and operation. Finally, it sheds light on a critical issue that has not yet been properly addressed in the field: integration requires complex and tailored optimization and, of paramount importance to full-scale application, is sensitive to uncertainty and variability in the inputs used for process design and operation. Future research is therefore critically needed to improve process control and better assess the real potential of sequential chemical-biological processes for industrial wastewater treatment. Copyright © 2013 Elsevier B.V. All rights reserved.
Sensitivity of Asteroid Impact Risk to Uncertainty in Asteroid Properties and Entry Parameters
NASA Astrophysics Data System (ADS)
Wheeler, Lorien; Mathias, Donovan; Dotson, Jessie L.; NASA Asteroid Threat Assessment Project
2017-10-01
A central challenge in assessing the threat posed by asteroids striking Earth is the large amount of uncertainty inherent throughout all aspects of the problem. Many asteroid properties are not well characterized and can range widely from strong, dense, monolithic irons to loosely bound, highly porous rubble piles. Even for an object of known properties, the specific entry velocity, angle, and impact location can swing the potential consequence from no damage to causing millions of casualties. Due to the extreme rarity of large asteroid strikes, there are also large uncertainties in how different types of asteroids will interact with the atmosphere during entry, how readily they may break up or ablate, and how much surface damage will be caused by the resulting airbursts or impacts.In this work, we use our Probabilistic Asteroid Impact Risk (PAIR) model to investigate the sensitivity of asteroid impact damage to uncertainties in key asteroid properties, entry parameters, or modeling assumptions. The PAIR model combines physics-based analytic models of asteroid entry and damage in a probabilistic Monte Carlo framework to assess the risk posed by a wide range of potential impacts. The model samples from uncertainty distributions of asteroid properties and entry parameters to generate millions of specific impact cases, and models the atmospheric entry and damage for each case, including blast overpressure, thermal radiation, tsunami inundation, and global effects. To assess the risk sensitivity, we alternately fix and vary the different input parameters and compare the effect on the resulting range of damage produced. The goal of these studies is to help guide future efforts in asteroid characterization and model refinement by determining which properties most significantly affect the potential risk.
Microstructural Modeling of Dynamic Intergranular and Transgranular Fracture Modes in Zircaloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohammed, I.; Zikry, M.A.; Ziaei, S.
2017-04-01
In this time period, we have continued to focus on (i) refining the thermo-mechanical fracture model for zirconium (Zr) alloys subjected to large deformations and high temperatures that accounts for the cracking of ZrH and ZrH2 hydrides, (ii) formulating a framework to account intergranular fracture due to iodine diffusion and pit formation in grain-boundaries (GBs). Our future objectives are focused on extending to a combined population of ZrH and ZrH2 populations and understanding how thermo-mechanical behavior affects hydride reorientation and cracking. We will also refine the intergranular failure mechanisms for grain boundaries with pits.
High Speed Solid State Circuit Breaker
NASA Technical Reports Server (NTRS)
Podlesak, Thomas F.
1993-01-01
The U.S. Army Research Laboratory, Fort Monmouth, NJ, has developed and is installing two 3.3 MW high speed solid state circuit breakers at the Army's Pulse Power Center. These circuit breakers will interrupt 4160V three phase power mains in no more than 300 microseconds, two orders of magnitude faster than conventional mechanical contact type circuit breakers. These circuit breakers utilize Gate Turnoff Thyristors (GTO's) and are currently utility type devices using air cooling in an air conditioned enclosure. Future refinements include liquid cooling, either water or two phase organic coolant, and more advanced semiconductors. Each of these refinements promises a more compact, more reliable unit.
Using supercritical fluids to refine hydrocarbons
Yarbro, Stephen Lee
2014-11-25
This is a method to reactively refine hydrocarbons, such as heavy oils with API gravities of less than 20.degree. and bitumen-like hydrocarbons with viscosities greater than 1000 cp at standard temperature and pressure using a selected fluid at supercritical conditions. The reaction portion of the method delivers lighter weight, more volatile hydrocarbons to an attached contacting device that operates in mixed subcritical or supercritical modes. This separates the reaction products into portions that are viable for use or sale without further conventional refining and hydro-processing techniques. This method produces valuable products with fewer processing steps, lower costs, increased worker safety due to less processing and handling, allow greater opportunity for new oil field development and subsequent positive economic impact, reduce related carbon dioxide, and wastes typical with conventional refineries.
Reformulated Gasoline Foreign Refinery Rules (Short-Term Energy Outlook Supplement January 1998)
1998-01-01
On August 27, 1997, the Environmental Protection Agency (EPA) promulgated revised the rules that allow foreign refiners to establish and use individual baselines, but it would not be mandatory (the optional use of an individual refinery baseline is not available to domestic refiners.) If a foreign refiner did not establish and use an individual baseline, the gasoline they export to the United States would be regulated through the importer, and subject to the importer's baseline (most likely the statutory baseline). Specific regulatory provisions are implemented to ensure that the option to use an individual baseline would not lead to adverse environmental impacts. This involves monitoring the average quality of imported gasoline, and if a specified benchmark is exceeded, remedial action would be taken by adjusting the requirements applicable to imported gasoline.
Brooks, Simon; Ebenezer, Neil; Poopalasundaram, Subathra; Maher, Eamonn; Francis, Peter; Moore, Anthony; Hardcastle, Alison
2004-06-01
The X-linked congenital cataract (CXN) locus has been mapped to a 3-cM (approximately 3.5 Mb) interval on chromosome Xp22.13, which is syntenic to the mouse cataract disease locus Xcat and encompasses the recently refined Nance-Horan syndrome (NHS) locus. A positional cloning strategy has been adopted to identify the causative gene. In an attempt to refine the CXN locus, seven microsatellites were analysed within 21 individuals of a CXN family. Haplotypes were reconstructed confirming disease segregation with markers on Xp22.13. In addition, a proximal cross-over was observed between markers S3 and S4, thereby refining the CXN disease interval by approximately 400 Kb to 3.2 Mb, flanked by markers DXS9902 and S4. Two known genes (RAI2 and RBBP7) and a novel gene (TL1) were screened for mutations within an affected male from the CXN family and an NHS family by direct sequencing of coding exons and intron- exon splice sites. No mutations or polymorphisms were identified, therefore excluding them as disease-causative in CXN and NHS. In conclusion, the CXN locus has been successfully refined and excludes PPEF1 as a candidate gene. A further three candidates were excluded based on sequence analysis. Future positional cloning efforts will focus on the region of overlap between CXN, Xcat, and NHS.
Sim, Biow Ing; Muhamad, Halimah; Lai, Oi Ming; Abas, Faridah; Yeoh, Chee Beng; Nehdi, Imededdine Arbi; Khor, Yih Phing; Tan, Chin Ping
2018-04-01
This paper examines the interactions of degumming and bleaching processes as well as their influences on the formation of 3-monochloropropane-1,2-diol esters (3-MCPDE) and glycidyl esters in refined, bleached and deodorized palm oil by using D-optimal design. Water degumming effectively reduced the 3-MCPDE content up to 50%. Acid activated bleaching earth had a greater effect on 3-MCPDE reduction compared to natural bleaching earth and acid activated bleaching earth with neutral pH, indicating that performance and adsorption capacities of bleaching earth are the predominant factors in the removal of esters, rather than its acidity profile. The combination of high dosage phosphoric acid during degumming with the use of acid activated bleaching earth eliminated almost all glycidyl esters during refining. Besides, the effects of crude palm oil quality was assessed and it was found that the quality of crude palm oil determines the level of formation of 3-MCPDE and glycidyl esters in palm oil during the high temperature deodorization step of physical refining process. Poor quality crude palm oil has strong impact towards 3-MCPDE and glycidyl esters formation due to the intrinsic components present within. The findings are useful to palm oil refining industry in choosing raw materials as an input during the refining process.
An Advanced Neutron Spectrometer for Future Manned Exploration Missions
NASA Technical Reports Server (NTRS)
Christl, Mark; Apple, Jeffrey A.; Cox, Mark D.; Dietz, Kurtis L.; Dobson, Christopher C.; Gibson, Brian F.; Howard, David E.; Jackson, Amanda C.; Kayatin, Mathew J.; Kuznetsov, Evgeny N.;
2014-01-01
An Advanced Neutron Spectrometer (ANS) is being developed to support future manned exploration missions. This new instrument uses a refined gate and capture technique that significantly improves the identification of neutrons in mixed radiation fields found in spacecraft, habitats and on planetary surfaces. The new instrument is a composite scintillator comprised of PVT loaded with litium-6 glass scintillators. We will describe the detection concept and show preliminary results from laboratory tests and exposures at particle accelerators
NASA Astrophysics Data System (ADS)
Zhou, Cheng; Ye, Qibin; Yan, Ling
The effect of ultra-fast cooling(UFC) and conventional accelerated cooling(AcC) on the mechanical properties and microstructure of controlled rolled AH32 grade steel plates on industrial scale were compared using tensile test, Charpy impact test, welding thermal simulation, and microscopic analysis. The results show that the properties of the plate produced by UFC are improved considerably comparing to that by AcC. The yield strength is increased with 54 MPa without deterioration in the ductility and the impact energy is improved to more than 260 J at -60 °C with much lower ductile-to-brittle transition temperature(DBTT). The ferrite grain size is refined to ASTM No. 11.5 in the UFC steel with uniform microstructure throughout the thickness direction, while that of the AcC steel is ASTM No. 9.5. The analysis of nucleation kinetics of α-ferrite indicates that the microstructure is refined due to the increased nucleation rate of α-ferrite by much lower γ→α transition temperature through the UFC process. The Hall-Petch effect is quantified for the improvement of the strength and toughness of the UFC steel attributed to the grain refinement.
Assessment of the radiological impact of oil refining industry.
Bakr, W F
2010-03-01
The field of radiation protection and corresponding national and international regulations has evolved to ensure safety in the use of radioactive materials. Oil and gas production processing operations have been known to cause naturally occurring radioactive materials (NORMs) to accumulate at elevated concentrations as by-product waste streams. A comprehensive radiological study on the oil refining industry in Egypt was carried out to assess the radiological impact of this industry on the workers. Scales, sludge, water and crude oil samples were collected at each stage of the refining process. The activity concentration of (226)Ra, (232)Th and (40)K were determined using high-resolution gamma spectrometry. The average activity concentrations of the determined isotopes are lower than the IAEA exempt activity levels for NORM isotopes. Different exposure scenarios were studied. The average annual effective dose for workers due to direct exposure to gamma radiation and dust inhalation found to be 0.6 microSv and 3.2 mSv, respectively. Based on the ALARA principle, the results indicate that special care must be taken during cleaning operations in order to reduce the personnel's exposure due to maintenance as well as to avoid contamination of the environment. 2009 Elsevier Ltd. All rights reserved.
Models of emergency departments for reducing patient waiting times.
Laskowski, Marek; McLeod, Robert D; Friesen, Marcia R; Podaima, Blake W; Alfa, Attahiru S
2009-07-02
In this paper, we apply both agent-based models and queuing models to investigate patient access and patient flow through emergency departments. The objective of this work is to gain insights into the comparative contributions and limitations of these complementary techniques, in their ability to contribute empirical input into healthcare policy and practice guidelines. The models were developed independently, with a view to compare their suitability to emergency department simulation. The current models implement relatively simple general scenarios, and rely on a combination of simulated and real data to simulate patient flow in a single emergency department or in multiple interacting emergency departments. In addition, several concepts from telecommunications engineering are translated into this modeling context. The framework of multiple-priority queue systems and the genetic programming paradigm of evolutionary machine learning are applied as a means of forecasting patient wait times and as a means of evolving healthcare policy, respectively. The models' utility lies in their ability to provide qualitative insights into the relative sensitivities and impacts of model input parameters, to illuminate scenarios worthy of more complex investigation, and to iteratively validate the models as they continue to be refined and extended. The paper discusses future efforts to refine, extend, and validate the models with more data and real data relative to physical (spatial-topographical) and social inputs (staffing, patient care models, etc.). Real data obtained through proximity location and tracking system technologies is one example discussed.
Models of Emergency Departments for Reducing Patient Waiting Times
Laskowski, Marek; McLeod, Robert D.; Friesen, Marcia R.; Podaima, Blake W.; Alfa, Attahiru S.
2009-01-01
In this paper, we apply both agent-based models and queuing models to investigate patient access and patient flow through emergency departments. The objective of this work is to gain insights into the comparative contributions and limitations of these complementary techniques, in their ability to contribute empirical input into healthcare policy and practice guidelines. The models were developed independently, with a view to compare their suitability to emergency department simulation. The current models implement relatively simple general scenarios, and rely on a combination of simulated and real data to simulate patient flow in a single emergency department or in multiple interacting emergency departments. In addition, several concepts from telecommunications engineering are translated into this modeling context. The framework of multiple-priority queue systems and the genetic programming paradigm of evolutionary machine learning are applied as a means of forecasting patient wait times and as a means of evolving healthcare policy, respectively. The models' utility lies in their ability to provide qualitative insights into the relative sensitivities and impacts of model input parameters, to illuminate scenarios worthy of more complex investigation, and to iteratively validate the models as they continue to be refined and extended. The paper discusses future efforts to refine, extend, and validate the models with more data and real data relative to physical (spatial–topographical) and social inputs (staffing, patient care models, etc.). Real data obtained through proximity location and tracking system technologies is one example discussed. PMID:19572015
Advancing the 3Rs in regulatory ecotoxicology: A pragmatic cross-sector approach.
Burden, Natalie; Benstead, Rachel; Clook, Mark; Doyle, Ian; Edwards, Peter; Maynard, Samuel K; Ryder, Kathryn; Sheahan, Dave; Whale, Graham; van Egmond, Roger; Wheeler, James R; Hutchinson, Thomas H
2016-07-01
The ecotoxicity testing of chemicals for prospective environmental safety assessment is an area in which a high number of vertebrates are used across a variety of industry sectors. Refining, reducing, and replacing the use of animals such as fish, birds, and amphibians for this purpose addresses the ethical concerns and the increasing legislative requirements to consider alternative test methods. Members of the UK-based National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs) Ecotoxicology Working Group, consisting of representatives from academia, government organizations, and industry, have worked together over the past 6 y to provide evidence bases to support and advance the application of the 3Rs in regulatory ecotoxicity testing. The group recently held a workshop to identify the areas of testing, demands, and drivers that will have an impact on the future of animal use in regulatory ecotoxicology. As a result of these discussions, we have developed a pragmatic approach to prioritize and realistically address key opportunity areas, to enable progress toward the vision of a reduced reliance on the use of animals in this area of testing. This paper summarizes the findings of this exercise and proposes a pragmatic strategy toward our key long-term goals-the incorporation of reliable alternatives to whole-organism testing into regulations and guidance, and a culture shift toward reduced reliance on vertebrate toxicity testing in routine environmental safety assessment. Integr Environ Assess Manag 2016;12:417-421. © 2015 SETAC. © 2015 SETAC.
NASA Astrophysics Data System (ADS)
Lewis, S.; Dunbar, R. B.; Mucciarone, D.; Barkdull, M.
2017-12-01
Scientific tools assessing impacts to watershed and coastal ecosystem services, like those from land-use land conversion (LULC), are critical for sustainable land management strategies. Small island nations are particularly vulnerable to LULC threats, especially sediment delivery, given their small spatial size and reliance on natural resources. In the Republic of Palau, a small Pacific island country, three major land-use activities—construction, fires, and agriculture— have increased sediment delivery to important estuarine and coastal habitats (i.e., rivers, mangroves, coral reefs) over the past 30 years. This project examines the predictive capacity of an ecosystem services model, Natural Capital Project's InVEST, for sediment delivery using historic land-use and coral geochemical analysis. These refined model projections are used to assess ecosystem services tradeoffs under different future land development and management scenarios. Coral cores (20-41cm in length) were sampled along a high-to-low sedimentation gradient (i.e., near major rivers (high-impact) and ocean (low-impact)) in Micronesia's largest estuary, Ngeremeduu Bay. Isotopic indicators of seasonality (δ18O and δ13C values (% VPDB)) were used to construct the age model for each core. Barium, Manganese, and Yttrium were used as trace metal proxies for sedimentation and measured in each core using a laser ablation ICP-MS. Finally, the Natural Capital Project's InVEST sediment delivery model was paired with Geospatial data to examine the drivers of sediment delivery (i.e., construction, farms and fires) within these two watersheds. A thirty-year record of trace metal to calcium ratios in coral skeletons show a peak in sedimentation during 2006 and 2007, and in 2012. These results suggest historic peaks in sediment delivery correlating to large-scale road construction and support previous findings that Ngeremeduu Bay has reached a tipping point of retaining sediment. Natural Capital's project InVEST sediment delivery model results suggest fires increases sediment exportation by an order of magnitude compared with the other major land-use activities. A refined measure of LULC from a novel database (earth-moving permits) will be used to develop a more accurate depiction of sediment delivery to estuarine and coastal habitats.
NASA Astrophysics Data System (ADS)
Shabani, Farzin; Kumar, Lalit; Taylor, Subhashni
2014-11-01
This study set out to model potential date palm distribution under current and future climate scenarios using an emission scenario, in conjunction with two different global climate models (GCMs): CSIRO-Mk3.0 (CS), and MIROC-H (MR), and to refine results based on suitability under four nonclimatic parameters. Areas containing suitable physicochemical soil properties and suitable soil taxonomy, together with land slopes of less than 10° and suitable land uses for date palm ( Phoenix dactylifera) were selected as appropriate refining tools to ensure the CLIMEX results were accurate and robust. Results showed that large regions of Iran are projected as likely to become climatically suitable for date palm cultivation based on the projected scenarios for the years 2030, 2050, 2070, and 2100. The study also showed CLIMEX outputs merit refinement by nonclimatic parameters and that the incremental introduction of each additional parameter decreased the disagreement between GCMs. Furthermore, the study indicated that the least amount of disagreement in terms of areas conducive to date palm cultivation resulted from CS and MR GCMs when the locations of suitable physicochemical soil properties and soil taxonomy were used as refinement tools.
Prescott, Mark J; Brown, Verity J; Flecknell, Paul A; Gaffan, David; Garrod, Kate; Lemon, Roger N; Parker, Andrew J; Ryder, Kathy; Schultz, Wolfram; Scott, Leah; Watson, Jayne; Whitfield, Lucy
2010-11-30
This report provides practical guidance on refinement of the use of food and fluid control as motivational tools for macaques used in behavioural neuroscience research. The guidance is based on consideration of the scientific literature and, where data are lacking, expert opinion and professional experience, including that of the members of a Working Group convened by the United Kingdom National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs). The report should be useful to researchers, veterinarians and animal care staff responsible for the welfare of macaques used in food and fluid control protocols, as well as those involved with designing, performing and analysing studies that use these protocols. It should also assist regulatory authorities and members of local ethical review processes or institutional animal care and use committees concerned with evaluating such protocols. The report provides a framework for refinement that can be tailored to meet local requirements. It also identifies data gaps and areas for future research and sets out the Working Group's recommendations on contemporary best practice. Copyright © 2010 Elsevier B.V. All rights reserved.
Socio-economic Impact Analysis for Near Real-Time Flood Detection in the Lower Mekong River Basin
NASA Astrophysics Data System (ADS)
Oddo, P.; Ahamed, A.; Bolten, J. D.
2017-12-01
Flood events pose a severe threat to communities in the Lower Mekong River Basin. The combination of population growth, urbanization, and economic development exacerbate the impacts of these flood events. Flood damage assessments are frequently used to quantify the economic losses in the wake of storms. These assessments are critical for understanding the effects of flooding on the local population, and for informing decision-makers about future risks. Remote sensing systems provide a valuable tool for monitoring flood conditions and assessing their severity more rapidly than traditional post-event evaluations. The frequency and severity of extreme flood events are projected to increase, further illustrating the need for improved flood monitoring and impact analysis. In this study we implement a socio-economic damage model into a decision support tool with near real-time flood detection capabilities (NASA's Project Mekong). Surface water extent for current and historical floods is found using multispectral Moderate-resolution Imaging Spectroradiometer (MODIS) 250-meter imagery and the spectral Normalized Difference Vegetation Index (NDVI) signatures of permanent water bodies (MOD44W). Direct and indirect damages to populations, infrastructure, and agriculture are assessed using the 2011 Southeast Asian flood as a case study. Improved land cover and flood depth assessments result in a more refined understanding of losses throughout the Mekong River Basin. Results suggest that rapid initial estimates of flood impacts can provide valuable information to governments, international agencies, and disaster responders in the wake of extreme flood events.
NASA Technical Reports Server (NTRS)
Glavin, Daniel P.; Dworkin, Jason P.; Lupisella, Mark; Williams, David R.; Kminek, Gerhard; Rummel, John D.
2010-01-01
NASA and ESA have outlined visions for solar system exploration that will include a series of lunar robotic precursor missions to prepare for, and support a human return to the Moan, and future human exploration of Mars and other destinations, including possibly asteroids. One of the guiding principles for exploration is to pursue compelling scientific questions about the origin and evolution of life. The search for life on objects such as Mars will require careful operations, and that all systems be sufficiently cleaned and sterilized prior to launch to ensure that the scientific integrity of extraterrestrial samples is not jeopardized by terrestrial organic contamination. Under the Committee on Space Research's (COSPAR's) current planetary protection policy for the Moon, no sterilization procedures are required for outbound lunar spacecraft, nor is there a different planetary protection category for human missions, although preliminary C SPAR policy guidelines for human missions to Mars have been developed. Future in situ investigations of a variety of locations on the Moon by highly sensitive instruments designed to search for biologically derived organic compounds would help assess the contamination of the Moon by lunar spacecraft. These studies could also provide valuable "ground truth" data for Mars sample return missions and help define planetary protection requirements for future Mars bound spacecraft carrying life detection experiments. In addition, studies of the impact of terrestrial contamination of the lunar surface by the Apollo astronauts could provide valuable data to help refine future: Mars surface exploration plans for a human mission to Mars.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
... Plan Amendment; UNEV Refined Liquid Petroleum Products Pipeline Environmental Impact Statement AGENCY... Environmental Impact Statement (EIS) published on April 16, 2010, is the same as that selected in the ROD. The..., Tooele, Juab, Millard, Beaver, Iron, and Washington Counties in Utah, and in Lincoln and Clark Counties...
NASA Astrophysics Data System (ADS)
Alderman, Rachael; Hobday, Alistair J.
2017-06-01
Conservation of marine species typically focuses on monitoring and mitigating demonstrated stressors where possible. Evidence is accumulating that some species will be negatively affected in the future by climate change and that reduction of existing stressors may not be sufficient to offset these impacts. Recent work suggests the shy albatross (Thalassarche cauta) will be adversely affected by projected changes in environmental conditions under plausible climate change scenarios. Furthermore, modelling shows that elimination of the principal present-day threat to albatrosses, fisheries bycatch, an achievable and critical priority, may not be sufficient to reverse projected population declines due to climate impacts, which cannot be directly eliminated. Here, a case study is presented in which a range of intervention options, in preparation for predicted climate change impacts, are identified and evaluated. A suite of 24 plausible climate adaptation options is first assessed using a semi-quantitative cost-benefit-risk tool, leading to a relative ranking of actions. Of these options, increasing chick survival via reduction of disease prevalence through control of vectors, was selected for field trials. Avian insecticide was applied to chicks' mid-way through their development and the effect on subsequent survival was evaluated. Survival of treated chicks after six weeks was significantly higher (92.7%) than those in control areas (82.1%). This approach shows that options to enhance albatross populations exist and we argue that testing interventions prior to serious impacts can formalise institutional processes and allow refinement of actions that offer some chance of mitigating the impacts of climate change on iconic marine species.
ERIC Educational Resources Information Center
Crank, Ron
This instructional unit is one of 10 developed by students on various energy-related areas that deals specifically with fossil fuels. Some topics covered are historic facts, development of fuels, history of oil production, current and future trends of the oil industry, refining fossil fuels, and environmental problems. Material in each unit may…
Critical Measurement Issues in Translational Research
ERIC Educational Resources Information Center
Glasgow, Russell E.
2009-01-01
This article summarizes critical evaluation needs, challenges, and lessons learned in translational research. Evaluation can play a key role in enhancing successful application of research-based programs and tools as well as informing program refinement and future research. Discussion centers on what is unique about evaluating programs and…
Olsson, I Anna S; Hansen, Axel K; Sandøe, Peter
2008-07-01
The use of animals in biomedical and other research presents an ethical dilemma: we do not want to lose scientific benefits, nor do we want to cause laboratory animals to suffer. Scientists often refer to the potential human benefits of animal models to justify their use. However, even if this is accepted, it still needs to be argued that the same benefits could not have been achieved with a mitigated impact on animal welfare. Reducing the adverse effects of scientific protocols ('refinement') is therefore crucial in animal-based research. It is especially important that researchers share knowledge on how to avoid causing unnecessary suffering. We have previously demonstrated that even in studies in which animal use leads to spontaneous death, scientists often fail to report measures to minimize animal distress (Olsson et al. 2007). In this paper, we present the full results of a case study examining reports, published in peer-reviewed journals between 2003 and 2004, of experiments employing animal models to study the neurodegenerative disorder Huntington's disease. In 51 references, experiments in which animals were expected to develop motor deficits so severe that they would have difficulty eating and drinking normally were conducted, yet only three references were made to housing adaptation to facilitate food and water intake. Experiments including end-stages of the disease were reported in 14 papers, yet of these only six referred to the euthanasia of moribund animals. If the reference in scientific publications reflects the actual application of refinement, researchers do not follow the 3Rs (replacement, reduction, refinement) principle. While in some cases, it is clear that less-than-optimal techniques were used, we recognize that scientists may apply refinement without referring to it; however, if they do not include such information in publications, it suggests they find it less relevant. Journal publishing policy could play an important role: first, in ensuring that referees seriously consider whether submitted studies were indeed carried out with the smallest achievable negative impact on the animals and, secondly, in encouraging scientists to share refinements through the inclusion of a 3Rs section in papers publishing the results of animal-based research.
Osier, Nicole D.; Dixon, C. Edward
2016-01-01
Controlled cortical impact (CCI) is a mechanical model of traumatic brain injury (TBI) that was developed nearly 30 years ago with the goal of creating a testing platform to determine the biomechanical properties of brain tissue exposed to direct mechanical deformation. Initially used to model TBIs produced by automotive crashes, the CCI model rapidly transformed into a standardized technique to study TBI mechanisms and evaluate therapies. CCI is most commonly produced using a device that rapidly accelerates a rod to impact the surgically exposed cortical dural surface. The tip of the rod can be varied in size and geometry to accommodate scalability to difference species. Typically, the rod is actuated by a pneumatic piston or electromagnetic actuator. With some limits, CCI devices can control the velocity, depth, duration, and site of impact. The CCI model produces morphologic and cerebrovascular injury responses that resemble certain aspects of human TBI. Commonly observed are graded histologic and axonal derangements, disruption of the blood–brain barrier, subdural and intra-parenchymal hematoma, edema, inflammation, and alterations in cerebral blood flow. The CCI model also produces neurobehavioral and cognitive impairments similar to those observed clinically. In contrast to other TBI models, the CCI device induces a significantly pronounced cortical contusion, but is limited in the extent to which it models the diffuse effects of TBI; a related limitation is that not all clinical TBI cases are characterized by a contusion. Another perceived limitation is that a non-clinically relevant craniotomy is performed. Biomechanically, this is irrelevant at the tissue level. However, craniotomies are not atraumatic and the effects of surgery should be controlled by including surgical sham control groups. CCI devices have also been successfully used to impact closed skulls to study mild and repetitive TBI. Future directions for CCI research surround continued refinements to the model through technical improvements in the devices (e.g., minimizing mechanical sources of variation). Like all TBI models, publications should report key injury parameters as outlined in the NIH common data elements (CDEs) for pre-clinical TBI. PMID:27582726
Gilson, Nicholas D; Pavey, Toby G; Wright, Olivia Rl; Vandelanotte, Corneel; Duncan, Mitch J; Gomersall, Sjaan; Trost, Stewart G; Brown, Wendy J
2017-05-18
Chronic diseases are high in truck drivers and have been linked to work routines that promote inactivity and poor diets. This feasibility study examined the extent to which an m-Health financial incentives program facilitated physical activity and healthy dietary choices in Australian truck drivers. Nineteen men (mean [SD] age = 47.5 [9.8] years; BMI = 31.2 [4.6] kg/m 2 ) completed the 20-week program, and used an activity tracker and smartphone application (Jawbone UP™) to regulate small positive changes in occupational physical activity, and fruit, vegetable, saturated fat and processed/refined sugar food/beverage choices. Measures (baseline, end-program, 2-months follow-up; April-December 2014) were accelerometer-determined proportions of work time spent physically active, and a workday dietary questionnaire. Statistical (repeated measures ANOVA) and thematic (interviews) analyses assessed program impact. Non-significant increases in the mean proportions of work time spent physically active were found at end-program and follow-up (+1%; 7 mins/day). Fruit (p = 0.023) and vegetable (p = 0.024) consumption significantly increased by one serve/day at end-program. Non-significant improvements in saturated fat (5%) and processed/refined sugar (1%) food/beverage choices were found at end-program and follow-up. Overall, 65% (n = 11) of drivers demonstrated positive changes in physical activity, and at least one dietary choice (e.g. saturated fat) at follow-up. Drivers found the financial incentives component of the program to be a less effective facilitator of change than the activity tracker and smartphone application, although this technology was easier to use for monitoring of physical activity than healthy dietary choices. Not all drivers benefitted from the program. However, positive changes for different health behaviours were observed in the majority of participants. Outcomes from this feasibility study inform future intervention development for studies with larger samples. ANZCTR12616001513404 . Registered November 2nd, 2016 (retrospectively registered).
Simulation in Metallurgical Processing: Recent Developments and Future Perspectives
NASA Astrophysics Data System (ADS)
Ludwig, Andreas; Wu, Menghuai; Kharicha, Abdellah
2016-08-01
This article briefly addresses the most important topics concerning numerical simulation of metallurgical processes, namely, multiphase issues (particle and bubble motion and flotation/sedimentation of equiaxed crystals during solidification), multiphysics issues (electromagnetic stirring, electro-slag remelting, Cu-electro-refining, fluid-structure interaction, and mushy zone deformation), process simulations on graphical processing units, integrated computational materials engineering, and automatic optimization via simulation. The present state-of-the-art as well as requirements for future developments are presented and briefly discussed.
Greenforce Initiative: Advancing Greener Careers
ERIC Educational Resources Information Center
Mwase, Gloria; Keniry, Julian
2011-01-01
With support from the Bank of America Charitable Foundation and the Charles Stewart Mott Foundation, the National Wildlife Federation (NWF) and Jobs for the Future (JFF) formed the Greenforce Initiative--a two-year venture that will work with community colleges across the nation to strengthen their capacity to implement or refine quality pathways…
An Analysis of the Dimensionality of the Pupil Control Ideology Scale.
ERIC Educational Resources Information Center
Graham, Steve; And Others
1985-01-01
The present study replicated an earlier investigation using the Pupil Control Ideology (PCI). The findings were congruent with earlier results. Consequently, it was recommended that the PCI should be refined and that the 10 item, unidimensional scale should be used in future investigations. (Author/LMO)
My Academic Plan: Helping Students Map Their Future
ERIC Educational Resources Information Center
Williams, John; Mathur, Raghu; Gaston, Jim
2009-01-01
What more important problem could we solve than helping students make intelligent decisions in their course selections? The South Orange County Community College District created a new award-winning system dedicated to helping students define, refine, and implement their personal academic goals. The user-centered design is apparent in the…
LSSA (Low-cost Silicon Solar Array) project
NASA Technical Reports Server (NTRS)
1976-01-01
The Photovoltaic Conversion Program was established to find methods of economically generating enough electrical power to meet future requirements. Activities and progress in the following areas are discussed: silicon-refinement processes; silicon-sheet-growth techniques; encapsulants; manufacturing of off-the-shelf solar arrays; and procurement of semistandardized solar arrays.
...Toward complete use of eastern Oregon's forest resources.
Donald R. Gedney
1963-01-01
Eastern Oregon's economy is definitely timber oriented. Few other segments of its economy have contributed as much to its development or promise more for future growth. The greatest opportunity for expansion through use of its forest resources lies in the direction of greater product diversification and product refinement.
Development of a Culturally Valid Counselor Burnout Inventory for Korean Counselors
ERIC Educational Resources Information Center
Yu, Kumlan; Lee, Sang Min; Nesbit, Elisabeth A.
2008-01-01
This article describes the development of the culturally valid Counselor Burnout Inventory. A multistage approach including item translation; item refinement; and evaluation of factorial validity, reliability, and score validity was used to test constructs and validation. Implications for practice and future research are discussed. (Contains 3…
We previously described our collective judgment methods to engage expert stakeholders in the Comprehensive Environmental Assessment (CEA) workshop process applied to nano-TiO2 and nano-Ag research planning. We identified several lessons learned in engaging stakeholders to identif...
BRODY, DAVID L.; DONALD, CHRISTINE Mac; KESSENS, CHAD C.; YUEDE, CARLA; PARSADANIAN, MAIA; SPINNER, MIKE; KIM, EDDIE; SCHWETYE, KATHERINE E.; HOLTZMAN, DAVID M.; BAYLY, PHILIP V.
2008-01-01
Genetically modified mice represent useful tools for traumatic brain injury (TBI) research and attractive preclinical models for the development of novel therapeutics. Experimental methods that minimize the number of mice needed may increase the pace of discovery. With this in mind, we developed and characterized a prototype electromagnetic (EM) controlled cortical impact device along with refined surgical and behavioral testing techniques. By varying the depth of impact between 1.0 and 3.0 mm, we found that the EM device was capable of producing a broad range of injury severities. Histologically, 2.0-mm impact depth injuries produced by the EM device were similar to 1.0-mm impact depth injuries produced by a commercially available pneumatic device. Behaviorally, 2.0-, 2.5-, and 3.0-mm impacts impaired hidden platform and probe trial water maze performance, whereas 1.5-mm impacts did not. Rotorod and visible platform water maze deficits were also found following 2.5- and 3.0-mm impacts. No impairment of conditioned fear performance was detected. No differences were found between sexes of mice. Inter-operator reliability was very good. Behaviorally, we found that we could statistically distinguish between injury depths differing by 0.5 mm using 12 mice per group and between injury depths differing by 1.0 mm with 7-8 mice per group. Thus, the EM impactor and refined surgical and behavioral testing techniques may offer a reliable and convenient framework for preclinical TBI research involving mice. PMID:17439349
Beyond Texas City: the state of process safety in the unionized U.S. oil refining industry.
McQuiston, Thomas H; Lippin, Tobi Mae; Bradley-Bull, Kristin; Anderson, Joseph; Beach, Josie; Beevers, Gary; Frederick, Randy J; Frederick, James; Greene, Tammy; Hoffman, Thomas; Lefton, James; Nibarger, Kim; Renner, Paul; Ricks, Brian; Seymour, Thomas; Taylor, Ren; Wright, Mike
2009-01-01
The March 2005 British Petroleum (BP) Texas City Refinery disaster provided a stimulus to examine the state of process safety in the U.S. refining industry. Participatory action researchers conducted a nation-wide mail-back survey of United Steelworkers local unions and collected data from 51 unionized refineries. The study examined the prevalence of highly hazardous conditions key to the Texas City disaster, refinery actions to address those conditions, emergency preparedness and response, process safety systems, and worker training. Findings indicate that the key highly hazardous conditions were pervasive and often resulted in incidents or near-misses. Respondents reported worker training was insufficient and less than a third characterized their refineries as very prepared to respond safely to a hazardous materials emergency. The authors conclude that the potential for future disasters plagues the refining industry. In response, they call for effective proactive OSHA regulation and outline ten urgent and critical actions to improve refinery process safety.
Mineral commodity profiles: Cadmium
Butterman, W.C.; Plachy, Jozef
2004-01-01
Overview -- Cadmium is a soft, low-melting-point metal that has many uses. It is similar in abundance to antimony and bismuth and is the 63d element in order of crustal abundance. Cadmium is associated in nature with zinc (and, less closely, with lead and copper) and is extracted mainly as a byproduct of the mining and processing of zinc. In 2000, it was refined in 27 countries, of which the 8 largest accounted for two-thirds of world production. The United States was the third largest refiner after Japan and China. World production in 2000 was 19,700 metric tons (t) and U.S. production was 1,890 t. In the United States, one company in Illinois and another in Tennessee refined primary cadmium. A Pennsylvania company recovered cadmium from scrap, mainly spent nickel-cadmium (NiCd) batteries. The supply of cadmium in the world and in the United States appears to be adequate to meet future industrial needs; the United States has about 23 percent of the world reserve base.
Robust stereo matching with trinary cross color census and triple image-based refinements
NASA Astrophysics Data System (ADS)
Chang, Ting-An; Lu, Xiao; Yang, Jar-Ferr
2017-12-01
For future 3D TV broadcasting systems and navigation applications, it is necessary to have accurate stereo matching which could precisely estimate depth map from two distanced cameras. In this paper, we first suggest a trinary cross color (TCC) census transform, which can help to achieve accurate disparity raw matching cost with low computational cost. The two-pass cost aggregation (TPCA) is formed to compute the aggregation cost, then the disparity map can be obtained by a range winner-take-all (RWTA) process and a white hole filling procedure. To further enhance the accuracy performance, a range left-right checking (RLRC) method is proposed to classify the results as correct, mismatched, or occluded pixels. Then, the image-based refinements for the mismatched and occluded pixels are proposed to refine the classified errors. Finally, the image-based cross voting and a median filter are employed to complete the fine depth estimation. Experimental results show that the proposed semi-global stereo matching system achieves considerably accurate disparity maps with reasonable computation cost.
Topological quantum computation of the Dold-Thom functor
NASA Astrophysics Data System (ADS)
Ospina, Juan
2014-05-01
A possible topological quantum computation of the Dold-Thom functor is presented. The method that will be used is the following: a) Certain 1+1-topological quantum field theories valued in symmetric bimonoidal categories are converted into stable homotopical data, using a machinery recently introduced by Elmendorf and Mandell; b) we exploit, in this framework, two recent results (independent of each other) on refinements of Khovanov homology: our refinement into a module over the connective k-theory spectrum and a stronger result by Lipshitz and Sarkar refining Khovanov homology into a stable homotopy type; c) starting from the Khovanov homotopy the Dold-Thom functor is constructed; d) the full construction is formulated as a topological quantum algorithm. It is conjectured that the Jones polynomial can be described as the analytical index of certain Dirac operator defined in the context of the Khovanov homotopy using the Dold-Thom functor. As a line for future research is interesting to study the corresponding supersymmetric model for which the Khovanov-Dirac operator plays the role of a supercharge.
NASA Astrophysics Data System (ADS)
Næss, Mari k.; Kero, Ida; Tranell, Gabriella
2013-08-01
In the production of metallurgical grade silicon (MG-Si), fugitive emissions are a serious concern due to the health risks associated with the fumes formed in different parts of the production. The fumes are also a potential environmental hazard. Yet, the chemical composition of the fumes from most process steps in the silicon plant, such as oxidative refining ladle, remains unknown. This in turn constitutes a problem with respect to the correct assessment of the environmental impact and working conditions. A comprehensive industrial measurement campaign was performed at the Elkem Salten MG-Si production plant in Norway. Samples of the ingoing and outgoing mass flows were analyzed by high-resolution inductively coupled plasma mass spectrometry, with respect to 62 elements. In every step of the sampling and sample treatment processes, possible sources of error have been identified and quantified, including process variation, mass measurement accuracy, and contamination risk. Total measurement errors for all elements in all phases are established. The method is applied to estimate the order of magnitude of the elemental emissions via the fumes from the tapping and refining processes, with respect to production mass and year. The elements with higher concentrations in the fume than slag and refined silicon include Ag, Bi, Cd, Cu, In, K, Mg, Na, Pb, Rb, Se, Sn, Tl, and Zn: all being present in the ppm range. This work constitutes new and vital information to enable the correct assessment of the environmental impact and working conditions at an MG-Si plant.
Two-polarity magnetization in the Manson impact breccia
NASA Technical Reports Server (NTRS)
Steiner, M. B.; Shoemaker, E. M.
1993-01-01
A preliminary paleomagnetic study of the impact breccia matrix and clasts has produced surprising results--nearly antipodal normal and reversed polarity magnetic vectors are observed in different portions of the core. Near-antipodal magnetizations within a segment of matrix and within individual samples rule out core inversion as the explanation of the dual polarity. In both the dense and the sandy matrix breccias, the magnetizations of clasts and matrix within the same core segment are identical; this negative 'conglomerate test' indicates that magnetization originated after impact. Paleomagnetic study of the Manson Impact Structure is an attempt to refine the Ar-40/Ar-39 age (65.7 +/- 1 m.y.) that suggests Manson to be a Cretaceous-Tertiary boundary impact. Refinement is possible because the boundary occurs within a reversed polarity interval (29R) of only 0.5 m.y. duration. The two breccia types in the Manson structure were both examined: one of a very dense matrix and apparently partially melted, and the breccia stratigraphically below it of granular or 'sandy' chloritic matrix. Samples were taken from the matrixes and a wide variety of clast compositions, including granite, diabase, gneiss, amphibolite, and melted granite. Currently, measurements have been made on 22 samples, using 30-35 steps of either alternating field (AF) or thermal demagnetization.
DIGITAL CARTOGRAPHY OF THE PLANETS: NEW METHODS, ITS STATUS, AND ITS FUTURE.
Batson, R.M.
1987-01-01
A system has been developed that establishes a standardized cartographic database for each of the 19 planets and major satellites that have been explored to date. Compilation of the databases involves both traditional and newly developed digital image processing and mosaicking techniques, including radiometric and geometric corrections of the images. Each database, or digital image model (DIM), is a digital mosaic of spacecraft images that have been radiometrically and geometrically corrected and photometrically modeled. During compilation, ancillary data files such as radiometric calibrations and refined photometric values for all camera lens and filter combinations and refined camera-orientation matrices for all images used in the mapping are produced.
Planning for Downtown Circulation Systems. Volume 2. Analysis Techniques.
DOT National Transportation Integrated Search
1983-10-01
This volume contains the analysis and refinement stages of downtown circulator planning. Included are sections on methods for estimating patronage, costs, revenues, and impacts, and a section on methods for performing micro-level analyses.
ERIC Educational Resources Information Center
Martin, Nancy K.; Yin, Zenong; Mayall, Hayley
2006-01-01
This study represents a continuation of research efforts to further refine the Attitudes and Beliefs on Classroom Control (ABCC) Inventory. The purposes of this study were to investigate the: (1) impact of classroom management training on classroom management style; (2) differences in attitudes toward classroom management between novice and…
Harris, Sharon
2013-01-01
Abstract Appropriately constructed health promotions can improve population health. The authors developed a practical model for designing, evaluating, and improving initiatives to provide optimal value. Three independent model dimensions (impact, engagement, and sustainability) and the resultant three-dimensional paradigm were described using hypothetical case studies, including a walking challenge, a health risk assessment survey, and an individual condition management program. The 3-dimensional model is illustrated and the dimensions are defined. Calculation of a 3-dimensional score for program comparisons, refinements, and measurement is explained. Program 1, the walking challenge, had high engagement and impact, but limited sustainability. Program 2, the health risk assessment survey, had high engagement and sustainability but limited impact. Program 3, the on-site condition management program, had measurable impact and sustainability but limited engagement, because of a lack of program capacity. Each initiative, though successful in 2 dimensions, lacked sufficient evolution along the third axis for optimal value. Calculation of a 3-dimensional score is useful for health promotion program development comparison and refinements, and overall measurement of program success. (Population Health Management 2013;16:291–295) PMID:23869538
Effect of alignment perturbations in a trans-tibial prosthesis user: A pilot study.
Courtney, Anna; Orendurff, Michael S; Buis, Arjan
2016-04-01
A recurring complication in trans-tibial prosthetic limb users is "poor socket fit" with painful residuum-socket interfaces, a consequence of excess pressure. This is due to both poor socket fit and poor socket alignment; however, the interaction of these factors has not been quantified. Through evaluation of kinetic data this study aimed to articulate an interaction uniting socket design, alignment and interface pressures. The results will help to refine future studies and will hopefully help determine whether sockets can be designed, fitted and aligned to maximize mobility whilst minimizing injurious forces. Interface pressures were recorded throughout ambulation in one user with "optimal (reference) alignment" followed by 5 malalignments in a patellar tendon-bearing and a hydrocast socket. Marked differences in pressure distribution were discovered when equating the patellar tendon-bearing against the hydrocast socket and when comparing interface pressures from reference with offset alignment. Patellar tendon-bearing sockets were found to be more sensitive to alignment perturbations than hydrocast sockets. A complex interaction was found, with the most prominent finding demonstrating the requisite for attainment of optimal alignment: a translational alignment error of 10 mm can increase maximum peak pressures by 227% (mean 17.5%). Refinements for future trials are described and the necessity for future research into socket design, alignment and interface pressures has been estabilished.
Thrust Vector Control for Nuclear Thermal Rockets
NASA Technical Reports Server (NTRS)
Ensworth, Clinton B. F.
2013-01-01
Future space missions may use Nuclear Thermal Rocket (NTR) stages for human and cargo missions to Mars and other destinations. The vehicles are likely to require engine thrust vector control (TVC) to maintain desired flight trajectories. This paper explores requirements and concepts for TVC systems for representative NTR missions. Requirements for TVC systems were derived using 6 degree-of-freedom models of NTR vehicles. Various flight scenarios were evaluated to determine vehicle attitude control needs and to determine the applicability of TVC. Outputs from the models yielded key characteristics including engine gimbal angles, gimbal rates and gimbal actuator power. Additional factors such as engine thrust variability and engine thrust alignment errors were examined for impacts to gimbal requirements. Various technologies are surveyed for TVC systems for the NTR applications. A key factor in technology selection is the unique radiation environment present in NTR stages. Other considerations including mission duration and thermal environments influence the selection of optimal TVC technologies. Candidate technologies are compared to see which technologies, or combinations of technologies best fit the requirements for selected NTR missions. Representative TVC systems are proposed and key properties such as mass and power requirements are defined. The outputs from this effort can be used to refine NTR system sizing models, providing higher fidelity definition for TVC systems for future studies.
Unconventional Pretreatment of Lignocellulose with Low-Temperature Plasma.
Vanneste, Jens; Ennaert, Thijs; Vanhulsel, Annick; Sels, Bert
2017-01-10
Lignocellulose represents a potential supply of sustainable feedstock for the production of biofuels and chemicals. There is, however, an important cost and efficiency challenge associated with the conversion of such lignocellulosics. Because its structure is complex and not prone to undergo chemical reactions very easily, chemical and mechanical pretreatments are usually necessary to be able to refine them into the compositional building blocks (carbohydrates and lignin) from which value-added platform molecules, such as glucose, ethylene glycol, 5-hydroxymethylfurfural, and levulinic acid, and biofuels, such as bioderived naphtha, kerosene, and diesel fractions, will be produced. Conventional (wet) methods are usually polluting, aggressive, and highly energy consuming, so any alternative activation procedure of the lignocellulose is highly recommended and anticipated in recent and future biomass research. Lignocellulosic plasma activation has emerged as an interesting (dry) treatment technique. In the long run, in particular, in times of fairly accessible renewable electricity, plasma may be considered as an alternative to conventional pretreatment methods, but current knowledge is too little and examples too few to guarantee that statement. This review therefore highlights recent knowledge, advancements, and shortcomings in the field of plasma treatment of cellulose and lignocellulose with regard to the (structural and chemical) effects and impact on the future of pretreatment methods. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Gater, Adam; Kitchen, Helen; Heron, Louise; Pollard, Catherine; Håkan-Bloch, Jonas; Højbjerre, Lise; Hansen, Brian Bekker; Strandberg-Larsen, Martin
2015-01-01
The primary objective of this review is to develop a conceptual model for Crohn's disease (CD) outlining the disease burden for patients, healthcare systems and wider society, as reported in the scientific literature. A search was conducted using MEDLINE, PsycINFO, EconLit, Health Economic Evaluation Database and Centre for Reviews and Dissemination databases. Patient-reported outcome (PRO) measures widely used in CD were reviewed according to the US FDA PRO Guidance for Industry. The resulting conceptual model highlights the characterization of CD by gastrointestinal disturbances, extra-intestinal and systemic symptoms. These symptoms impact physical functioning, ability to complete daily activities, emotional wellbeing, social functioning, sexual functioning and ability to work. Gaps in conceptual coverage and evidence of reliability and validity for some PRO measures were noted. Review findings also highlight the substantial direct and indirect costs associated with CD. Evidence from the literature confirms the substantial burden of CD to patients and wider society; however, future research is still needed to further understand burden from the perspective of patients and to accurately understand the economic burden of disease. Challenges with existing PRO measures also suggest the need for future research to refine or develop new measures.
Overview: Application of heterogeneous nucleation in grain-refining of metals.
Greer, A L
2016-12-07
In all of metallurgical processing, probably the most prominent example of nucleation control is the "inoculation" of melts to suppress columnar solidification and to obtain fine equiaxed grain structures in the as-cast solid. In inoculation, a master alloy is added to the melt to increase its solute content and to add stable particles that can act as nucleants for solid grains. This is important for alloys of many metals, and in other cases such as ice nucleation in living systems, but inoculation of aluminum alloys using Al-5Ti-1B (wt.%) master alloy is the exemplar. The key elements are (i) that the chemical interactions between nucleant TiB 2 particles and the melt ensure that the solid phase (α-Al) exists on the surface of the particles even above the liquidus temperature of the melt, (ii) that these perfect nucleants can initiate grains only when the barrier for free growth of α-Al is surmounted, and (iii) that (depending on whether the melt is spatially isothermal or not) the release of latent heat, or the limited extent of constitutional supercooling, can act to limit the number of grains that is initiated and therefore the degree of grain refinement that can be achieved. We review recent studies that contribute to better understanding, and improvement, of grain refinement in general. We also identify priorities for future research. These include the study of the effects of nanophase dispersions in melts. Preliminary studies show that such dispersions may be especially effective in achieving grain refinement, and raise many questions about the underlying mechanisms. The stimulation of icosahedral short-range ordering in the liquid has been shown to lead to grain refinement, and is a further priority for study, especially as the refinement can be achieved with only minor additions of solute.
Overview: Application of heterogeneous nucleation in grain-refining of metals
NASA Astrophysics Data System (ADS)
Greer, A. L.
2016-12-01
In all of metallurgical processing, probably the most prominent example of nucleation control is the "inoculation" of melts to suppress columnar solidification and to obtain fine equiaxed grain structures in the as-cast solid. In inoculation, a master alloy is added to the melt to increase its solute content and to add stable particles that can act as nucleants for solid grains. This is important for alloys of many metals, and in other cases such as ice nucleation in living systems, but inoculation of aluminum alloys using Al-5Ti-1B (wt.%) master alloy is the exemplar. The key elements are (i) that the chemical interactions between nucleant TiB2 particles and the melt ensure that the solid phase (α-Al) exists on the surface of the particles even above the liquidus temperature of the melt, (ii) that these perfect nucleants can initiate grains only when the barrier for free growth of α-Al is surmounted, and (iii) that (depending on whether the melt is spatially isothermal or not) the release of latent heat, or the limited extent of constitutional supercooling, can act to limit the number of grains that is initiated and therefore the degree of grain refinement that can be achieved. We review recent studies that contribute to better understanding, and improvement, of grain refinement in general. We also identify priorities for future research. These include the study of the effects of nanophase dispersions in melts. Preliminary studies show that such dispersions may be especially effective in achieving grain refinement, and raise many questions about the underlying mechanisms. The stimulation of icosahedral short-range ordering in the liquid has been shown to lead to grain refinement, and is a further priority for study, especially as the refinement can be achieved with only minor additions of solute.
Upgrading and Refining of Crude Oils and Petroleum Products by Ionizing Irradiation.
Zaikin, Yuriy A; Zaikina, Raissa F
2016-06-01
A general trend in the oil industry is a decrease in the proven reserves of light crude oils so that any increase in future oil exploration is associated with high-viscous sulfuric oils and bitumen. Although the world reserves of heavy oil are much greater than those of sweet light oils, their exploration at present is less than 12 % of the total oil recovery. One of the main constraints is very high expenses for the existing technologies of heavy oil recovery, upgrading, transportation, and refining. Heavy oil processing by conventional methods is difficult and requires high power inputs and capital investments. Effective and economic processing of high viscous oil and oil residues needs not only improvements of the existing methods, such as thermal, catalytic and hydro-cracking, but the development of new technological approaches for upgrading and refining of any type of problem oil feedstock. One of the perspective approaches to this problem is the application of ionizing irradiation for high-viscous oil processing. Radiation methods for upgrading and refining high-viscous crude oils and petroleum products in a wide temperature range, oil desulfurization, radiation technology for refining used oil products, and a perspective method for gasoline radiation isomerization are discussed in this paper. The advantages of radiation technology are simple configuration of radiation facilities, low capital and operational costs, processing at lowered temperatures and nearly atmospheric pressure without the use of any catalysts, high production rates, relatively low energy consumption, and flexibility to the type of oil feedstock.
Potential Impacts of Reductions in Refinery Activity on Northeast Petroleum Product Markets
2012-01-01
Potential Impacts of Reductions in Refinery Activity on Northeast Petroleum Product Markets is an update to a previous Energy Information Administration (EIA) report, Reductions in Northeast Refining Activity: Potential Implications for Petroleum Product Markets, released in December 2011. This update analyzes possible market responses and impacts in the event Sunoco's Philadelphia refinery closes this summer, in addition to the recently idled refineries on the East Coast and in the U.S. Virgin Islands.
High-throughput Crystallography for Structural Genomics
Joachimiak, Andrzej
2009-01-01
Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now over 53,000 proteins structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact. PMID:19765976
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sisman, S. Lara
2015-07-20
Hexavalent chromium, Cr(VI), is present in the environment as a byproduct of industrial processes. Due to its mobility and toxicity, it is crucial to attenuate or remove Cr(VI) from the environment. The objective of this investigation was to quantify potential natural attenuation, or reduction capacity, of reactive minerals and aquifer sediments. Samples of reduced-iron containing minerals such as ilmenite, as well as Puye Formation sediments representing a contaminated aquifer in New Mexico, were reacted with chromate. The change in Cr(VI) during the reaction was used to calculate reduction capacity. This study found that minerals that contain reduced iron, such asmore » ilmenite, have high reducing capacities. The data indicated that sample history may impact reduction capacity tests due to surface passivation. Further, this investigation identified areas for future research including: a) refining the relationships between iron content, magnetic susceptibility and reduction capacity, and b) long term kinetic testing using fresh aquifer sediments.« less
Liquid belt radiator design study
NASA Technical Reports Server (NTRS)
Teagan, W. P.; Fitzgerald, K. F.
1986-01-01
The Liquid Belt Radiator (LBR) is an advanced concept developed to meet the needs of anticipated future space missions. A previous study documented the advantages of this concept as a lightweight, easily deployable alternative to present day space heat rejection systems. The technical efforts associated with this study concentrate on refining the concept of the LBR as well as examining the issues of belt dynamics and potential application of the LBR to intermediate and high temperature heat rejection applications. A low temperature point design developed in previous work is updated assuming the use of diffusion pump oil, Santovac-6, as the heat transfer media. Additional analytical and design effort is directed toward determining the impact of interface heat exchanger, fluid bath sealing, and belt drive mechanism designs on system performance and mass. The updated design supports the earlier result by indicating a significant reduction in system specific system mass as compared to heat pipe or pumped fluid radiator concepts currently under consideration (1.3 kg/sq m versus 5 kg/sq m).
Glutamate and dopamine in schizophrenia: an update for the 21st century
Howes, Oliver; McCutcheon, Rob; Stone, James
2016-01-01
The glutamate and dopamine hypotheses are leading theories of the pathoaetiology of schizophrenia. Both were initially based on indirect evidence from pharmacological studies supported by post-mortem findings, but have since been substantially advanced by new lines of evidence from in vivo imaging studies. This review provides an up- date on the latest findings on dopamine and glutamate abnormalities in schizophrenia, focusing on the in vivo neuroimaging studies in patients and clinical high risk groups, and considers their implications for understanding the biology and treatment of schizophrenia. These findings have refined both the dopamine and glutamate hypotheses, enabling greater anatomical and functional specificity, and have been complemented by preclinical evidence showing how the risk factors for schizophrenia impact on the dopamine and glutamate systems. The implications of this new evidence for understanding the development and treatment of schizophrenia are considered, and the gaps in current knowledge highlighted. Finally the evidence for an integrated model of the interactions between the glutamate and dopamine systems is reviewed, and future directions discussed. PMID:25586400
NASA Astrophysics Data System (ADS)
Byars-Winston, Angela M.; Branchaw, Janet; Pfund, Christine; Leverett, Patrice; Newton, Joseph
2015-10-01
Few studies have empirically investigated the specific factors in mentoring relationships between undergraduate researchers (mentees) and their mentors in the biological and life sciences that account for mentees' positive academic and career outcomes. Using archival evaluation data from more than 400 mentees gathered over a multi-year period (2005-2011) from several undergraduate biology research programs at a large, Midwestern research university, we validated existing evaluation measures of the mentored research experience and the mentor-mentee relationship. We used a subset of data from mentees (77% underrepresented racial/ethnic minorities) to test a hypothesized social cognitive career theory model of associations between mentees' academic outcomes and perceptions of their research mentoring relationships. Results from path analysis indicate that perceived mentor effectiveness indirectly predicted post-baccalaureate outcomes via research self-efficacy beliefs. Findings are discussed with implications for developing new and refining existing tools to measure this impact, programmatic interventions to increase the success of culturally diverse research mentees and future directions for research.
Middleton, Beth A.
2014-01-01
A cornerstone of ecosystem ecology, decomposition was recognized as a fundamental process driving the exchange of energy in ecosystems by early ecologists such as Lindeman 1942 and Odum 1960). In the history of ecology, studies of decomposition were incorporated into the International Biological Program in the 1960s to compare the nature of organic matter breakdown in various ecosystem types. Such studies still have an important role in ecological studies of today. More recent refinements have brought debates on the relative role microbes, invertebrates and environment in the breakdown and release of carbon into the atmosphere, as well as how nutrient cycling, production and other ecosystem processes regulated by decomposition may shift with climate change. Therefore, this bibliography examines the primary literature related to organic matter breakdown, but it also explores topics in which decomposition plays a key supporting role including vegetation composition, latitudinal gradients, altered ecosystems, anthropogenic impacts, carbon storage, and climate change models. Knowledge of these topics is relevant to both the study of ecosystem ecology as well projections of future conditions for human societies.
Promoting a skills-based agenda in Olympic sports: the role of skill-acquisition specialists.
Williams, A Mark; Ford, Paul R
2009-11-01
We highlight the importance of promoting a skills-based agenda in the development and preparation of Olympic athletes. The role that specialists with a background in skill acquisition can play is illustrated and the need to move towards a culture where evidence-based practice permeates all aspects of this process reiterated. We provide examples from contemporary research to illustrate how skill-acquisition theory and practice can help inform and guide practitioners, coaches, and administrators in their quest to develop Olympic athletes. Although the acquisition and refinement of skills are essential to performance in most Olympic sports, paradoxically the area of skill acquisition has not impacted in a concerted and meaningful way on this agenda. Skill-acquisition specialists need to be more proactive in forging links with elite sport, whereas practitioners, coaches, and administrators need to appreciate the important role that sports scientists with a background in this area can play in helping to develop future generations of podium athletes.
NASA Astrophysics Data System (ADS)
Wada, Y.
2017-12-01
Increased occurrence of extreme climate events is one of the most damaging consequences of global climate change today and in the future. Estimating the impacts of such extreme events on global and regional water resources is therefore crucial for quantifying increasing risks from climate change. The quest for water security has been a struggle throughout human history. Only in recent years has the scale of this quest moved beyond the local, to the national and regional scales and to the planet itself. Absent or unreliable water supply, sanitation and irrigation services, unmitigated floods and droughts, and degraded water environments severely impact half of the planet's population. The scale and complexity of the water challenges faced by society, particularly but not only in the world's poorest regions, are now recognized, as is the imperative of overcoming these challenges for a stable and equitable world. IIASA's Water Futures and Solutions Initiative (WFAS) is an unprecedented inter-disciplinary scientific initiative to identify robust and adaptive portfolios of optional solutions across different economic sectors, including agriculture, energy and industry, and to test these solution-portfolios with multi-model ensembles of hydrologic and sector models to obtain a clearer picture of the trade-offs, risks, and opportunities. The results of WFaS scenarios and models provide a basis for long-term strategic planning of water resource development under changing environments and increasing climate extremes. And given the complexity of the water system, WFaS uniquely provides policy makers with optional sets of solutions that work together and that can be easily adapted as circumstances change in the future. As WFaS progresses, it will establish a network involving information exchange, mutual learning and horizontal cooperation across teams of researchers, public and private decision makers and practitioners exploring solutions at regional, national and local scales. The initiative includes a major stakeholder consultation component, to inform and guide the science and to test and refine policy and business outcome.
The impact of future sea-level rise on the global tides
NASA Astrophysics Data System (ADS)
Pickering, M. D.; Horsburgh, K. J.; Blundell, J. R.; Hirschi, J. J.-M.; Nicholls, R. J.; Verlaan, M.; Wells, N. C.
2017-06-01
Tides are a key component in coastal extreme water levels. Possible changes in the tides caused by mean sea-level rise (SLR) are therefore of importance in the analysis of coastal flooding, as well as many other applications. We investigate the effect of future SLR on the tides globally using a fully global forward tidal model: OTISmpi. Statistical comparisons of the modelled and observed tidal solutions demonstrate the skill of the refined model setup with no reliance on data assimilation. We simulate the response of the four primary tidal constituents to various SLR scenarios. Particular attention is paid to future changes at the largest 136 coastal cities, where changes in water level would have the greatest impact. Spatially uniform SLR scenarios ranging from 0.5 to 10 m with fixed coastlines show that the tidal amplitudes in shelf seas globally respond strongly to SLR with spatially coherent areas of increase and decrease. Changes in the M2 and S2 constituents occur globally in most shelf seas, whereas changes in K1 and O1 are confined to Asian shelves. With higher SLR tidal changes are often not proportional to the SLR imposed and larger portions of mean high water (MHW) changes are above proportional. Changes in MHW exceed ±10% of the SLR at 10% of coastal cities. SLR scenarios allowing for coastal recession tend increasingly to result in a reduction in tidal range. The fact that the fixed and recession shoreline scenarios result mainly in changes of opposing sign is explained by the effect of the perturbations on the natural period of oscillation of the basin. Our results suggest that coastal management strategies could influence the sign of the tidal amplitude change. The effect of a spatially varying SLR, in this case fingerprints of the initial elastic response to ice mass loss, modestly alters the tidal response with the largest differences at high latitudes.
Research and Development Trend of Shape Control for Cold Rolling Strip
NASA Astrophysics Data System (ADS)
Wang, Dong-Cheng; Liu, Hong-Min; Liu, Jun
2017-09-01
Shape is an important quality index of cold rolling strip. Up to now, many problems in the shape control domain have not been solved satisfactorily, and a review on the research progress in the shape control domain can help to seek new breakthrough directions. In the past 10 years, researches and applications of shape control models, shape control means, shape detection technology, and shape control system have achieved significant progress. In the aspect of shape control models, the researches in the past improve the accuracy, speed and robustness of the models. The intelligentization of shape control models should be strengthened in the future. In the aspect of the shape control means, the researches in the past focus on the roll optimization, mill type selection, process optimization, local strip shape control, edge drop control, and so on. In the future, more attention should be paid to the coordination control of both strip shape and other quality indexes, and the refinement of control objective should be strengthened. In the aspects of shape detection technology and shape control system, some new types of shape detection meters and shape control systems are developed and have successfully industrial applications. In the future, the standardization of shape detection technology and shape control system should be promoted to solve the problem of compatibility. In general, the four expected development trends of shape control for cold rolling strip in the future are intelligentization, coordination, refinement, and standardization. The proposed research provides new breakthrough directions for improving shape quality.
Liu, Dunyi; Liu, Yumin; Zhang, Wei; Chen, Xinping; Zou, Chunqin
2017-01-01
Zinc (Zn) deficiency is a common disorder of humans in developing countries. The effect of Zn biofortification (via application of six rates of Zn fertilizer to soil) on Zn bioavailability in wheat grain and flour and its impacts on human health was evaluated. Zn bioavailability was estimated with a trivariate model that included Zn homeostasis in the human intestine. As the rate of Zn fertilization increased, the Zn concentration increased in all flour fractions, but the percentages of Zn in standard flour (25%) and bran (75%) relative to total grain Zn were constant. Phytic acid (PA) concentrations in grain and flours were unaffected by Zn biofortification. Zn bioavailability and the health impact, as indicated by disability-adjusted life years (DALYs) saved, increased with the Zn application rate and were greater in standard and refined flour than in whole grain and coarse flour. The biofortified standard and refined flour obtained with application of 50 kg/ha ZnSO4·7H2O met the health requirement (3 mg of Zn obtained from 300 g of wheat flour) and reduced DALYs by >20%. Although Zn biofortification increased Zn bioavailability in standard and refined flour, it did not reduce the bioavailability of iron, manganese, or copper in wheat flour. PMID:28481273
Liu, Dunyi; Liu, Yumin; Zhang, Wei; Chen, Xinping; Zou, Chunqin
2017-05-06
Zinc (Zn) deficiency is a common disorder of humans in developing countries. The effect of Zn biofortification (via application of six rates of Zn fertilizer to soil) on Zn bioavailability in wheat grain and flour and its impacts on human health was evaluated. Zn bioavailability was estimated with a trivariate model that included Zn homeostasis in the human intestine. As the rate of Zn fertilization increased, the Zn concentration increased in all flour fractions, but the percentages of Zn in standard flour (25%) and bran (75%) relative to total grain Zn were constant. Phytic acid (PA) concentrations in grain and flours were unaffected by Zn biofortification. Zn bioavailability and the health impact, as indicated by disability-adjusted life years (DALYs) saved, increased with the Zn application rate and were greater in standard and refined flour than in whole grain and coarse flour. The biofortified standard and refined flour obtained with application of 50 kg/ha ZnSO₄·7H₂O met the health requirement (3 mg of Zn obtained from 300 g of wheat flour) and reduced DALYs by >20%. Although Zn biofortification increased Zn bioavailability in standard and refined flour, it did not reduce the bioavailability of iron, manganese, or copper in wheat flour.
O'Brien-Pallas, Linda; Griffin, Pat; Shamian, Judith; Buchan, James; Duffield, Christine; Hughes, Frances; Spence Laschinger, Heather K; North, Nicola; Stone, Patricia W
2006-08-01
Research about the economic impact of nurse turnover has been compromised by a lack of consistent definitions and measurement. This article describes a study that was designed to refine a methodology to examine the costs associated with nurse turnover. Nursing unit managers responded to a survey that contained items relating to budgeted full-time equivalents, new hires, and turnover, as well as direct and indirect costs. The highest mean direct cost was incurred through temporary replacements, whereas the highest indirect cost was decreased initial productivity of the new hire. The study allowed the identification of the availability of data and where further refinement of data definition of variables is needed. The results provided significant evidence to justify increased emphasis on nurse retention strategies and the creation of healthy work environments for nurses.
The state of animal welfare in the context of refinement.
Zurlo, Joanne; Hutchinson, Eric
2014-01-01
The ultimate goal of the Three Rs is the full replacement of animals used in biomedical research and testing. However, replacement is unlikely to occur in the near future; therefore the scientific community as a whole must continue to devote considerable effort to ensure optimal animal welfare for the benefit of the science and the animals, i.e., the R of refinement. Laws governing the care and use of laboratory animals have recently been revised in Europe and the US and these place greater emphasis on promoting the well-being of the animals in addition to minimizing pain and distress. Social housing for social species is now the default condition, which can present a challenge in certain experimental settings and for certain species. The practice of positive reinforcement training of laboratory animals, particularly non-human primates, is gathering momentum but is not yet universally employed. Enhanced consideration of refinement extends to rodents, particularly mice, whose use is still increasing as more genetically modified models are generated. The wastage of extraneous mice and the method of their euthanasia are refinement issues that still need to be addressed. An international, concerted effort into defining the needs of laboratory animals is still necessary to improve the quality of the animal models used as well as their welfare.
NASA Astrophysics Data System (ADS)
Zakwan; Raja, PM; Giyanto
2018-02-01
Indonesia is one of the crude palm oil (CPO) production country in the world. As many products are derivated from the CPO, the quality must be increased continuously. One of the things that influence the quality of palm oil is the Fe and Cu content. The objective of this research was to reduce Fe and Cu content in Refined Bleached Palm Oil (RBPO). In processing CPO or Refined Bleachead Palm Oil (RBPO) may be contaminated by Fe and Cu from metal tank and pipe in the factory. The zeolite and bentonite was activated by maceration method using hydrochloric acid (0,1 N). Four batch reactions consisting of refined palm oil (RPO), activated natural zeolite-bentonite (ANZB) was bleached by heating and stirring them at about 105°C and 1200 rpm for 30 minutes. The results showed that all combinations of ANZB can reduce the Fe content. Thereafter, the optimal combination of ANZB was obtained in K1, K2 and K4 with Cu content 0.02 ppm. In the future, it is needed to study on the reduction of the Fe and Cu content in palm oil with the other adsorbent.
Hashemi, Sepehr; Armand, Mehran; Gordon, Chad R
2016-10-01
To describe the development and refinement of the computer-assisted planning and execution (CAPE) system for use in face-jaw-teeth transplants (FJTTs). Although successful, some maxillofacial transplants result in suboptimal hybrid occlusion and may require subsequent surgical orthognathic revisions. Unfortunately, the use of traditional dental casts and splints pose several compromising shortcomings in the context of FJTT and hybrid occlusion. Computer-assisted surgery may overcome these challenges. Therefore, the use of computer-assisted orthognathic techniques and functional planning may prevent the need for such revisions and improve facial-skeletal outcomes. A comprehensive CAPE system for use in FJTT was developed through a multicenter collaboration and refined using plastic models, live miniature swine surgery, and human cadaver models. The system marries preoperative surgical planning and intraoperative execution by allowing on-table navigation of the donor fragment relative to recipient cranium, and real-time reporting of patient's cephalometric measurements relative to a desired dental-skeletal outcome. FJTTs using live-animal and cadaveric models demonstrate the CAPE system to be accurate in navigation and beneficial in improving hybrid occlusion and other craniofacial outcomes. Future refinement of the CAPE system includes integration of more commonly performed orthognathic/maxillofacial procedures.
Effects of NaBF4 + NaF on the Tensile and Impact Properties of Al-Si-Mg-Fe Alloys
NASA Astrophysics Data System (ADS)
Chen, Zongning; Wang, Tongmin; Zhao, Yufei; Zheng, Yuanping; Kang, Huijun
2015-05-01
NaBF4 + NaF were found to play three roles, i.e., Fe-eliminator, grain refiner, and eutectic modifier, in treating A356 alloy with a high Fe content. The joint effects led to significant improvement in both tensile and impact properties of thus treated alloy. The multiple reactions between the NaBF4 + NaF and Al-Si-Mg-Fe system are suggested to form Fe2B, AlB2, and Na in the melt, as per thermodynamic analysis. The three are responsible for Fe removal, grain refinement, and eutectic modification, respectively. When NaBF4 and NaF are mixed in weight ratio of 1:1, an optimum addition rate is in the range between 1.0 and 2.0 wt pct for treating AlSi7Mg0.3Fe0.65 alloy, based on the results of tensile and impact tests. Excessive addition of the salt may deteriorate the mechanical properties of the alloy, basically owing to overmodification of Si and contamination of salt inclusions.
NASA Astrophysics Data System (ADS)
Karthikeyan, T.; Thomas Paul, V.; Saroja, S.; Moitra, A.; Sasikala, G.; Vijayalakshmi, M.
2011-12-01
This paper presents the results of an experimental investigation where an enhancement in Charpy impact toughness and decrease in DBTT was obtained through grain refinement in 9Cr-1Mo steel. The steel in the normalized and tempered condition (1323 K/air cool + 1023 K/2 h/air cool) had an average prior-austenite grain size of 26 μm. By designing a two-stage normalizing (1323 K/2 h/water quench + 1223 K/2 h/air cool) and tempering treatment (1023 K/2 h/air cool), a homogeneous tempered martensite microstructure with a lesser prior-austenite grain size of 12 μm could be obtained. An improvement trend in impact properties of standard sized Charpy specimens was obtained in fine-grained steel: upper shelf energy increased from 175 J to 210 J, and DBTT reduced from 243 K to 228 K. This heat treatment is unique since an attempt to carry out a single-stage low temperature normalizing treatment (1223 K/2 h/air cool) did not give a complete martensite structure, due to the incomplete dissolution of carbides during austenitization.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-01
...-Loss Provision 3. Future Refinements VIII. Secretary's Recommendations IX. Waiver of Notice and Comment... corresponding meanings in alphabetical order below: BBRA Medicare, Medicaid and SCHIP [State Children's Health... 1886(s)(2)(A)(ii) of the Social Security Act (the Act) and a 0.5 percentage point reduction for economy...
Tom Leuschen; Dale Wade; Paula Seamon
2001-01-01
The success of a fire use program is in large part dependent on a solid foundation set in clear and concise planning. The planning process results in specific goals and measurable objectives for fire application, provides a means of setting priorities, and establishes a mechanism for evaluating and refining the process to meet the desired future condition. It is an...
Solar Current Output as a Function of Sun Elevation: Students as Toolmakers
ERIC Educational Resources Information Center
Igoe, D. P.; Parisi, A. V.
2015-01-01
Solar current is an increasingly important aspect of modern life and will be even more so crucial in the students' future. Encouraging students to be the "toolmakers" allows students to take ownership of scientific investigations, as well as forcing them to refine their research questions and hypothesis, including the design and…
The End of Flat Earth Economics & the Transition to Renewable Resource Societies.
ERIC Educational Resources Information Center
Henderson, Hazel
1978-01-01
A post-industrial revolution is predicted for the future with an accompanying shift of focus from simple, brute force technolgies, based on cheap, accessible resources and energy, to a second generation of more subtle, refined technologies grounded in a much deeper understanding of biological and ecological realities. (Author/BB)
Points of Interest: What Determines Interest Rates?
ERIC Educational Resources Information Center
Schilling, Tim
Interest rates can significantly influence people's behavior. When rates decline, homeowners rush to buy new homes and refinance old mortgages; automobile buyers scramble to buy new cars; the stock market soars, and people tend to feel more optimistic about the future. But even though individuals respond to changes in rates, they may not fully…
Critical Research and the Future of Literacy Education
ERIC Educational Resources Information Center
Morrell, Ernest
2009-01-01
This commentary argues for a specific conception of research, what the author and others call "critical research." The author asserts that critical research can help educators to identify quality teaching in literacy classrooms even as it helps to refine (or even redefine) their notions of curricula, pedagogy, literacy, and achievement. Here, the…
Developing a Rubric to Support the Evaluation of Professional Development School Partnerships
ERIC Educational Resources Information Center
Polly, Drew; Smaldino, Sharon; Brynteson, Kristin
2015-01-01
This article describes the synthesis of the NCATE PDS Standards, the NAPDS Nine Essentials and the CAEP Standards to create a rubric that can be used to help PDS stakeholders develop, refine, and evaluate their partnerships. Implications and future directions on how to use the rubric are also shared.
An Agency Theory Perspective on Student Performance Evaluation
ERIC Educational Resources Information Center
Smith, Michael E.; Zsidisin, George A.; Adams, Laural L.
2005-01-01
The emphasis in recent research on the responsibility of college and university business instructors to prepare students for future employment underscores a need to refine the evaluation of student performance. In this article, an agency theory framework is used to understand the trade-offs that may be involved in the selection of various…
Recent advances and future technologies for baled silages
USDA-ARS?s Scientific Manuscript database
Although the concept of ensiling large-round or large-square bales dates back to the late 1970’s, there have been many refinements to both equipment and management since that time, resulting in much greater acceptance by small or mid-sized dairy or beef producers. There are several reasons this sila...
Application of historical mobility testing to sensor-based robotic performance
NASA Astrophysics Data System (ADS)
Willoughby, William E.; Jones, Randolph A.; Mason, George L.; Shoop, Sally A.; Lever, James H.
2006-05-01
The USA Engineer Research and Development Center (ERDC) has conducted on-/off-road experimental field testing with full-sized and scale-model military vehicles for more than fifty years. Some 4000 acres of local terrain are available for tailored field evaluations or verification/validation of future robotic designs in a variety of climatic regimes. Field testing and data collection procedures, as well as techniques for quantifying terrain in engineering terms, have been developed and refined into algorithms and models for predicting vehicle-terrain interactions and resulting forces or speeds of military-sized vehicles. Based on recent experiments with Matilda, Talon, and Pacbot, these predictive capabilities appear to be relevant to most robotic systems currently in development. Utilization of current testing capabilities with sensor-based vehicle drivers, or use of the procedures for terrain quantification from sensor data, would immediately apply some fifty years of historical knowledge to the development, refinement, and implementation of future robotic systems. Additionally, translation of sensor-collected terrain data into engineering terms would allow assessment of robotic performance a priori deployment of the actual system and ensure maximum system performance in the theater of operation.
An investigation of the relationship between innovation and cultural diversity.
Kandler, Anne; Laland, Kevin N
2009-08-01
In this paper we apply reaction-diffusion models to explore the relationship between the rate of behavioural innovation and the level of cultural diversity. We investigate how both independent invention and the modification and refinement of established innovations impact on cultural dynamics and diversity. Further, we analyse these relationships in the presence of biases in cultural learning and find that the introduction of new variants typically increases cultural diversity substantially in the short term, but may decrease long-term diversity. Independent invention generally supports higher levels of cultural diversity than refinement. Repeated patterns of innovation through refinement generate characteristic oscillating trends in diversity, with increasing trends towards greater average diversity observed for medium but not low innovation rates. Conformity weakens the relationship between innovation and diversity. The level of cultural diversity, and pattern of temporal dynamics, potentially provide clues as to the underlying process, which can be used to interpret empirical data.
The Impact of Cooperative Quizzes in a Large Introductory Astronomy Course for Non-Science Majors
ERIC Educational Resources Information Center
Zeilik, Michael; Morris, Vicky J.
2004-01-01
In Astronomy 101 at the University of New Mexico, we carried out a repeated-items experiment on quizzes and tests to investigate the impact of cooperative testing. This trial was the only change in a reformed course format that had been refined over previous semesters. Our research questions were: (1) Did cooperative quizzes result in gains for…
NASA Technical Reports Server (NTRS)
Kogut, J.
1981-01-01
The NIMBUS 7 Scanning Multichannel Microwave Radiometer (SMMR) data are analyzed. The impact of cross polarization and Faraday rotation on SMMR derived brightness temperatures is evaluated. The algorithms used to retrieve the geophysical parameters are tested, refined, and compared with values derived by other techniques. The technical approach taken is described and the results presented.
NASA Astrophysics Data System (ADS)
Im, Eun-Soon; Coppola, Erika; Giorgi, Felippo
2010-05-01
Given the discernable evidences of climate changes due to human activity, there is a growing demand for the reliable climate change scenario in response to future emission forcing. One of the most significant impacts of climate changes can be that on the hydrological process. Changes in the seasonality and increase in the low and high rainfall extremes can severely influence the water balance of river basin, with serious consequences for societies and ecosystems. In fact, recent studies have reported that East Asia including the Korean peninsula is regarded to be a highly vulnerability region under global warming, in particular for water resources. As an attempt accurately assess the impact of climate change over Korea, we performed a downscaling of the ECAHM5-MPI/OM global projection under the A1B emission scenario for the period 1971-2100 using the RegCM3 one-way double-nested system. Physically based long-term (130 years) fine-scale (20 km) climate information is appropriate for analyzing the detailed structure of the hydroclimatological response to climate change. Changes in temperature and precipitation are translated to the hydrological condition in a direct or indirect way. The change in precipitation shows a distinct seasonal variations and a complicated spatial pattern. While changes in total precipitation do not show any relevant trend, the change patterns in daily precipitation clearly show an enhancement of high intensity precipitation and a reduction of weak intensity precipitation. The increase of temperature enhances the evapotranspiration, and hence the actual water stress becomes more pronounced in the future climate. Precipitation, snow, and runoff changes show the relevant topographical modulation under global warming. This study clearly demonstrates the importance of a refined topography for improving the accuracy of the local climatology. Improved accuracy of regional climate projection could lead to an enhanced reliability of the interpretation of the warming effect, especially when viewed in the linkage climate change information and impact assessment studies.
Roch, Geneviève; Borgès Da Silva, Roxane; de Montigny, Francine; Witteman, Holly O; Pierce, Tamarha; Semenic, Sonia; Poissant, Julie; Parent, André-Anne; White, Deena; Chaillet, Nils; Dubois, Carl-Ardy; Ouimet, Mathieu; Lapointe, Geneviève; Turcotte, Stéphane; Prud'homme, Alexandre; Painchaud Guérard, Geneviève; Gagnon, Marie-Pierre
2018-05-29
Prenatal education is a core component of perinatal care and services provided by health institutions. Whereas group prenatal education is the most common educational model, some health institutions have opted to implement online prenatal education to address accessibility issues as well as the evolving needs of future parents. Various studies have shown that prenatal education can be effective in acquisition of knowledge on labour and delivery, reducing psychological distress and maximising father's involvement. However, these results may depend on educational material, organization, format and content. Furthermore, the effectiveness of online prenatal education compared to group prenatal education remains unclear in the literature. This project aims to evaluate the impacts of group prenatal education and online prenatal education on health determinants and users' health status, as well as on networks of perinatal educational services maintained with community-based partners. This multipronged mixed methods study uses a collaborative research approach to integrate and mobilize knowledge throughout the process. It consists of: 1) a prospective cohort study with quantitative data collection and qualitative interviews with future and new parents; and 2) a multiple case study integrating documentary sources and interviews with stakeholders involved in the implementation of perinatal information service networks and collaborations with community partners. Perinatal health indicators and determinants will be compared between prenatal education groups (group prenatal education and online prenatal education) and standard care without these prenatal education services (control group). This study will provide knowledge about the impact of online prenatal education as a new technological service delivery model compared to traditional group prenatal education. Indicators related to the complementarity of these interventions and those available in community settings will refine our understanding of regional perinatal services networks. Results will assist decision-making regarding service organization and delivery models of prenatal education services. Version 1 (February 9 2018).
Survey of Ambient Air Pollution Health Risk Assessment Tools.
Anenberg, Susan C; Belova, Anna; Brandt, Jørgen; Fann, Neal; Greco, Sue; Guttikunda, Sarath; Heroux, Marie-Eve; Hurley, Fintan; Krzyzanowski, Michal; Medina, Sylvia; Miller, Brian; Pandey, Kiran; Roos, Joachim; Van Dingenen, Rita
2016-09-01
Designing air quality policies that improve public health can benefit from information about air pollution health risks and impacts, which include respiratory and cardiovascular diseases and premature death. Several computer-based tools help automate air pollution health impact assessments and are being used for a variety of contexts. Expanding information gathered for a May 2014 World Health Organization expert meeting, we survey 12 multinational air pollution health impact assessment tools, categorize them according to key technical and operational characteristics, and identify limitations and challenges. Key characteristics include spatial resolution, pollutants and health effect outcomes evaluated, and method for characterizing population exposure, as well as tool format, accessibility, complexity, and degree of peer review and application in policy contexts. While many of the tools use common data sources for concentration-response associations, population, and baseline mortality rates, they vary in the exposure information source, format, and degree of technical complexity. We find that there is an important tradeoff between technical refinement and accessibility for a broad range of applications. Analysts should apply tools that provide the appropriate geographic scope, resolution, and maximum degree of technical rigor for the intended assessment, within resources constraints. A systematic intercomparison of the tools' inputs, assumptions, calculations, and results would be helpful to determine the appropriateness of each for different types of assessment. Future work would benefit from accounting for multiple uncertainty sources and integrating ambient air pollution health impact assessment tools with those addressing other related health risks (e.g., smoking, indoor pollution, climate change, vehicle accidents, physical activity). © 2016 Society for Risk Analysis.
Simoneau, Teresa L; Kilbourn, Kristin; Spradley, Janet; Laudenslager, Mark L
2017-08-01
Caregivers of cancer patients face challenges impacting their physical, psychological and social well-being that need attention in the form of well-designed and tested interventions. We created an eight-session individual stress management intervention for caregivers of allogeneic hematopoietic stem cell transplant (Allo-HSCT) recipients. This intervention, tested by randomized control trial, proved effective in decreasing distress. Herein, we describe the intervention including theoretical framework, development, and elements of fidelity. Implementation challenges along with recommendations for refinement in future studies are discussed with the goal of replication and dissemination. Seventy-four of 148 caregivers received stress management training following randomization. The intervention occurred during the 100-day post-transplant period when caregivers are required. The training provided integrated cognitive behavioral strategies, psychoeducation, and problem-solving skills building as well as use of a biofeedback device. Seventy percent of caregivers completed all eight sessions indicating good acceptability for the in-person intervention; however, most caregivers did not reliably use the biofeedback device. The most common reason for drop-out was their patient becoming gravely ill or patient death. Few caregivers dropped out because of study demands. The need for flexibility in providing intervention sessions was key to retention. Our evidence-based stress management intervention for Allo-HSCT caregivers was feasible. Variability in acceptability and challenges in implementation are discussed and suggestions for refinement of the intervention are outlined. Dissemination efforts could improve by using alternative methods for providing caregiver support such as telephone or video chat to accommodate caregivers who are unable to attend in-person sessions.
Case Study of Urban Residential Remediation and Restoration in Port Hope, Canada - 13250
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geddes, Brian; DeJong, John; Owen, Michael
2013-07-01
The Canadian Municipality of Port Hope, Ontario, is located some 100 km east of Toronto and has been the location of radium and/or uranium refining since the 1930's. Historically, these activities involved materials containing radium-226, uranium, arsenic and other contaminants generated by the refining process. In years past, properties and sites in Port Hope became contaminated from spillage during transportation, unrecorded, un-monitored or unauthorized diversion of contaminated fill and materials, wind and water erosion and spread from residue storage areas. Residential properties in Port Hope impacted by radioactive materials are being addressed by the Canadian federal government under programs administeredmore » by the Low-Level Radioactive Waste Management Office (LLRWMO) and the Port Hope Area Initiative Management Office (PHAIMO). Issues that currently arise at these properties are addressed by the LLRWMO's Interim Waste Management Program (IWM). In the future, these sites will be included in the PHAIMO's Small Scale Sites (SSS) remedial program. The LLRWMO has recently completed a remediation and restoration program at a residential property in Port Hope that has provided learnings that will be applicable to the PHAIMO's upcoming SSS remedial effort. The work scope at this property involved remediating contaminated refinery materials that had been re-used in the original construction of the residence. Following removal of the contaminated materials, the property was restored for continued residential use. This kind of property represents a relatively small, but potentially challenging subset of the portfolio of sites that will eventually be addressed by the SSS program. (authors)« less
ERIC Educational Resources Information Center
Sinfelt, John H.
1985-01-01
Chemical reaction rates can be controlled by varying composition of miniscule clusters of metal atoms. These bimetallic catalysts have had major impact on petroleum refining, where work has involved heterogeneous catalysis (reacting molecules in a phase separate from catalyst.) Experimentation involving hydrocarbon reactions, catalytic…
Brusatte, Stephen L; Candeiro, Carlos R A; Simbras, Felipe M
2017-01-01
The non-avian dinosaurs died out at the end of the Cretaceous, ~66 million years ago, after an asteroid impact. The prevailing hypothesis is that the effects of the impact suddenly killed the dinosaurs, but the poor fossil record of latest Cretaceous (Campanian-Maastrichtian) dinosaurs from outside Laurasia (and even more particularly, North America) makes it difficult to test specific extinction scenarios. Over the past few decades, a wealth of new discoveries from the Bauru Group of Brazil has revealed a unique window into the evolution of terminal Cretaceous dinosaurs from the southern continents. We review this record and demonstrate that there was a diversity of dinosaurs, of varying body sizes, diets, and ecological roles, that survived to the very end of the Cretaceous (Maastrichtian: 72-66 million years ago) in Brazil, including a core fauna of titanosaurian sauropods and abelisaurid and carcharodontosaurid theropods, along with a variety of small-to-mid-sized theropods. We argue that this pattern best fits the hypothesis that southern dinosaurs, like their northern counterparts, were still diversifying and occupying prominent roles in their ecosystems before the asteroid suddenly caused their extinction. However, this hypothesis remains to be tested with more refined paleontological and geochronological data, and we give suggestions for future work.
Future Lunar Sampling Missions: Big Returns on Small Samples
NASA Astrophysics Data System (ADS)
Shearer, C. K.; Borg, L.
2002-01-01
The next sampling missions to the Moon will result in the return of sample mass (100g to 1 kg) substantially smaller than those returned by the Apollo missions (380 kg). Lunar samples to be returned by these missions are vital for: (1) calibrating the late impact history of the inner solar system that can then be extended to other planetary surfaces; (2) deciphering the effects of catastrophic impacts on a planetary body (i.e. Aitken crater); (3) understanding the very late-stage thermal and magmatic evolution of a cooling planet; (4) exploring the interior of a planet; and (5) examining volatile reservoirs and transport on an airless planetary body. Can small lunar samples be used to answer these and other pressing questions concerning important solar system processes? Two potential problems with small, robotically collected samples are placing them in a geologic context and extracting robust planetary information. Although geologic context will always be a potential problem with any planetary sample, new lunar samples can be placed within the context of the important Apollo - Luna collections and the burgeoning planet-scale data sets for the lunar surface and interior. Here we illustrate the usefulness of applying both new or refined analytical approaches in deciphering information locked in small lunar samples.
NASA Technical Reports Server (NTRS)
Talpe Matthieu; Zuber, Maria T.; Yang, Di; Neumann, Gregory A.; Solomon, Sean C.; Mazarico, Erwan; Vilas, Faith
2012-01-01
Earth-based radar images of Mercury show radar-bright material inside impact craters near the planet s poles. A previous study indicated that the polar-deposit-hosting craters (PDCs) at Mercury s north pole are shallower than craters that lack such deposits. We use data acquired by the Mercury Laser Altimeter on the MESSENGER spacecraft during 11 months of orbital observations to revisit the depths of craters at high northern latitudes on Mercury. We measured the depth and diameter of 537 craters located poleward of 45 N, evaluated the slopes of the northern and southern walls of 30 PDCs, and assessed the floor roughness of 94 craters, including nine PDCs. We find that the PDCs appear to have a fresher crater morphology than the non-PDCs and that the radar-bright material has no detectable influence on crater depths, wall slopes, or floor roughness. The statistical similarity of crater depth-diameter relations for the PDC and non-PDC populations places an upper limit on the thickness of the radar-bright material (< 170 m for a crater 11 km in diameter) that can be refined by future detailed analysis. Results of the current study are consistent with the view that the radar-bright material constitutes a relatively thin layer emplaced preferentially in comparatively young craters.
De Marchis, Emilia H; Doekhie, Kirti; Willard-Grace, Rachel; Olayiwola, J Nwando
2018-06-19
Over the past decade, the Patient-Centered Medical Home (PCMH) has become a preeminent model for primary care delivery. Simultaneously, health care disparities have gained increasing attention. There has been limited research on whether and how the PCMH can or should affect health care disparities. The authors conducted qualitative interviews with key stakeholders and experts on the PCMH model and health care disparities, including grant and policy makers, accreditors, researchers, patient advocates, primary care practices, practice transformation organizations, and payers, to assess perspectives on the role of the PCMH in addressing health care disparities. The application of grounded theory and thematic analysis elucidated best practice recommendations for the PCMH model's role in addressing health care disparities. Although the majority of stakeholders support greater integration of efforts to reduce health care disparities into the PCMH model, most stakeholders view the current PCMH model as having minimal or indirect influence on health care disparities. The majority supported greater integration of efforts to reduce health care disparities into the PCMH model. As the PCMH model continues to be refined, and as the health care system strives toward improving population health, there must be reflection on the policies and delivery systems that impact health care disparities.
NASA Technical Reports Server (NTRS)
Gillespie, V. G.; Kelly, R. O.
1974-01-01
The problems encountered and special techniques and procedures developed on the Skylab program are described along with the experiences and practical benefits obtained for dissemination and use on future programs. Three major topics are discussed: electrical problems, mechanical problems, and special techniques. Special techniques and procedures are identified that were either developed or refined during the Skylab program. These techniques and procedures came from all manufacturing and test phases of the Skylab program and include both flight and GSE items from component level to sophisticated spaceflight systems.
Coping with changing controlled vocabularies.
Cimino, J. J.; Clayton, P. D.
1994-01-01
For the foreseeable future, controlled medical vocabularies will be in a constant state of development, expansion and refinement. Changes in controlled vocabularies must be reconciled with historical patient information which is coded using those vocabularies and stored in clinical databases. This paper explores the kinds of changes that can occur in controlled vocabularies, including adding terms (simple additions, refinements, redundancy and disambiguation), deleting terms, changing terms (major and minor name changes), and other special situations (obsolescence, discovering redundancy, and precoordination). Examples are drawn from actual changes appearing in the 1993 update to the International Classification of Diseases (ICD9-CM). The methods being used at Columbia-Presbyterian Medical Center to reconcile its Medical Entities Dictionary and its clinical database are discussed. PMID:7949906
Preclinical Alzheimer's Disease: Implications for Refinement of the Concept.
Vos, Stephanie J B; Visser, Pieter Jelle
2018-05-23
Increasing interest in clinical trials and clinical research settings to identify Alzheimer's disease (AD) in the earliest stages of the disease has led to the concept of preclinical AD. Individuals with preclinical AD have AD pathology without clinical symptoms yet. Accumulating evidence has shown that biomarkers can identify preclinical AD and that preclinical AD is associated with a poor clinical outcome. Little is known yet about the role of vascular and lifestyle risk factors in the development of preclinical AD. In order to better understand preclinical AD pathology and clinical progression rates, there is a need to refine the concept of preclinical AD. This will be of great value for advancements in future research, clinical trials, and eventually clinical practice.
The Long-Term Conditions Questionnaire: conceptual framework and item development.
Peters, Michele; Potter, Caroline M; Kelly, Laura; Hunter, Cheryl; Gibbons, Elizabeth; Jenkinson, Crispin; Coulter, Angela; Forder, Julien; Towers, Ann-Marie; A'Court, Christine; Fitzpatrick, Ray
2016-01-01
To identify the main issues of importance when living with long-term conditions to refine a conceptual framework for informing the item development of a patient-reported outcome measure for long-term conditions. Semi-structured qualitative interviews (n=48) were conducted with people living with at least one long-term condition. Participants were recruited through primary care. The interviews were transcribed verbatim and analyzed by thematic analysis. The analysis served to refine the conceptual framework, based on reviews of the literature and stakeholder consultations, for developing candidate items for a new measure for long-term conditions. Three main organizing concepts were identified: impact of long-term conditions, experience of services and support, and self-care. The findings helped to refine a conceptual framework, leading to the development of 23 items that represent issues of importance in long-term conditions. The 23 candidate items formed the first draft of the measure, currently named the Long-Term Conditions Questionnaire. The aim of this study was to refine the conceptual framework and develop items for a patient-reported outcome measure for long-term conditions, including single and multiple morbidities and physical and mental health conditions. Qualitative interviews identified the key themes for assessing outcomes in long-term conditions, and these underpinned the development of the initial draft of the measure. These initial items will undergo cognitive testing to refine the items prior to further validation in a survey.
Eliyas, S; Briggs, P; Gallagher, J E
2017-02-24
Objective To explore the experiences of primary care dentists following training to enhance endodontic skills and their views on the implications for the NHS.Design Qualitative study using anonymised free text questionnaires.Setting Primary care general dental services within the National Health Service (NHS) in London, United Kingdom.Subjects and methods Eight primary care dentists who completed this training were asked about factors affecting participant experience of the course, perceived impact on themselves, their organisation, their patients and barriers/facilitators to providing endodontic treatment in NHS primary care. Data were transferred verbatim to a spreadsheet and thematically analysed.Intervention 24-month part-time educational and service initiative to provide endodontics within the NHS, using a combination of training in simulation lab and treatment of patients in primary care.Results Positive impacts were identified at individual (gains in knowledge, skills, confidence, personal development), patient (more teeth saved, quality of care improved) and system levels (access, value for money). Suggested developments for future courses included more case discussions, teaching of practical skills earlier in the course and refinement of the triaging processes. Barriers to using the acquired skills in providing endodontic treatment in primary care within the NHS were perceived to be resources (remuneration, time, skills) and accountability. Facilitators included appropriately remunerated contracts, necessary equipment and time.Conclusion This novel pilot training programme in endodontics combining general practice experience with education/training, hands-on experience and a portfolio was perceived by participants as beneficial for extending skills and service innovation in primary dental care. The findings provide insight into primary dental care practitioners' experience with education/training and have implications for future educational initiatives in support of systems innovation within the NHS.
Anderson, Daniel M; Benson, James D; Kearsley, Anthony J
2014-12-01
Mathematical modeling plays an enormously important role in understanding the behavior of cells, tissues, and organs undergoing cryopreservation. Uses of these models range from explanation of phenomena, exploration of potential theories of damage or success, development of equipment, and refinement of optimal cryopreservation/cryoablation strategies. Over the last half century there has been a considerable amount of work in bio-heat and mass-transport, and these models and theories have been readily and repeatedly applied to cryobiology with much success. However, there are significant gaps between experimental and theoretical results that suggest missing links in models. One source for these potential gaps is that cryobiology is at the intersection of several very challenging aspects of transport theory: it couples multi-component, moving boundary, multiphase solutions that interact through a semipermeable elastic membrane with multicomponent solutions in a second time-varying domain, during a two-hundred Kelvin temperature change with multi-molar concentration gradients and multi-atmosphere pressure changes. In order to better identify potential sources of error, and to point to future directions in modeling and experimental research, we present a three part series to build from first principles a theory of coupled heat and mass transport in cryobiological systems accounting for all of these effects. The hope of this series is that by presenting and justifying all steps, conclusions may be made about the importance of key assumptions, perhaps pointing to areas of future research or model development, but importantly, lending weight to standard simplification arguments that are often made in heat and mass transport. In this first part, we review concentration variable relationships, their impact on choices for Gibbs energy models, and their impact on chemical potentials. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Inoue, Tadanobu; Yin, Fuxing; Kimura, Yuuji; Tsuzaki, Kaneaki; Ochiai, Shojiro
2010-02-01
Bulk ultrafine-grained (UFG) low-carbon steel bars were produced by caliber rolling, and the impact and tensile properties were investigated. Initial samples with two different microstructures, ferrite-pearlite and martensite (or bainite), were prepared and then caliber rolling was conducted at 500 °C. The microstructures in the rolled bars consisted of an elongated UFG structure with a strong α-fiber texture. The rolled bar consisting of spheroidal cementite particles that distributed uniformly in the elongated ferrite matrix of transverse grain sizes 0.8 to 1.0 μm exhibited the best strength-ductility balance and impact properties. Although the yield strength in the rolled bar increased 2.4 times by grain refinement, the upper-shelf energy did not change, and its value was maintained from 100 °C to -40 °C. In the rolled bars, cracks during an impact test branched parallel to the longitudinal direction of the test samples as temperatures decreased. Delamination caused by such crack branching appeared, remarkably, near the ductile-to-brittle transition temperature (DBTT). The effect of delamination on the impact properties was associated with crack propagation on the basis of the microstructural features in the rolled bars. In conclusion, the strength-toughness balance is improved by refining crystal grains and controlling their shape and orientation; in addition, delamination effectively enhances the low-temperature toughness.
NASA Astrophysics Data System (ADS)
Huson, S. A.; Foit, F. F.; Watkinson, A. J.; Pope, M. C.
2009-12-01
Previous X-ray powder diffraction (XRD) studies revealed that shock deformed carbonates and quartz have broader XRD patterns than those of unshocked samples. Entire XRD patterns, single peak profiles and Rietveld refined parameters of carbonate samples from the Sierra Madera impact crater, west Texas, unshocked equivalent samples from 95 miles north of the crater and the Mission Canyon Formation of southwest Montana and western Wyoming were used to evaluate the use of X-ray powder diffraction as a potential tool for distinguishing impact deformed rocks from unshocked and tectonically deformed rocks. At Sierra Madera dolostone and limestone samples were collected from the crater rim (lower shock intensity) and the central uplift (higher shock intensity). Unshocked equivalent dolostone samples were collected from well cores drilled outside of the impact crater. Carbonate rocks of the Mission Canyon Formation were sampled along a transect across the tectonic front of the Sevier and Laramide orogenic belts. Whereas calcite subjected to significant shock intensities at the Sierra Madera impact crater can be differentiated from tectonically deformed calcite from the Mission Canyon Formation using Rietveld refined peak profiles, weakly shocked calcite from the crater rim appears to be indistinguishable from the tectonically deformed calcite. In contrast, Rietveld analysis readily distinguishes shocked Sierra Madera dolomite from unshocked equivalent dolostone samples from outside the crater and tectonically deformed Mission Canyon Formation dolomite.
Free-Mass and Interface Configurations of Hammering Mechanisms
NASA Technical Reports Server (NTRS)
Bao, Xiaoqi (Inventor); Sherrit, Stewart (Inventor); Badescu, Mircea (Inventor); Bar-Cohen, Yoseph (Inventor); Askins, Steve (Inventor); Ostlund, Patrick (Inventor)
2015-01-01
The design of the free-mass in an ultrasonic driller/corer (USDC) has been refined in order to improve the performance and operational reliability of the system. In one embodiment, the improvements in performance and operational reliability include decreasing the impact surface area of the free-mass to increase the transfer of impact energy from the piezoelectric transducer and reductions in the likelihood that the system will jam.
Powers, Christina M; Grieger, Khara; Meacham, Connie A; Gooding, Meredith Lassiter; Gift, Jeffrey S; Lehmann, Geniece M; Hendren, Christine O; Davis, J Michael; Burgoon, Lyle
2016-01-01
Risk assessments and risk management efforts to protect human health and the environment can benefit from early, coordinated research planning by researchers, risk assessors, and risk managers. However, approaches for engaging these and other stakeholders in research planning have not received much attention in the environmental scientific literature. The Comprehensive Environmental Assessment (CEA) approach under development by the United States Environmental Protection Agency (USEPA) is a means to manage complex information and input from diverse stakeholder perspectives on research planning that will ultimately support environmental and human health decision making. The objectives of this article are to 1) describe the outcomes of applying lessons learned from previous CEA applications to planning research on engineered nanomaterial, multiwalled carbon nanotubes (MWCNTs) and 2) discuss new insights and refinements for future efforts to engage stakeholders in research planning for risk assessment and risk management of environmental issues. Although framed in terms of MWCNTs, this discussion is intended to enhance research planning to support assessments for other environmental issues as well. Key insights for research planning include the potential benefits of 1) ensuring that participants have research, risk assessment, and risk management expertise in addition to diverse disciplinary backgrounds; 2) including an early scoping step before rounds of formal ratings; 3) using a familiar numeric scale (e.g., US dollars) versus ordinal rating scales of "importance"; 4) applying virtual communication tools to supplement face-to-face interaction between participants; and 5) refining criteria to guide development of specific, actionable research questions. © 2015 SETAC.
The MCMI-III: present and future directions.
Millon, T; Davis, R D
1997-02-01
Both the original Millon Clinical Multiaxial Inventory (MCMI-I; Millon, 1977) and the Millon Clinical Multiaxial Inventory-II (MCMI-II; Millon, 1987) were refined and strengthened on a regular basis by both theoretic logic and research data. This aspiration has continued. The new Millon Clinical Multiaxial Inventory-III (MCMI-III; Millon, 1994) has been further coordinated with the most recent official diagnostic schema, the Diagnostic and Statistical Manual of Mental Disorders (4th ed., [DSM-IV]; American Psychiatric Association [APA], 1994) in an even more explicit way than before. Although the publication of the first version of the MCMI preceded the publication of the DSM-IV, its author played a major role in formulating the official manual's personality disorders, contributing thereby to their conceptual correspondence. The DSM-III-R (APA, 1987) was subsequently published in the same year as the MCMI-II; the inventory was modified in its final stages to make it as consonant as possible with the conceptual changes introduced in the then forthcoming official classification. The present version of the MCMI, the MCMI-III, strengthens these correspondences further by drawing on many of the diagnostic criteria of the DSM-IV to serve as the basis for drafting the inventory's items. This article reports on a select set of theoretical and empirical developments that are being carefully weighed for possible inclusion in future MCMIs, or as a guide in the refinement process of future MCMIs.
The U.S. Army Chemical Corps and a Future Within AFRICOM
2009-03-01
pollution of surface and coastal waters; poaching of elephants for ivory • Namibia: diamonds, copper, uranium, gold, silver, lead, tin, lithium...desertification; wildlife populations (such as elephant , hippopotamus, giraffe, and lion) threatened because of poaching and habitat destruction...extraction and refining region; chemical runoff into watersheds; poaching seriously threatens rhinoceros, elephant , antelope, and large cat populations
Status of growth and yield information for northern forest types
Dale S. Solomon
1977-01-01
Existing regional growth-and-yield information for most of the northern forest types is summarized by species. Present research is concentrated on growth-simulation models, constructed by either aggregating available information or through individual tree growth studies. A uniformity of more refined measurements is needed so that future growth models can be tried for...
NASA Technical Reports Server (NTRS)
Momenthy, A. M.
1980-01-01
Options for satisfying the future demand for commercial jet fuels are analyzed. It is concluded that the most effective means to this end are to attract more refiners to the jet fuel market and encourage development of processes to convert oil shale and coal to transportation fuels. Furthermore, changing the U.S. refineries fuel specification would not significantly alter jet fuel availability.
Refining the Assessment of Hopelessness: An Improved Way to Look to the Future
ERIC Educational Resources Information Center
Fisher, Lauren B.; Overholser, James C.
2013-01-01
Despite its high sensitivity, the Beck Hopelessness Scale (BHS) has demonstrated low specificity, has an ambiguous factor structure, and includes inadequate items. The current study examined the psychometric properties of a modified BHS (mBHS) using a Likert scale format that would allow for improved reliability, validity, and clinical utility.…
From Designing to Organizing New Social Futures: Multiliteracies Pedagogies for Today
ERIC Educational Resources Information Center
Penuel, William R.; O'Connor, Kevin
2018-01-01
More than 20 years ago, literacy pedagogies informed by the emerging networked world defined by local diversity and global connectedness, new digital media and fast capitalism. Modern people now fully inhabit the world they described, but the contours of that world's racial dynamics and growing inequality call for a refinement of pedagogies that…
Historical Roots and Future Perspectives Related to Nursing Ethics.
ERIC Educational Resources Information Center
Freitas, Lorraine
1990-01-01
This article traces the evolution of the development and refinement of the professional code from concerns about the ethical conduct of nurses to its present state as a professional code for all nurses. The relationship of the Ethics Committee of the American Nurses' Association to the development of the code is also discussed. (Author/MLW)
Growth of a 45-year-old ponderosa pine plantation: An Arizona case study
Peter F. Ffolliott; Gerald J. Gottfried; Cody L. Stropki; L. J. Heidmann
2008-01-01
Information on the growth of forest plantations is necessary for planning of ecosystem-based management of the plantations. This information is also useful in validating or refining computer simulators that estimate plantation growth into the future. Such growth information has been obtained from a 45-year-old ponderosa pine (Pinus ponderosa)...
Mobil lube dewaxing technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, C.L.; McGuiness, M.P.
1995-09-01
Currently, the lube refining industry is in a period of transition, with both hydroprocessing and catalytic dewaxing gathering momentum as replacements for solvent extraction and solvent dewaxing. In addition, lube product quality requirements have been increasing, both in the US and abroad. Mobil has developed a broad array of dewaxing catalytic technologies which can serve refiners throughout the stages of this transition. In the future, lube feedstocks which vary in source and wax content will become increasingly important, requiring an optimized system for highest performance. The Mobil Lube Dewaxing (MLDW) process is the work-horse of the catalytic dewaxing technologies, beingmore » a robust, low cost technology suitable for both solvent extracted and hydrocracked feeds. The Mobil Selective Dewaxing (MSDW) process has been recently introduced in response to the growth of hydroprocessing. MSDW requires either severely hydrotreated or hydrocracked feeds and provides improved lube yields and VI. For refiners with hydrocrackers and solvent dewaxing units, Mobil Wax Isomerization (MWI) technology can make higher VI base stocks to meet the growing demand for very high quality lube products. A review of these three technologies is presented in this paper.« less
Refinement of the Long-Term Conditions Questionnaire (LTCQ): patient and expert stakeholder opinion.
Kelly, Laura; Potter, Caroline M; Hunter, Cheryl; Gibbons, Elizabeth; Fitzpatrick, Ray; Jenkinson, Crispin; Peters, Michele
2016-01-01
It is a key UK government priority to assess and improve outcomes in people with long-term conditions (LTCs). We are developing a new patient-reported outcome measure, the Long-Term Conditions Questionnaire (LTCQ), for use among people with single or multiple LTCs. This study aimed to refine candidate LTCQ items that had previously been informed through literature reviews, interviews with professional stakeholders, and interviews with people with LTCs. Cognitive interviews (n=32) with people living with LTCs and consultations with professional stakeholders (n=13) and public representatives (n=5) were conducted to assess the suitability of 23 candidate items. Items were tested for content and comprehensibility and underwent a translatability assessment. Four rounds of revisions took place, due to amendments to item structure, improvements to item clarity, item duplication, and recommendations for future translations. Twenty items were confirmed as relevant to living with LTCs and understandable to patients and professionals. This study supports the content validity of the LTCQ items among people with LTCs and professional stakeholders. The final items are suitable to enter the next stage of psychometric refinement.
The nexus between urbanization and PM2.5 related mortality in China.
Liu, Miaomiao; Huang, Yining; Jin, Zhou; Ma, Zongwei; Liu, Xingyu; Zhang, Bing; Liu, Yang; Yu, Yang; Wang, Jinnan; Bi, Jun; Kinney, Patrick L
2017-08-01
The launch of China's new national urbanization plan, coupled with increasing concerns about air pollution, calls for better understandings of the nexus between urbanization and the air pollution-related health. Based on refined estimates of PM 2.5 related mortality in China, we developed an Urbanization-Excess Deaths Elasticity (U-EDE) indicator to measure the marginal PM 2.5 related mortality caused by urbanization. We then applied statistical models to estimate U-EDE and examined the modification effects of income on U-EDE. Urbanization in China between 2004 and 2012 led to increased PM 2.5 related mortality. A 1% increase in urbanization was associated with a 0.32%, 0.14%, and 0.50% increase in PM 2.5 related mortality of lung cancer, stroke, and ischemic heart disease. U-EDEs were modified by income with an inverted U curve, i.e., lower marginal impacts at the lowest and highest income levels. In addition, we projected the future U-EDE trend of China as a whole and found that China had experienced the peak of U-EDE and entered the second half of the inverted U-shaped curve. In the near future, national average U-EDE in China will decline along with the improvement of income level if no dramatic changes happen. However, the decreased U-EDE only implies that marginal PM 2.5 -related mortality brought by urbanization would decrease in China. Total health damage of urbanization will keep going up in the predictable future because the U-EDE is always positive. Copyright © 2017 Elsevier Ltd. All rights reserved.
Biokinetics of Nanomaterials: the Role of Biopersistence.
Laux, Peter; Riebeling, Christian; Booth, Andy M; Brain, Joseph D; Brunner, Josephine; Cerrillo, Cristina; Creutzenberg, Otto; Estrela-Lopis, Irina; Gebel, Thomas; Johanson, Gunnar; Jungnickel, Harald; Kock, Heiko; Tentschert, Jutta; Tlili, Ahmed; Schäffer, Andreas; Sips, Adriënne J A M; Yokel, Robert A; Luch, Andreas
2017-04-01
Nanotechnology risk management strategies and environmental regulations continue to rely on hazard and exposure assessment protocols developed for bulk materials, including larger size particles, while commercial application of nanomaterials (NMs) increases. In order to support and corroborate risk assessment of NMs for workers, consumers, and the environment it is crucial to establish the impact of biopersistence of NMs at realistic doses. In the future, such data will allow a more refined future categorization of NMs. Despite many experiments on NM characterization and numerous in vitro and in vivo studies, several questions remain unanswered including the influence of biopersistence on the toxicity of NMs. It is unclear which criteria to apply to characterize a NM as biopersistent. Detection and quantification of NMs, especially determination of their state, i.e., dissolution, aggregation, and agglomeration within biological matrices and other environments are still challenging tasks; moreover mechanisms of nanoparticle (NP) translocation and persistence remain critical gaps. This review summarizes the current understanding of NM biokinetics focusing on determinants of biopersistence. Thorough particle characterization in different exposure scenarios and biological matrices requires use of suitable analytical methods and is a prerequisite to understand biopersistence and for the development of appropriate dosimetry. Analytical tools that potentially can facilitate elucidation of key NM characteristics, such as ion beam microscopy (IBM) and time-of-flight secondary ion mass spectrometry (ToF-SIMS), are discussed in relation to their potential to advance the understanding of biopersistent NM kinetics. We conclude that a major requirement for future nanosafety research is the development and application of analytical tools to characterize NPs in different exposure scenarios and biological matrices.
Sahlean, Tiberiu C; Gherghel, Iulian; Papeş, Monica; Strugariu, Alexandru; Zamfirescu, Ştefan R
2014-01-01
Climate warming is one of the most important threats to biodiversity. Ectothermic organisms such as amphibians and reptiles are especially vulnerable as climatic conditions affect them directly. Ecological niche models (ENMs) are increasingly popular in ecological studies, but several drawbacks exist, including the limited ability to account for the dispersal potential of the species. In this study, we use ENMs to explore the impact of global climate change on the Caspian whip snake (Dolichophis caspius) as model for organisms with low dispersal abilities and to quantify dispersal to novel areas using GIS techniques. Models generated using Maxent 3.3.3 k and GARP for current distribution were projected on future climatic scenarios. A cost-distance analysis was run in ArcGIS 10 using geomorphological features, ecological conditions, and human footprint as "costs" to dispersal of the species to obtain a Maximum Dispersal Range (MDR) estimate. All models developed were statistically significant (P<0.05) and recovered the currently known distribution of D. caspius. Models projected on future climatic conditions using Maxent predicted a doubling of suitable climatic area, while GARP predicted a more conservative expansion. Both models agreed on an expansion of suitable area northwards, with minor decreases at the southern distribution limit. The MDR area calculated using the Maxent model represented a third of the total area of the projected model. The MDR based on GARP models recovered only about 20% of the total area of the projected model. Thus, incorporating measures of species' dispersal abilities greatly reduced estimated area of potential future distributions.
NASA Astrophysics Data System (ADS)
Schleussner, C.-F.; Lissner, T. K.; Fischer, E. M.; Wohland, J.; Perrette, M.; Golly, A.; Rogelj, J.; Childers, K.; Schewe, J.; Frieler, K.; Mengel, M.; Hare, W.; Schaeffer, M.
2015-11-01
Robust appraisals of climate impacts at different levels of global-mean temperature increase are vital to guide assessments of dangerous anthropogenic interference with the climate system. Currently, two such levels are discussed in the context of the international climate negotiations as long-term global temperature goals: a below 2 °C and a 1.5 °C limit in global-mean temperature rise above pre-industrial levels. Despite the prominence of these two temperature limits, a comprehensive assessment of the differences in climate impacts at these levels is still missing. Here we provide an assessment of key impacts of climate change at warming levels of 1.5 °C and 2 °C, including extreme weather events, water availability, agricultural yields, sea-level rise and risk of coral reef loss. Our results reveal substantial differences in impacts between 1.5 °C and 2 °C. For heat-related extremes, the additional 0.5 °C increase in global-mean temperature marks the difference between events at the upper limit of present-day natural variability and a new climate regime, particularly in tropical regions. Similarly, this warming difference is likely to be decisive for the future of tropical coral reefs. In a scenario with an end-of-century warming of 2 °C, virtually all tropical coral reefs are projected to be at risk of severe degradation due to temperature induced bleaching from 2050 onwards. This fraction is reduced to about 90 % in 2050 and projected to decline to 70 % by 2100 for a 1.5 °C scenario. Analyses of precipitation-related impacts reveal distinct regional differences and several hot-spots of change emerge. Regional reduction in median water availability for the Mediterranean is found to nearly double from 9 to 17 % between 1.5 °C and 2 °C, and the projected lengthening of regional dry spells increases from 7 % longer to 11 %. Projections for agricultural yields differ between crop types as well as world regions. While some (in particular high-latitude) regions may benefit, tropical regions like West Africa, South-East Asia, as well as Central and Northern South America are projected to face local yield reductions, particularly for wheat and maize. Best estimate sea-level rise projections based on two illustrative scenarios indicate a 50 cm rise by 2100 relative to year 2000-levels under a 2 °C warming, which is about 10 cm lower for a 1.5 °C scenario. Our findings highlight the importance of regional differentiation to assess future climate risks as well as different vulnerabilities to incremental increases in global-mean temperature. The article provides a consistent and comprehensive assessment of existing projections and a solid foundation for future work on refining our understanding of warming-level dependent climate impacts.
The Portsmouth-based glaucoma refinement scheme: a role for virtual clinics in the future?
Trikha, S; Macgregor, C; Jeffery, M; Kirwan, J
2012-10-01
Glaucoma referrals continue to impart a significant burden on Hospital Eye Services (HES), with a large proportion of these false positives. To evaluate the Portsmouth glaucoma scheme, utilising virtual clinics, digital technology, and community optometrists to streamline glaucoma referrals. The stages of the patient trail were mapped and, at each step of the process, 100 consecutive patient decisions were identified. The diagnostic outcomes of 50 consecutive patients referred from the refinement scheme to the HES were identified. A total of 76% of 'glaucoma' referrals were suitable for the refinement scheme. Overall, 94% of disc images were gradeable in the virtual clinic. In all, 11% of patients 'attending' the virtual clinic were accepted into HES, with 89% being discharged for community follow-up. Of referrals accepted into HES, the positive predictive value (glaucoma/ocular hypertension/suspect) was 0.78 vs 0.37 in the predating 'unrefined' scheme (95% CI 0.65-0.87). The scheme has released 1400 clinic slots/year for HES, and has produced a £244 200/year cost saving for Portsmouth Hospitals' Trust. The refinement scheme is streamlining referrals and increasing the positive predictive rate in the diagnosis of glaucoma, glaucoma suspect or ocular hypertension. This consultant-led practice-based commissioning scheme, if adopted widely, is likely to incur a significant cost saving while maintaining high quality of care within the NHS.
Commentary: Epidemiology in the era of big data.
Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M
2015-05-01
Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.
Epidemiology in the Era of Big Data
Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M
2015-01-01
Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221
Estimating Impacts of Diesel Fuel Reformulation with Vector-based Blending
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadder, G.R.
2003-01-23
The Oak Ridge National Laboratory Refinery Yield Model has been used to study the refining cost, investment, and operating impacts of specifications for reformulated diesel fuel (RFD) produced in refineries of the U.S. Midwest in summer of year 2010. The study evaluates different diesel fuel reformulation investment pathways. The study also determines whether there are refinery economic benefits for producing an emissions reduction RFD (with flexibility for individual property values) compared to a vehicle performance RFD (with inflexible recipe values for individual properties). Results show that refining costs are lower with early notice of requirements for RFD. While advanced desulfurizationmore » technologies (with low hydrogen consumption and little effect on cetane quality and aromatics content) reduce the cost of ultra low sulfur diesel fuel, these technologies contribute to the increased costs of a delayed notice investment pathway compared to an early notice investment pathway for diesel fuel reformulation. With challenging RFD specifications, there is little refining benefit from producing emissions reduction RFD compared to vehicle performance RFD. As specifications become tighter, processing becomes more difficult, blendstock choices become more limited, and refinery benefits vanish for emissions reduction relative to vehicle performance specifications. Conversely, the emissions reduction specifications show increasing refinery benefits over vehicle performance specifications as specifications are relaxed, and alternative processing routes and blendstocks become available. In sensitivity cases, the refinery model is also used to examine the impact of RFD specifications on the economics of using Canadian synthetic crude oil. There is a sizeable increase in synthetic crude demand as ultra low sulfur diesel fuel displaces low sulfur diesel fuel, but this demand increase would be reversed by requirements for diesel fuel reformulation.« less
Impacts of the Venezuelan Crude Oil Production Loss
2003-01-01
This assessment of the Venezuelan petroleum loss examines two areas. The first part of the analysis focuses on the impact of the loss of Venezuelan crude production on crude oil supply for U.S. refiners who normally run a significant fraction of Venezuelan crude oil. The second part of the analysis looks at the impact of the Venezuelan production loss on crude markets in general, with particular emphasis on crude oil imports, refinery crude oil throughput levels, stock levels, and the changes in price differences between light and heavy crude oils.
Bulk Nanolaminated Nickel: Preparation, Microstructure, Mechanical Property, and Thermal Stability
NASA Astrophysics Data System (ADS)
Liu, Fan; Yuan, Hao; Goel, Sunkulp; Liu, Ying; Wang, Jing Tao
2018-02-01
A bulk nanolaminated (NL) structure with distinctive fractions of low- and high-angle grain boundaries ( f LAGBs and f HAGBs) is produced in pure nickel, through a two-step process of primary grain refinement by equal-channel angular pressing (ECAP), followed by a secondary geometrical refinement via liquid nitrogen rolling (LNR). The lamellar boundary spacings of 2N and 4N nickel are refined to 40 and 70 nm, respectively, and the yield strength of the NL structure in 2N nickel reaches 1.5 GPa. The impacts of the deformation path, material purity, grain boundary (GB) misorientation, and energy on the microstructure, refinement ability, mechanical strength, and thermal stability are investigated to understand the inherent governing mechanisms. GB migration is the main restoration mechanism limiting the refinement of an NL structure in 4N nickel, while in 2N nickel, shear banding occurs and mediates one-fifth of the total true normal rolling strain at the mesoscale, restricting further refinement. Three typical structures [ultrafine grained (UFG), NL with low f LAGBs, and NL with high f LAGBs] obtained through three different combinations of ECAP and LNR were studied by isochronal annealing for 1 hour at temperatures ranging from 433 K to 973 K (160 °C to 700 °C). Higher thermal stability in the NL structure with high f LAGBs is shown by a 50 K (50 °C) delay in the initiation temperature of recrystallization. Based on calculations and analyses of the stored energies of deformed structures from strain distribution, as characterized by kernel average misorientation (KAM), and from GB misorientations, higher thermal stability is attributed to high f LAGBs in this type of NL structure. This is confirmed by a slower change in the microstructure, as revealed by characterizing its annealing kinetics using KAM maps.
2012-01-01
Background Our companion paper discussed the yield benefits achieved by integrating deacetylation, mechanical refining, and washing with low acid and low temperature pretreatment. To evaluate the impact of the modified process on the economic feasibility, a techno-economic analysis (TEA) was performed based on the experimental data presented in the companion paper. Results The cost benefits of dilute acid pretreatment technology combined with the process alternatives of deacetylation, mechanical refining, and pretreated solids washing were evaluated using cost benefit analysis within a conceptual modeling framework. Control cases were pretreated at much lower acid loadings and temperatures than used those in the NREL 2011 design case, resulting in much lower annual ethanol production. Therefore, the minimum ethanol selling prices (MESP) of the control cases were $0.41-$0.77 higher than the $2.15/gallon MESP of the design case. This increment is highly dependent on the carbohydrate content in the corn stover. However, if pretreatment was employed with either deacetylation or mechanical refining, the MESPs were reduced by $0.23-$0.30/gallon. Combing both steps could lower the MESP further by $0.44 ~ $0.54. Washing of the pretreated solids could also greatly improve the final ethanol yields. However, the large capital cost of the solid–liquid separation unit negatively influences the process economics. Finally, sensitivity analysis was performed to study the effect of the cost of the pretreatment reactor and the energy input for mechanical refining. A 50% cost reduction in the pretreatment reactor cost reduced the MESP of the entire conversion process by $0.11-$0.14/gallon, while a 10-fold increase in energy input for mechanical refining will increase the MESP by $0.07/gallon. Conclusion Deacetylation and mechanical refining process options combined with low acid, low severity pretreatments show improvements in ethanol yields and calculated MESP for cellulosic ethanol production. PMID:22967479
Franco, Nuno Henrique; Correia-Neves, Margarida; Olsson, I. Anna S.
2012-01-01
There is growing concern over the welfare of animals used in research, in particular when these animals develop pathology. The present study aims to identify the main sources of animal distress and to assess the possible implementation of refinement measures in experimental infection research, using mouse models of tuberculosis (TB) as a case study. This choice is based on the historical relevance of mouse studies in understanding the disease and the present and long-standing impact of TB on a global scale. Literature published between 1997 and 2009 was analysed, focusing on the welfare impact on the animals used and the implementation of refinement measures to reduce this impact. In this 12-year period, we observed a rise in reports of ethical approval of experiments. The proportion of studies classified into the most severe category did however not change significantly over the studied period. Information on important research parameters, such as method for euthanasia or sex of the animals, were absent in a substantial number of papers. Overall, this study shows that progress has been made in the application of humane endpoints in TB research, but that a considerable potential for improvement remains. PMID:23110093
A Review of the Water and Energy Sectors and the Use of a Nexus Approach in Abu Dhabi.
Paul, Parneet; Al Tenaiji, Ameena Kulaib; Braimah, Nuhu
2016-03-25
Rapid population increase coupled with urbanization and industrialization has resulted in shortages of water in the Middle East. This situation is further exacerbated by global climate change due to greenhouse gas emissions. Recent research advocates that solutions to the global water security and scarcity crisis must involve water-energy nexus approaches. This means adopting policies and strategies that harmonize these inter-related sectors to minimize environmental impact while maximizing human benefit. In the case of Abu Dhabi, when designing and locating oil/gas refineries and associated power generation facilities, previous relevant decisions were based on simple economic and geographical grounds, such as nearness to oil rigs, pipelines, existing industries and port facilities, etc. The subsequent design and location of water abstraction and treatment works operated by the waste heat from these refining and/or power generation processes was catered for as an afterthought, meaning that there is now a mismatch between the water and energy supplies and demands. This review study was carried out to show how Abu Dhabi is trying now to integrate its water-energy sectors using a nexus approach so that future water/power infrastructure is designed optimally and operated in harmony, especially in regard to future demand. Based upon this review work, some recommendations are made for designers and policy makers alike to bolster the nexus approach that Abu Dhabi is pursuing.
Astronaut medical selection during the shuttle era: 1981-2011.
Johnston, Smith L; Blue, Rebecca S; Jennings, Richard T; Tarver, William J; Gray, Gary W
2014-08-01
U.S. astronauts undergo extensive job-related screening and medical examinations prior to selection in order to identify candidates optimally suited for careers in spaceflight. Screening medical standards evolved over many years and after extensive spaceflight experience. These standards assess health-related risks for each astronaut candidate, minimizing the potential for medical impact on future mission success. This document discusses the evolution of the Shuttle-era medical selection standards and the most common reasons for medical dis-qualification of applicants. Data for astronaut candidate finalists were compiled from medical records and NASA archives from the period of 1978 to 2004 and were retrospectively reviewed for medically disqualifying conditions. During Shuttle selection cycles, a total of 372 applicants were disqualified due to 425 medical concerns. The most common disqualifying conditions included visual, cardiovascular, psychiatric, and behavioral disorders. During this time period, three major expert panel reviews resulted in refinements and alterations to selection standards for future cycles. Shuttle-era screening, testing, and specialist evaluations evolved through periodic expert reviews, evidence-based medicine, and astronaut medical care experience. The Shuttle medical program contributed to the development and implementation of NASA and international standards, longitudinal data collection, improved medical care, and occupational surveillance models. The lessons learned from the Shuttle program serve as the basis for medical selection for the ISS, exploration-class missions, and for those expected to participate in commercial spaceflight.
A Review of the Water and Energy Sectors and the Use of a Nexus Approach in Abu Dhabi
Paul, Parneet; Al Tenaiji, Ameena Kulaib; Braimah, Nuhu
2016-01-01
Rapid population increase coupled with urbanization and industrialization has resulted in shortages of water in the Middle East. This situation is further exacerbated by global climate change due to greenhouse gas emissions. Recent research advocates that solutions to the global water security and scarcity crisis must involve water–energy nexus approaches. This means adopting policies and strategies that harmonize these inter-related sectors to minimize environmental impact while maximizing human benefit. In the case of Abu Dhabi, when designing and locating oil/gas refineries and associated power generation facilities, previous relevant decisions were based on simple economic and geographical grounds, such as nearness to oil rigs, pipelines, existing industries and port facilities, etc. The subsequent design and location of water abstraction and treatment works operated by the waste heat from these refining and/or power generation processes was catered for as an afterthought, meaning that there is now a mismatch between the water and energy supplies and demands. This review study was carried out to show how Abu Dhabi is trying now to integrate its water–energy sectors using a nexus approach so that future water/power infrastructure is designed optimally and operated in harmony, especially in regard to future demand. Based upon this review work, some recommendations are made for designers and policy makers alike to bolster the nexus approach that Abu Dhabi is pursuing. PMID:27023583
A narrative review of alcohol consumption as a risk factor for global burden of disease.
Rehm, Jürgen; Imtiaz, Sameer
2016-10-28
Since the original Comparative Risk Assessment (CRA) for alcohol consumption as part of the Global Burden of Disease Study for 1990, there had been regular updates of CRAs for alcohol from the World Health Organization and/or the Institute for Health Metrics and Evaluation. These studies have become more and more refined with respect to establishing causality between dimensions of alcohol consumption and different disease and mortality (cause of death) outcomes, refining risk relations, and improving the methodology for estimating exposure and alcohol-attributable burden. The present review will give an overview on the main results of the CRAs with respect to alcohol consumption as a risk factor, sketch out new trends and developments, and draw implications for future research and policy.
Crawford, Eean R; Lepine, Jeffery A; Rich, Bruce Louis
2010-09-01
We refine and extend the job demands-resources model with theory regarding appraisal of stressors to account for inconsistencies in relationships between demands and engagement, and we test the revised theory using meta-analytic structural modeling. Results indicate support for the refined and updated theory. First, demands and burnout were positively associated, whereas resources and burnout were negatively associated. Second, whereas relationships among resources and engagement were consistently positive, relationships among demands and engagement were highly dependent on the nature of the demand. Demands that employees tend to appraise as hindrances were negatively associated with engagement, and demands that employees tend to appraise as challenges were positively associated with engagement. Implications for future research are discussed. Copyright 2010 APA, all rights reserved
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-08
... proposal area. Alternative transmission line corridors will be refined as part of the EIS scoping process... aspects of the proposal will be considered in the EIS. RUS is the lead Federal agency, as defined at 40...
Small Changes to Catalysts; Big Impacts for our Nation
Bullock, Morris; Shaw, Wendy; O'Hagan, Molly
2018-01-16
Hydrogen is at the heart of producing ammonia. It's what we need to make the fertilizers that grow our food, and to refine crude oil into fuels that meet clean air standards. But it is produced from non-renewable resources, typically natural gas.
Small Changes to Catalysts; Big Impacts for our Nation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bullock, Morris; Shaw, Wendy; O'Hagan, Molly
2013-08-29
Hydrogen is at the heart of producing ammonia. It's what we need to make the fertilizers that grow our food, and to refine crude oil into fuels that meet clean air standards. But it is produced from non-renewable resources, typically natural gas.
Testing hydrodynamics schemes in galaxy disc simulations
NASA Astrophysics Data System (ADS)
Few, C. G.; Dobbs, C.; Pettitt, A.; Konstandin, L.
2016-08-01
We examine how three fundamentally different numerical hydrodynamics codes follow the evolution of an isothermal galactic disc with an external spiral potential. We compare an adaptive mesh refinement code (RAMSES), a smoothed particle hydrodynamics code (SPHNG), and a volume-discretized mesh-less code (GIZMO). Using standard refinement criteria, we find that RAMSES produces a disc that is less vertically concentrated and does not reach such high densities as the SPHNG or GIZMO runs. The gas surface density in the spiral arms increases at a lower rate for the RAMSES simulations compared to the other codes. There is also a greater degree of substructure in the SPHNG and GIZMO runs and secondary spiral arms are more pronounced. By resolving the Jeans length with a greater number of grid cells, we achieve more similar results to the Lagrangian codes used in this study. Other alterations to the refinement scheme (adding extra levels of refinement and refining based on local density gradients) are less successful in reducing the disparity between RAMSES and SPHNG/GIZMO. Although more similar, SPHNG displays different density distributions and vertical mass profiles to all modes of GIZMO (including the smoothed particle hydrodynamics version). This suggests differences also arise which are not intrinsic to the particular method but rather due to its implementation. The discrepancies between codes (in particular, the densities reached in the spiral arms) could potentially result in differences in the locations and time-scales for gravitational collapse, and therefore impact star formation activity in more complex galaxy disc simulations.
The Long-Term Conditions Questionnaire: conceptual framework and item development
Peters, Michele; Potter, Caroline M; Kelly, Laura; Hunter, Cheryl; Gibbons, Elizabeth; Jenkinson, Crispin; Coulter, Angela; Forder, Julien; Towers, Ann-Marie; A’Court, Christine; Fitzpatrick, Ray
2016-01-01
Purpose To identify the main issues of importance when living with long-term conditions to refine a conceptual framework for informing the item development of a patient-reported outcome measure for long-term conditions. Materials and methods Semi-structured qualitative interviews (n=48) were conducted with people living with at least one long-term condition. Participants were recruited through primary care. The interviews were transcribed verbatim and analyzed by thematic analysis. The analysis served to refine the conceptual framework, based on reviews of the literature and stakeholder consultations, for developing candidate items for a new measure for long-term conditions. Results Three main organizing concepts were identified: impact of long-term conditions, experience of services and support, and self-care. The findings helped to refine a conceptual framework, leading to the development of 23 items that represent issues of importance in long-term conditions. The 23 candidate items formed the first draft of the measure, currently named the Long-Term Conditions Questionnaire. Conclusion The aim of this study was to refine the conceptual framework and develop items for a patient-reported outcome measure for long-term conditions, including single and multiple morbidities and physical and mental health conditions. Qualitative interviews identified the key themes for assessing outcomes in long-term conditions, and these underpinned the development of the initial draft of the measure. These initial items will undergo cognitive testing to refine the items prior to further validation in a survey. PMID:27621678
NASA technology program for future civil air transports
NASA Technical Reports Server (NTRS)
Wright, H. T.
1983-01-01
An assessment is undertaken of the development status of technology, applicable to future civil air transport design, which is currently undergoing conceptual study or testing at NASA facilities. The NASA civil air transport effort emphasizes advanced aerodynamic computational capabilities, fuel-efficient engines, advanced turboprops, composite primary structure materials, advanced aerodynamic concepts in boundary layer laminarization and aircraft configuration, refined control, guidance and flight management systems, and the integration of all these design elements into optimal systems. Attention is given to such novel transport aircraft design concepts as forward swept wings, twin fuselages, sandwich composite structures, and swept blade propfans.
On the horizon: new options for contraception.
Reifsnider, E
1997-01-01
Future contraceptives include refinements of existing contraceptives and totally new methods. New formulations of oral contraceptives, subdermal hormonal implants, injectable hormones, vaginal spermicides, and intrauterine devices (IUDs) are being tested around the world. New methods that are not yet available include the use of vaginal preparations containing sperm-immobilizing agents, gonadotrophin releasing hormone agonists and antagonists, vaccines against ova and sperm, and endogenous hormones. Male contraceptive methods use hormones to suppress testosterone and vaccines to immobilize sperm. The availability of all future contraceptives is dependent on ample funds for research, development, and testing, and such funds are in jeopardy.
[Natural orifice translumenal endoscopic surgery: historical and future perspectives].
Yasuda, Kazuhiro; Shiroshita, Hidefumi; Inomata, Masafumi; Kitano, Seigo
2013-11-01
Natural orifice translumenal endoscopic surgery (NOTES) has gained much attention worldwide since the first report of transgastric peritoneoscopy in a porcine model in 2004. In this review, we summarize and highlight the current status and future directions of NOTES. Thousands of human NOTES procedures have been performed. The most common procedures are cholecystectomy and appendectomy, mainly performed through transvaginal access in a hybrid fashion with laparoscopic assistance, and the general complication rate is acceptable. Although much work is still needed to refine the techniques for NOTES, the development of NOTES has the potential to create a paradigm shift in minimally invasive surgery.
The National Inventory of Down Woody Materials: Methods, Outputs, and Future Directions
Christopher W. Woodall
2003-01-01
The Forest Inventory and Analysis Program (FIA) of the USDA Forest Service conducts a national inventory of forests of the United States. A subset of FIA permanent inventory plots are sampled every year for numerous forest health indicators ranging fiom soils to understory vegetation. Down woody material (DWM) is an FIA indicator that refines estimation of forest...
Future Software Sizing Metrics and Estimation Challenges
2011-07-01
systems 4. Ultrahigh software system assurance 5. Legacy maintenance and Brownfield development 6. Agile and Lean/ Kanban development. This paper...refined as the design of the maintenance modifications or Brownfield re-engineering is determined. VII. 6. AGILE AND LEAN/ KANBAN DEVELOPMENT The...difficulties of software maintenance estimation can often be mitigated by using lean workflow management techniques such as Kanban [25]. In Kanban
Identification of polar bear den habitat in northern Alaska
Amstrup, Steven C.; Garner, Gerald W.; Derocher, Andrew E.; Garner, Gerald W.; Lunn, Nicholas J.; Wiig, Øystein; Derocher, Andrew E.; Garner, Gerald W.; Lunn, Nicholas J.; Wiig, Øystein
1998-01-01
The goal of this project is to refine the information collected previously on maternal denning, into digital maps that show where polar bears are likely to create future dens in northern Alaska. Such maps will allow a priori recommendations regarding timing and geographic locations of proposed human developments; and hence provide managers with an important mitigation and management tool.
A lumber grading system for the future: an update evaluation
D. Earl Kline; Chris Surak; Philip A. Araman
2000-01-01
Virginia Tech and the Southern Research Station of the USDA Forest Service have jointly developed and refined a multiple-sensor lumber-scanning prototype to demonstrate and test applicable scanning technologies (Conners et al. 1997, Kline et al. 1997, Kline et al. 1998). This R&D effort has led to a patented wood color and grain sorting system (Conners and Lu 1998...
ERIC Educational Resources Information Center
McDonald, Ronald H.
In this paper, a comparison of the Latin American and the North American society is presented as a preliminary to future refinement of the concepts into instructional devices for secondary students. Following discussion of the distinctions between the two general societal types (Latin America as organic-centripetal and North America as…
YogaHome: teaching and research challenges in a yoga program with homeless adults.
Davis-Berman, Jennifer; Farkas, Jean
2012-01-01
YogaHome is a therapeutic yoga program for homeless women. Developing and refining YogaHome provided a unique opportunity to explore the process of teaching yoga to women faced with the physical and emotional stress of living in a homeless shelter. Unique teaching and research challenges are presented and recommendations for future programs are discussed.
[Financial Aid to Independent Students at the Post Secondary Level: The Federal Government's Role
ERIC Educational Resources Information Center
Dellenback, John
One of the new and complex issues related to student aid is the independent student controversy. The author wishes to increase the Basic Opportunity Grant (BOG) and substantially increase work study to help the independent student. For the immediate future the author would like to see: (1) The BOG refined as a major Federal grant program committed…
Growth of a 45-year-old ponderosa pine plantation: An Arizona case study (P-53)
Peter F. Ffolliott; Gerald J. Gottfried; Cody L. Stropki; L. J. Heidmann
2008-01-01
Information on the growth of forest plantations is necessary for planning of ecosystem-based management of the plantations. This information is also useful in validating or refining computer simulators that estimate plantation growth into the future. Such growth information has been obtained from a 45-year-old ponderosa pine (Pinus ponderosa) plantation in the Hart...
Defining Gas Turbine Engine Performance Requirements for the Large Civil TiltRotor (LCTR2)
NASA Technical Reports Server (NTRS)
Snyder, Christopher A.
2013-01-01
Defining specific engine requirements is a critical part of identifying technologies and operational models for potential future rotary wing vehicles. NASA's Fundamental Aeronautics Program, Subsonic Rotary Wing Project has identified the Large Civil TiltRotor (LCTR) as the configuration to best meet technology goals. This notional vehicle concept has evolved with more clearly defined mission and operational requirements to the LCTR-iteration 2 (LCTR2). This paper reports on efforts to further review and refine the LCTR2 analyses to ascertain specific engine requirements and propulsion sizing criteria. The baseline mission and other design or operational requirements are reviewed. Analysis tools are described to help understand their interactions and underlying assumptions. Various design and operational conditions are presented and explained for their contribution to defining operational and engine requirements. These identified engine requirements are discussed to suggest which are most critical to the engine sizing and operation. The most-critical engine requirements are compared to in-house NASA engine simulations to try to ascertain which operational requirements define engine requirements versus points within the available engine operational capability. Finally, results are summarized with suggestions for future efforts to improve analysis capabilities, and better define and refine mission and operational requirements.
Grain-refining heat treatments to improve cryogenic toughness of high-strength steels
NASA Technical Reports Server (NTRS)
Rush, H. F.
1984-01-01
The development of two high Reynolds number wind tunnels at NASA Langley Research Center which operate at cryogenic temperatures with high dynamic pressures has imposed severe requirements on materials for model construction. Existing commercial high strength steels lack sufficient toughness to permit their safe use at temperatures approaching that of liquid nitrogen (-320 F). Therefore, a program to improve the cryogenic toughness of commercial high strength steels was conducted. Significant improvement in the cryogenic toughness of commercial high strength martensitic and maraging steels was demonstrated through the use of grain refining heat treatments. Charpy impact strength at -320 F was increased by 50 to 180 percent for the various alloys without significant loss in tensile strength. The grain sizes of the 9 percent Ni-Co alloys and 200 grade maraging steels were reduced to 1/10 of the original size or smaller, with the added benefit of improved machinability. This grain refining technique should permit these alloys with ultimate strengths of 220 to 270 ksi to receive consideration for cryogenic service.
Aarons, Gregory A; Green, Amy E; Willging, Cathleen E; Ehrhart, Mark G; Roesch, Scott C; Hecht, Debra B; Chaffin, Mark J
2014-12-10
This study examines sustainment of an EBI implemented in 11 United States service systems across two states, and delivered in 87 counties. The aims are to 1) determine the impact of state and county policies and contracting on EBI provision and sustainment; 2) investigate the role of public, private, and academic relationships and collaboration in long-term EBI sustainment; 3) assess organizational and provider factors that affect EBI reach/penetration, fidelity, and organizational sustainment climate; and 4) integrate findings through a collaborative process involving the investigative team, consultants, and system and community-based organization (CBO) stakeholders in order to further develop and refine a conceptual model of sustainment to guide future research and provide a resource for service systems to prepare for sustainment as the ultimate goal of the implementation process. A mixed-method prospective and retrospective design will be used. Semi-structured individual and group interviews will be used to collect information regarding influences on EBI sustainment including policies, attitudes, and practices; organizational factors and external policies affecting model implementation; involvement of or collaboration with other stakeholders; and outer- and inner-contextual supports that facilitate ongoing EBI sustainment. Document review (e.g., legislation, executive orders, regulations, monitoring data, annual reports, agendas and meeting minutes) will be used to examine the roles of state, county, and local policies in EBI sustainment. Quantitative measures will be collected via administrative data and web surveys to assess EBI reach/penetration, staff turnover, EBI model fidelity, organizational culture and climate, work attitudes, implementation leadership, sustainment climate, attitudes toward EBIs, program sustainment, and level of institutionalization. Hierarchical linear modeling will be used for quantitative analyses. Qualitative analyses will be tailored to each of the qualitative methods (e.g., document review, interviews). Qualitative and quantitative approaches will be integrated through an inclusive process that values stakeholder perspectives. The study of sustainment is critical to capitalizing on and benefiting from the time and fiscal investments in EBI implementation. Sustainment is also critical to realizing broad public health impact of EBI implementation. The present study takes a comprehensive mixed-method approach to understanding sustainment and refining a conceptual model of sustainment.
Assessing cumulative impacts to elk and mule deer in the Salmon River Basin, Idaho
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Neil, T.A.; Witmer, G.W.
1988-01-01
In this paper, we illustrate the method, using the potential for cumulative impacts to elk and mule deer from multiple hydroelectric development in the Salmon River Basin of Idaho. We attempted to incorporate knowledge of elk and mule deer habitat needs into a paradigm to assess cumulative impacts and aid in the regulatory decision making process. Undoubtedly, other methods could be developed based on different needs or constraints, but we offer this technique as a means to further refine cumulative impact assessment. Our approach is divided into three phases: analysis, evaluation, and documentation. 36 refs., 2 figs., 3 tabs.
76 FR 41555 - Tupelo, Mississippi Railroad Relocation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... Parts 1500-1508, and the FRA NEPA procedures, 64 FR 28545 (May 26, 1999). FRA is the lead Federal agency and the Mississippi Department of Transportation (MDOT) is the lead State agency. The Environmental... minimize potential impacts. Such strategies would be further refined in subsequent environmental review...
Political Socialization: A Topical Bibliography
ERIC Educational Resources Information Center
Brauen, Marsha; Harmon, Kathryn Newcomer
1977-01-01
Identifies four major areas of recent investigations: cross-cultural studies of political socialization, the focus on the interactive nature of the individual in the process of learning about politics, the need to examine the comparative impacts of the various agencies of political socialization, and methodological and conceptual refinements.…
The GOAT Effect's Impact upon Educational R and D.
ERIC Educational Resources Information Center
Kean, Michael H.; McNamara, Thomas C.
1979-01-01
The "Goodbye To All That" (GOAT) Effect is introduced as a special research and evaluation "outcome" effect characterizing decision making unduly influenced by abandoning "write-off" tendencies. The "gradual refinement" approach offers an antidote to the GOAT Effect because it does not use the systems…
78 FR 46677 - Environmental Impact Statement; Calcasieu Parish, LA
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-01
... between the I-210 interchanges including the Calcasieu River Bridge. A feasibility and environmental study... project. The feasibility study involved four phases: (1) Information and Data Gathering; (2) Preliminary Study; (3) Refined Alternatives; and (4) Preparation and Submission of a Final Report. Based on the...
NASA Astrophysics Data System (ADS)
Simon, Patrick; Hilbert, Stefan
2018-05-01
Galaxies are biased tracers of the matter density on cosmological scales. For future tests of galaxy models, we refine and assess a method to measure galaxy biasing as a function of physical scale k with weak gravitational lensing. This method enables us to reconstruct the galaxy bias factor b(k) as well as the galaxy-matter correlation r(k) on spatial scales between 0.01 h Mpc-1 ≲ k ≲ 10 h Mpc-1 for redshift-binned lens galaxies below redshift z ≲ 0.6. In the refinement, we account for an intrinsic alignment of source ellipticities, and we correct for the magnification bias of the lens galaxies, relevant for the galaxy-galaxy lensing signal, to improve the accuracy of the reconstructed r(k). For simulated data, the reconstructions achieve an accuracy of 3-7% (68% confidence level) over the above k-range for a survey area and a typical depth of contemporary ground-based surveys. Realistically the accuracy is, however, probably reduced to about 10-15%, mainly by systematic uncertainties in the assumed intrinsic source alignment, the fiducial cosmology, and the redshift distributions of lens and source galaxies (in that order). Furthermore, our reconstruction technique employs physical templates for b(k) and r(k) that elucidate the impact of central galaxies and the halo-occupation statistics of satellite galaxies on the scale-dependence of galaxy bias, which we discuss in the paper. In a first demonstration, we apply this method to previous measurements in the Garching-Bonn Deep Survey and give a physical interpretation of the lens population.
NASA Systems Analysis and Concepts Directorate Mission and Trade Study Analysis
NASA Technical Reports Server (NTRS)
Ricks, Wendell; Guynn, Mark; Hahn, Andrew; Lepsch, Roger; Mazanek, Dan; Dollyhigh, Sam
2006-01-01
Mission analysis, as practiced by the NASA Langley Research Center's Systems Analysis and Concepts Directorate (SACD), consists of activities used to define, assess, and evaluate a wide spectrum of aerospace systems for given requirements. The missions for these systems encompass a broad range from aviation to space exploration. The customer, who is usually another NASA organization or another government agency, often predefines the mission. Once a mission is defined, the goals and objectives that the system will need to meet are delineated and quantified. A number of alternative systems are then typically developed and assessed relative to these goals and objectives. This is done in order to determine the most favorable design approaches for further refinement. Trade studies are performed in order to understand the impact of a requirement on each system and to select among competing design options. Items varied in trade studies typically include: design variables or design constraints; technology and subsystem options; and operational approaches. The results of trade studies are often used to refine the mission and system requirements. SACD studies have been integral to the decision processes of many organizations for decades. Many recent examples of SACD mission and trade study analyses illustrate their excellence and influence. The SACD-led, Agency-wide effort to analyze a broad range of future human lunar exploration scenarios for NASA s Exploration Systems Mission Directorate (ESMD) and the Mars airplane design study in support of the Aerial Regional-scale Environment Survey of Mars (ARES) mission are two such examples. This paper describes SACD's mission and trade study analysis activities in general and presents the lunar exploration and Mars airplane studies as examples of type of work performed by the SACD.
Framing international trade and chronic disease
2011-01-01
There is an emerging evidence base that global trade is linked with the rise of chronic disease in many low and middle-income countries (LMICs). This linkage is associated, in part, with the global diffusion of unhealthy lifestyles and health damaging products posing a particular challenge to countries still facing high burdens of communicable disease. We developed a generic framework which depicts the determinants and pathways connecting global trade with chronic disease. We then applied this framework to three key risk factors for chronic disease: unhealthy diets, alcohol, and tobacco. This led to specific 'product pathways', which can be further refined and used by health policy-makers to engage with their country's trade policy-makers around health impacts of ongoing trade treaty negotiations, and by researchers to continue refining an evidence base on how global trade is affecting patterns of chronic disease. The prevention and treatment of chronic diseases is now rising on global policy agendas, highlighted by the UN Summit on Noncommunicable Diseases (September 2011). Briefs and declarations leading up to this Summit reference the role of globalization and trade in the spread of risk factors for these diseases, but emphasis is placed on interventions to change health behaviours and on voluntary corporate responsibility. The findings summarized in this article imply the need for a more concerted approach to regulate trade-related risk factors and thus more engagement between health and trade policy sectors within and between nations. An explicit recognition of the role of trade policies in the spread of noncommunicable disease risk factors should be a minimum outcome of the September 2011 Summit, with a commitment to ensure that future trade treaties do not increase such risks. PMID:21726434
Framing international trade and chronic disease.
Labonté, Ronald; Mohindra, Katia S; Lencucha, Raphael
2011-07-04
There is an emerging evidence base that global trade is linked with the rise of chronic disease in many low and middle-income countries (LMICs). This linkage is associated, in part, with the global diffusion of unhealthy lifestyles and health damaging products posing a particular challenge to countries still facing high burdens of communicable disease. We developed a generic framework which depicts the determinants and pathways connecting global trade with chronic disease. We then applied this framework to three key risk factors for chronic disease: unhealthy diets, alcohol, and tobacco. This led to specific 'product pathways', which can be further refined and used by health policy-makers to engage with their country's trade policy-makers around health impacts of ongoing trade treaty negotiations, and by researchers to continue refining an evidence base on how global trade is affecting patterns of chronic disease. The prevention and treatment of chronic diseases is now rising on global policy agendas, highlighted by the UN Summit on Noncommunicable Diseases (September 2011). Briefs and declarations leading up to this Summit reference the role of globalization and trade in the spread of risk factors for these diseases, but emphasis is placed on interventions to change health behaviours and on voluntary corporate responsibility. The findings summarized in this article imply the need for a more concerted approach to regulate trade-related risk factors and thus more engagement between health and trade policy sectors within and between nations. An explicit recognition of the role of trade policies in the spread of noncommunicable disease risk factors should be a minimum outcome of the September 2011 Summit, with a commitment to ensure that future trade treaties do not increase such risks.
Functional Fault Model Development Process to Support Design Analysis and Operational Assessment
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.
2016-01-01
A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.
Identifying functional cancer-specific miRNA-mRNA interactions in testicular germ cell tumor.
Sedaghat, Nafiseh; Fathy, Mahmood; Modarressi, Mohammad Hossein; Shojaie, Ali
2016-09-07
Testicular cancer is the most common cancer in men aged between 15 and 35 and more than 90% of testicular neoplasms are originated at germ cells. Recent research has shown the impact of microRNAs (miRNAs) in different types of cancer, including testicular germ cell tumor (TGCT). MicroRNAs are small non-coding RNAs which affect the development and progression of cancer cells by binding to mRNAs and regulating their expressions. The identification of functional miRNA-mRNA interactions in cancers, i.e. those that alter the expression of genes in cancer cells, can help delineate post-regulatory mechanisms and may lead to new treatments to control the progression of cancer. A number of sequence-based methods have been developed to predict miRNA-mRNA interactions based on the complementarity of sequences. While necessary, sequence complementarity is, however, not sufficient for presence of functional interactions. Alternative methods have thus been developed to refine the sequence-based interactions using concurrent expression profiles of miRNAs and mRNAs. This study aims to find functional cancer-specific miRNA-mRNA interactions in TGCT. To this end, the sequence-based predicted interactions are first refined using an ensemble learning method, based on two well-known methods of learning miRNA-mRNA interactions, namely, TaLasso and GenMiR++. Additional functional analyses were then used to identify a subset of interactions to be most likely functional and specific to TGCT. The final list of 13 miRNA-mRNA interactions can be potential targets for identifying TGCT-specific interactions and future laboratory experiments to develop new therapies. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dam, William; Campbell, Sam; Johnson, Ray
Milling activities at a former uranium mill site near Riverton, Wyoming, USA, contaminated the shallow groundwater beneath and downgradient of the site. Although the mill operated for <6 years (1958-1963), its impact remains an environmental liability. Groundwater modeling predicted that contaminant concentrations were declining steadily, which confirmed the conceptual site model (CSM). However, local flooding in 2010 mobilized contaminants that migrated downgradient from the Riverton site and resulted in a dramatic increase in groundwater contaminant concentrations. This observation indicated that the original CSM was inadequate to explain site conditions and needed to be refined. In response to the new observationsmore » after the flood, a collaborative investigation to better understand site conditions and processes commenced. This investigation included installing 103 boreholes to collect soil and groundwater samples, sampling and analysis of evaporite minerals along the bank of the Little Wind River, an analysis of evaportranspiration in the shallow aquifer, and sampling naturally organic-rich sediments near groundwater discharge areas. The enhanced characterization revealed that the existing CSM did not account for high uranium concentrations in groundwater remaining on the former mill site and groundwater plume stagnation near the Little Wind River. Observations from the flood and subsequent investigations indicate that additional characterization is still needed to continue refining the CSM and determine the viability of the natural flushing compliance strategy. Additional sampling, analysis, and testing of soil and groundwater are necessary to investigate secondary contaminant sources, mobilization of contaminants during floods, geochemical processes, contaminant plume stagnation, distribution of evaporite minerals and organic-rich sediments, and mechanisms and rates of contaminant transfer from soil to groundwater. Future data collection will be used to continually revise the CSM and evaluate the compliance strategy at the site.« less
NASA's Advanced Exploration Systems Mars Transit Habitat Refinement Point of Departure Design
NASA Technical Reports Server (NTRS)
Simon, Matthew; Latorella, Kara; Martin, John; Cerro, Jeff; Lepsch, Roger; Jefferies, Sharon; Goodliff, Kandyce; McCleskey, Carey; Smitherman, David; Stromgren, Chel
2017-01-01
This paper describes the recently developed point of departure design for a long duration, reusable Mars Transit Habitat, which was established during a 2016 NASA habitat design refinement activity supporting the definition of NASA's Evolvable Mars Campaign. As part of its development of sustainable human Mars mission concepts achievable in the 2030s, the Evolvable Mars Campaign has identified desired durations and mass/dimensional limits for long duration Mars habitat designs to enable the currently assumed solar electric and chemical transportation architectures. The Advanced Exploration Systems Mars Transit Habitat Refinement Activity brought together habitat subsystem design expertise from across NASA to develop an increased fidelity, consensus design for a transit habitat within these constraints. The resulting design and data (including a mass equipment list) contained in this paper are intended to help teams across the agency and potential commercial, academic, or international partners understand: 1) the current architecture/habitat guidelines and assumptions, 2) performance targets of such a habitat (particularly in mass, volume, and power), 3) the driving technology/capability developments and architectural solutions which are necessary for achieving these targets, and 4) mass reduction opportunities and research/design needs to inform the development of future research and proposals. Data presented includes: an overview of the habitat refinement activity including motivation and process when informative; full documentation of the baseline design guidelines and assumptions; detailed mass and volume breakdowns; a moderately detailed concept of operations; a preliminary interior layout design with rationale; a list of the required capabilities necessary to enable the desired mass; and identification of any worthwhile trades/analyses which could inform future habitat design efforts. As a whole, the data in the paper show that a transit habitat meeting the 43 metric tons launch mass/trans-Mars injection burn limits specified by the Evolvable Mars Campaign is achievable near the desired timeframe with moderate strategic investments including maintainable life support systems, repurposable structures and packaging, and lightweight exercise modalities. It also identifies operational and technological options to reduce this mass to less than 41 metric tons including staging of launch structure/packaging and alternate structural materials.
Craft, Brian D; Nagy, Kornél; Seefelder, Walburga; Dubois, Mathieu; Destaillats, Frédéric
2012-05-01
In a previous work, it was shown that at high temperatures (up to 280°C) glycidyl esters (GE) are formed from diacylglycerols (DAG) via elimination of free fatty acid (FFA). In the present study, the impact of DAG content and temperature on the formation of GE using a model vacuum system mimicking industrial edible oil deodorization is investigated. These deodorization experiments confirmed that the formation of GE from DAG is extensive at temperatures above 230-240°C, and therefore, this value should be considered as an upper limit for refining operations. Furthermore, experimental data suggest that the formation of GE accelerates in particular when the DAG levels in refined oils exceed 3-4% of total lipids. Analysis of the lipid composition of crude palm oil (CPO) samples allowed the estimation that this critical DAG content corresponds to about 1.9-2.5% of FFA, which is the conventional quality marker of CPO. Moreover, high levels (>100ppm) of GE were also found in palm fatty acid distillate samples, which may indicate that the level of GE in fully refined palm oils also depends on the elimination rate of GE into the fatty acid distillate. Copyright © 2011 Elsevier Ltd. All rights reserved.
Lincoln Advanced Science and Engineering Reinforcement (LASER) program
NASA Technical Reports Server (NTRS)
Williams, Willie E.
1989-01-01
Lincoln University, under the Lincoln Advanced Science and Engineering Reinforcement (LASER) Program, has identified and successfully recruited over 100 students for majors in technical fields. To date, over 70 percent of these students have completed or will complete technical degrees in engineering, physics, chemistry, and computer science. Of those completing the undergraduate degree, over 40 percent have gone on to graduate and professional schools. This success is attributable to well planned approaches to student recruitment, training, personal motivation, retention, and program staff. Very closely coupled to the above factors is a focus designed to achieve excellence in program services and student performance. Future contributions by the LASER Program to the pool of technical minority graduates will have a significant impact. This is already evident from the success of the students that began the first year of the program. With program plans to refine many of the already successful techniques, follow-on activities are expected to make even greater contributions to the availability of technically trained minorities. For example, undergraduate research exposure, broadened summer, and co-op work experiences will be enhanced.
Protein-based materials, toward a new level of structural control.
van Hest, J C; Tirrell, D A
2001-10-07
Through billions of years of evolution nature has created and refined structural proteins for a wide variety of specific purposes. Amino acid sequences and their associated folding patterns combine to create elastic, rigid or tough materials. In many respects, nature's intricately designed products provide challenging examples for materials scientists, but translation of natural structural concepts into bio-inspired materials requires a level of control of macromolecular architecture far higher than that afforded by conventional polymerization processes. An increasingly important approach to this problem has been to use biological systems for production of materials. Through protein engineering, artificial genes can be developed that encode protein-based materials with desired features. Structural elements found in nature, such as beta-sheets and alpha-helices, can be combined with great flexibility, and can be outfitted with functional elements such as cell binding sites or enzymatic domains. The possibility of incorporating non-natural amino acids increases the versatility of protein engineering still further. It is expected that such methods will have large impact in the field of materials science, and especially in biomedical materials science, in the future.
The ethical dimension in published animal research in critical care: the dark side of our moon.
Huet, Olivier; de Haan, Judy B
2014-03-13
The replacement, refinement, and reduction (3Rs) guidelines are the cornerstone of animal welfare practice for medical research. Nowadays, no animal research can be performed without being approved by an animal ethics committee. Therefore, we should expect that any published article would respect and promote the highest standard of animal welfare. However, in the previous issue of Critical Care, Bara and Joffe reported an unexpected finding: animal welfare is extremely poorly reported in critical care research publications involving animal models.This may have a significant negative impact on the reliability of the results and on future funding for our research.The ability of septic shock animal models to translate into clinical studies has been a challenge. Therefore, every means to improve the quality of these models should be pursued. Animal welfare issues should be seen as an additional benefit to achieve this goal. It is therefore critical to draw conclusions from this study to improve the standard of animal welfare in critical care research. This has already been achieved in other fields of research, and we should follow their example.
Leppin, Aaron L.; Montori, Victor M.; Gionfriddo, Michael R.
2015-01-01
An increasing proportion of healthcare resources in the United States are directed toward an expanding group of complex and multimorbid patients. Federal stakeholders have called for new models of care to meet the needs of these patients. Minimally Disruptive Medicine (MDM) is a theory-based, patient-centered, and context-sensitive approach to care that focuses on achieving patient goals for life and health while imposing the smallest possible treatment burden on patients’ lives. The MDM Care Model is designed to be pragmatically comprehensive, meaning that it aims to address any and all factors that impact the implementation and effectiveness of care for patients with multiple chronic conditions. It comprises core activities that map to an underlying and testable theoretical framework. This encourages refinement and future study. Here, we present the conceptual rationale for and a practical approach to minimally disruptive care for patients with multiple chronic conditions. We introduce some of the specific tools and strategies that can be used to identify the right care for these patients and to put it into practice. PMID:27417747
The evolution of global disaster risk assessments: from hazard to global change
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2013-04-01
The perception of disaster risk as a dynamic process interlinked with global change is a fairly recent concept. It gradually emerged as an evolution from new scientific theories, currents of thinking and lessons learned from large disasters since the 1970s. The interest was further heighten, in the mid-1980s, by the Chernobyl nuclear accident and the discovery of the ozone layer hole, both bringing awareness that dangerous hazards can generate global impacts. The creation of the UN International Decade for Natural Disaster Reduction (IDNDR) and the publication of the first IPCC report in 1990 reinforced the interest for global risk assessment. First global risk models including hazard, exposure and vulnerability components were available since mid-2000s. Since then increased computation power and more refined datasets resolution, led to more numerous and sophisticated global risk models. This article presents a recent history of global disaster risk models, the current status of researches for the Global Assessment Report on Disaster Risk Reduction (GAR 2013) and future challenges and limitations for the development of next generation global disaster risk models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, John T; Holladay, John; Wagner, Robert
The U.S. Department of Energy's (DOE's) Co-Optimization of Fuels & Engines (Co-Optima) initiative is conducting the early-stage research needed to accelerate the market introduction of advanced fuel and engine technologies. The research includes both spark-ignition (SI) and compression-ignition (CI) combustion approaches, targeting applications that impact the entire on-road fleet (light-, medium-, and heavy-duty vehicles). The initiative's major goals include significant improvements in vehicle fuel economy, lower-cost pathways to reduce emissions, and leveraging diverse U.S. fuel resources. A key objective of Co-Optima's research is to identify new blendstocks that enhance current petroleum blending components, increase blendstock diversity, and provide refiners withmore » increased flexibility to blend fuels with the key properties required to optimize advanced internal combustion engines. This report identifies eight representative blendstocks from five chemical families that have demonstrated the potential to increase boosted SI engine efficiency, meet key fuel quality requirements, and be viable for production at commercial scale by 2025-2030.« less
Factors shaping the evolution of electronic documentation systems
NASA Technical Reports Server (NTRS)
Dede, Christopher J.; Sullivan, Tim R.; Scace, Jacque R.
1990-01-01
The main goal is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge. By anticipating advances, the design of Space Station Project (SSP) information systems can be tailored to facilitate a progression of increasingly sophisticated strategies as the space station evolves. Future generations of advanced information systems will use increases in power to deliver environmentally meaningful, contextually targeted, interconnected data (knowledge). The concept of a Knowledge Base Management System is emerging when the problem is focused on how information systems can perform such a conversion of raw data. Such a system would include traditional management functions for large space databases. Added artificial intelligence features might encompass co-existing knowledge representation schemes; effective control structures for deductive, plausible, and inductive reasoning; means for knowledge acquisition, refinement, and validation; explanation facilities; and dynamic human intervention. The major areas covered include: alternative knowledge representation approaches; advanced user interface capabilities; computer-supported cooperative work; the evolution of information system hardware; standardization, compatibility, and connectivity; and organizational impacts of information intensive environments.
Nuclear Energy and Synthetic Liquid Transportation Fuels
NASA Astrophysics Data System (ADS)
McDonald, Richard
2012-10-01
This talk will propose a plan to combine nuclear reactors with the Fischer-Tropsch (F-T) process to produce synthetic carbon-neutral liquid transportation fuels from sea water. These fuels can be formed from the hydrogen and carbon dioxide in sea water and will burn to water and carbon dioxide in a cycle powered by nuclear reactors. The F-T process was developed nearly 100 years ago as a method of synthesizing liquid fuels from coal. This process presently provides commercial liquid fuels in South Africa, Malaysia, and Qatar, mainly using natural gas as a feedstock. Nuclear energy can be used to separate water into hydrogen and oxygen as well as to extract carbon dioxide from sea water using ion exchange technology. The carbon dioxide and hydrogen react to form synthesis gas, the mixture needed at the beginning of the F-T process. Following further refining, the products, typically diesel and Jet-A, can use existing infrastructure and can power conventional engines with little or no modification. We can then use these carbon-neutral liquid fuels conveniently long into the future with few adverse environmental impacts.
Dickey, David M; Jagiela, Steven; Fetters, Dennis
2003-01-01
In order to assess the current performance and to identify future growth opportunities of an in-house biomedical engineering (BME) program, senior management of Lehigh Valley Hospital (Allentown, Penn) engaged (in July 2001) the services of a clinical engineering consultant. Although the current in-house program was both functionally and financially sound, an independent audit had not been performed in over 4 years, and there were growing concerns by the BME staff related to the department's future leadership and long-term support from senior management. After an initial 2-month audit of the existing program, the consultant presented 41 separate recommendations for management's consideration. In order to refine and implement these recommendations, 5 separate committees were established to further evaluate a consolidated version of them, with the consultant acting as the facilitator for each group. Outcomes from each of the committees were used in the development of a formal business plan, which, upon full implementation, would not only strengthen and refine the current in-house service model but could also result in a substantial 3-year cost savings for the organization ($1,100,000 from existing operations, $500,000 in cost avoidance by in-sourcing postwarranty support of future capital equipment acquisitions). Another key outcome of the project was related to the development of a new master policy, titled the "Medical Equipment Management Program," complete with a newly defined state-of-the-art equipment scheduled inspection frequency model.
Zhang, Cindy; Ball, Jonathan; Panzica-Kelly, Julie; Augustine-Rauch, Karen
2016-04-18
There has been increasing focus on generation and assessment of in vitro developmental toxicology models for assessing teratogenic liability of chemicals. The driver for this focus has been to find reliable in vitro assays that will reduce or replace the use of in vivo tests for assessing teratogenicity. Such efforts may be eventually applied in testing pharmaceutical agents where a developmental toxicology assay or battery of assays may be incorporated into regulatory testing to replace one of the two species currently used in teratogenic assessment. Such assays may be eventually applied in testing a broader spectrum of chemicals, supporting efforts aligned with Tox21 strategies and responding to REACH legislation. This review describes the developmental toxicology assays that are of focus in these assessments: rodent whole embryo culture, zebrafish embryo assays, and embryonic stem cell assays. Progress on assay development as well as future directions of how these assays are envisioned to be applied for broader safety testing of chemicals are discussed. Altogether, the developmental model systems described in this review provide rich biological systems that can be utilized in better understanding teratogenic mechanisms of action of chemotypes and are promising in providing proactive safety assessment related to developmental toxicity. Continual advancements in refining/optimizing these in vitro assays are anticipated to provide a robust data set to provide thoughtful assessment of how whole animal teratogenicity evaluations can be reduced/refined in the future.
Reed, Susan G.; Adibi, Shawn S.; Coover, Mullen; Gellin, Robert G.; Wahlquist, Amy E.; AbdulRahiman, Anitha; Hamil, Lindsey H.; Walji, Muhammad F.; O’Neill, Paula; Kalenderian, Elsbeth
2015-01-01
The Consortium for Oral Health Research and Informatics (COHRI) is leading the way in use of the Dental Diagnostic System (DDS) terminology in the axiUm electronic health record (EHR). This collaborative pilot study had two aims: 1) to investigate whether use of the DDS terms positively impacted predoctoral dental students’ critical thinking skills measured by the Health Sciences Reasoning Test (HSRT), and 2) to refine study protocols. The study design was a natural experiment with cross-sectional data collection using the HSRT for 15 classes (2013–17) of students at three dental schools. Characteristics of students who had been exposed to the DDS terms were compared with students who had not, and the differences were tested by t-tests or chi-square tests. Generalized linear models were used to evaluate the relationship between exposure and outcome on the overall critical thinking score. The results showed that exposure was significantly related to overall score (p=0.01), with not-exposed students having lower mean overall scores. This study thus demonstrated a positive impact of using the DDS terminology in an EHR on the critical thinking skills of predoctoral dental students in three COHRI schools as measured by their overall score on the HSRT. These preliminary findings support future research to further evaluate a proposed model of critical thinking in clinical dentistry. PMID:26034034
Reed, Susan G; Adibi, Shawn S; Coover, Mullen; Gellin, Robert G; Wahlquist, Amy E; AbdulRahiman, Anitha; Hamil, Lindsey H; Walji, Muhammad F; O'Neill, Paula; Kalenderian, Elsbeth
2015-06-01
The Consortium for Oral Health Research and Informatics (COHRI) is leading the way in use of the Dental Diagnostic System (DDS) terminology in the axiUm electronic health record (EHR). This collaborative pilot study had two aims: 1) to investigate whether use of the DDS terms positively impacted predoctoral dental students' critical thinking skills measured by the Health Sciences Reasoning Test (HSRT), and 2) to refine study protocols. The study design was a natural experiment with cross-sectional data collection using the HSRT for 15 classes (2013-17) of students at three dental schools. Characteristics of students who had been exposed to the DDS terms were compared with students who had not, and the differences were tested by t-tests or chi-square tests. Generalized linear models were used to evaluate the relationship between exposure and outcome on the overall critical thinking score. The results showed that exposure was significantly related to overall score (p=0.01), with not-exposed students having lower mean overall scores. This study thus demonstrated a positive impact of using the DDS terminology in an EHR on the critical thinking skills of predoctoral dental students in three COHRI schools as measured by their overall score on the HSRT. These preliminary findings support future research to further evaluate a proposed model of critical thinking in clinical dentistry.
An adaptive discontinuous Galerkin solver for aerodynamic flows
NASA Astrophysics Data System (ADS)
Burgess, Nicholas K.
This work considers the accuracy, efficiency, and robustness of an unstructured high-order accurate discontinuous Galerkin (DG) solver for computational fluid dynamics (CFD). Recently, there has been a drive to reduce the discretization error of CFD simulations using high-order methods on unstructured grids. However, high-order methods are often criticized for lacking robustness and having high computational cost. The goal of this work is to investigate methods that enhance the robustness of high-order discontinuous Galerkin (DG) methods on unstructured meshes, while maintaining low computational cost and high accuracy of the numerical solutions. This work investigates robustness enhancement of high-order methods by examining effective non-linear solvers, shock capturing methods, turbulence model discretizations and adaptive refinement techniques. The goal is to develop an all encompassing solver that can simulate a large range of physical phenomena, where all aspects of the solver work together to achieve a robust, efficient and accurate solution strategy. The components and framework for a robust high-order accurate solver that is capable of solving viscous, Reynolds Averaged Navier-Stokes (RANS) and shocked flows is presented. In particular, this work discusses robust discretizations of the turbulence model equation used to close the RANS equations, as well as stable shock capturing strategies that are applicable across a wide range of discretization orders and applicable to very strong shock waves. Furthermore, refinement techniques are considered as both efficiency and robustness enhancement strategies. Additionally, efficient non-linear solvers based on multigrid and Krylov subspace methods are presented. The accuracy, efficiency, and robustness of the solver is demonstrated using a variety of challenging aerodynamic test problems, which include turbulent high-lift and viscous hypersonic flows. Adaptive mesh refinement was found to play a critical role in obtaining a robust and efficient high-order accurate flow solver. A goal-oriented error estimation technique has been developed to estimate the discretization error of simulation outputs. For high-order discretizations, it is shown that functional output error super-convergence can be obtained, provided the discretization satisfies a property known as dual consistency. The dual consistency of the DG methods developed in this work is shown via mathematical analysis and numerical experimentation. Goal-oriented error estimation is also used to drive an hp-adaptive mesh refinement strategy, where a combination of mesh or h-refinement, and order or p-enrichment, is employed based on the smoothness of the solution. The results demonstrate that the combination of goal-oriented error estimation and hp-adaptation yield superior accuracy, as well as enhanced robustness and efficiency for a variety of aerodynamic flows including flows with strong shock waves. This work demonstrates that DG discretizations can be the basis of an accurate, efficient, and robust CFD solver. Furthermore, enhancing the robustness of DG methods does not adversely impact the accuracy or efficiency of the solver for challenging and complex flow problems. In particular, when considering the computation of shocked flows, this work demonstrates that the available shock capturing techniques are sufficiently accurate and robust, particularly when used in conjunction with adaptive mesh refinement . This work also demonstrates that robust solutions of the Reynolds Averaged Navier-Stokes (RANS) and turbulence model equations can be obtained for complex and challenging aerodynamic flows. In this context, the most robust strategy was determined to be a low-order turbulence model discretization coupled to a high-order discretization of the RANS equations. Although RANS solutions using high-order accurate discretizations of the turbulence model were obtained, the behavior of current-day RANS turbulence models discretized to high-order was found to be problematic, leading to solver robustness issues. This suggests that future work is warranted in the area of turbulence model formulation for use with high-order discretizations. Alternately, the use of Large-Eddy Simulation (LES) subgrid scale models with high-order DG methods offers the potential to leverage the high accuracy of these methods for very high fidelity turbulent simulations. This thesis has developed the algorithmic improvements that will lay the foundation for the development of a three-dimensional high-order flow solution strategy that can be used as the basis for future LES simulations.
Update of Forest Service Research Data
Terence L. Wagner
2002-01-01
The U.S. Forest Service undertakes research to improve the protection of wood products against subterranean termite damage, define the role of termites in forest ecosystems, and understand their impact on forest health. Specifically, the Wood Products Insect Research Unit concentrates efforts on developing, refining, and assessing new and alternative compounds,...
Impact of Culture on Breast Cancer Screening in Chinese American Women
2006-09-01
developed and refined based on previous finding of cultural and language barriers to breast cancer screening in Chinese women . In Year 2, two hundred...and fifty Chinese women aged 50 and older in the Washington, DC area completed a telephone interview regarding their previous screening experience
An Approach to Poiseuille's Law in an Undergraduate Laboratory Experiment
ERIC Educational Resources Information Center
Sianoudis, I. A.; Drakaki, E.
2008-01-01
The continuous growth of computer and sensor technology allows many researchers to develop simple modifications and/or refinements to standard educational experiments, making them more attractive and comprehensible to students and thus increasing their educational impact. In the framework of this approach, the present study proposes an alternative…
Gagnier, Joel J; Derosier, Joseph M; Maratt, Joseph D; Hake, Mark E; Bagian, James P
2016-06-01
To develop, implement and test the effect of a handoff tool for orthopaedic trauma residents that reduces adverse events associated with the omission of critical information and the transfer of erroneous information. Components of this project included a literature review, resident surveys and observations, checklist development and refinement, implementation and evaluation of impact on adverse events through a chart review of a prospective cohort compared with a historical control group. Large teaching hospital. Findings of a literature review were presented to orthopaedic residents, epidemiologists, orthopaedic surgeons and patient safety experts in face-to-face meetings, during which we developed and refined the contents of a resident handoff tool. The tool was tested in an orthopaedic trauma service and its impact on adverse events was evaluated through a chart review. The handoff tool was developed and refined during the face-to-face meetings and a pilot implementation. Adverse event data were collected on 127 patients (n = 67 baseline period; n = 60 test period). A handoff tool for use by orthopaedic residents. Adverse events in patients handed off by orthopaedic trauma residents. After controlling for age, gender and comorbidities, testing resulted in fewer events per person (25-27% reduction; P < 0.10). Preliminary evidence suggests that our resident handoff tool may contribute to a decrease in adverse events in orthopaedic patients. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Estimating changes in urban land and urban population using refined areal interpolation techniques
NASA Astrophysics Data System (ADS)
Zoraghein, Hamidreza; Leyk, Stefan
2018-05-01
The analysis of changes in urban land and population is important because the majority of future population growth will take place in urban areas. U.S. Census historically classifies urban land using population density and various land-use criteria. This study analyzes the reliability of census-defined urban lands for delineating the spatial distribution of urban population and estimating its changes over time. To overcome the problem of incompatible enumeration units between censuses, regular areal interpolation methods including Areal Weighting (AW) and Target Density Weighting (TDW), with and without spatial refinement, are implemented. The goal in this study is to estimate urban population in Massachusetts in 1990 and 2000 (source zones), within tract boundaries of the 2010 census (target zones), respectively, to create a consistent time series of comparable urban population estimates from 1990 to 2010. Spatial refinement is done using ancillary variables such as census-defined urban areas, the National Land Cover Database (NLCD) and the Global Human Settlement Layer (GHSL) as well as different combinations of them. The study results suggest that census-defined urban areas alone are not necessarily the most meaningful delineation of urban land. Instead, it appears that alternative combinations of the above-mentioned ancillary variables can better depict the spatial distribution of urban land, and thus make it possible to reduce the estimation error in transferring the urban population from source zones to target zones when running spatially-refined temporal areal interpolation.
NASA Astrophysics Data System (ADS)
Zoraghein, H.; Leyk, S.; Balk, D.
2017-12-01
The analysis of changes in urban land and population is important because the majority of future population growth will take place in urban areas. The U.S. Census historically classifies urban land using population density and various land-use criteria. This study analyzes the reliability of census-defined urban lands for delineating the spatial distribution of urban population and estimating its changes over time. To overcome the problem of incompatible enumeration units between censuses, regular areal interpolation methods including Areal Weighting (AW) and Target Density Weighting (TDW), with and without spatial refinement, are implemented. The goal in this study is to estimate urban population in Massachusetts in 1990 and 2000 (source zones), within tract boundaries of the 2010 census (target zones), respectively, to create a consistent time series of comparable urban population estimates from 1990 to 2010. Spatial refinement is done using ancillary variables such as census-defined urban areas, the National Land Cover Database (NLCD) and the Global Human Settlement Layer (GHSL) as well as different combinations of them. The study results suggest that census-defined urban areas alone are not necessarily the most meaningful delineation of urban land. Instead it appears that alternative combinations of the above-mentioned ancillary variables can better depict the spatial distribution of urban land, and thus make it possible to reduce the estimation error in transferring the urban population from source zones to target zones when running spatially-refined temporal areal interpolation.
Beckley, Lynnath E.; Kobryn, Halina T.; Lombard, Amanda T.; Radford, Ben; Heyward, Andrew
2016-01-01
Marine protected area (MPA) designs are likely to require iterative refinement as new knowledge is gained. In particular, there is an increasing need to consider the effects of climate change, especially the ability of ecosystems to resist and/or recover from climate-related disturbances, within the MPA planning process. However, there has been limited research addressing the incorporation of climate change resilience into MPA design. This study used Marxan conservation planning software with fine-scale shallow water (<20 m) bathymetry and habitat maps, models of major benthic communities for deeper water, and comprehensive human use information from Ningaloo Marine Park in Western Australia to identify climate change resilience features to integrate into the incremental refinement of the marine park. The study assessed the representation of benthic habitats within the current marine park zones, identified priority areas of high resilience for inclusion within no-take zones and examined if any iterative refinements to the current no-take zones are necessary. Of the 65 habitat classes, 16 did not meet representation targets within the current no-take zones, most of which were in deeper offshore waters. These deeper areas also demonstrated the highest resilience values and, as such, Marxan outputs suggested minor increases to the current no-take zones in the deeper offshore areas. This work demonstrates that inclusion of fine-scale climate change resilience features within the design process for MPAs is feasible, and can be applied to future marine spatial planning practices globally. PMID:27529820
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanratty, M.P.; Liber, K.
1994-12-31
The Littoral Ecosystem Risk Assessment Model (LERAM) is a bioenergetic ecosystem effects model. It links single species toxicity data to a bioenergetic model of the trophic structure of an ecosystem in order to simulate community and ecosystem level effects of chemical stressors. LERAM was used in 1992 to simulate the ecological effects of diflubenzuron. When compared to the results from a littoral enclosure study, the model exaggerated the cascading of effects through the trophic levels of the littoral ecosystem. It was hypothesized that this could be corrected by making minor changes in the representation of the littoral food web. Twomore » refinements of the model were therefore performed: (1) the plankton and macroinvertebrate model populations [eg., predatory Copepoda, herbivorous Insecta, green phytoplankton, etc.] were changed to better represent the habitat and feeding preferences of the endemic taxa; and (2) the method for modeling the microbial degradation of detritus (and the resulting nutrient remineralization) was changed from simulating bacterial populations to simulating bacterial function. Model predictions of the ecological effects of 4-nonylphenol were made before and after these refinements. Both sets of predictions were then compared to the results from a littoral enclosure study of the ecological effects of 4-nonylphenol. The changes in the LERAM predictions were then used to determine the success of the refinements, to guide. future research, and to further define LERAM`s domain of application.« less
The Portsmouth-based glaucoma refinement scheme: a role for virtual clinics in the future?
Trikha, S; Macgregor, C; Jeffery, M; Kirwan, J
2012-01-01
Background Glaucoma referrals continue to impart a significant burden on Hospital Eye Services (HES), with a large proportion of these false positives. Aims To evaluate the Portsmouth glaucoma scheme, utilising virtual clinics, digital technology, and community optometrists to streamline glaucoma referrals. Method The stages of the patient trail were mapped and, at each step of the process, 100 consecutive patient decisions were identified. The diagnostic outcomes of 50 consecutive patients referred from the refinement scheme to the HES were identified. Results A total of 76% of ‘glaucoma' referrals were suitable for the refinement scheme. Overall, 94% of disc images were gradeable in the virtual clinic. In all, 11% of patients ‘attending' the virtual clinic were accepted into HES, with 89% being discharged for community follow-up. Of referrals accepted into HES, the positive predictive value (glaucoma/ocular hypertension/suspect) was 0.78 vs 0.37 in the predating ‘unrefined' scheme (95% CI 0.65–0.87). The scheme has released 1400 clinic slots/year for HES, and has produced a £244 200/year cost saving for Portsmouth Hospitals' Trust. Conclusion The refinement scheme is streamlining referrals and increasing the positive predictive rate in the diagnosis of glaucoma, glaucoma suspect or ocular hypertension. This consultant-led practice-based commissioning scheme, if adopted widely, is likely to incur a significant cost saving while maintaining high quality of care within the NHS. PMID:22766539
Molt, Robert W; Bartlett, Rodney J; Watson, Thomas; Bazanté, Alexandre P
2012-12-13
We have identified the major conformers of CL-20 explosive, otherwise known as 2,4,6,8,10,12-hexanitrohexaazaisowurtzitane, more formally known as 2,4,6,8,10,12-hexanitrohexaazatetracyclo[5.5.0.0]-dodecane, via Monte Carlo search in conformational space through molecular mechanics and subsequent quantum mechanical refinement using perturbation theory. Our search produced enough conformers to account for all of the various forms of CL-20 found in crystals. This suggests that our methodology will be useful in studying the conformational landscape of other nitramines. The energy levels of the conformers found are all within 0.25 eV of one another based on MBPT(2)/6-311G(d,p); consequently, without further refinement from a method such as coupled cluster theory, all conformers may reasonably be populated at STP in the gas phase. We also report the harmonic vibrational frequencies of conformers, including the implications on the mechanism of detonation. In particular, we establish that the weakest N-N nitramine of CL-20 is the cyclohexane equatorial nitramine. This preliminary mapping of the conformers of CL-20 makes it possible to study the mechanism of detonation of this explosive rigorously in future work.
Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.
2015-01-01
This report documents the process of developing and refining conceptual ecological models (CEMs) for linking river management to pallid sturgeon (Scaphirhynchus albus) population dynamics in the Missouri River. The refined CEMs are being used in the Missouri River Pallid Sturgeon Effects Analysis to organize, document, and formalize an understanding of pallid sturgeon population responses to past and future management alternatives. The general form of the CEMs, represented by a population-level model and component life-stage models, was determined in workshops held in the summer of 2013. Subsequently, the Missouri River Pallid Sturgeon Effects Analysis team designed a general hierarchical structure for the component models, refined the graphical structure, and reconciled variation among the components and between models developed for the upper river (Upper Missouri & Yellowstone Rivers) and the lower river (Missouri River downstream from Gavins Point Dam). Importance scores attributed to the relations between primary biotic characteristics and survival were used to define a candidate set of working dominant hypotheses about pallid sturgeon population dynamics. These CEMs are intended to guide research and adaptive-management actions to benefit pallid sturgeon populations in the Missouri River.
Costing for the Future: Exploring Cost Estimation With Unmanned Autonomous Systems
2016-04-30
account for how cost estimating for autonomy is different than current methodologies and to suggest ways it can be addressed through the integration and...The Development stage involves refining the system requirements, creating a solution description , and building a system. 3. The Operational Test...parameter describes the extent to which efficient fabrication methodologies and processes are used, and the automation of labor-intensive operations
Panel: The Future of Research in Modeling & Simulation
2014-12-01
in industry has not yet materialized. Massive, multiplayer on-line game systems represent one area where the technology has seen extensive commercial...deployment of experiment scripts, aggregation and analysis of data, and refinement and online adaptation of experiment designs through feedback as...findings in a specified context. Practitioners often consult with books, blogs, forums, online video tutorials, forums, and brief one-to- two page
Shallow Water Reverberation Measurement and Prediction
1994-06-01
tool . The temporal signal processing consisted of a short-time Fourier transform spectral estimation method applied to data from a single hydrophone...The three-dimensional Hamiltonian Acoustic Ray-tracing Program for the Ocean (HARPO) was used as the primary propagation modeling tool . The temporal...summarizes the work completed and discusses lessons learned . Advice regarding future work to refine the present study will be provided. 6 our poiut source
Students as Designers of Their Own Life Curricula: The Reconstruction of Experience in Education
ERIC Educational Resources Information Center
Izuegbu, Vincent
2011-01-01
The idea of life curriculum came as a result of looking back at the author's past in relation to his studies in curriculum. He learned by reconstructing his past in the present to influence his future, and students, indeed everyone, can as well do so. Constructing a curriculum of life is also a continuous process of building, renewing, refining,…
Development of CPR security using impact analysis.
Salazar-Kish, J.; Tate, D.; Hall, P. D.; Homa, K.
2000-01-01
The HIPAA regulations will require that institutions ensure the prevention of unauthorized access to electronically stored or transmitted patient records. This paper discusses a process for analyzing the impact of security mechanisms on users of computerized patient records through "behind the scenes" electronic access audits. In this way, those impacts can be assessed and refined to an acceptable standard prior to implementation. Through an iterative process of design and evaluation, we develop security algorithms that will protect electronic health information from improper access, alteration or loss, while minimally affecting the flow of work of the user population as a whole. PMID:11079984
NASA Technical Reports Server (NTRS)
Tiemsin, Pacita I.; Wohl, Christopher J.
2012-01-01
Flow visualization using polystyrene microspheres (PSL)s has enabled researchers to learn a tremendous amount of information via particle based diagnostic techniques. To better accommodate wind tunnel researchers needs, PSL synthesis via dispersion polymerization has been carried out at NASA Langley Research Center since the late 1980s. When utilizing seed material for flow visualization, size and size distribution are of paramount importance. Therefore, the work described here focused on further refinement of PSL synthesis and characterization. Through controlled variation of synthetic conditions (chemical concentrations, solution stirring speed, temperature, etc.) a robust, controllable procedure was developed. The relationship between particle size and salt concentration, MgSO4, was identified enabling the determination of PSL diameters a priori. Suggestions of future topics related to PSL synthesis, stability, and size variation are also described.
Prioritization in comparative effectiveness research: the CANCERGEN Experience.
Thariani, Rahber; Wong, William; Carlson, Josh J; Garrison, Louis; Ramsey, Scott; Deverka, Patricia A; Esmail, Laura; Rangarao, Sneha; Hoban, Carolyn J; Baker, Laurence H; Veenstra, David L
2012-05-01
Systematic approaches to stakeholder-informed research prioritization are a central focus of comparative effectiveness research. Genomic testing in cancer is an ideal area to refine such approaches given rapid innovation and potentially significant impacts on patient outcomes. To develop and pilot test a stakeholder-informed approach to prioritizing genomic tests for future study in collaboration with the cancer clinical trials consortium SWOG. We conducted a landscape analysis to identify genomic tests in oncology using a systematic search of published and unpublished studies, and expert consultation. Clinically valid tests suitable for evaluation in a comparative study were presented to an external stakeholder group. Domains to guide the prioritization process were identified with stakeholder input, and stakeholders ranked tests using multiple voting rounds. A stakeholder group was created including representatives from patient-advocacy groups, payers, test developers, regulators, policy makers, and community-based oncologists. We identified 9 domains for research prioritization with stakeholder feedback: population impact; current standard of care, strength of association; potential clinical benefits, potential clinical harms, economic impacts, evidence of need, trial feasibility, and market factors. The landscape analysis identified 635 studies; of 9 tests deemed to have sufficient clinical validity, 6 were presented to stakeholders. Two tests in lung cancer (ERCC1 and EGFR) and 1 test in breast cancer (CEA/CA15-3/CA27.29) were identified as top research priorities. Use of a diverse stakeholder group to inform research prioritization is feasible in a pragmatic and timely manner. Additional research is needed to optimize search strategies, stakeholder group composition, and integration with existing prioritization mechanisms.
Prioritization in Comparative Effectiveness Research: The CANCERGEN Experience in Cancer Genomics
Thariani, Rahber; Wong, William; Carlson, Josh J; Garrison, Louis; Ramsey, Scott; Deverka, Patricia A; Esmail, Laura; Rangarao, Sneha; Hoban, Carolyn J; Baker, Laurence H; Veenstra, David L
2012-01-01
Background Systematic approaches to stakeholder-informed research prioritization are a central focus of comparative effectiveness research. Genomic testing in cancer is an ideal area to refine such approaches given rapid innovation and potentially significant impacts on patient outcomes. Objective To develop and pilot-test a stakeholder-informed approach to prioritizing genomic tests for future study in collaboration with the cancer clinical trials consortium SWOG. Methods We conducted a landscape-analysis to identify genomic tests in oncology using a systematic search of published and unpublished studies, and expert consultation. Clinically valid tests suitable for evaluation in a comparative study were presented to an external stakeholder group. Domains to guide the prioritization process were identified with stakeholder input, and stakeholders ranked tests using multiple voting rounds. Results A stakeholder group was created including representatives from patient-advocacy groups, payers, test developers, regulators, policy-makers, and community-based oncologists. We identified nine domains for research prioritization with stakeholder feedback: population impact; current standard of care, strength of association; potential clinical benefits, potential clinical harms, economic impacts, evidence of need, trial feasibility, and market factors. The landscape-analysis identified 635 studies; of 9 tests deemed to have sufficient clinical validity, 6 were presented to stakeholders. Two tests in lung cancer (ERCC1 and EGFR) and one test in breast cancer (CEA/CA15-3/CA27.29) were identified as top research priorities. Conclusions Use of a diverse stakeholder group to inform research prioritization is feasible in a pragmatic and timely manner. Additional research is needed to optimize search strategies, stakeholder group composition and integration with existing prioritization mechanisms. PMID:22274803
Volcanic risk assessment: Quantifying physical vulnerability in the built environment
NASA Astrophysics Data System (ADS)
Jenkins, S. F.; Spence, R. J. S.; Fonseca, J. F. B. D.; Solidum, R. U.; Wilson, T. M.
2014-04-01
This paper presents structured and cost-effective methods for assessing the physical vulnerability of at-risk communities to the range of volcanic hazards, developed as part of the MIA-VITA project (2009-2012). An initial assessment of building and infrastructure vulnerability has been carried out for a set of broadly defined building types and infrastructure categories, with the likelihood of damage considered separately for projectile impact, ash fall loading, pyroclastic density current dynamic pressure and earthquake ground shaking intensities. In refining these estimates for two case study areas: Kanlaon volcano in the Philippines and Fogo volcano in Cape Verde, we have developed guidelines and methodologies for carrying out physical vulnerability assessments in the field. These include identifying primary building characteristics, such as construction material and method, as well as subsidiary characteristics, for example the size and prevalence of openings, that may be important in assessing eruption impacts. At-risk buildings around Kanlaon were found to be dominated by timber frame buildings that exhibit a high vulnerability to pyroclastic density currents, but a low vulnerability to failure from seismic shaking. Around Fogo, the predominance of unreinforced masonry buildings with reinforced concrete slab roofs suggests a high vulnerability to volcanic earthquake but a low vulnerability to ash fall loading. Given the importance of agriculture for local livelihoods around Kanlaon and Fogo, we discuss the potential impact of infrastructure vulnerability for local agricultural economies, with implications for volcanic areas worldwide. These methodologies and tools go some way towards offering a standardised approach to carrying out future vulnerability assessments for populated volcanic areas.
A refined methodology for modeling volume quantification performance in CT
NASA Astrophysics Data System (ADS)
Chen, Baiyu; Wilson, Joshua; Samei, Ehsan
2014-03-01
The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.
Do Massive Oil Sands Developments in a Northern Watershed Lead to an Impending Crisis?
NASA Astrophysics Data System (ADS)
Kienzle, S. W.; Byrne, J.; Schindler, D.; Komers, P.
2005-12-01
Oil sands developments in northern Alberta are land disruptions of massive proportions, with potentially major impacts on watersheds. Alberta has one of the largest known oil reserves in the world, and developments have about 25,000 sqkm of lease areas, and have approvals for plants to develop over half a million ha (or 54 townships). This is 91% the size of Lake Erie covered mainly with tailings dams, open-pit mines and associated massive removal of forests, wetlands, and soils. With rising oil prices and declining conventional reserves, the current production of about 900,000 barrels per day will dramatically increase. There is considerable confusion over how much water is needed to extract and refine the oil. Best estimated by oil companies are 6 to 10 barrels of water for each barrel of oil. Shell Oil is aiming to bring the water to oil ratio down to 3, however, this is not yet achieved. Trend analysis of the Athabasca streamflow shows that the streamflow is declining, particularly the low flow during winter. In order to sustain a minimum flow that ensures a relatively healthy aquatic environment, the only option the oil sands companies have to ensure uninterrupted production during winter is to build large water reservoirs, which would be filled during the high flow period in spring or summer. A disturbing fact is that this need for reservoirs was never considered until a science panel initiated by the Mikesew Cree First Nation participated in two hearings in the fall of 2003, when two major oil companies applied for licenses of a massive scale each. In the Environmental Impact Assessments (EIAs), water was to be extracted throughout the year, consequently threatening in-stream flow needs at some point in the future. Less than 1% has been reclaimed so far, with questionable success, as the new landscape will be a relatively sterile landscape with minimal biological diversity. Reclamation liabilities need to be included in mining leases. The release of naphthenic acids into water bodies through oil sands refining and potential tailings pond leaks could have huge impacts on the water quality for a large region. The short-comings in the EIAs submitted during the past two and a half years are manifold: a) the hydrological science of the EIAs is extremely sparse, with hardly any references to peer-reviewed journals; b) uncertainty analysis was not included until the 2003 hearings, and today uncertainty analysis is carried out inadequately; c) climate change impacts on streamflow and the water cycle have been totally ignored until the 2003 hearings, and today climate change impact analyses are totally inadequate; d) impacts quantification calculations are based on comparing impacted areas, such as the change of open water areas, to the total study area, instead of the associated pre-development areas, which results in highly underrated impacts quantification; e) the regions of potentially affected impacts from oil sands operations are defined to end at the inflow into Lake Athabasca, which is insufficient as substances carried with the water will flow into Lake Athabasca; f) frequency analyses are based on the wrong frequency distribution, subsequently resulting in inadequate predictions of streamflow extremes.
An ESA roadmap for geobiology in space exploration
NASA Astrophysics Data System (ADS)
Cousins, Claire R.; Cockell, Charles S.
2016-01-01
Geobiology, and in particular mineral-microbe interactions, has a significant role to play in current and future space exploration. This includes the search for biosignatures in extraterrestrial environments, and the human exploration of space. Microorganisms can be exploited to advance such exploration, such as through biomining, maintenance of life-support systems, and testing of life-detection instrumentation. In view of these potential applications, a European Space Agency (ESA) Topical Team "Geobiology in Space Exploration" was developed to explore these applications, and identify research avenues to be investigated to support this endeavour. Through community workshops, a roadmap was produced, with which to define future research directions via a set of 15 recommendations spanning three key areas: Science, Technology, and Community. These roadmap recommendations identify the need for research into: (1) new terrestrial space-analogue environments; (2) community level microbial-mineral interactions; (3) response of biofilms to the space environment; (4) enzymatic and biochemical mineral interaction; (5) technical refinement of instrumentation for space-based microbiology experiments, including precursor flight tests; (6) integration of existing ground-based planetary simulation facilities; (7) integration of fieldsite biogeography with laboratory- and field-based research; (8) modification of existing planetary instruments for new geobiological investigations; (9) development of in situ sample preparation techniques; (10) miniaturisation of existing analytical methods, such as DNA sequencing technology; (11) new sensor technology to analyse chemical interaction in small volume samples; (12) development of reusable Lunar and Near Earth Object experimental platforms; (13) utility of Earth-based research to enable the realistic pursuit of extraterrestrial biosignatures; (14) terrestrial benefits and technological spin-off from existing and future space-based geobiology investigations; and (15) new communication avenues between space agencies and terrestrial research organisations to enable this impact to be developed.
The UT 7/8 February 2013 Sila-Nunam Mutual Event and Future Predictions
NASA Technical Reports Server (NTRS)
Benecchi, S. D.; Noll, K. S.; Thirouin, A.; Ryan, E.; Grundy, W. M.; Verbiscer, A.; Doressoundiram, A.; Hestroffer, D.; Beaton, R.; Rabinowitz, D.;
2013-01-01
A superior mutual event of the Kuiper Belt binary system (79360) Sila-Nunam was observed over 15.47 h on UT 7/8 February 2013 by a coordinated effort at four different telescope facilities; it started approximately 1.5 h earlier than anticipated, the duration was approximately 9.5 h (about 10% longer than predicted), and was slightly less deep than predicted. It is the first full event observed for a comparably sized binary Kuiper Belt object. We provide predictions for future events refined by this and other partial mutual event observations obtained since the mutual event season began.
NASA Technical Reports Server (NTRS)
Stanturf, J. A.; Heimbuch, D. G.
1980-01-01
A refinement to the matrix approach to environmental impact assessment is to use landscape units in place of separate environmental elements in the analysis. Landscape units can be delineated by integrating remotely sensed data and available single-factor data. A remote sensing approach to landscape stratification is described and the conditions under which it is superior to other approaches that require single-factor maps are indicated. Flowcharts show the steps necessary to develop classification criteria, delineate units and a map legend, and use the landscape units in impact assessment. Application of the approach to assessing impacts of a transmission line in Montana is presented to illustrate the method.
Chen, Xiaowen; Shekiro, Joseph; Pschorn, Thomas; ...
2015-10-29
A novel, highly efficient deacetylation and disk refining (DDR) process to liberate fermentable sugars from biomass was recently developed at the National Renewable Energy Laboratory (NREL). The DDR process consists of a mild, dilute alkaline deacetylation step followed by low-energy-consumption disk refining. The DDR corn stover substrates achieved high process sugar conversion yields, at low to modest enzyme loadings, and also produced high sugar concentration syrups at high initial insoluble solid loadings. The sugar syrups derived from corn stover are highly fermentable due to low concentrations of fermentation inhibitors. The objective of this work is to evaluate the economic feasibilitymore » of the DDR process through a techno-economic analysis (TEA). A large array of experiments designed using a response surface methodology was carried out to investigate the two major cost-driven operational parameters of the novel DDR process: refining energy and enzyme loadings. The boundary conditions for refining energy (128–468 kWh/ODMT), cellulase (Novozyme’s CTec3) loading (11.6–28.4 mg total protein/g of cellulose), and hemicellulase (Novozyme’s HTec3) loading (0–5 mg total protein/g of cellulose) were chosen to cover the most commercially practical operating conditions. The sugar and ethanol yields were modeled with good adequacy, showing a positive linear correlation between those yields and refining energy and enzyme loadings. The ethanol yields ranged from 77 to 89 gallons/ODMT of corn stover. The minimum sugar selling price (MSSP) ranged from $0.191 to $0.212 per lb of 50 % concentrated monomeric sugars, while the minimum ethanol selling price (MESP) ranged from $2.24 to $2.54 per gallon of ethanol. The DDR process concept is evaluated for economic feasibility through TEA. The MSSP and MESP of the DDR process falls within a range similar to that found with the deacetylation/dilute acid pretreatment process modeled in NREL’s 2011 design report. The DDR process is a much simpler process that requires less capital and maintenance costs when compared to conventional chemical pretreatments with pressure vessels. As a result, we feel the DDR process should be considered as an option for future biorefineries with great potential to be more cost-effective.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xiaowen; Shekiro, Joseph; Pschorn, Thomas
A novel, highly efficient deacetylation and disk refining (DDR) process to liberate fermentable sugars from biomass was recently developed at the National Renewable Energy Laboratory (NREL). The DDR process consists of a mild, dilute alkaline deacetylation step followed by low-energy-consumption disk refining. The DDR corn stover substrates achieved high process sugar conversion yields, at low to modest enzyme loadings, and also produced high sugar concentration syrups at high initial insoluble solid loadings. The sugar syrups derived from corn stover are highly fermentable due to low concentrations of fermentation inhibitors. The objective of this work is to evaluate the economic feasibilitymore » of the DDR process through a techno-economic analysis (TEA). A large array of experiments designed using a response surface methodology was carried out to investigate the two major cost-driven operational parameters of the novel DDR process: refining energy and enzyme loadings. The boundary conditions for refining energy (128–468 kWh/ODMT), cellulase (Novozyme’s CTec3) loading (11.6–28.4 mg total protein/g of cellulose), and hemicellulase (Novozyme’s HTec3) loading (0–5 mg total protein/g of cellulose) were chosen to cover the most commercially practical operating conditions. The sugar and ethanol yields were modeled with good adequacy, showing a positive linear correlation between those yields and refining energy and enzyme loadings. The ethanol yields ranged from 77 to 89 gallons/ODMT of corn stover. The minimum sugar selling price (MSSP) ranged from $0.191 to $0.212 per lb of 50 % concentrated monomeric sugars, while the minimum ethanol selling price (MESP) ranged from $2.24 to $2.54 per gallon of ethanol. The DDR process concept is evaluated for economic feasibility through TEA. The MSSP and MESP of the DDR process falls within a range similar to that found with the deacetylation/dilute acid pretreatment process modeled in NREL’s 2011 design report. The DDR process is a much simpler process that requires less capital and maintenance costs when compared to conventional chemical pretreatments with pressure vessels. As a result, we feel the DDR process should be considered as an option for future biorefineries with great potential to be more cost-effective.« less
Chen, Xiaowen; Shekiro, Joseph; Pschorn, Thomas; Sabourin, Marc; Tucker, Melvin P; Tao, Ling
2015-01-01
A novel, highly efficient deacetylation and disk refining (DDR) process to liberate fermentable sugars from biomass was recently developed at the National Renewable Energy Laboratory (NREL). The DDR process consists of a mild, dilute alkaline deacetylation step followed by low-energy-consumption disk refining. The DDR corn stover substrates achieved high process sugar conversion yields, at low to modest enzyme loadings, and also produced high sugar concentration syrups at high initial insoluble solid loadings. The sugar syrups derived from corn stover are highly fermentable due to low concentrations of fermentation inhibitors. The objective of this work is to evaluate the economic feasibility of the DDR process through a techno-economic analysis (TEA). A large array of experiments designed using a response surface methodology was carried out to investigate the two major cost-driven operational parameters of the novel DDR process: refining energy and enzyme loadings. The boundary conditions for refining energy (128-468 kWh/ODMT), cellulase (Novozyme's CTec3) loading (11.6-28.4 mg total protein/g of cellulose), and hemicellulase (Novozyme's HTec3) loading (0-5 mg total protein/g of cellulose) were chosen to cover the most commercially practical operating conditions. The sugar and ethanol yields were modeled with good adequacy, showing a positive linear correlation between those yields and refining energy and enzyme loadings. The ethanol yields ranged from 77 to 89 gallons/ODMT of corn stover. The minimum sugar selling price (MSSP) ranged from $0.191 to $0.212 per lb of 50 % concentrated monomeric sugars, while the minimum ethanol selling price (MESP) ranged from $2.24 to $2.54 per gallon of ethanol. The DDR process concept is evaluated for economic feasibility through TEA. The MSSP and MESP of the DDR process falls within a range similar to that found with the deacetylation/dilute acid pretreatment process modeled in NREL's 2011 design report. The DDR process is a much simpler process that requires less capital and maintenance costs when compared to conventional chemical pretreatments with pressure vessels. As a result, we feel the DDR process should be considered as an option for future biorefineries with great potential to be more cost-effective.
NASA Astrophysics Data System (ADS)
Schleussner, Carl-Friedrich; Lissner, Tabea K.; Fischer, Erich M.; Wohland, Jan; Perrette, Mahé; Golly, Antonius; Rogelj, Joeri; Childers, Katelin; Schewe, Jacob; Frieler, Katja; Mengel, Matthias; Hare, William; Schaeffer, Michiel
2016-04-01
Robust appraisals of climate impacts at different levels of global-mean temperature increase are vital to guide assessments of dangerous anthropogenic interference with the climate system. The 2015 Paris Agreement includes a two-headed temperature goal: "holding the increase in the global average temperature to well below 2 °C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5 °C". Despite the prominence of these two temperature limits, a comprehensive overview of the differences in climate impacts at these levels is still missing. Here we provide an assessment of key impacts of climate change at warming levels of 1.5 °C and 2 °C, including extreme weather events, water availability, agricultural yields, sea-level rise and risk of coral reef loss. Our results reveal substantial differences in impacts between a 1.5 °C and 2 °C warming that are highly relevant for the assessment of dangerous anthropogenic interference with the climate system. For heat-related extremes, the additional 0.5 °C increase in global-mean temperature marks the difference between events at the upper limit of present-day natural variability and a new climate regime, particularly in tropical regions. Similarly, this warming difference is likely to be decisive for the future of tropical coral reefs. In a scenario with an end-of-century warming of 2 °C, virtually all tropical coral reefs are projected to be at risk of severe degradation due to temperature-induced bleaching from 2050 onwards. This fraction is reduced to about 90 % in 2050 and projected to decline to 70 % by 2100 for a 1.5 °C scenario. Analyses of precipitation-related impacts reveal distinct regional differences and hot-spots of change emerge. Regional reduction in median water availability for the Mediterranean is found to nearly double from 9 % to 17 % between 1.5 °C and 2 °C, and the projected lengthening of regional dry spells increases from 7 to 11 %. Projections for agricultural yields differ between crop types as well as world regions. While some (in particular high-latitude) regions may benefit, tropical regions like West Africa, South-East Asia, as well as Central and northern South America are projected to face substantial local yield reductions, particularly for wheat and maize. Best estimate sea-level rise projections based on two illustrative scenarios indicate a 50 cm rise by 2100 relative to year 2000-levels for a 2 °C scenario, and about 10 cm lower levels for a 1.5 °C scenario. In a 1.5 °C scenario, the rate of sea-level rise in 2100 would be reduced by about 30 % compared to a 2 °C scenario. Our findings highlight the importance of regional differentiation to assess both future climate risks and different vulnerabilities to incremental increases in global-mean temperature. The article provides a consistent and comprehensive assessment of existing projections and a good basis for future work on refining our understanding of the difference between impacts at 1.5 °C and 2 °C warming.
NASA Astrophysics Data System (ADS)
Arendt, A. A.; Houser, P.; Kapnick, S. B.; Kargel, J. S.; Kirschbaum, D.; Kumar, S.; Margulis, S. A.; McDonald, K. C.; Osmanoglu, B.; Painter, T. H.; Raup, B. H.; Rupper, S.; Tsay, S. C.; Velicogna, I.
2017-12-01
The High Mountain Asia Team (HiMAT) is an assembly of 13 research groups funded by NASA to improve understanding of cryospheric and hydrological changes in High Mountain Asia (HMA). Our project goals are to quantify historical and future variability in weather and climate over the HMA, partition the components of the water budget across HMA watersheds, explore physical processes driving changes, and predict couplings and feedbacks between physical and human systems through assessment of hazards and downstream impacts. These objectives are being addressed through analysis of remote sensing datasets combined with modeling and assimilation methods to enable data integration across multiple spatial and temporal scales. Our work to date has focused on developing improved high resolution precipitation, snow cover and snow water equivalence products through a variety of statistical uncertainty analysis, dynamical downscaling and assimilation techniques. These and other high resolution climate products are being used as input and validation for an assembly of land surface and General Circulation Models. To quantify glacier change in the region we have calculated multidecadal mass balances of a subset of HMA glaciers by comparing commercial satellite imagery with earlier elevation datasets. HiMAT is using these tools and datasets to explore the impact of atmospheric aerosols and surface impurities on surface energy exchanges, to determine drivers of glacier and snowpack melt rates, and to improve our capacity to predict future hydrological variability. Outputs from the climate and land surface assessments are being combined with landslide and glacier lake inventories to refine our ability to predict hazards in the region. Economic valuation models are also being used to assess impacts on water resources and hydropower. Field data of atmospheric aerosol, radiative flux and glacier lake conditions are being collected to provide ground validation for models and remote sensing products. In this presentation we will discuss initial results and outline plans for a scheduled release of our datasets and findings to the broader community. We will also describe our methods for cross-team collaboration through the adoption of cloud computing and data integration tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dirras, G., E-mail: dirras@univ-paris13.fr; Ouarem, A.; Couque, H.
2011-05-15
Polycrystalline Zn with an average grain size of about 300 {mu}m was deformed by direct impact Hopkinson pressure bar at a velocity of 29 m/s. An inhomogeneous grain structure was found consisting of a center region having large average grain size of 20 {mu}m surrounded by a fine-grained rim with an average grain size of 6 {mu}m. Transmission electron microscopy investigations showed a significant dislocation density in the large-grained area while in the fine-grained rim the dislocation density was negligible. Most probably, the higher strain yielded recrystallization in the outer ring while in the center only recovery occurred. The hardeningmore » effect of dislocations overwhelms the smaller grain size strengthening in the center part resulting in higher nanohardness in this region than in the outer ring. - Graphical Abstract: (a): EBSD micrograph showing the initial microstructure of polycrystalline Zn that was subsequently submitted to high strain rate impact. (b): an inhomogeneous grain size refinement was obtained which consists of a central coarse-grained area, surrounded by a fine-grained recrystallized rim. The black arrow points to the disc center. Research Highlights: {yields} A polycrystalline Zn specimen was submitted to high strain rate impact loading. {yields} Inhomogeneous grain refinement occurred due to strain gradient in impacted sample. {yields} A fine-grained recrystallized rim surrounded the coarse-grained center of specimen. {yields} The coarse-grained center exhibited higher hardness than the fine-grained rim. {yields} The higher hardness of the center was caused by the higher dislocation density.« less
High-rate squeezing process of bulk metallic glasses
Fan, Jitang
2017-01-01
High-rate squeezing process of bulk metallic glasses from a cylinder into an intact sheet achieved by impact loading is investigated. Such a large deformation is caused by plastic flow, accompanied with geometrical confinement, shear banding/slipping, thermo softening, melting and joining. Temperature rise during the high-rate squeezing process makes a main effect. The inherent mechanisms are illustrated. Like high-pressure torsion (HPT), equal channel angular pressing (ECAP) and surface mechanical attrition treatments (SMAT) for refining grain of metals, High-Rate Squeezing (HRS), as a multiple-functions technique, not only creates a new road of processing metallic glasses and other metallic alloys for developing advanced materials, but also directs a novel technology of processing, grain refining, coating, welding and so on for treating materials. PMID:28338092
High-rate squeezing process of bulk metallic glasses
NASA Astrophysics Data System (ADS)
Fan, Jitang
2017-03-01
High-rate squeezing process of bulk metallic glasses from a cylinder into an intact sheet achieved by impact loading is investigated. Such a large deformation is caused by plastic flow, accompanied with geometrical confinement, shear banding/slipping, thermo softening, melting and joining. Temperature rise during the high-rate squeezing process makes a main effect. The inherent mechanisms are illustrated. Like high-pressure torsion (HPT), equal channel angular pressing (ECAP) and surface mechanical attrition treatments (SMAT) for refining grain of metals, High-Rate Squeezing (HRS), as a multiple-functions technique, not only creates a new road of processing metallic glasses and other metallic alloys for developing advanced materials, but also directs a novel technology of processing, grain refining, coating, welding and so on for treating materials.
High-rate squeezing process of bulk metallic glasses.
Fan, Jitang
2017-03-24
High-rate squeezing process of bulk metallic glasses from a cylinder into an intact sheet achieved by impact loading is investigated. Such a large deformation is caused by plastic flow, accompanied with geometrical confinement, shear banding/slipping, thermo softening, melting and joining. Temperature rise during the high-rate squeezing process makes a main effect. The inherent mechanisms are illustrated. Like high-pressure torsion (HPT), equal channel angular pressing (ECAP) and surface mechanical attrition treatments (SMAT) for refining grain of metals, High-Rate Squeezing (HRS), as a multiple-functions technique, not only creates a new road of processing metallic glasses and other metallic alloys for developing advanced materials, but also directs a novel technology of processing, grain refining, coating, welding and so on for treating materials.
Randell, Rebecca; Greenhalgh, Joanne; Hindmarsh, Jon; Dowding, Dawn; Jayne, David; Pearman, Alan; Gardner, Peter; Croft, Julie; Kotze, Alwyn
2014-05-02
Robotic surgery offers many potential benefits for patients. While an increasing number of healthcare providers are purchasing surgical robots, there are reports that the technology is failing to be introduced into routine practice. Additionally, in robotic surgery, the surgeon is physically separated from the patient and the rest of the team, with the potential to negatively impact teamwork in the operating theatre. The aim of this study is to ascertain: how and under what circumstances robotic surgery is effectively introduced into routine practice; and how and under what circumstances robotic surgery impacts teamwork, communication and decision making, and subsequent patient outcomes. We will undertake a process evaluation alongside a randomised controlled trial comparing laparoscopic and robotic surgery for the curative treatment of rectal cancer. Realist evaluation provides an overall framework for the study. The study will be in three phases. In Phase I, grey literature will be reviewed to identify stakeholders' theories concerning how robotic surgery becomes embedded into surgical practice and its impacts. These theories will be refined and added to through interviews conducted across English hospitals that are using robotic surgery for rectal cancer resection with staff at different levels of the organisation, along with a review of documentation associated with the introduction of robotic surgery. In Phase II, a multi-site case study will be conducted across four English hospitals to test and refine the candidate theories. Data will be collected using multiple methods: the structured observation tool OTAS (Observational Teamwork Assessment for Surgery); video recordings of operations; ethnographic observation; and interviews. In Phase III, interviews will be conducted at the four case sites with staff representing a range of surgical disciplines, to assess the extent to which the results of Phase II are generalisable and to refine the resulting theories to reflect the experience of a broader range of surgical disciplines. The study will provide (i) guidance to healthcare organisations on factors likely to facilitate successful implementation and integration of robotic surgery, and (ii) guidance on how to ensure effective communication and teamwork when undertaking robotic surgery.
2014-01-01
Background Robotic surgery offers many potential benefits for patients. While an increasing number of healthcare providers are purchasing surgical robots, there are reports that the technology is failing to be introduced into routine practice. Additionally, in robotic surgery, the surgeon is physically separated from the patient and the rest of the team, with the potential to negatively impact teamwork in the operating theatre. The aim of this study is to ascertain: how and under what circumstances robotic surgery is effectively introduced into routine practice; and how and under what circumstances robotic surgery impacts teamwork, communication and decision making, and subsequent patient outcomes. Methods and design We will undertake a process evaluation alongside a randomised controlled trial comparing laparoscopic and robotic surgery for the curative treatment of rectal cancer. Realist evaluation provides an overall framework for the study. The study will be in three phases. In Phase I, grey literature will be reviewed to identify stakeholders’ theories concerning how robotic surgery becomes embedded into surgical practice and its impacts. These theories will be refined and added to through interviews conducted across English hospitals that are using robotic surgery for rectal cancer resection with staff at different levels of the organisation, along with a review of documentation associated with the introduction of robotic surgery. In Phase II, a multi-site case study will be conducted across four English hospitals to test and refine the candidate theories. Data will be collected using multiple methods: the structured observation tool OTAS (Observational Teamwork Assessment for Surgery); video recordings of operations; ethnographic observation; and interviews. In Phase III, interviews will be conducted at the four case sites with staff representing a range of surgical disciplines, to assess the extent to which the results of Phase II are generalisable and to refine the resulting theories to reflect the experience of a broader range of surgical disciplines. The study will provide (i) guidance to healthcare organisations on factors likely to facilitate successful implementation and integration of robotic surgery, and (ii) guidance on how to ensure effective communication and teamwork when undertaking robotic surgery. PMID:24885669
ERIC Educational Resources Information Center
Faust, Kyle A.; Faust, David; Baker, Aaron M.; Meyer, Joseph F.
2012-01-01
Even when relatively infrequent, deviant response sets, such as defensive and careless responding, can have remarkably robust effects on individual and group data and thereby distort clinical evaluations and research outcomes. Given such potential adverse impacts and the widespread use of self-report measures when appraising addictions and…
ERIC Educational Resources Information Center
Angus, Rebecca; Hughes, Thomas
2017-01-01
Schools regularly implement numerous programs to satisfy widespread expectations. Often, implementation is carried out with little follow-up examining data that could help refine or determine the ultimate worth of the intervention. Through utilization of both descriptive and empirical methods, this study delved into the long-term effectiveness of…
40 CFR Appendix W to Part 51 - Guideline on Air Quality Models
Code of Federal Regulations, 2011 CFR
2011-07-01
... sufficient spatial and temporal coverage are available. c. It would be advantageous to categorize the various... control strategies. These are referred to as refined models. c. The use of screening techniques followed... location of the source in question and its expected impacts. c. In all regulatory analyses, especially if...
The Impact of Education on Income Distribution.
ERIC Educational Resources Information Center
Tinbergen, Jan
The author's previously developed theory on income distribution, in which two of the explanatory variables are the average level and the distribution of education, is refined and tested on data selected and processed by the author and data from three studies by Americans. The material consists of data on subdivisions of three countries, the United…
Challenges and approaches in planning fuel treatments across fire-excluded forested landscapes
B.M. Collins; S.L. Stephens; J.J. Moghaddas; J. Battles
2010-01-01
Placing fuel reduction treatments across entire landscapes such that impacts associated with high-intensity fire are lessened is a difficult goal to achieve, largely because of the immense area needing treatment. As such, fire scientists and managers have conceptually developed and are refining methodologies for strategic placement of fuel treatments that...
Due to its presence in water as a volatile disinfection byproduct, BDCM, which is mutagenic and a rodent carcinogen, poses a risk for exposure via multiple routes. We developed a refined human PBPK model for BDCM (including new chemical-specific human parameters) to evaluate the...
Professional Development, Teacher Efficacy, and Collaboration in Title I Middle Schools
ERIC Educational Resources Information Center
Rostan, MaryMargret
2009-01-01
A problem exists in the U.S. education system regarding the efforts to refine professional development and gain a deeper understanding of content knowledge to impact teachers' abilities to meet students' needs. Many teachers have not had the professional development opportunities that support the improvement of teaching skills and knowledge. The…
Proteopedia Entry: The Large Ribosomal Subunit of "Haloarcula Marismortui"
ERIC Educational Resources Information Center
Decatur, Wayne A.
2010-01-01
This article presents a "Proteopedia" page that shows the refined version of the structure of the "Haloarcula" large ribosomal subunit as solved by the laboratories of Thomas Steitz and Peter Moore. The landmark structure is of great impact as it is the first atomic-resolution structure of the highly conserved ribosomal subunit which harbors…
Threshold Concepts: Impacts on Teaching and Learning at Tertiary Level
ERIC Educational Resources Information Center
Peter, Mira; Harlow, Ann
2014-01-01
This project explored teaching and learning of hard-to-learn threshold concepts in first-year English, an electrical engineering course, leadership courses, and in doctoral writing. The project was envisioned to produce disciplinary case studies that lecturers could use to reflect on and refine their curriculum and pedagogy, thereby contributing…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Jeongwoo; Elgowainy, Amgad; Wang, Michael
2015-07-14
In this study, we evaluated the impacts of producing HOF with a RON of 100, using a range of ethanol blending levels (E10, E25, and E40), vehicle efficiency gains, and HOF market penetration scenarios (3.4% to 70%), on WTW petroleum use and GHG emissions. In particular, we conducted LP modeling of petroleum refineries to examine the impacts of different HOF production scenarios on petroleum refining energy use and GHG emissions. We compared two cases of HOF vehicle fuel economy gains of 5% and 10% in terms of MPGGE to baseline regular gasoline vehicles. We incorporated three key factors in GREETmore » — (1) refining energy intensities of gasoline components for the various ethanol blending options and market shares, (2) vehicle efficiency gains, and (3) upstream energy use and emissions associated with the production of different crude types and ethanol — to compare the WTW GHG emissions of various HOF/vehicle scenarios with the business-as-usual baseline regular gasoline (87 AKI E10) pathway.« less
Olive Oil and Vitamin D Synergistically Prevent Bone Loss in Mice
Tagliaferri, Camille; Davicco, Marie-Jeanne; Lebecque, Patrice; Georgé, Stéphane; Amiot, Marie-Jo; Mercier, Sylvie; Dhaussy, Amélie; Huertas, Alain; Walrand, Stéphane; Wittrant, Yohann; Coxam, Véronique
2014-01-01
As the Mediterranean diet (and particularly olive oil) has been associated with bone health, we investigated the impact of extra virgin oil as a source of polyphenols on bone metabolism. In that purpose sham-operated (SH) or ovariectomized (OVX) mice were subjected to refined or virgin olive oil. Two supplementary OVX groups were given either refined or virgin olive oil fortified with vitamin D3, to assess the possible synergistic effects with another liposoluble nutrient. After 30 days of exposure, bone mineral density and gene expression were evaluated. Consistent with previous data, ovariectomy was associated with increased bone turnover and led to impaired bone mass and micro-architecture. The expression of oxidative stress markers were enhanced as well. Virgin olive oil fortified with vitamin D3 prevented such changes in terms of both bone remodeling and bone mineral density. The expression of inflammation and oxidative stress mRNA was also lower in this group. Overall, our data suggest a protective impact of virgin olive oil as a source of polyphenols in addition to vitamin D3 on bone metabolism through improvement of oxidative stress and inflammation. PMID:25551374
Poag, C. Wylie; Hutchinson, Deborah R.; Colman, Steve M.; Lee, Myung W.; Dressler, B.O.; Sharpton, V.L.
1999-01-01
This work refines previous interpretations of the structure and morphology of the Chesapeake Bay impact crater on the basis of more than 1,200 km of multichannel and single-channel seismic reflection profiles collected in the bay and on the adjacent continental shelf. The outer rim, formed in sedimentary rocks, is irregularly circular, with an average diameter of ~85 km. A 20–25-km-wide annular trough separates the outer rim from an ovate, crystalline peak ring of ~200 m of maximum relief. The inner basin is 35–40 km in diameter, and at least 1.26 km deep. A crystalline(?) central peak, approximately 1 km high, is faintly imaged on three profiles, and also is indicated by a small positive Bouguer gravity anomaly. These features classify the crater as a complex peak-ring/central peak crater. Chesapeake Bay Crater is most comparable to the Ries and Popigai Craters on Earth; to protobasins on Mars, Mercury, and the Moon; and to type D craters on Venus.
Recommendations for Exploration Space Medicine from the Apollo Medical Operations Project
NASA Technical Reports Server (NTRS)
Scheuring, R. a.; Davis, J. R.; Duncan, J. M.; Polk, J. D.; Jones, J. A.; Gillis, D. B.
2007-01-01
Introduction: A study was requested in December, 2005 by the Space Medicine Division at the NASA-Johnson Space Center (JSC) to identify Apollo mission issues relevant to medical operations that had impact to crew health and/or performance. The objective was to use this new information to develop medical requirements for the future Crew Exploration Vehicle (CEV), Lunar Surface Access Module (LSAM), Lunar Habitat, and Advanced Extravehicular Activity (EVA) suits that are currently being developed within the exploration architecture. Methods: Available resources pertaining to medical operations on the Apollo 7 through 17 missions were reviewed. Ten categories of hardware, systems, or crew factors were identified in the background research, generating 655 data records in a database. A review of the records resulted in 280 questions that were then posed to surviving Apollo crewmembers by mail, face-to-face meetings, or online interaction. Response analysis to these questions formed the basis of recommendations to items in each of the categories. Results: Thirteen of 22 surviving Apollo astronauts (59%) participated in the project. Approximately 236 pages of responses to the questions were captured, resulting in 107 recommendations offered for medical consideration in the design of future vehicles and EVA suits based on the Apollo experience. Discussion: The goals of this project included: 1) Develop or modify medical requirements for new vehicles; 2) create a centralized database for future access; and 3) take this new knowledge and educate the various directorates at NASA-JSC who are participating in the exploration effort. To date, the Apollo Medical Operations recommendations are being incorporated into the exploration mission architecture at various levels and a centralized database has been developed. The Apollo crewmembers input has proved to be an invaluable resource, prompting ongoing collaboration as the requirements for the future exploration missions continue to evolve and be refined.
MODFLOW-LGR: Practical application to a large regional dataset
NASA Astrophysics Data System (ADS)
Barnes, D.; Coulibaly, K. M.
2011-12-01
In many areas of the US, including southwest Florida, large regional-scale groundwater models have been developed to aid in decision making and water resources management. These models are subsequently used as a basis for site-specific investigations. Because the large scale of these regional models is not appropriate for local application, refinement is necessary to analyze the local effects of pumping wells and groundwater related projects at specific sites. The most commonly used approach to date is Telescopic Mesh Refinement or TMR. It allows the extraction of a subset of the large regional model with boundary conditions derived from the regional model results. The extracted model is then updated and refined for local use using a variable sized grid focused on the area of interest. MODFLOW-LGR, local grid refinement, is an alternative approach which allows model discretization at a finer resolution in areas of interest and provides coupling between the larger "parent" model and the locally refined "child." In the present work, these two approaches are tested on a mining impact assessment case in southwest Florida using a large regional dataset (The Lower West Coast Surficial Aquifer System Model). Various metrics for performance are considered. They include: computation time, water balance (as compared to the variable sized grid), calibration, implementation effort, and application advantages and limitations. The results indicate that MODFLOW-LGR is a useful tool to improve local resolution of regional scale models. While performance metrics, such as computation time, are case-dependent (model size, refinement level, stresses involved), implementation effort, particularly when regional models of suitable scale are available, can be minimized. The creation of multiple child models within a larger scale parent model makes it possible to reuse the same calibrated regional dataset with minimal modification. In cases similar to the Lower West Coast model, where a model is larger than optimal for direct application as a parent grid, a combination of TMR and LGR approaches should be used to develop a suitable parent grid.
The Amber Biomolecular Simulation Programs
CASE, DAVID A.; CHEATHAM, THOMAS E.; DARDEN, TOM; GOHLKE, HOLGER; LUO, RAY; MERZ, KENNETH M.; ONUFRIEV, ALEXEY; SIMMERLING, CARLOS; WANG, BING; WOODS, ROBERT J.
2006-01-01
We describe the development, current features, and some directions for future development of the Amber package of computer programs. This package evolved from a program that was constructed in the late 1970s to do Assisted Model Building with Energy Refinement, and now contains a group of programs embodying a number of powerful tools of modern computational chemistry, focused on molecular dynamics and free energy calculations of proteins, nucleic acids, and carbohydrates. PMID:16200636
LSSA (Low-cost Silicon Solar Array) project
NASA Technical Reports Server (NTRS)
1976-01-01
Methods are explored for economically generating electrical power to meet future requirements. The Low-Cost Silicon Solar Array Project (LSSA) was established to reduce the price of solar arrays by improving manufacturing technology, adapting mass production techniques, and promoting user acceptance. The new manufacturing technology includes the consideration of new silicon refinement processes, silicon sheet growth techniques, encapsulants, and automated assembly production being developed under contract by industries and universities.
Laser Imaging Video Camera Sees Through Fire, Fog, Smoke
NASA Technical Reports Server (NTRS)
2015-01-01
Under a series of SBIR contracts with Langley Research Center, inventor Richard Billmers refined a prototype for a laser imaging camera capable of seeing through fire, fog, smoke, and other obscurants. Now, Canton, Ohio-based Laser Imaging through Obscurants (LITO) Technologies Inc. is demonstrating the technology as a perimeter security system at Glenn Research Center and planning its future use in aviation, shipping, emergency response, and other fields.
Proposals for the Future of JCAS Doctrine
2008-01-01
close proximity to friendly forces. GPS-equipped aircraft and munitions, laser range finders/designators and digital system capabilities are...Executive Summary 1 Introduction 2 IIistory of Close Air Support 3 Current Views on JCAS 15 JCAS Doctrine io Conclusion 22 Bibliography 27" r Executive...to be refined. An over-reliance on technology in an evolving Joint Close Air Support (JCAS) dogma increases airpower’s risk of fratricide to friendly
Stenehjem, Jo S; Friesen, Melissa C; Eggen, Tone; Kjærheim, Kristina; Bråtveit, Magne; Grimsrud, Tom K
2016-01-01
The objective of this study was to examine self-reported frequency of occupational exposure reported by 28,000 Norwegian offshore oil workers in a 1998 survey. Predictors of self-reported exposure frequency were identified to aid future refinements of an expert-based job-exposure-time matrix (JEM). We focus here on reported frequencies for skin contact with oil and diesel, exposure to oil vapor from shaker, to exhaust fumes, vapor from mixing chemicals used for drilling, natural gas, chemicals used for water injection and processing, and to solvent vapor. Exposure frequency was reported by participants as the exposed proportion of the work shift, defined by six categories, in their current or last position offshore (between 1965 and 1999). Binary Poisson regression models with robust variance were used to examine the probabilities of reporting frequent exposure (≥¼ vs. <¼ of work shift) according to main activity, time period, supervisory position, type of company, type of installation, work schedule, and education. Holding a non-supervisory position, working shifts, being employed in the early period of the offshore industry, and having only compulsory education increased the probability of reporting frequent exposure. The identified predictors and group-level patterns may aid future refinement of the JEM previously developed for the present cohort. PMID:25671393
Stenehjem, Jo S; Friesen, Melissa C; Eggen, Tone; Kjærheim, Kristina; Bråtveit, Magne; Grimsrud, Tom K
2015-01-01
The objective of this study was to examine self-reported frequency of occupational exposure reported by 28,000 Norwegian offshore oil workers in a 1998 survey. Predictors of self-reported exposure frequency were identified to aid future refinements of an expert-based job-exposure-time matrix (JEM). We focus here on reported frequencies for skin contact with oil and diesel; exposure to oil vapor from shaker, to exhaust fumes, vapor from mixing chemicals used for drilling, natural gas, chemicals used for water injection and processing, and to solvent vapor. Exposure frequency was reported by participants as the exposed proportion of the work shift, defined by six categories, in their current or last position offshore (between 1965 and 1999). Binary Poisson regression models with robust variance were used to examine the probabilities of reporting frequent exposure (≥¼ vs. <¼ of work shift) according to main activity, time period, supervisory position, type of company, type of installation, work schedule, and education. Holding a non-supervisory position, working shifts, being employed in the early period of the offshore industry, and having only compulsory education increased the probability of reporting frequent exposure. The identified predictors and group-level patterns may aid future refinement of the JEM previously developed for the present cohort.
Shang, Yizi; Lu, Shibao; Gong, Jiaguo; Shang, Ling; Li, Xiaofei; Wei, Yongping; Shi, Hongwang
2017-12-01
A recent study decomposed the changes in industrial water use into three hierarchies (output, technology, and structure) using a refined Laspeyres decomposition model, and found monotonous and exclusive trends in the output and technology hierarchies. Based on that research, this study proposes a hierarchical prediction approach to forecast future industrial water demand. Three water demand scenarios (high, medium, and low) were then established based on potential future industrial structural adjustments, and used to predict water demand for the structural hierarchy. The predictive results of this approach were compared with results from a grey prediction model (GPM (1, 1)). The comparison shows that the results of the two approaches were basically identical, differing by less than 10%. Taking Tianjin, China, as a case, and using data from 2003-2012, this study predicts that industrial water demand will continuously increase, reaching 580 million m 3 , 776.4 million m 3 , and approximately 1.09 billion m 3 by the years 2015, 2020 and 2025 respectively. It is concluded that Tianjin will soon face another water crisis if no immediate measures are taken. This study recommends that Tianjin adjust its industrial structure with water savings as the main objective, and actively seek new sources of water to increase its supply.
The value of continuity: Refined isogeometric analysis and fast direct solvers
Garcia, Daniel; Pardo, David; Dalcin, Lisandro; ...
2016-08-24
Here, we propose the use of highly continuous finite element spaces interconnected with low continuity hyperplanes to maximize the performance of direct solvers. Starting from a highly continuous Isogeometric Analysis (IGA) discretization, we introduce C0-separators to reduce the interconnection between degrees of freedom in the mesh. By doing so, both the solution time and best approximation errors are simultaneously improved. We call the resulting method “refined Isogeometric Analysis (rIGA)”. To illustrate the impact of the continuity reduction, we analyze the number of Floating Point Operations (FLOPs), computational times, and memory required to solve the linear system obtained by discretizing themore » Laplace problem with structured meshes and uniform polynomial orders. Theoretical estimates demonstrate that an optimal continuity reduction may decrease the total computational time by a factor between p 2 and p 3, with pp being the polynomial order of the discretization. Numerical results indicate that our proposed refined isogeometric analysis delivers a speed-up factor proportional to p 2. In a 2D mesh with four million elements and p=5, the linear system resulting from rIGA is solved 22 times faster than the one from highly continuous IGA. In a 3D mesh with one million elements and p=3, the linear system is solved 15 times faster for the refined than the maximum continuity isogeometric analysis.« less
NASA Astrophysics Data System (ADS)
Apel, M.; Eiken, J.; Hecht, U.
2014-02-01
This paper aims at briefly reviewing phase field models applied to the simulation of heterogeneous nucleation and subsequent growth, with special emphasis on grain refinement by inoculation. The spherical cap and free growth model (e.g. A.L. Greer, et al., Acta Mater. 48, 2823 (2000)) has proven its applicability for different metallic systems, e.g. Al or Mg based alloys, by computing the grain refinement effect achieved by inoculation of the melt with inert seeding particles. However, recent experiments with peritectic Ti-Al-B alloys revealed that the grain refinement by TiB2 is less effective than predicted by the model. Phase field simulations can be applied to validate the approximations of the spherical cap and free growth model, e.g. by computing explicitly the latent heat release associated with different nucleation and growth scenarios. Here, simulation results for point-shaped nucleation, as well as for partially and completely wetted plate-like seed particles will be discussed with respect to recalescence and impact on grain refinement. It will be shown that particularly for large seeding particles (up to 30 μm), the free growth morphology clearly deviates from the assumed spherical cap and the initial growth - until the free growth barrier is reached - significantly contributes to the latent heat release and determines the recalescence temperature.
Negotiating plausibility: intervening in the future of nanotechnology.
Selin, Cynthia
2011-12-01
The national-level scenarios project NanoFutures focuses on the social, political, economic, and ethical implications of nanotechnology, and is initiated by the Center for Nanotechnology in Society at Arizona State University (CNS-ASU). The project involves novel methods for the development of plausible visions of nanotechnology-enabled futures, elucidates public preferences for various alternatives, and, using such preferences, helps refine future visions for research and outreach. In doing so, the NanoFutures project aims to address a central question: how to deliberate the social implications of an emergent technology whose outcomes are not known. The solution pursued by the NanoFutures project is twofold. First, NanoFutures limits speculation about the technology to plausible visions. This ambition introduces a host of concerns about the limits of prediction, the nature of plausibility, and how to establish plausibility. Second, it subjects these visions to democratic assessment by a range of stakeholders, thus raising methodological questions as to who are relevant stakeholders and how to activate different communities so as to engage the far future. This article makes the dilemmas posed by decisions about such methodological issues transparent and therefore articulates the role of plausibility in anticipatory governance.
Romig, Barbara D; Tucker, Ann W; Hewitt, Anne M; O'Sullivan Maillet, Julie
2016-01-01
Allied health (AH) clinical education provides future health professionals with the experiences necessary to develop the healthcare competencies required for success in their individual fields. There is limited information and consensus on the purposes of clinical education, including its definition and goals, and its comprehensive role in AH clinical training. This study explored whether consensus could be achieved in the definition, goals, and factors impacting AH clinical education. An expert panel consisting of 61 AH deans (54.9% of the population) whose institutions were 2013 members of the Association of Schools of Allied Health Professions (ASAHP) participated in a three-round Delphi study. From July 2013 to March 2014, the deans expressed opinions about clinical education and its purposes. Responses were collected, summarized, and refined, and responses were accepted and re-rated until agreement was achieved or the study concluded. The hypothesis that AH deans would agree upon the definition and goals of clinical education was supported by this study's findings. Over 90% of deans "strongly agreed" or "agreed" on the definition of clinical education. A majority (90.2% to 92.7%) agreed with the goals. High agreement was achieved on the purposes of clinical education, resulting in a comprehensive definition of and goals for AH clinical education. The definition and goals of clinical education can be added in the healthcare literature and used in support of AH education.
Linder, Greg L.; Brumbaugh, William G.; Neitlich, Peter; Little, Edward
2013-01-01
To protect important resources under their bureau’s purview, the United States National Park Service’s (NPS) Arctic Network (ARCN) has developed a series of “vital signs” that are to be periodically monitored. One of these vital signs focuses on wet and dry deposition of atmospheric chemicals and further, the establishment of critical load (CL) values (thresholds for ecological effects based on cumulative depositional loadings) for nitrogen (N), sulfur, and metals. As part of the ARCN terrestrial monitoring programs, samples of the feather moss Hylocomium splendens are being col- lected and analyzed as a cost-effective means to monitor atmospheric pollutant deposition in this region. Ultimately, moss data combined with refined CL values might be used to help guide future regulation of atmospheric contaminant sources potentially impacting Arctic Alaska. But first, additional long-term studies are needed to determine patterns of contaminant deposition as measured by moss biomonitors and to quantify ecosystem responses at particular loadings/ ranges of contaminants within Arctic Alaska. Herein we briefly summarize 1) current regulatory guidance related to CL values 2) derivation of CL models for N and metals, 3) use of mosses as biomonitors of atmospheric deposition and loadings, 4) preliminary analysis of vulnerabilities and risks associated with CL estimates for N, 5) preliminary analysis of existing data for characterization of CL values for N for interior Alaska and 6) implications for managers and future research needs.
Climatic changes, bioclimatic stages and flooding durations in relation with public health
NASA Astrophysics Data System (ADS)
Sandoz, A.; Roumieux, C.; Trouillet, A.
2009-12-01
Climatic Changes, and more generaly Global Changes, play a major role in environmental modifications related to public health. Modifications of temperatures, precipitations... influence ecological habitats. These habitats can be adapted for some animals species, responsable for predestinate pandemics. Mosquitoes and birds represent for certain pandemics the essential elements of virus transmission. Abundance of mosquitoes and birds species, is heavily conditioned to favorable ecological habitats, flooded areas extent and their variations. The study we carried, has been done in South of France. We show present status of ecological habitats and flooded durations in relation with actual climat. We have refine mediterranean spatial knowledge in mediterranean basin with actual data. We show evolution of climat and consequences for bioclimatic stages, using world clim data and IPCC scenarii. We reach environment impact for certain virus like West Nile virus. This virus affects birds, horses and hands up to men (e.g.West Nile virus appeared in 1999 in USA, between 1999 and 2007 : 27 000 human cases including 1 050 deaths). Presence of the virus is conditioned by different factors, primarily including vector distribution (mosquitoes). We show how it’s possible to localise favorable areas for the virus and to predict its future expansion areas. We present maps of the possibilities for future concerning previsions of bioclimatic steps variations. Thanks to the latest remote sensing and spatial analysis techniques. Our maps may be used as precious tools to help decision makers when faced with mosquito related problems.
Spitzer Transits of New TESS Planets
NASA Astrophysics Data System (ADS)
Crossfield, Ian; Werner, Michael; Dragomir, Diana; Kreidberg, Laura; Benneke, Bjoern; Deming, Drake; Gorjian, Varoujan; Guo, Xueying; Dressing, Courtney; Yu, Liang; Kane, Stephen; Christiansen, Jessie; Berardo, David; Morales, Farisa
2018-05-01
TESS will soon begin searching the sky for new transiting planets around the nearest, brightest stars, and JWST will become the world-leading facility in exoplanet atmospheric characterization. A key TESS goal is to provide the best atmospheric targets to JWST. However, many new TESS planets will exhibit just a few transits each, so their transit ephemerides will be only weakly constrained; without additional constraints on the planet orbit, the transits will be quickly "lost" long before JWST transit spectroscopy can commence. Some TESS planets will also be good targets for JWST secondary eclipses observations, but these eclipses will be even harder to pin down from TESS data alone. Spitzer's IR sensitivity and photometric stability can identify the transits and eclipses of the most favorable TESS planets and set the stage for JWST atmospheric characterization on a large scale. We request 550 hr to use Spitzer to measure precise transits and eclipses of new planets from the first year of TESS, refining their properties and ensuring their transits and eclipses can be recovered for many years to come. We will focus on the smaller planets for which ground-based observations are impractical and for which JWST spectroscopy will have a high impact. The time baseline provided by Spitzer will pin down the ephemerides far into the future. Thus our proposed program will secure these planets for future JWST spectroscopy to reveal their atmospheric makeup, chemistry, cloud properties, and formation history in unprecedented detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oubeidillah, Abdoul A; Kao, Shih-Chieh; Ashfaq, Moetasim
2014-01-01
To extend geographical coverage, refine spatial resolution, and improve modeling efficiency, a computation- and data-intensive effort was conducted to organize a comprehensive hydrologic dataset with post-calibrated model parameters for hydro-climate impact assessment. Several key inputs for hydrologic simulation including meteorologic forcings, soil, land class, vegetation, and elevation were collected from multiple best-available data sources and organized for 2107 hydrologic subbasins (8-digit hydrologic units, HUC8s) in the conterminous United States at refined 1/24 (~4 km) spatial resolution. Using high-performance computing for intensive model calibration, a high-resolution parameter dataset was prepared for the macro-scale Variable Infiltration Capacity (VIC) hydrologic model. The VICmore » simulation was driven by DAYMET daily meteorological forcing and was calibrated against USGS WaterWatch monthly runoff observations for each HUC8. The results showed that this new parameter dataset may help reasonably simulate runoff at most US HUC8 subbasins. Based on this exhaustive calibration effort, it is now possible to accurately estimate the resources required for further model improvement across the entire conterminous United States. We anticipate that through this hydrologic parameter dataset, the repeated effort of fundamental data processing can be lessened, so that research efforts can emphasize the more challenging task of assessing climate change impacts. The pre-organized model parameter dataset will be provided to interested parties to support further hydro-climate impact assessment.« less
The influence of facility and home pen design on the welfare of the laboratory-housed dog.
Scullion Hall, Laura E M; Robinson, Sally; Finch, John; Buchanan-Smith, Hannah M
We have an ethical and scientific obligation to Refine all aspects of the life of the laboratory-housed dog. Across industry there are many differences amongst facilities, home pen design and husbandry, as well as differences in features of the dogs such as strain, sex and scientific protocols. Understanding how these influence welfare, and hence scientific output is therefore critical. A significant proportion of dogs' lives are spent in the home pen and as such, the design can have a considerable impact on welfare. Although best practice guidelines exist, there is a paucity of empirical evidence to support the recommended Refinements and uptake varies across industry. In this study, we examine the effect of modern and traditional home pen design, overall facility design, husbandry, history of regulated procedures, strain and sex on welfare-indicating behaviours and mechanical pressure threshold. Six groups of dogs from two facilities (total n=46) were observed in the home pen and tested for mechanical pressure threshold. Dogs which were housed in a purpose-built modern facility or in a modern design home pen showed the fewest behavioural indicators of negative welfare (such as alert or pacing behaviours) and more indicators of positive welfare (such as resting) compared to those in a traditional home pen design or traditional facility. Welfare indicating behaviours did not vary consistently with strain, but male dogs showed more negative welfare indicating behaviours and had greater variation in these behaviours than females. Our findings showed more positive welfare indicating behaviours in dogs with higher mechanical pressure thresholds. We conclude that factors relating to the design of home pens and implementation of Refinements at the facility level have a significant positive impact on the welfare of laboratory-housed dogs, with a potential concomitant impact on scientific endpoints. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Clinical Adoption of Prognostic Biomarkers The Case for Heart Failure
Kalogeropoulos, Andreas P.; Georgiopoulou, Vasiliki V.; Butler, Javed
2013-01-01
The recent explosion of scientific knowledge and technological progress has led to the discovery of a large array of circulating molecules commonly referred to as biomarkers. Biomarkers in heart failure research have been used to provide pathophysiological insights, aid in establishing the diagnosis, refine prognosis, guide management, and target treatment. However, beyond diagnostic applications of natriuretic peptides, there are currently few widely recognized applications for biomarkers in heart failure. This represents a remarkable discordance considering the number of molecules that have been shown to correlate with outcomes, refine risk prediction, or track disease severity in heart failure in the past decade. In this article, we use a broad framework proposed for cardiovascular risk markers to summarize the current state of biomarker development for heart failure patients. We utilize this framework to identify the challenges of biomarker adoption for risk prediction, disease management, and treatment selection for heart failure and suggest considerations for future research. PMID:22824105
Solar breeder: Energy payback time for silicon photovoltaic systems
NASA Technical Reports Server (NTRS)
Lindmayer, J.
1977-01-01
The energy expenditures of the prevailing manufacturing technology of terrestrial photovoltaic cells and panels were evaluated, including silicon reduction, silicon refinement, crystal growth, cell processing and panel building. Energy expenditures include direct energy, indirect energy, and energy in the form of equipment and overhead expenses. Payback times were development using a conventional solar cell as a test vehicle which allows for the comparison of its energy generating capability with the energies expended during the production process. It was found that the energy payback time for a typical solar panel produced by the prevailing technology is 6.4 years. Furthermore, this value drops to 3.8 years under more favorable conditions. Moreover, since the major energy use reductions in terrestrial manufacturing have occurred in cell processing, this payback time directly illustrates the areas where major future energy reductions can be made -- silicon refinement, crystal growth, and panel building.
Radiation Resistant Electrical Insulation Materials for Nuclear Reactors: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duckworth, Robert C.; Aytug, Tolga; Paranthaman, M. Parans
The instrument and control cables in future nuclear reactors will be exposed to temperatures, dose rates, and accumulated doses exceeding those originally anticipated for the 40-year operational life of the nuclear power plant fleet. The use of nanocomposite dielectrics as insulating material for such cables has been considered a route to performance improvement. In this project, nanoparticles were developed and successfully included in three separate material systems [cross-linked polyvinyl alcohol (PVA/XLPVA), cross-linked polyethylene (PE/XLPE), and polyimide (PI)], and the chemical, electrical, and mechanical performance of each was analyzed as a function of environmental exposure and composition. Improvements were found inmore » each material system; however, refinement of each processing pathway is needed, and the consequences of these refinements in the context of thermal, radiation, and moisture exposures should be evaluated before transferring knowledge to industry.« less
Lipkin, W. Ian
2010-01-01
Summary: Platforms for pathogen discovery have improved since the days of Koch and Pasteur; nonetheless, the challenges of proving causation are at least as daunting as they were in the late 1800s. Although we will almost certainly continue to accumulate low-hanging fruit, where simple relationships will be found between the presence of a cultivatable agent and a disease, these successes will be increasingly infrequent. The future of the field rests instead in our ability to follow footprints of infectious agents that cannot be characterized using classical microbiological techniques and to develop the laboratory and computational infrastructure required to dissect complex host-microbe interactions. I have tried to refine the criteria used by Koch and successors to prove linkage to disease. These refinements are working constructs that will continue to evolve in light of new technologies, new models, and new insights. What will endure is the excitement of the chase. Happy hunting! PMID:20805403
Refining the ideas of "ethnic" skin.
Torres, Vicente; Herane, Maria Isabel; Costa, Adilson; Martin, Jaime Piquero; Troielli, Patricia
2017-01-01
Skin disease occur worldwide, affecting people of all nationalities and all skin types. These diseases may have a genetic component and may manifest differently in specific population groups; however, there has been little study on this aspect. If population-based differences exist, it is reasonable to assume that understanding these differences may optimize treatment. While there is a relative paucity of information about similarities and differences in skin diseases around the world, the knowledge-base is expanding. One challenge in understanding population-based variations is posed by terminology used in the literature: including ethnic skin, Hispanic skin, Asian skin, and skin of color. As will be discussed in this article, we recommend that the first three descriptors are no longer used in dermatology because they refer to nonspecific groups of people. In contrast, "skin of color" may be used - perhaps with further refinements in the future - as a term that relates to skin biology and provides relevant information to dermatologists.
NASA Astrophysics Data System (ADS)
McCall, N.; Gulick, S. P. S.; Morgan, J. V.; Hall, B. J.; Jones, L.; Expedition 364 Science Party, I. I.
2017-12-01
During Expedition 364, IODP/ICDP drilled the peak ring of the Chicxulub impact crater at Site M0077, recovering core from 505.7 to 1334.7 mbsf. The core has been imaged via X-ray Computer Tomography (CT) as a noninvasive method to create a 3-dimensional model of the core, providing information on the density and internal structure at a 0.3 mm resolution. Results from the expedition show that from 748 mbsf and deeper the peak ring is largely composed of uplifted and fractured granitic basement rocks originally sourced from approximately 8-10 km depth. Impact crater modeling suggests the peak ring was formed through dynamic collapse of a rebounding central peak within 10 minutes of impact, requiring the target rocks to temporarily behave as a viscous fluid. The newly recovered core provides a rare opportunity to investigate the cratering process, specifically how the granite was weakened, as well as the extent of the hydrothermal system created after the impact. Based on the CT data, we identify four classes of fractures based on their CT facies deforming the granitoids: pervasive fine fractures, discrete fine fractures, discrete filled fractures, and discrete open fractures. Pervasive fine fractures were most commonly found proximal to dikes and impact melt rock. Discrete filled fractures often displayed a cataclastic texture. We present density trends for the different facies and compare these to petrophysical properties (density, NGR, P-wave seismic velocity). Fractured areas have a lower density than the surrounding granite, as do most filled fractures. This reduction suggests that fluid migrating through the peak ring in the wake of the impact either deposited lower density minerals within the fractures and/or altered the original fracture fill. The extent and duration of fluid flow recorded in these fractures will assist in the characterization of the post-impact hydrothermal system. Future work includes combining information from CT images with thin sections and plug samples at similar depths, refinement of CT facies characterization, examining cross-cutting relationships to determine timing constraints of deformation processes, and measurement of the orientation of the fractures.
Hester, Katy L M; Newton, Julia; Rapley, Tim; De Soyza, Anthony
2016-04-23
There is currently little patient information on bronchiectasis, a chronic lung disease with rising prevalence. Previous work shows that patients and their families want more information, which could potentially improve their understanding and self-management. Using interviews and focus groups, we have co-developed a novel patient and carer information resource, aiming to meet their identified needs. The aims and objectives are: 1. To assess the potential impact of the information resource 2. To evaluate and refine the intervention 3. To establish the feasibility of carrying out a multi-centre randomised controlled trial to determine its effect on understanding, self-management and health outcomes This is a feasibility study, with a single-centre, randomised controlled trial design, comparing use of a novel patient information resource to usual care in bronchiectasis. Additionally, patients and carers will be invited to focus groups to discuss their views on both the intervention itself and the trial process. The study duration for each participant will be 3 months from the study entry date. A total of 70 patients will be recruited to the study, and a minimum of 30 will be randomised to each arm. Ten participants (and their carers if applicable) will be invited to attend focus groups on completion of the study visits. Participants will be adults with bronchiectasis diagnosed as per national bronchiectasis guidelines. Once consented, participants will be randomised to the intervention or control arm using random permuted blocks to ensure treatment group numbers are evenly balanced. Randomisation will be web-based. Those randomised to the intervention will receive the information resource (website and booklet) and instructions on its use. Outcome measures (resource satisfaction, resource use and alternative information seeking, quality of life questionnaires, unscheduled healthcare visits, exacerbation frequency, bronchiectasis knowledge questionnaire and lung function tests) will be recorded at baseline, 2 weeks and 3 months. All outcome measures will be used in assessing feasibility and acceptability of a future definitive trial. Feasibility outcomes include recruitment, retention and study scale form completion rates. Focus groups will strengthen qualitative data for resource refinement and to identify participant views on the trial process, which will also inform feasibility assessments. Questionnaires will also be used to evaluate and refine the resource. ISRCTN84229105.
Andrejasic, Miha; Praaenikar, Jure; Turk, Dusan
2008-11-01
The number and variety of macromolecular structures in complex with ;hetero' ligands is growing. The need for rapid delivery of correct geometric parameters for their refinement, which is often crucial for understanding the biological relevance of the structure, is growing correspondingly. The current standard for describing protein structures is the Engh-Huber parameter set. It is an expert data set resulting from selection and analysis of the crystal structures gathered in the Cambridge Structural Database (CSD). Clearly, such a manual approach cannot be applied to the vast and ever-growing number of chemical compounds. Therefore, a database, named PURY, of geometric parameters of chemical compounds has been developed, together with a server that accesses it. PURY is a compilation of the whole CSD. It contains lists of atom classes and bonds connecting them, as well as angle, chirality, planarity and conformation parameters. The current compilation is based on CSD 5.28 and contains 1978 atom classes and 32,702 bonding, 237,068 angle, 201,860 dihedral and 64,193 improper geometric restraints. Analysis has confirmed that the restraints from the PURY database are suitable for use in macromolecular crystal structure refinement and should be of value to the crystallographic community. The database can be accessed through the web server http://pury.ijs.si/, which creates topology and parameter files from deposited coordinates in suitable forms for the refinement programs MAIN, CNS and REFMAC. In the near future, the server will move to the CSD website http://pury.ccdc.cam.ac.uk/.
Blaisdell, Aaron P; Lau, Yan Lam Matthew; Telminova, Ekatherina; Lim, Hwee Cheei; Fan, Boyang; Fast, Cynthia D; Garlick, Dennis; Pendergrass, David C
2014-04-10
Purified high-fat diet (HFD) feeding causes deleterious metabolic and cognitive effects when compared with unrefined low-fat diets in rodent models. These effects are often attributed to the diet's high content of fat, while less attention has been paid to other mechanisms associated with the diet's highly refined state. Although the effects of HFD feeding on cognition have been explored, little is known about the impact of refined vs. unrefined food on cognition. We tested the hypothesis that a refined low-fat diet (LFD) increases body weight and adversely affects cognition relative to an unrefined diet. Rats were allowed ad libitum access to unrefined rodent chow (CON, Lab Diets 5001) or a purified low-fat diet (REF, Research Diets D12450B) for 6 months, and body weight and performance on an instrumental lever pressing task were recorded. After six months on their respective diets, group REF gained significantly more weight than group CON. REF rats made significantly fewer lever presses and exhibited dramatically lower breaking points than CON rats for sucrose and water reinforcement, indicating a chronic reduction of motivation for instrumental performance. Switching the rats' diet for 9 days had no effect on these measures. Diet-induced obesity produces a substantial deficit in motivated behavior in rats, independent of dietary fat content. This holds implications for an association between obesity and motivation. Specifically, behavioral traits comorbid with obesity, such as depression and fatigue, may be effects of obesity rather than contributing causes. To the degree that refined foods contribute to obesity, as demonstrated in our study, they may play a significant contributing role to other behavioral and cognitive disorders. Copyright © 2014 Elsevier Inc. All rights reserved.
Maillot, N; Guenancia, C; Yameogo, N V; Gudjoncik, A; Garnier, F; Lorgis, L; Chagué, F; Cottin, Y
2018-02-01
To interpret the electrocardiogram (ECG) of athletes, the recommendations of the ESC and the Seattle criteria define type 1 peculiarities, those induced by training, and type 2, those not induced by training, to rule out cardiomyopathy. The specificity of the screening was improved by Sheikh who defined "Refined Criteria," which includes a group of intermediate peculiarities. The aim of our study was to investigate the influence of static and dynamic components on the prevalence of different types of abnormalities. The ECGs of 1030 athletes performed during preparticipation screening were interpreted using these three classifications. Our work revealed 62/16%, 69/13%, and 71/7% of type 1 peculiarities and type 2 abnormalities for the ESC, Seattle, and Refined Criteria algorithms, respectively(P<.001). For type 2 abnormalities, three independent factors were found for the ESC and Seattle criteria: age, Afro-Caribbean origin, and the dynamic component with, for the latter, an OR[95% CI] of 2.35[1.28-4.33] (P=.006) and 1.90[1.03-3.51] (P=.041), respectively. In contrast, only the Afro-Caribbean origin was associated with type 2 abnormalities using the Refined Criteria: OR[95% CI] 2.67[1.60-4.46] (P<.0001). The Refined Criteria classified more athletes in the type 1 category and fewer in the type 2 category compared with the ESC and Seattle algorithms. Contrary to previous studies, a high dynamic component was not associated with type 2 abnormalities when the Refined Criteria were used; only the Afro-Caribbean origin remained associated. Further research is necessary to better understand adaptations with regard to duration and thus improve the modern criteria for ECG screening in athletes. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Rout, Bapin Kumar; Brooks, Geoff; Rhamdhani, M. Akbar; Li, Zushu; Schrama, Frank N. H.; Sun, Jianjun
2018-04-01
A multi-zone kinetic model coupled with a dynamic slag generation model was developed for the simulation of hot metal and slag composition during the basic oxygen furnace (BOF) operation. The three reaction zones (i) jet impact zone, (ii) slag-bulk metal zone, (iii) slag-metal-gas emulsion zone were considered for the calculation of overall refining kinetics. In the rate equations, the transient rate parameters were mathematically described as a function of process variables. A micro and macroscopic rate calculation methodology (micro-kinetics and macro-kinetics) were developed to estimate the total refining contributed by the recirculating metal droplets through the slag-metal emulsion zone. The micro-kinetics involves developing the rate equation for individual droplets in the emulsion. The mathematical models for the size distribution of initial droplets, kinetics of simultaneous refining of elements, the residence time in the emulsion, and dynamic interfacial area change were established in the micro-kinetic model. In the macro-kinetics calculation, a droplet generation model was employed and the total amount of refining by emulsion was calculated by summing the refining from the entire population of returning droplets. A dynamic FetO generation model based on oxygen mass balance was developed and coupled with the multi-zone kinetic model. The effect of post-combustion on the evolution of slag and metal composition was investigated. The model was applied to a 200-ton top blowing converter and the simulated value of metal and slag was found to be in good agreement with the measured data. The post-combustion ratio was found to be an important factor in controlling FetO content in the slag and the kinetics of Mn and P in a BOF process.
Private initiatives and policy options: recent health system experience in India.
Purohit, B C
2001-03-01
In the recent past the impact of structural adjustment in the Indian health care sector has been felt in the reduction in central grants to States for public health and disease control programmes. This falling share of central grants has had a more pronounced impact on the poorer states, which have found it more difficult to raise local resources to compensate for this loss of revenue. With the continued pace of reforms, the likelihood of increasing State expenditure on the health care sector is limited in the future. As a result, a number of notable trends are appearing in the Indian health care sector. These include an increasing investment by non-resident Indians (NRIs) in the hospital industry, leading to a spurt in corporatization in the States of their original domicile and an increasing participation by multinational companies in diagnostics aiming to capture the potential of the Indian health insurance market. The policy responses to these private initiatives are reflected in measures comprising strategies to attract private sector participation and management inputs into primary health care centres (PHCs), privatization or semi-privatization of public health facilities such as non-clinical services in public hospitals, innovating ways to finance public health facilities through non-budgetary measures, and tax incentives by the State governments to encourage private sector investment in the health sector. Bearing in mind the vital importance of such market forces and policy responses in shaping the future health care scenario in India, this paper examines in detail both of these aspects and their implications for the Indian health care sector. The analysis indicates that despite the promising newly emerging atmosphere, there are limits to market forces; appropriate refinement in the role of government should be attempted to avoid undesirable consequences of rising costs, increasing inequity and consumer exploitation. This may require opening the health insurance market to multinational companies, the proper channelling of tax incentives to set up medical institutions in backward areas, and reinforcing appropriate regulatory mechanisms.
Post-wildfire soil erosion in the Mediterranean: Review and future research directions
NASA Astrophysics Data System (ADS)
Shakesby, R. A.
2011-04-01
Wildfires increased dramatically in frequency and extent in the European Mediterranean region from the 1960s, aided by a general warming and drying trend, but driven primarily by socio-economic changes, including rural depopulation, land abandonment and afforestation with flammable species. Published research into post-wildfire hydrology and soil erosion, beginning during the 1980s in Spain, has been followed by studies in other European Mediterranean countries together with Israel and has now attained a sufficiently large critical mass to warrant a major review. Although variations in climate, vegetation, soil, topography and fire severity cause differences in Mediterranean post-wildfire erosion, the long history of human landscape impact up to the present day is responsible for some its distinctive characteristics. This paper highlights these characteristics in reviewing wildfire impacts on hydrology, soil properties and soil erosion by water. The 'mosaic' nature of many Mediterranean landscapes (e.g. an intricate land-use pattern, abandoned terraces and tracks interrupting slopes) may explain sometimes conflicting post-fire hydrological and erosional responses at different sites and spatial scales. First-year post-wildfire soil losses at point- (average, 45-56 t ha - 1 ) and plot-scales (many < 1 t ha - 1 and the majority < 10 t ha - 1 in the first year) are similar to or even lower than those reported for fire-affected land elsewhere or other disturbed (e.g. cultivated) and natural poorly-vegetated (e.g. badlands, rangeland) land in the Mediterranean. The few published losses at larger-scales (hillslope and catchment) are variable. Thin soil and high stone content can explain supply-limited erosion preceding significant protection by recovering vegetation. Peak erosion can sometimes be delayed for years, largely through slow vegetation recovery and temporal variability of erosive storms. Preferential removal of organic matter and nutrients in the commonly thin, degraded soils is arguably just as if not more important than the total soil loss. Aspect is important, with more erosion reported for south- than north-facing slopes, which is attributed to greater fire frequency, slower vegetation recovery on the former and with soil characteristics more prone to erosion (e.g. lower aggregate stability). Post-fire wind erosion is a potentially important but largely neglected process. Gauging the degradational significance of wildfires has relied on comparison with unburnt land, but the focus for comparison should be switched to other agents of soil disturbance and/or currently poorly understood soil renewal rates. Human impact on land use and vegetation may alter expected effects (increased fire activity and post-wildfire erosion) arising from future climatic change. Different future wildfire mitigation responses and likely erosional consequences are outlined. Research gaps are identified, and more research effort is suggested to: (1) improve assessment of post-wildfire erosion impact on soil fertility, through further quantification of soil nutrient depletion resulting from single and multiple fire cycles, and on soil longevity; (2) investigate prescribed fire impacts on carbon release, air pollution and nutrient losses as well as on soil loss; (3) isolate hillslope- and catchment-scale impacts of soil water repellency under Mediterranean post-wildfire conditions; (4) test and refine application of cosmogenic radionuclides to post-wildfire hillslope-scale soil redistribution at different temporal scales; (5) use better temporal resolution of sedimentary sequences to understand palaeofire-erosion-sedimentation links; (6) quantify post-wildfire wind erosion; (7) improve the integration of wildfire into an overall assessment of the processes and impacts of land degradation in the Mediterranean; and (8) raise public awareness of wildfire impact on soil degradation.
Refining Housing, Husbandry and Care for Animals Used in Studies Involving Biotelemetry
Hawkins, Penny
2014-01-01
Simple Summary Biotelemetry, the remote detection and measurement of an animal function or activity, is widely used in animal research. Biotelemetry devices transmit physiological or behavioural data and may be surgically implanted into animals, or externally attached. This can help to reduce animal numbers and improve welfare, e.g., if animals can be group housed and move freely instead of being tethered to a recording device. However, biotelemetry can also cause pain and distress to animals due to surgery, attachment, single housing and long term laboratory housing. This article explains how welfare and science can be improved by avoiding or minimising these harms. Abstract Biotelemetry can contribute towards reducing animal numbers and suffering in disciplines including physiology, pharmacology and behavioural research. However, the technique can also cause harm to animals, making biotelemetry a ‘refinement that needs refining’. Current welfare issues relating to the housing and husbandry of animals used in biotelemetry studies are single vs. group housing, provision of environmental enrichment, long term laboratory housing and use of telemetered data to help assess welfare. Animals may be singly housed because more than one device transmits on the same wavelength; due to concerns regarding damage to surgical sites; because they are wearing exteriorised jackets; or if monitoring systems can only record from individually housed animals. Much of this can be overcome by thoughtful experimental design and surgery refinements. Similarly, if biotelemetry studies preclude certain enrichment items, husbandry refinement protocols can be adapted to permit some environmental stimulation. Nevertheless, long-term laboratory housing raises welfare concerns and maximum durations should be defined. Telemetered data can be used to help assess welfare, helping to determine endpoints and refine future studies. The above measures will help to improve data quality as well as welfare, because experimental confounds due to physiological and psychological stress will be minimised. PMID:26480045
Identifying alternate pathways for climate change to impact inland recreational fishers
Hunt, Len M.; Fenichel, Eli P.; Fulton, David C.; Mendelsohn, Robert; Smith, Jordan W.; Tunney, Tyler D.; Lynch, Abigail J.; Paukert, Craig P.; Whitney, James E.
2016-01-01
Fisheries and human dimensions literature suggests that climate change influences inland recreational fishers in North America through three major pathways. The most widely recognized pathway suggests that climate change impacts habitat and fish populations (e.g., water temperature impacting fish survival) and cascades to impact fishers. Climate change also impacts recreational fishers by influencing environmental conditions that directly affect fishers (e.g., increased temperatures in northern climates resulting in extended open water fishing seasons and increased fishing effort). The final pathway occurs from climate change mitigation and adaptation efforts (e.g., refined energy policies result in higher fuel costs, making distant trips more expensive). To address limitations of past research (e.g., assessing climate change impacts for only one pathway at a time and not accounting for climate variability, extreme weather events, or heterogeneity among fishers), we encourage researchers to refocus their efforts to understand and document climate change impacts to inland fishers.
NASA Technical Reports Server (NTRS)
Mei, Chuh; Jaunky, Navin
1999-01-01
The goal of this research project is to develop modelling and analysis strategy for the penetration of aluminium plates impacted by titanium impactors. Finite element analysis is used to study the penetration of aluminium plates impacted by titanium impactors in order to study the effect of such uncontained engine debris impacts on aircraft-like skin panels. LS-DYNA3D) is used in the simulations to model the impactor, test fixture frame and target barrier plate. The effects of mesh refinement, contact modeling, and impactor initial velocity and orientation were studied. The research project also includes development of a design tool for optimum design of grid-stiffened non-circular shells or panels subjected to buckling.
Naturally occurring 32Si and low-background silicon dark matter detectors
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary; ...
2018-02-10
Here, the naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude thatmore » production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Naturally occurring 32Si and low-background silicon dark matter detectors
NASA Astrophysics Data System (ADS)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary; Bunker, Raymond; Finch, Zachary S.
2018-05-01
The naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon "ore" and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude that production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.
Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.
2011-01-01
The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.
Naturally occurring 32Si and low-background silicon dark matter detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary
Here, the naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude thatmore » production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Naturally occurring 32 Si and low-background silicon dark matter detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary
The naturally occurring radioisotope Si-32 represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of Si-32 and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the Si-32 concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude that productionmore » of Si-32-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in Si-32. To quantitatively evaluate the Si-32 content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon-based detectors with low levels of Si-32, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-06
... areas in the energy industry, including coal, oil, natural gas, and nuclear energy, as well as in... higher power ratings. 12. In processing and refining crude oil into petroleum products, oil refineries... energy industry, including coal, oil, natural gas, and nuclear energy, as well as in renewable resources...
Using EMAP data from the NE Wadeable Stream Survey and state datasets (CT, ME), assessment tools were developed to predict diffuse NPS effects from watershed development and distinguish these from local impacts (point sources, contaminated sediments). Classification schemes were...
An Encounter with Fleeting Moments through Transitional Space
ERIC Educational Resources Information Center
Ryoo, Anna
2016-01-01
This paper is based on a phenomenologically oriented exploratory case study. It focuses on Bea, one of the many fascinating individuals the author met at a unique educational site who had an invaluable impact not only on the refinement of the initial guiding question of inquiry, but also on the author as an educator and educational researcher.…
Assessment tools are being developed to predict diffuse NPS effects from watershed development and distinguish these from local impacts (point sources, contaminated sediments). Using EMAP data from the New England Wadeable Stream Survey and two state datasets (CT, ME), we are de...
ERIC Educational Resources Information Center
Kearney, W. Sean; Webb, Michael; Goldhorn, Jeff; Peters, Michelle L.
2013-01-01
This article presents a quantitative study utilizing HLM to analyze classroom walkthrough data completed by principals within 87 secondary mathematics classrooms across 9 public schools in Texas. This research is based on the theoretical framework of learner engagement as established by Argryis & Schon (1996), and refined by Marks (2000). It…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
... impact on hospitals across the Nation. At a time when the demand for health care services is on the rise... capital to help hospitals refinance debt was sufficiently available, and that the demand for this type of... nursing home, existing assisted living facility, existing intermediate care facility, existing board and...
The Impact of a Professional Development Programme on the Practices and Beliefs of Numeracy Teachers
ERIC Educational Resources Information Center
Swan, Malcolm; Swain, Jon
2010-01-01
This article describes some outcomes of a nine-month design-based research study into the professional development of 24 numeracy teachers with post-16 learners. Teachers analysed research-based principles for teaching, and engaged in a design-research process by testing and refining teaching activities to embody these principles. Data from…
A strategic endeavor in business planning--an oncology perspective.
Eck, C
2000-06-01
Planning is imperative to provide direction for future growth. The purpose of writing a business plan is to cultivate, analyze, and refine ideas. Planning for academic health centers has become increasingly important because of the changes in financing and delivery of health care. Gathering data related to the current patients population as well as the projected future trends is necessary to establish a framework. Identifying the market and financial data and formulating the strategies needed to move forward are key elements of a business plan. The ultimate outcome of the process is to convince others that the vision is achievable and to ensure allocation of resources to carry out the plan.
ICASE/LaRC Workshop on Adaptive Grid Methods
NASA Technical Reports Server (NTRS)
South, Jerry C., Jr. (Editor); Thomas, James L. (Editor); Vanrosendale, John (Editor)
1995-01-01
Solution-adaptive grid techniques are essential to the attainment of practical, user friendly, computational fluid dynamics (CFD) applications. In this three-day workshop, experts gathered together to describe state-of-the-art methods in solution-adaptive grid refinement, analysis, and implementation; to assess the current practice; and to discuss future needs and directions for research. This was accomplished through a series of invited and contributed papers. The workshop focused on a set of two-dimensional test cases designed by the organizers to aid in assessing the current state of development of adaptive grid technology. In addition, a panel of experts from universities, industry, and government research laboratories discussed their views of needs and future directions in this field.
Satellite Ocean Biology: Past, Present, Future
NASA Technical Reports Server (NTRS)
McClain, Charles R.
2012-01-01
Since 1978 when the first satellite ocean color proof-of-concept sensor, the Nimbus-7 Coastal Zone Color Scanner, was launched, much progress has been made in refining the basic measurement concept and expanding the research applications of global satellite time series of biological and optical properties such as chlorophyll-a concentrations. The seminar will review the fundamentals of satellite ocean color measurements (sensor design considerations, on-orbit calibration, atmospheric corrections, and bio-optical algorithms), scientific results from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Moderate resolution Imaging Spectroradiometer (MODIS) missions, and the goals of future NASA missions such as PACE, the Aerosol, Cloud, Ecology (ACE), and Geostationary Coastal and Air Pollution Events (GeoCAPE) missions.
Williams, Mark D.; USA, Richland Washington; Vermuel, Vince R.; ...
2014-12-31
The FutureGen 2.0 Project will design and build a first-of-its-kind, near-zero emissions coal-fueled power plant with carbon capture and storage (CCS). To assess storage site performance and meet the regulatory requirements of the Class VI Underground Injection Control (UIC) Program for CO 2 Geologic Sequestration, the FutureGen 2.0 project will implement a suite of monitoring technologies designed to evaluate CO 2 mass balance and detect any unforeseen loss in CO 2 containment. The monitoring program will include direct monitoring of the reservoir, and early-leak-detection monitoring directly above the primary confining zone. This preliminary modeling study described here focuses on hypotheticalmore » leakage scenarios into the first permeable unit above the primary confining zone (Ironton Sandstone) and is used to support assessment of early-leak detection capabilities. Future updates of the model will be used to assess potential impacts on the lowermost underground source of drinking water (Saint Peter Sandstone) for a range of theoretical leakage scenarios. This preliminary modeling evaluation considers both pressure response and geochemical signals in the overlying Ironton Sandstone. This model is independent of the FutureGen 2.0 reservoir model in that it does not simulate caprock discontinuities, faults, or failure scenarios. Instead this modeling effort is based on theoretical, volumetric-rate based leakage scenarios. The scenarios include leakage of 1% of the total injected CO 2 mass, but spread out over different time periods (20, 100, and 500 years) with each case yielding a different mass flux (i.e., smaller mass fluxes for longer duration leakage cases]. A brine leakage scenario using a volumetric leakage similar to the 20 year 1% CO 2 case was also considered. A framework for the comparison of the various cases was developed based on the exceedance of selected pressure and geochemical thresholds at different distances from the point of leakage and at different vertical positions within the Ironton Sandstone. These preliminary results, and results from an updated models that incorporate additional site-specific characterization data, support development/refinement of the monitoring system design.« less
Future Food Production System Development Pulling From Space Biology Crop Growth Testing in Veggie
NASA Technical Reports Server (NTRS)
Massa, Gioia; Romeyn, Matt; Fritsche, Ralph
2017-01-01
Preliminary crop testing using Veggie indicates the environmental conditions provided by the ISS are generally suitable for food crop production. When plant samples were returned to Earth for analysis, their levels of nutrients were comparable to Earth-grown ground controls. Veggie-grown produce food safety microbiology analysis indicated that space-grown crops are safe to consume. Produce sanitizing wipes were used on-orbit to further reduce risk of foodborne illness. Validation growth tests indicated abiotic challenges of insufficient or excess fluid delivery, potentially reduced air flow leading to excess water, elevated CO2 leading to physiological responses, and microorganisms that became opportunistic pathogens. As NASA works to develop future space food production, several areas of research to define these systems pull from the Veggie technology validation tests. Research into effective, reusable water delivery and water recovery methods for future food production systems arises from abiotic challenges observed. Additionally, impacts of elevated CO2 and refinement of fertilizer and light recipes for crops needs to be assessed. Biotic pulls include methods or technologies to effectively sanitize produce with few consumables and low inputs; work to understand the phytomicrobiome and potentially use it to protect crops or enhance growth; selection of crops with high harvest index and desirable flavors for supplemental nutrition; crops that provide psychosocial benefits, and custom space crop development. Planning for future food production in a deep space gateway or a deep space transit vehicle requires methods of handling and storing seeds, and ensuring space seeds are free of contaminants and long-lived. Space food production systems may require mechanization and autonomous operation, with preliminary testing initiated to identify operations and capabilities that are candidates for automation. Food production design is also pulling from Veggie logistics lessons, as we learn about growing at different scales and move toward developing systems that require less launch mass. Veggie will be used as a test bed for novel food production technologies. Veggie is a relatively simple precursor food production system but the knowledge gained from space biology validation tests in Veggie will have far reaching repercussions on future exploration food production. This work is supported by NASA.
Future Food Production System Development Pulling from Space Biology Crop Growth Testing in Veggie
NASA Technical Reports Server (NTRS)
Massa, G. D.; Romeyn, M. W.; Fritsche, R. F.
2017-01-01
Preliminary crop testing using Veggie indicates the environmental conditions provided by the ISS are generally suitable for food crop production. When plant samples were returned to Earth for analysis, their levels of nutrients were comparable to Earth-grown ground controls. Veggie-grown produce food safety microbiology analysis indicated that space-grown crops are safe to consume. Produce sanitizing wipes were used on-orbit to further reduce risk of foodborne illness. Validation growth tests indicated abiotic challenges of insufficient or excess fluid delivery, potentially reduced air flow leading to excess water, elevated CO2 leading to physiological responses, and microorganisms that became opportunistic pathogens. As NASA works to develop future space food production, several areas of research to define these systems pull from the Veggie technology validation tests. Research into effective, reusable water delivery and water recovery methods for future food production systems arises from abiotic challenges observed. Additionally, impacts of elevated CO2 and refinement of fertilizer and light recipes for crops needs to be assessed. Biotic pulls include methods or technologies to effectively sanitize produce with few consumables and low inputs; work to understand the phytomicrobiome and potentially use it to protect crops or enhance growth; selection of crops with high harvest index and desirable flavors for supplemental nutrition; crops that provide psychosocial benefits, and custom space crop development. Planning for future food production in a deep space gateway or a deep space transit vehicle requires methods of handling and storing seeds, and ensuring space seeds are free of contaminants and long-lived. Space food production systems may require mechanization and autonomous operation, with preliminary testing initiated to identify operations and capabilities that are candidates for automation. Food production design is also pulling from Veggie logistics lessons, as we learn about growing at different scales and move toward developing systems that require less launch mass. Veggie will be used as a test bed for novel food production technologies. Veggie is a relatively simple precursor food production system but the knowledge gained from space biology validation tests in Veggie will have far reaching repercussions on future exploration food production.
ERIC Educational Resources Information Center
Lee, Kyungmee; Brett, Clare
2013-01-01
This qualitative case study is the first phase of a large-scale design-based research project to implement a theoretically derived double-layered CoP model within real-world teacher development practices. The main goal of this first iteration is to evaluate the courses and test and refine the CoP model for future implementations. This paper…
System Integration Issues in Digital Photogrammetric Mapping
1992-01-01
elevation models, and/or rectified imagery/ orthophotos . Imagery exported from the DSPW can be either in a tiled image format or standard raster format...data. In the near future, correlation using "window shaping" operations along with an iterative orthophoto refinements methodology (Norvelle, 1992) is...components of TIES. The IDS passes tiled image data and ASCII header data to the DSPW. The tiled image file contains only image data. The ASCII header
Navy Combatives: Adjusting Course for the Future
2010-12-01
Judo and Japanese Ju Jitsu as a young boy. Gracie developed a system in BJJ that allowed him to use his small frame against much larger adversaries...Defendu. Fairbairn was an instructor for the Shanghai Police teaching the styles of Chinese Boxing and Japanese Ju Jitsu . Mixing these two...refinement of Japanese Ju Jitsu and was created as a non-lethal way to subdue an attacker. Using an opponent’s enemy against it, the strength of
Creating an Effective Regional Alignment Strategy for the U.S. Army
2014-11-01
profession of arms, and by serving as a crucible for educating future leaders in the analysis, evaluation, and refinement of professional expertise in war ...Carlisle Barracks, PA and UNITED STATES ARMY WAR COLLEGE PRESS U.S. ARMY WAR COLLEGE Report Documentation Page Form ApprovedOMB No. 0704-0188 Public... War College,Strategic Studies Institute,47 Ashburn Drive ,Carlisle,PA,17013-5010 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING
SRB attrition rate study of the aft skirt due to water impact cavity collapse loading
NASA Technical Reports Server (NTRS)
Crockett, C. D.
1976-01-01
A methodology was presented so that realistic attrition prediction could aid in selecting an optimum design option for minimizing the effects of updated loads on the Space Shuttle Solid Rocket Booster (SRB) aft skirt. The updated loads resulted in water impact attrition rates greater than 10 percent for the aft skirt structure. Adding weight to reinforce the aft skirt was undesirable. The refined method treats the occurrences of the load distribution probabilistically, radially and longitudinally, with respect to the critical structural response.
On the Predictability of Future Impact in Science
Penner, Orion; Pan, Raj K.; Petersen, Alexander M.; Kaski, Kimmo; Fortunato, Santo
2013-01-01
Correctly assessing a scientist's past research impact and potential for future impact is key in recruitment decisions and other evaluation processes. While a candidate's future impact is the main concern for these decisions, most measures only quantify the impact of previous work. Recently, it has been argued that linear regression models are capable of predicting a scientist's future impact. By applying that future impact model to 762 careers drawn from three disciplines: physics, biology, and mathematics, we identify a number of subtle, but critical, flaws in current models. Specifically, cumulative non-decreasing measures like the h-index contain intrinsic autocorrelation, resulting in significant overestimation of their “predictive power”. Moreover, the predictive power of these models depend heavily upon scientists' career age, producing least accurate estimates for young researchers. Our results place in doubt the suitability of such models, and indicate further investigation is required before they can be used in recruiting decisions. PMID:24165898
Fortini, Lucas B.; Vorsino, Adam E.; Amidon, Fred A.; Paxton, Eben H.; Jacobi, James D.
2015-01-01
Hawaiian forest birds serve as an ideal group to explore the extent of climate change impacts on at-risk species. Avian malaria constrains many remaining Hawaiian forest bird species to high elevations where temperatures are too cool for malaria's life cycle and its principal mosquito vector. The impact of climate change on Hawaiian forest birds has been a recent focus of Hawaiian conservation biology, and has centered on the links between climate and avian malaria. To elucidate the differential impacts of projected climate shifts on species with known varying niches, disease resistance and tolerance, we use a comprehensive database of species sightings, regional climate projections and ensemble distribution models to project distribution shifts for all Hawaiian forest bird species. We illustrate that, under a likely scenario of continued disease-driven distribution limitation, all 10 species with highly reliable models (mostly narrow-ranged, single-island endemics) are expected to lose >50% of their range by 2100. Of those, three are expected to lose all range and three others are expected to lose >90% of their range. Projected range loss was smaller for several of the more widespread species; however improved data and models are necessary to refine future projections. Like other at-risk species, Hawaiian forest birds have specific habitat requirements that limit the possibility of range expansion for most species, as projected expansion is frequently in areas where forest habitat is presently not available (such as recent lava flows). Given the large projected range losses for all species, protecting high elevation forest alone is not an adequate long-term strategy for many species under climate change. We describe the types of additional conservation actions practitioners will likely need to consider, while providing results to help with such considerations.
Fortini, Lucas B; Vorsino, Adam E; Amidon, Fred A; Paxton, Eben H; Jacobi, James D
2015-01-01
Hawaiian forest birds serve as an ideal group to explore the extent of climate change impacts on at-risk species. Avian malaria constrains many remaining Hawaiian forest bird species to high elevations where temperatures are too cool for malaria's life cycle and its principal mosquito vector. The impact of climate change on Hawaiian forest birds has been a recent focus of Hawaiian conservation biology, and has centered on the links between climate and avian malaria. To elucidate the differential impacts of projected climate shifts on species with known varying niches, disease resistance and tolerance, we use a comprehensive database of species sightings, regional climate projections and ensemble distribution models to project distribution shifts for all Hawaiian forest bird species. We illustrate that, under a likely scenario of continued disease-driven distribution limitation, all 10 species with highly reliable models (mostly narrow-ranged, single-island endemics) are expected to lose >50% of their range by 2100. Of those, three are expected to lose all range and three others are expected to lose >90% of their range. Projected range loss was smaller for several of the more widespread species; however improved data and models are necessary to refine future projections. Like other at-risk species, Hawaiian forest birds have specific habitat requirements that limit the possibility of range expansion for most species, as projected expansion is frequently in areas where forest habitat is presently not available (such as recent lava flows). Given the large projected range losses for all species, protecting high elevation forest alone is not an adequate long-term strategy for many species under climate change. We describe the types of additional conservation actions practitioners will likely need to consider, while providing results to help with such considerations.
Fortini, Lucas B.; Vorsino, Adam E.; Amidon, Fred A.; Paxton, Eben H.; Jacobi, James D.
2015-01-01
Hawaiian forest birds serve as an ideal group to explore the extent of climate change impacts on at-risk species. Avian malaria constrains many remaining Hawaiian forest bird species to high elevations where temperatures are too cool for malaria’s life cycle and its principal mosquito vector. The impact of climate change on Hawaiian forest birds has been a recent focus of Hawaiian conservation biology, and has centered on the links between climate and avian malaria. To elucidate the differential impacts of projected climate shifts on species with known varying niches, disease resistance and tolerance, we use a comprehensive database of species sightings, regional climate projections and ensemble distribution models to project distribution shifts for all Hawaiian forest bird species. We illustrate that, under a likely scenario of continued disease-driven distribution limitation, all 10 species with highly reliable models (mostly narrow-ranged, single-island endemics) are expected to lose >50% of their range by 2100. Of those, three are expected to lose all range and three others are expected to lose >90% of their range. Projected range loss was smaller for several of the more widespread species; however improved data and models are necessary to refine future projections. Like other at-risk species, Hawaiian forest birds have specific habitat requirements that limit the possibility of range expansion for most species, as projected expansion is frequently in areas where forest habitat is presently not available (such as recent lava flows). Given the large projected range losses for all species, protecting high elevation forest alone is not an adequate long-term strategy for many species under climate change. We describe the types of additional conservation actions practitioners will likely need to consider, while providing results to help with such considerations. PMID:26509270
Fox, Mary A.; Kaye, Charlotte; Resnick, Beth
2017-01-01
Summary: Public health has potential to serve as a frame to convey the urgency of behavior change needed to adapt to a changing climate and reduce greenhouse gas emissions. Local governments form the backbone of climate-related public health preparedness. Yet local health agencies are often inadequately prepared and poorly integrated into climate change assessments and plans. We reviewed the climate health profiles of 16 states and two cities participating in the U.S. Centers for Disease Control and Prevention (CDC)’s Climate-Ready States and Cities Initiative (CRSCI) that aims to build local capacity to assess and respond to the health impacts of climate change. Following recommendations from a recent expert panel strategic review, we present illustrations of emerging promising practice and future directions. We found that CRSCI has strengthened climate preparedness and response in local public health agencies by identifying critical climate-health impacts and vulnerable populations, and has helped integrate health more fully into broader climate planning. Promising practice was found in all three recommendation areas identified by the expert panel (leveraging partnerships, refining assessment methodologies and enhancing communications), particularly with regard to health impacts of extreme heat. Vast needs remain, however, suggesting the need to disseminate CRSCI experience to non-grantees. In conclusion, the CRSCI program approach and selected activities illustrate a way forward toward robust, targeted local preparedness and response that may serve as a useful example for public health departments in the United States and internationally, particularly at a time of uncertain commitment to climate change agreements at the national level. https://doi.org/10.1289/EHP1838 PMID:28934724
NASA Astrophysics Data System (ADS)
Rao, M.
2014-12-01
Drought is a natural disaster with serious implications to environmental, social and economic well-being at local, regional and global scales. In its third year, California's drought condition has seriously impacted not just the agricultural sector, but also the natural resources sector including forestry, wildlife, and fisheries. As of July 15, 2014, the National Weather Service drought monitor shows 81% of California in the category of extreme drought. As future predictions of drought and fire severity become more real in California, there is an increased awareness to pursue innovative and cost-effective solutions that are based on silvicultural treatments and controlled burns to improve forest health and reduce the risk of high-severity wildfires. The main goal of this study is to develop a GIS map of the drought-impacted region of northern and central California using remote sensing data. Specifically, based on a geospatial database for the study region, Landsat imagery in conjunction with field and ancillary data will be analyzed using a combination of supervised and unsupervised classification techniques in addition to spectral indices such as the Modified Perpendicular Drought Index (MPDI). This spectral index basically scales the line perpendicular to the soil line defined in the Red-NIR feature space in conjunction with added information about vegetative fraction derived using NDVI. The image processing will be conducted for two time periods (2001 and 2014) to characterize the severity of the drought. In addition to field data, data collected by state agencies including calforests.org will be used in the classification and accuracy assessment procedures. Visual assessment using high-resolution imagery such as NAIP will be used to further refine the spatial maps. The drought severity maps produced will greatly facilitate site-specific planning efforts aimed at implementing resource management decisions.
Exposing the impact of Citizens Advice Bureau services on health: a realist evaluation protocol.
Forster, N; Dalkin, S M; Lhussier, M; Hodgson, P; Carr, S M
2016-01-20
Welfare advice services can be used to address health inequalities, for example, through Citizens Advice Bureau (CAB). Recent reviews highlight evidence for the impact of advice services in improving people's financial position and improving mental health and well-being, daily living and social relationships. There is also some evidence for the impact of advice services in increasing accessibility of health services, and reducing general practitioner appointments and prescriptions. However, direct evidence for the impact of advice services on lifestyle behaviour and physical health is currently much less well established. There is a need for greater empirical testing of theories around the specific mechanisms through which advice services and associated financial or non-financial benefits may generate health improvements. A realist evaluation will be conducted, operationalised in 5 phases: building the explanatory framework; refining the explanatory framework; testing the explanatory framework through empirical data (mixed methods); development of a bespoke data recording template to capture longer term impact; and verification of findings with a range of CAB services. This research will therefore aim to build, refine and test an explanatory framework about how CAB services can be optimally implemented to achieve health improvement. The study was approved by the ethics committee at Northumbria University, UK. Project-related ethical issues are described and quality control aspects of the study are considered. A stakeholder mapping exercise will inform the dissemination of results in order to ensure all relevant institutions and organisations are targeted. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Modic, Mary Beth; Canfield, Christina; Kaser, Nancy; Sauvey, Rebecca; Kukla, Aniko
2012-01-01
The purpose of this project was to enhance the knowledge of the bedside nurse in diabetes management. A forum for ongoing support and exploration of clinical problems, along with the distribution of educational tools were the components of this program. Diabetes accounts for 30% of patients admitted to the hospital. It has become more challenging to manage as the treatment choices have increased. There are a number of researchers who have identified nurse and physician knowledge of diabetes management principles as suboptimal. DESCRIPTION OF THE INNOVATION: Staff nurses are educated for a role as a Diabetes Management Mentor and are expected to educate/dialogue with peers monthly, model advocacy and diabetes patient education skills, facilitate referrals for diabetes education, and direct staff to resources for diabetes management. Diabetes Management Mentors feel more confident in their knowledge of diabetes and their ability to resolve clinical issues as they arise. The Diabetes Management Mentor role is another avenue for nurses to refine their clinical knowledge base and acquire skills to share with colleagues while remaining at the bedside. The clinical nurse specialist is expertly prepared to foster the professional development of bedside nurses while simultaneously making a positive impact on disease management. Opportunity for future investigation includes efficacy of teaching tools on diabetes mastery, the effect of clinical nurse specialist mentoring on a select group of bedside nurses, and the Diabetes Management Mentor's impact on prevention of near-miss events.
The future of fish passage science, engineering, and practice
Silva, Ana T.; Lucas, Martyn C.; Castro-Santos, Theodore R.; Katopodis, Christos; Baumgartner, Lee J.; Thiem, Jason D.; Aarestrup, Kim; Pompeu, Paulo S.; O'Brien, Gordon C.; Braun, Douglas C.; Burnett, Nicholas J.; Zhu, David Z.; Fjeldstad, Hans-Petter; Forseth, Torbjorn; Rajarathnam, Nallamuthu; Williams, John G.; Cooke, Steven J.
2018-01-01
Much effort has been devoted to developing, constructing and refining fish passage facilities to enable target species to pass barriers on fluvial systems, and yet, fishway science, engineering and practice remain imperfect. In this review, 17 experts from different fish passage research fields (i.e., biology, ecology, physiology, ecohydraulics, engineering) and from different continents (i.e., North and South America, Europe, Africa, Australia) identified knowledge gaps and provided a roadmap for research priorities and technical developments. Once dominated by an engineering‐focused approach, fishway science today involves a wide range of disciplines from fish behaviour to socioeconomics to complex modelling of passage prioritization options in river networks. River barrier impacts on fish migration and dispersal are currently better understood than historically, but basic ecological knowledge underpinning the need for effective fish passage in many regions of the world, including in biodiversity hotspots (e.g., equatorial Africa, South‐East Asia), remains largely unknown. Designing efficient fishways, with minimal passage delay and post‐passage impacts, requires adaptive management and continued innovation. While the use of fishways in river restoration demands a transition towards fish passage at the community scale, advances in selective fishways are also needed to manage invasive fish colonization. Because of the erroneous view in some literature and communities of practice that fish passage is largely a proven technology, improved international collaboration, information sharing, method standardization and multidisciplinary training are needed. Further development of regional expertise is needed in South America, Asia and Africa where hydropower dams are currently being planned and constructed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duey, R.
1997-07-01
{open_quotes}Integrated solutions{close_quotes} has become such a buzzword in the business world that it`s beginning to lose its semantic impact. But when the business is oil and gas and the integrated solutions revolve around information technology, the impact is very great indeed. A recent Cambridge Energy Research Associates study called the use of information technology within the petroleum industry a {open_quotes}quiet revolution.{close_quotes} The technology has been {open_quotes}a powerful, enabling factor in the survival and rebirth{close_quotes} of the industry since the price collapse of 1986, the report states, adding, {open_quotes}It will be even more important to the industry`s future, shaping business strategymore » and competitive advantage-- and company structure.{close_quotes} Like many other industries, oil and gas companies are moving away from hierarchical structures, relying on asset teams and other interdisciplinary approaches to maximize profits in an era of downsizing, volatile prices and stiff competition. {open_quotes}I think in the past the energy industry had a commodity mindset,{close_quotes} says Robert Shaw, senior vice president and head of the worldwide sector for Oracle Energy. {open_quotes}The industry was fairly controlled, stable and regulated, and there was a feeling that there wasn`t much to do except find oil, own reserves, refine as much as they can, run the pipe full, dispose of as much as they can and hopefully make a lot of money{close_quotes}.« less
Redley, Bernice; Botti, Mari; Wood, Beverley; Bucknall, Tracey
2017-08-01
Poor interprofessional communication poses a risk to patient safety at change-of-shift in emergency departments (EDs). The purpose of this study was to identify and describe patterns and processes of interprofessional communication impacting quality of ED change-of-shift handovers. Observation of 66 change-of-shift handovers at two acute hospital EDs in Victoria, Australia. Focus groups with 34 nurse participants complemented the observations. Qualitative data analysis involved content and thematic methods. Four structural components of ED handover processes emerged represented by (ABCD): (1) Antecedents; (2) Behaviours and interactions; (3) Content; and (4) Delegation of ongoing care. Infrequent and ad hoc interprofessional communication and discipline-specific handover content and processes emerged as specific risks to patient safety at change-of-shift handovers. Three themes related to risky and effective practices to support interprofessional communications across the four stages of ED handovers emerged: 1) standard processes and practices, 2) teamwork and interactions and 3) communication activities and practices. Unreliable interprofessional communication can impact the quality of change-of-shift handovers in EDs and poses risk to patient safety. Structured reflective analysis of existing practices can identify opportunities for standardisation, enhanced team practices and effective communication across four stages of the handover process to support clinicians to enhance local handover practices. Future research should test and refine models to support analysis of practice, and identify and test strategies to enhance ED interprofessional communication to support clinical handovers. Copyright © 2017 College of Emergency Nursing Australasia. Published by Elsevier Ltd. All rights reserved.
Climate and dengue transmission: evidence and implications.
Morin, Cory W; Comrie, Andrew C; Ernst, Kacey
2013-01-01
Climate influences dengue ecology by affecting vector dynamics, agent development, and mosquito/human interactions. Although these relationships are known, the impact climate change will have on transmission is unclear. Climate-driven statistical and process-based models are being used to refine our knowledge of these relationships and predict the effects of projected climate change on dengue fever occurrence, but results have been inconsistent. We sought to identify major climatic influences on dengue virus ecology and to evaluate the ability of climate-based dengue models to describe associations between climate and dengue, simulate outbreaks, and project the impacts of climate change. We reviewed the evidence for direct and indirect relationships between climate and dengue generated from laboratory studies, field studies, and statistical analyses of associations between vectors, dengue fever incidence, and climate conditions. We assessed the potential contribution of climate-driven, process-based dengue models and provide suggestions to improve their performance. Relationships between climate variables and factors that influence dengue transmission are complex. A climate variable may increase dengue transmission potential through one aspect of the system while simultaneously decreasing transmission potential through another. This complexity may at least partly explain inconsistencies in statistical associations between dengue and climate. Process-based models can account for the complex dynamics but often omit important aspects of dengue ecology, notably virus development and host-species interactions. Synthesizing and applying current knowledge of climatic effects on all aspects of dengue virus ecology will help direct future research and enable better projections of climate change effects on dengue incidence.
Benson, Helen E; Sharman, Joanna L; Mpamhanga, Chido P; Parton, Andrew; Southan, Christopher; Harmar, Anthony J; Ghazal, Peter
2017-01-01
Background and Purpose An ever‐growing wealth of information on current drugs and their pharmacological effects is available from online databases. As our understanding of systems biology increases, we have the opportunity to predict, model and quantify how drug combinations can be introduced that outperform conventional single‐drug therapies. Here, we explore the feasibility of such systems pharmacology approaches with an analysis of the mevalonate branch of the cholesterol biosynthesis pathway. Experimental Approach Using open online resources, we assembled a computational model of the mevalonate pathway and compiled a set of inhibitors directed against targets in this pathway. We used computational optimization to identify combination and dose options that show not only maximal efficacy of inhibition on the cholesterol producing branch but also minimal impact on the geranylation branch, known to mediate the side effects of pharmaceutical treatment. Key Results We describe serious impediments to systems pharmacology studies arising from limitations in the data, incomplete coverage and inconsistent reporting. By curating a more complete dataset, we demonstrate the utility of computational optimization for identifying multi‐drug treatments with high efficacy and minimal off‐target effects. Conclusion and Implications We suggest solutions that facilitate systems pharmacology studies, based on the introduction of standards for data capture that increase the power of experimental data. We propose a systems pharmacology workflow for the refinement of data and the generation of future therapeutic hypotheses. PMID:28910500
Impact of parameterization choices on the restitution of ozone deposition over vegetation
NASA Astrophysics Data System (ADS)
Le Morvan-Quéméner, Aurélie; Coll, Isabelle; Kammer, Julien; Lamaud, Eric; Loubet, Benjamin; Personne, Erwan; Stella, Patrick
2018-04-01
Ozone is a potentially phyto-toxic air pollutant, which can cause leaf damage and drastically alter crop yields, causing serious economic losses around the world. The VULNOZ (VULNerability to OZone in Anthropised Ecosystems) project is a biology and modeling project that aims to understand how plants respond to the stress of high ozone concentrations, then use a set of models to (i) predict the impact of ozone on plant growth, (ii) represent ozone deposition fluxes to vegetation, and finally (iii) estimate the economic consequences of an increasing ozone background the future. In this work, as part of the VULNOZ project, an innovative representation of ozone deposition to vegetation was developed and implemented in the CHIMERE regional chemistry-transport model. This type of model calculates the average amount of ozone deposited on a parcel each hour, as well as the integrated amount of ozone deposited to the surface at the regional or country level. Our new approach was based on a refinement of the representation of crop types in the model and the use of empirical parameters specific to each crop category. The results obtained were compared with a conventional ozone deposition modeling approach, and evaluated against observations from several agricultural areas in France. They showed that a better representation of the distribution between stomatal and non-stomatal ozone fluxes was obtained in the empirical approach, and they allowed us to produce a new estimate of the total amount of ozone deposited on the subtypes of vegetation at the national level.
Preliminary Investigation of Civil Tiltrotor in NextGen Airspace
NASA Technical Reports Server (NTRS)
Young, Larry A.; Salvano, Dan; Wright, Ken; Chung, William; Young, Ray; Miller, David; Paris, Alfanso; Gao, Huina; Cheng, Victor
2010-01-01
Presentation intro: Tiltrotor aircraft have long been envisioned as being a potentially viable means of commercial aviation transport. Preliminary results from an ongoing study into the operational and technological considerations of Civil Tiltrotor (CTR) operation in the Next Generation airspace, circa the 2025 time-frame, are presented and discussed. In particular, a fleet of CTR aircraft has been conceptually designed. The performance characteristics of this CTR fleet was subsequently translated into BADA (Base of Aircraft DAta) models that could be used as input to emulate CTR aircraft operations in the ACES and AvTerminal airspace and terminal area simulation tools. A network of nine North-Eastern corridor airports is the focus of the airspace simulation effort; the results from this airport network viII then be extrapolated to provide insights into systemic impact of CTRs on the National Airspace System (NAS). Future work will also be detailed as to attempts to model the systemic effects of noise and emissions from this fleet of new aircraft as well as assess their leveraged impact on public service missions, in time of need, such as major regional/national disaster relief efforts. The ideal outcome of this study is a set of results whereby Next Gen airspace CONOPs can be refined to reflect potential CTR capabilities and, conversely, CTR technology development efforts can be better informed as to key performance requirement thresholds needed to be met in order to successfully introduce these aircraft into civilian aviation operation.
NASA Astrophysics Data System (ADS)
Bernhard, J. M.; Wit, J. C.
2015-12-01
The geochemistry recorded in carbonate foraminiferal tests (shells) is often used as proxy for past oceanographic events and environments. By understanding past oceanic and climatic conditions, we can better predict future climate scenarios, a relevant ability in these times of global change. The fact that foraminifera are biological entities can be pivotal for understanding their geochemical records. Thus, growing foraminifera under known physicochemical conditions and analyzing the geochemistry of their cultured carbonate can yield insightful perspectives for proxy refinement and development. Because parameters often co-vary in nature, proper proxy calibration can only be done with materials grown in strictly controlled and known environments. This presentation will review the various crucial aspects of foraminiferal maintenance and culturing, especially from the perspective of proxy development. These fundamentals were used to design a long-term multi-stressor experiment with oxygen, pCO2 (pH), and temperature as variables to test the single, double or triple threats of deoxygenation, ocean acidification, and oceanic warming. Results on assemblage composition, survivorship and growth of a continental shelf benthic foraminiferal community will be presented. Although one agglutinated morphospecies grew in each of the five treatments, growth of individual calcareous species was more restricted. Initial results indicate that pCO2 was not the factor that impacted communities most. Supported in part by NSF OCE-1219948.
High Resolution Visualization Applied to Future Heavy Airlift Concept Development and Evaluation
NASA Technical Reports Server (NTRS)
FordCook, A. B.; King, T.
2012-01-01
This paper explores the use of high resolution 3D visualization tools for exploring the feasibility and advantages of future military cargo airlift concepts and evaluating compatibility with existing and future payload requirements. Realistic 3D graphic representations of future airlifters are immersed in rich, supporting environments to demonstrate concepts of operations to key personnel for evaluation, feedback, and development of critical joint support. Accurate concept visualizations are reviewed by commanders, platform developers, loadmasters, soldiers, scientists, engineers, and key principal decision makers at various stages of development. The insight gained through the review of these physically and operationally realistic visualizations is essential to refining design concepts to meet competing requirements in a fiscally conservative defense finance environment. In addition, highly accurate 3D geometric models of existing and evolving large military vehicles are loaded into existing and proposed aircraft cargo bays. In this virtual aircraft test-loading environment, materiel developers, engineers, managers, and soldiers can realistically evaluate the compatibility of current and next-generation airlifters with proposed cargo.
Development of Non-Optimum Factors for Launch Vehicle Propellant Tank Bulkhead Weight Estimation
NASA Technical Reports Server (NTRS)
Wu, K. Chauncey; Wallace, Matthew L.; Cerro, Jeffrey A.
2012-01-01
Non-optimum factors are used during aerospace conceptual and preliminary design to account for the increased weights of as-built structures due to future manufacturing and design details. Use of higher-fidelity non-optimum factors in these early stages of vehicle design can result in more accurate predictions of a concept s actual weights and performance. To help achieve this objective, non-optimum factors are calculated for the aluminum-alloy gores that compose the ogive and ellipsoidal bulkheads of the Space Shuttle Super-Lightweight Tank propellant tanks. Minimum values for actual gore skin thicknesses and weld land dimensions are extracted from selected production drawings, and are used to predict reference gore weights. These actual skin thicknesses are also compared to skin thicknesses predicted using classical structural mechanics and tank proof-test pressures. Both coarse and refined weights models are developed for the gores. The coarse model is based on the proof pressure-sized skin thicknesses, and the refined model uses the actual gore skin thicknesses and design detail dimensions. To determine the gore non-optimum factors, these reference weights are then compared to flight hardware weights reported in a mass properties database. When manufacturing tolerance weight estimates are taken into account, the gore non-optimum factors computed using the coarse weights model range from 1.28 to 2.76, with an average non-optimum factor of 1.90. Application of the refined weights model yields non-optimum factors between 1.00 and 1.50, with an average non-optimum factor of 1.14. To demonstrate their use, these calculated non-optimum factors are used to predict heavier, more realistic gore weights for a proposed heavy-lift launch vehicle s propellant tank bulkheads. These results indicate that relatively simple models can be developed to better estimate the actual weights of large structures for future launch vehicles.
Bohlen, Martin; Hayes, Erika R.; Bohlen, Benjamin; Bailoo, Jeremy; Crabbe, John C.; Wahlsten, Douglas
2016-01-01
Eight standard inbred mouse strains were evaluated for ethanol effects on a refined battery of behavioral tests in a study that was originally designed to assess the influence of rat odors in the colony on mouse behaviors. As part of the design of the study, two experimenters conducted the tests, and the study was carefully balanced so that equal numbers of mice in all groups and times of day were tested by each experimenter. A defect in airflow in the facility compromised the odor manipulation, and in fact the different odor exposure groups did not differ in their behaviors. The two experimenters, however, obtained markedly different results for three of the tests. Certain of the experimenter effects arose from the way they judged behaviors that were not automated and had to be rated by the experimenter, such as slips on the balance beam. Others were not evident prior to ethanol injection but had a major influence after the injection. For several measures, the experimenter effects were notably different for different inbred strains. Methods to evaluate and reduce the impact of experimenter effects in future research are discussed. PMID:24933191
Sensitivity Analysis of Hybrid Propulsion Transportation System for Human Mars Expeditions
NASA Technical Reports Server (NTRS)
Chai, Patrick R.; Joyce, Ryan T.; Kessler, Paul D.; Merrill, Raymond G.; Qu, Min
2017-01-01
The National Aeronautics and Space Administration continues to develop and refine various transportation options to successfully field a human Mars campaign. One of these transportation options is the Hybrid Transportation System which utilizes both solar electric propulsion and chemical propulsion. The Hybrid propulsion system utilizes chemical propulsion to perform high thrust maneuvers, where the delta-V is most optimal when ap- plied to save time and to leverage the Oberth effect. It then utilizes solar electric propulsion to augment the chemical burns throughout the interplanetary trajectory. This eliminates the need for the development of two separate vehicles for crew and cargo missions. Previous studies considered single point designs of the architecture, with fixed payload mass and propulsion system performance parameters. As the architecture matures, it is inevitable that the payload mass and the performance of the propulsion system will change. It is desirable to understand how these changes will impact the in-space transportation system's mass and power requirements. This study presents an in-depth sensitivity analysis of the Hybrid crew transportation system to payload mass growth and solar electric propulsion performance. This analysis is used to identify the breakpoints of the current architecture and to inform future architecture and campaign design decisions.
NASA Astrophysics Data System (ADS)
Nurmaini, Siti; Firsandaya Malik, Reza; Stiawan, Deris; Firdaus; Saparudin; Tutuko, Bambang
2017-04-01
The information framework aims to holistically address the problems and issues posed by unwanted peat and land fires within the context of the natural environment and socio-economic systems. Informed decisions on planning and allocation of resources can only be made by understanding the landscape. Therefore, information on fire history and air quality impacts must be collected for future analysis. This paper proposes strategic framework based on technology approach with data fusion strategy to produce the data analysis about peat land fires and air quality management in in South Sumatera. The research framework should use the knowledge, experience and data from the previous fire seasons to review, improve and refine the strategies and monitor their effectiveness for the next fire season. Communicating effectively with communities and the public and private sectors in remote and rural landscapes is important, by using smartphones and mobile applications. Tools such as one-stop information based on web applications, to obtain information such as early warning to send and receive fire alerts, could be developed and promoted so that all stakeholders can share important information with each other.
Health facilities safety in natural disasters: experiences and challenges from South East Europe.
Radovic, Vesela; Vitale, Ksenija; Tchounwou, Paul B
2012-05-01
The United Nations named 2010 as a year of natural disasters, and launched a worldwide campaign to improve the safety of schools and hospitals from natural disasters. In the region of South East Europe, Croatia and Serbia have suffered the greatest impacts of natural disasters on their communities and health facilities. In this paper the disaster management approaches of the two countries are compared, with a special emphasis on the existing technological and legislative systems for safety and protection of health facilities and people. Strategic measures that should be taken in future to provide better safety for health facilities and populations, based on the best practices and positive experiences in other countries are recommended. Due to the expected consequences of global climate change in the region and the increased different environmental risks both countries need to refine their disaster preparedness strategies. Also, in the South East Europe, the effects of a natural disaster are amplified in the health sector due to its critical medical infrastructure. Therefore, the principles of environmental security should be implemented in public health policies in the described region, along with principles of disaster management through regional collaborations.
Computational Systems Analysis of Dopamine Metabolism
Qi, Zhen; Miller, Gary W.; Voit, Eberhard O.
2008-01-01
A prominent feature of Parkinson's disease (PD) is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease. PMID:18568086
Feo, Rebecca; Conroy, Tiffany; Marshall, Rhianon J; Rasmussen, Philippa; Wiechula, Richard; Kitson, Alison L
2017-04-01
Nursing policy and healthcare reform are focusing on two, interconnected areas: person-centred care and fundamental care. Each initiative emphasises a positive nurse-patient relationship. For these initiatives to work, nurses require guidance for how they can best develop and maintain relationships with their patients in practice. Although empirical evidence on the nurse-patient relationship is increasing, findings derived from this research are not readily or easily transferable to the complexities and diversities of nursing practice. This study describes a novel methodological approach, called holistic interpretive synthesis (HIS), for interpreting empirical research findings to create practice-relevant recommendations for nurses. Using HIS, umbrella review findings on the nurse-patient relationship are interpreted through the lens of the Fundamentals of Care Framework. The recommendations for the nurse-patient relationship created through this approach can be used by nurses to establish, maintain and evaluate therapeutic relationships with patients to deliver person-centred fundamental care. Future research should evaluate the validity and impact of these recommendations and test the feasibility of using HIS for other areas of nursing practice and further refine the approach. © 2016 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
The Aerosol/Cloud/Ecosystems Mission (ACE)
NASA Technical Reports Server (NTRS)
Schoeberl, Mark
2008-01-01
The goals and measurement strategy of the Aerosol/Cloud/Ecosystems Mission (ACE) are described. ACE will help to answer fundamental science questions associated with aerosols, clouds, air quality and global ocean ecosystems. Specifically, the goals of ACE are: 1) to quantify aerosol-cloud interactions and to assess the impact of aerosols on the hydrological cycle and 2) determine Ocean Carbon Cycling and other ocean biological processes. It is expected that ACE will: narrow the uncertainty in aerosol-cloud-precipitation interaction and quantify the role of aerosols in climate change; measure the ocean ecosystem changes and precisely quantify ocean carbon uptake; and, improve air quality forecasting by determining the height and type of aerosols being transported long distances. Overviews are provided of the aerosol-cloud community measurement strategy, aerosol and cloud observations over South Asia, and ocean biology research goals. Instruments used in the measurement strategy of the ACE mission are also highlighted, including: multi-beam lidar, multiwavelength high spectra resolution lidar, the ocean color instrument (ORCA)--a spectroradiometer for ocean remote sensing, dual frequency cloud radar and high- and low-frequency micron-wave radiometer. Future steps for the ACE mission include refining measurement requirements and carrying out additional instrument and payload studies.
XV-15 Tiltrotor Aircraft: 1997 Acoustic Testing
NASA Technical Reports Server (NTRS)
Edwards, Bryan D.; Conner, David A.
2003-01-01
XV-15 acoustic test is discussed, and measured results are presented. The test was conducted by NASA Langley and Bell Helicopter Textron, Inc., during June - July 1997, at the BHTI test site near Waxahachie, Texas. This was the second in a series of three XV-15 tests to document the acoustic signature of the XV-15 tiltrotor aircraft for a variety of flight conditions and minimize the noise signature during approach. Tradeoffs between flight procedures and the measured noise are presented to illustrate the noise abatement flight procedures. The test objectives were to: (1) support operation of future tiltrotors by further developing and demonstrating low-noise flight profiles, while maintaining acceptable handling and ride qualities, and (2) refine approach profiles, selected from previous (1995) tiltrotor testing, to incorporate Instrument Flight Rules (IFR), handling qualities constraints, operations and tradeoffs with sound. Primary emphasis was given to the approach flight conditions where blade-vortex interaction (BVI) noise dominates, because this condition influences community noise impact more than any other. An understanding of this part of the noise generating process could guide the development of low noise flight operations and increase the tiltrotor's acceptance in the community.
Impacts of Climate Change on the Collapse of Lowland Maya Civilization
NASA Astrophysics Data System (ADS)
Douglas, Peter M. J.; Demarest, Arthur A.; Brenner, Mark; Canuto, Marcello A.
2016-06-01
Paleoclimatologists have discovered abundant evidence that droughts coincided with collapse of the Lowland Classic Maya civilization, and some argue that climate change contributed to societal disintegration. Many archaeologists, however, maintain that drought cannot explain the timing or complex nature of societal changes at the end of the Classic Period, between the eighth and eleventh centuries ce. This review presents a compilation of climate proxy data indicating that droughts in the ninth to eleventh century were the most severe and frequent in Maya prehistory. Comparison with recent archaeological evidence, however, indicates an earlier beginning for complex economic and political processes that led to the disintegration of states in the southern region of the Maya lowlands that precedes major droughts. Nonetheless, drought clearly contributed to the unusual severity of the Classic Maya collapse, and helped to inhibit the type of recovery seen in earlier periods of Maya prehistory. In the drier northern Maya Lowlands, a later political collapse at ca. 1000 ce appears to be related to ongoing extreme drought. Future interdisciplinary research should use more refined climatological and archaeological data to examine the relationship between climate and social processes throughout the entirety of Maya prehistory.
Psychological responses to the proximity of climate change
NASA Astrophysics Data System (ADS)
Brügger, Adrian; Dessai, Suraje; Devine-Wright, Patrick; Morton, Thomas A.; Pidgeon, Nicholas F.
2015-12-01
A frequent suggestion to increase individuals' willingness to take action on climate change and to support relevant policies is to highlight its proximal consequences, that is, those that are close in space and time. But previous studies that have tested this proximizing approach have not revealed the expected positive effects on individual action and support for addressing climate change. We present three lines of psychological reasoning that provide compelling arguments as to why highlighting proximal impacts of climate change might not be as effective a way to increase individual mitigation and adaptation efforts as is often assumed. Our contextualization of the proximizing approach within established psychological research suggests that, depending on the particular theoretical perspective one takes on this issue, and on specific individual characteristics suggested by these perspectives, proximizing can bring about the intended positive effects, can have no (visible) effect or can even backfire. Thus, the effects of proximizing are much more complex than is commonly assumed. Revealing this complexity contributes to a refined theoretical understanding of the role that psychological distance plays in the context of climate change and opens up further avenues for future research and for interventions.
NASA Astrophysics Data System (ADS)
Cooper, L. A.; Ballantyne, A.
2017-12-01
Forest disturbances are critical components of ecosystems. Knowledge of their prevalence and impacts is necessary to accurately describe forest health and ecosystem services through time. While there are currently several methods available to identify and describe forest disturbances, especially those which occur in North America, the process remains inefficient and inaccessible in many parts of the world. Here, we introduce a preliminary approach to streamline and automate both the detection and attribution of forest disturbances. We use a combination of the Breaks for Additive Season and Trend (BFAST) detection algorithm to detect disturbances in combination with supervised and unsupervised classification algorithms to attribute the detections to disturbance classes. Both spatial and temporal disturbance characteristics are derived and utilized for the goal of automating the disturbance attribution process. The resulting preliminary algorithm is applied to up-scaled (100m) Landsat data for several different ecosystems in North America, with varying success. Our results indicate that supervised classification is more reliable than unsupervised classification, but that limited training data are required for a region. Future work will improve the algorithm through refining and validating at sites within North America before applying this approach globally.
Hayes, Joseph; Schimel, Jeff; Arndt, Jamie; Faucher, Erik H
2010-09-01
Terror management theory (TMT) highlights the motivational impact of thoughts of death in various aspects of everyday life. Since its inception in 1986, research on TMT has undergone a slight but significant shift from an almost exclusive focus on the manipulation of thoughts of death to a marked increase in studies that measure the accessibility of death-related cognition. Indeed, the number of death-thought accessibility (DTA) studies in the published literature has grown substantially in recent years. In light of this increasing reliance on the DTA concept, the present article is meant to provide a comprehensive theoretical and empirical review of the literature employing this concept. After discussing the roots of DTA, the authors outline the theoretical refinements to TMT that have accompanied significant research findings associated with the DTA concept. Four distinct categories (mortality salience, death association, anxiety-buffer threat, and dispositional) are derived to organize the reviewed DTA studies, and the theoretical implications of each category are discussed. Finally, a number of lingering empirical and theoretical issues in the DTA literature are discussed with the aim of stimulating and focusing future research on DTA specifically and TMT in general.
Using comparative genome analysis to identify problems in annotated microbial genomes.
Poptsova, Maria S; Gogarten, J Peter
2010-07-01
Genome annotation is a tedious task that is mostly done by automated methods; however, the accuracy of these approaches has been questioned since the beginning of the sequencing era. Genome annotation is a multilevel process, and errors can emerge at different stages: during sequencing, as a result of gene-calling procedures, and in the process of assigning gene functions. Missed or wrongly annotated genes differentially impact different types of analyses. Here we discuss and demonstrate how the methods of comparative genome analysis can refine annotations by locating missing orthologues. We also discuss possible reasons for errors and show that the second-generation annotation systems, which combine multiple gene-calling programs with similarity-based methods, perform much better than the first annotation tools. Since old errors may propagate to the newly sequenced genomes, we emphasize that the problem of continuously updating popular public databases is an urgent and unresolved one. Due to the progress in genome-sequencing technologies, automated annotation techniques will remain the main approach in the future. Researchers need to be aware of the existing errors in the annotation of even well-studied genomes, such as Escherichia coli, and consider additional quality control for their results.
Rethinking the extrinsic incubation period of malaria parasites.
Ohm, Johanna R; Baldini, Francesco; Barreaux, Priscille; Lefevre, Thierry; Lynch, Penelope A; Suh, Eunho; Whitehead, Shelley A; Thomas, Matthew B
2018-03-12
The time it takes for malaria parasites to develop within a mosquito, and become transmissible, is known as the extrinsic incubation period, or EIP. EIP is a key parameter influencing transmission intensity as it combines with mosquito mortality rate and competence to determine the number of mosquitoes that ultimately become infectious. In spite of its epidemiological significance, data on EIP are scant. Current approaches to estimate EIP are largely based on temperature-dependent models developed from data collected on parasite development within a single mosquito species in the 1930s. These models assume that the only factor affecting EIP is mean environmental temperature. Here, we review evidence to suggest that in addition to mean temperature, EIP is likely influenced by genetic diversity of the vector, diversity of the parasite, and variation in a range of biotic and abiotic factors that affect mosquito condition. We further demonstrate that the classic approach of measuring EIP as the time at which mosquitoes first become infectious likely misrepresents EIP for a mosquito population. We argue for a better understanding of EIP to improve models of transmission, refine predictions of the possible impacts of climate change, and determine the potential evolutionary responses of malaria parasites to current and future mosquito control tools.
Bohlen, Martin; Hayes, Erika R; Bohlen, Benjamin; Bailoo, Jeremy D; Crabbe, John C; Wahlsten, Douglas
2014-10-01
Eight standard inbred mouse strains were evaluated for ethanol effects on a refined battery of behavioral tests in a study that was originally designed to assess the influence of rat odors in the colony on mouse behaviors. As part of the design of the study, two experimenters conducted the tests, and the study was carefully balanced so that equal numbers of mice in all groups and times of day were tested by each experimenter. A defect in airflow in the facility compromised the odor manipulation, and in fact the different odor exposure groups did not differ in their behaviors. The two experimenters, however, obtained markedly different results for three of the tests. Certain of the experimenter effects arose from the way they judged behaviors that were not automated and had to be rated by the experimenter, such as slips on the balance beam. Others were not evident prior to ethanol injection but had a major influence after the injection. For several measures, the experimenter effects were notably different for different inbred strains. Methods to evaluate and reduce the impact of experimenter effects in future research are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.
Development and validation of a new survey: Perceptions of Teaching as a Profession (PTaP)
NASA Astrophysics Data System (ADS)
Adams, Wendy
2017-01-01
To better understand the impact of efforts to train more science teachers such as the PhysTEC Project and to help with early identification of future teachers, we are developing the survey of Perceptions of Teaching as a Profession (PTaP) to measure students' views of teaching as a career, their interest in teaching and the perceived climate of physics departments towards teaching as a profession. The instrument consists of a series of statements which require a response using a 5-point Likert-scale and can be easily administered online. The survey items were drafted by a team of researchers and physics teacher candidates and then reviewed by an advisory committee of 20 physics teacher educators and practicing teachers. We conducted 27 interviews with both teacher candidates and non-teaching STEM majors. The survey was refined through an iterative process of student interviews and item clarification until all items were interpreted consistently and answered for consistent reasons. In this presentation the preliminary results from the student interviews as well as the results of item analysis and a factor analysis on 900 student responses will be shared.
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba; ...
2016-01-07
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
Baldwin, Austin K.; Robertson, Dale M.; Saad, David A.; Magruder, Christopher
2013-01-01
In 2008, the U.S. Geological Survey and the Milwaukee Metropolitan Sewerage District initiated a study to develop regression models to estimate real-time concentrations and loads of chloride, suspended solids, phosphorus, and bacteria in streams near Milwaukee, Wisconsin. To collect monitoring data for calibration of models, water-quality sensors and automated samplers were installed at six sites in the Menomonee River drainage basin. The sensors continuously measured four potential explanatory variables: water temperature, specific conductance, dissolved oxygen, and turbidity. Discrete water-quality samples were collected and analyzed for five response variables: chloride, total suspended solids, total phosphorus, Escherichia coli bacteria, and fecal coliform bacteria. Using the first year of data, regression models were developed to continuously estimate the response variables on the basis of the continuously measured explanatory variables. Those models were published in a previous report. In this report, those models are refined using 2 years of additional data, and the relative improvement in model predictability is discussed. In addition, a set of regression models is presented for a new site in the Menomonee River Basin, Underwood Creek at Wauwatosa. The refined models use the same explanatory variables as the original models. The chloride models all used specific conductance as the explanatory variable, except for the model for the Little Menomonee River near Freistadt, which used both specific conductance and turbidity. Total suspended solids and total phosphorus models used turbidity as the only explanatory variable, and bacteria models used water temperature and turbidity as explanatory variables. An analysis of covariance (ANCOVA), used to compare the coefficients in the original models to those in the refined models calibrated using all of the data, showed that only 3 of the 25 original models changed significantly. Root-mean-squared errors (RMSEs) calculated for both the original and refined models using the entire dataset showed a median improvement in RMSE of 2.1 percent, with a range of 0.0–13.9 percent. Therefore most of the original models did almost as well at estimating concentrations during the validation period (October 2009–September 2011) as the refined models, which were calibrated using those data. Application of these refined models can produce continuously estimated concentrations of chloride, total suspended solids, total phosphorus, E. coli bacteria, and fecal coliform bacteria that may assist managers in quantifying the effects of land-use changes and improvement projects, establish total maximum daily loads, and enable better informed decision making in the future.
NASA Astrophysics Data System (ADS)
Wang, Jinxiang; Yang, Rui; Jiang, Li; Wang, Xiaoxu; Zhou, Nan
2013-11-01
Nanocrystalline (NC) copper was fabricated by severe plastic deformation of coarse-grained copper at a high strain rate under explosive loading. The feasibility of grain refinement under different explosive loading and the influence of overall temperature rise on grain refinement under impact compression were studied in this paper. The calculation model for the macroscopic temperature rise was established according to the adiabatic shock compression theory. The calculation model for coarse-grained copper was established by the Voronoi method and the microscopic temperature rise resulted from severe plastic deformation of grains was calculated by ANSYS/ls-dyna finite element software. The results show that it is feasible to fabricate NC copper by explosively dynamic deformation of coarse-grained copper and the average grain size of the NC copper can be controlled between 200˜400 nm. The whole temperature rise would increase with the increasing explosive thickness. Ammonium nitrate fuel oil explosive was adopted and five different thicknesses of the explosive, which are 20 mm, 25 mm, 30 mm, 35 mm, 45 mm, respectively, with the same diameter using 20 mm to the fly plate were adopted. The maximum macro and micro temperature rise is up to 532.4 K, 143.4 K, respectively, which has no great effect on grain refinement due to the whole temperature rise that is lower than grain growth temperature according to the high pressure melting theory.
Impact of Variable-Resolution Meshes on Regional Climate Simulations
NASA Astrophysics Data System (ADS)
Fowler, L. D.; Skamarock, W. C.; Bruyere, C. L.
2014-12-01
The Model for Prediction Across Scales (MPAS) is currently being used for seasonal-scale simulations on globally-uniform and regionally-refined meshes. Our ongoing research aims at analyzing simulations of tropical convective activity and tropical cyclone development during one hurricane season over the North Atlantic Ocean, contrasting statistics obtained with a variable-resolution mesh against those obtained with a quasi-uniform mesh. Analyses focus on the spatial distribution, frequency, and intensity of convective and grid-scale precipitations, and their relative contributions to the total precipitation as a function of the horizontal scale. Multi-month simulations initialized on May 1st 2005 using ERA-Interim re-analyses indicate that MPAS performs satisfactorily as a regional climate model for different combinations of horizontal resolutions and transitions between the coarse and refined meshes. Results highlight seamless transitions for convection, cloud microphysics, radiation, and land-surface processes between the quasi-uniform and locally- refined meshes, despite the fact that the physics parameterizations were not developed for variable resolution meshes. Our goal of analyzing the performance of MPAS is twofold. First, we want to establish that MPAS can be successfully used as a regional climate model, bypassing the need for nesting and nudging techniques at the edges of the computational domain as done in traditional regional climate modeling. Second, we want to assess the performance of our convective and cloud microphysics parameterizations as the horizontal resolution varies between the lower-resolution quasi-uniform and higher-resolution locally-refined areas of the global domain.
Impact of Variable-Resolution Meshes on Regional Climate Simulations
NASA Astrophysics Data System (ADS)
Fowler, L. D.; Skamarock, W. C.; Bruyere, C. L.
2013-12-01
The Model for Prediction Across Scales (MPAS) is currently being used for seasonal-scale simulations on globally-uniform and regionally-refined meshes. Our ongoing research aims at analyzing simulations of tropical convective activity and tropical cyclone development during one hurricane season over the North Atlantic Ocean, contrasting statistics obtained with a variable-resolution mesh against those obtained with a quasi-uniform mesh. Analyses focus on the spatial distribution, frequency, and intensity of convective and grid-scale precipitations, and their relative contributions to the total precipitation as a function of the horizontal scale. Multi-month simulations initialized on May 1st 2005 using NCEP/NCAR re-analyses indicate that MPAS performs satisfactorily as a regional climate model for different combinations of horizontal resolutions and transitions between the coarse and refined meshes. Results highlight seamless transitions for convection, cloud microphysics, radiation, and land-surface processes between the quasi-uniform and locally-refined meshes, despite the fact that the physics parameterizations were not developed for variable resolution meshes. Our goal of analyzing the performance of MPAS is twofold. First, we want to establish that MPAS can be successfully used as a regional climate model, bypassing the need for nesting and nudging techniques at the edges of the computational domain as done in traditional regional climate modeling. Second, we want to assess the performance of our convective and cloud microphysics parameterizations as the horizontal resolution varies between the lower-resolution quasi-uniform and higher-resolution locally-refined areas of the global domain.
You owe it to yourself: Boosting retirement saving with a responsibility-based appeal
Bryan, Christopher J.; Hershfield, Hal E.
2011-01-01
Americans are not saving enough for retirement. Previous research suggests this is due, in part, to people’s tendency to think of the future self as more like another person than like the present self, making saving feel like giving money away rather than like investing in oneself. Using objective employer saving data, a field experiment capitalized on this phenomenon to increase saving. It compared the effectiveness of a novel message—one appealing to people’s sense of “social” responsibility to their future selves—with a more traditional appeal to people’s sense of rational self-interest. The social-responsibility-to-the-future-self message resulted in larger increases in saving than the self-interest message, but only to the extent that people felt a strong “social” connection to their future selves. These results broaden our understanding of the psychology of moral responsibility and refine our understanding of the role of future-self continuity in fostering intertemporal patience. They further demonstrate how understanding conceptions of the self over time can suggest solutions to important and challenging policy problems. PMID:22103720
You owe it to yourself: boosting retirement saving with a responsibility-based appeal.
Bryan, Christopher J; Hershfield, Hal E
2012-08-01
Americans are not saving enough for retirement. Previous research suggests that this is due, in part, to people's tendency to think of the future self as more like another person than like the present self, making saving feel like giving money away rather than like investing in oneself. Using objective employer saving data, a field experiment capitalized on this phenomenon to increase saving. It compared the effectiveness of a novel message--one appealing to people's sense of "social" responsibility to their future selves--with a more traditional appeal to people's sense of rational self-interest. The social-responsibility-to-the-future-self message resulted in larger increases in saving than the self-interest message, but only to the extent that people felt a strong "social" connection to their future selves. These results broaden our understanding of the psychology of moral responsibility and refine our understanding of the role of future-self continuity in fostering intertemporal patience. They further demonstrate how understanding conceptions of the self over time can suggest solutions to important and challenging policy problems. (PsycINFO Database Record (c) 2012 APA, all rights reserved).