2011-01-01
Background Insecticide-treated mosquito nets (ITNs) and indoor-residual spraying have been scaled-up across sub-Saharan Africa as part of international efforts to control malaria. These interventions have the potential to significantly impact child survival. The Lives Saved Tool (LiST) was developed to provide national and regional estimates of cause-specific mortality based on the extent of intervention coverage scale-up. We compared the percent reduction in all-cause child mortality estimated by LiST against measured reductions in all-cause child mortality from studies assessing the impact of vector control interventions in Africa. Methods We performed a literature search for appropriate studies and compared reductions in all-cause child mortality estimated by LiST to 4 studies that estimated changes in all-cause child mortality following the scale-up of vector control interventions. The following key parameters measured by each study were applied to available country projections: baseline all-cause child mortality rate, proportion of mortality due to malaria, and population coverage of vector control interventions at baseline and follow-up years. Results The percent reduction in all-cause child mortality estimated by the LiST model fell within the confidence intervals around the measured mortality reductions for all 4 studies. Two of the LiST estimates overestimated the mortality reductions by 6.1 and 4.2 percentage points (33% and 35% relative to the measured estimates), while two underestimated the mortality reductions by 4.7 and 6.2 percentage points (22% and 25% relative to the measured estimates). Conclusions The LiST model did not systematically under- or overestimate the impact of ITNs on all-cause child mortality. These results show the LiST model to perform reasonably well at estimating the effect of vector control scale-up on child mortality when compared against measured data from studies across a range of malaria transmission settings. The LiST model appears to be a useful tool in estimating the potential mortality reduction achieved from scaling-up malaria control interventions. PMID:21501453
Frasher, Sarah K; Woodruff, Tracy M; Bouldin, Jennifer L
2016-06-01
In efforts to reduce nonpoint source runoff and improve water quality, Best Management Practices (BMPs) were implemented in the Outlet Larkin Creek Watershed. Farmers need to make scientifically informed decisions concerning BMPs addressing contaminants from agricultural fields. The BMP Tool was developed from previous studies to estimate BMP effectiveness at reducing nonpoint source contaminants. The purpose of this study was to compare the measured percent reduction of dissolved phosphorus (DP) and total suspended solids to the reported percent reductions from the BMP Tool for validation. Similarities were measured between the BMP Tool and the measured water quality parameters. Construction of a sedimentation pond resulted in 74 %-76 % reduction in DP as compared to 80 % as predicted with the BMP Tool. However, further research is needed to validate the tool for additional water quality parameters. The BMP Tool is recommended for future BMP implementation as a useful predictor for farmers.
DOT National Transportation Integrated Search
2010-09-01
Tools are proposed for carbon footprint estimation of transportation construction projects and decision support : for construction firms that must make equipment choice and usage decisions that affect profits, project duration : and greenhouse gas em...
Vehicle Technology Simulation and Analysis Tools | Transportation Research
| NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology
Tools for estimating VMT reductions from built environment changes.
DOT National Transportation Integrated Search
2013-06-01
Built environment characteristics are associated with walking, bicycling, transit use, and vehicle : miles traveled (VMT). Developing built environments supportive of walking, bicycling, and transit use : can help meet state VMT reduction goals. But ...
Lloyd-Jones, Donald M.; Huffman, Mark D.; Karmali, Kunal N.; Sanghavi, Darshak M.; Wright, Janet S.; Pelser, Colleen; Gulati, Martha; Masoudi, Frederick A.; Goff, David C.
2016-01-01
The Million Hearts Initiative has a goal of preventing 1 million heart attacks and strokes—the leading causes of mortality—through several public health and healthcare strategies by 2017. The American Heart Association and American College of Cardiology support the program. The Cardiovascular Risk Reduction Model was developed by Million Hearts and the Center for Medicare & Medicaid Services as a strategy to asses a value-based payment approach toward reduction in 10-year predicted risk of atherosclerotic cardiovascular disease (ASCVD) by implementing cardiovascular preventive strategies to manage the “ABCS” (aspirin therapy in appropriate patients, blood pressure control, cholesterol management, and smoking cessation). The purpose of this special report is to describe the development and intended use of the Million Hearts Longitudinal ASCVD Risk Assessment Tool. The Million Hearts Tool reinforces and builds on the “2013 ACC/AHA Guideline on the Assessment of Cardiovascular Risk” by allowing clinicians to estimate baseline and updated 10-year ASCVD risk estimates for primary prevention patients adhering to the appropriate ABCS over time, alone or in combination. The tool provides updated risk estimates based on evidence from high-quality systematic reviews and meta-analyses of the ABCS therapies. This novel approach to personalized estimation of benefits from risk-reducing therapies in primary prevention may help target therapies to those in whom they will provide the greatest benefit, and serves as the basis for a Center for Medicare & Medicaid Services program designed to evaluate the Million Hearts Cardiovascular Risk Reduction Model. PMID:27825770
Lloyd-Jones, Donald M; Huffman, Mark D; Karmali, Kunal N; Sanghavi, Darshak M; Wright, Janet S; Pelser, Colleen; Gulati, Martha; Masoudi, Frederick A; Goff, David C
2017-03-28
The Million Hearts Initiative has a goal of preventing 1 million heart attacks and strokes-the leading causes of mortality-through several public health and healthcare strategies by 2017. The American Heart Association and American College of Cardiology support the program. The Cardiovascular Risk Reduction Model was developed by Million Hearts and the Center for Medicare & Medicaid Services as a strategy to assess a value-based payment approach toward reduction in 10-year predicted risk of atherosclerotic cardiovascular disease (ASCVD) by implementing cardiovascular preventive strategies to manage the "ABCS" (aspirin therapy in appropriate patients, blood pressure control, cholesterol management, and smoking cessation). The purpose of this special report is to describe the development and intended use of the Million Hearts Longitudinal ASCVD Risk Assessment Tool. The Million Hearts Tool reinforces and builds on the "2013 ACC/AHA Guideline on the Assessment of Cardiovascular Risk" by allowing clinicians to estimate baseline and updated 10-year ASCVD risk estimates for primary prevention patients adhering to the appropriate ABCS over time, alone or in combination. The tool provides updated risk estimates based on evidence from high-quality systematic reviews and meta-analyses of the ABCS therapies. This novel approach to personalized estimation of benefits from risk-reducing therapies in primary prevention may help target therapies to those in whom they will provide the greatest benefit, and serves as the basis for a Center for Medicare & Medicaid Services program designed to evaluate the Million Hearts Cardiovascular Risk Reduction Model. Copyright © 2017 American Heart Association, Inc., and the American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Framework to parameterize and validate APEX to support deployment of the nutrient tracking tool
USDA-ARS?s Scientific Manuscript database
The Agricultural Policy Environmental eXtender (APEX) model is the scientific basis for the Nutrient Tracking Tool (NTT). NTT is an enhanced version of the Nitrogen Trading Tool, a user-friendly web-based computer program originally developed by the USDA. NTT was developed to estimate reductions in...
L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J
2000-03-15
The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio (<5.5, 5.5 to <6.5, 6.5 to <7.5, > or = 7.5), 2 levels of diastolic blood pressure (<90, > or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.
Tool-specific performance of vibration-reducing gloves for attenuating fingers-transmitted vibration
Welcome, Daniel E.; Dong, Ren G.; Xu, Xueyan S.; Warren, Christopher; McDowell, Thomas W.
2016-01-01
BACKGROUND Fingers-transmitted vibration can cause vibration-induced white finger. The effectiveness of vibration-reducing (VR) gloves for reducing hand transmitted vibration to the fingers has not been sufficiently examined. OBJECTIVE The objective of this study is to examine tool-specific performance of VR gloves for reducing finger-transmitted vibrations in three orthogonal directions (3D) from powered hand tools. METHODS A transfer function method was used to estimate the tool-specific effectiveness of four typical VR gloves. The transfer functions of the VR glove fingers in three directions were either measured in this study or during a previous study using a 3D laser vibrometer. More than seventy vibration spectra of various tools or machines were used in the estimations. RESULTS When assessed based on frequency-weighted acceleration, the gloves provided little vibration reduction. In some cases, the gloves amplified the vibration by more than 10%, especially the neoprene glove. However, the neoprene glove did the best when the assessment was based on unweighted acceleration. The neoprene glove was able to reduce the vibration by 10% or more of the unweighted vibration for 27 out of the 79 tools. If the dominant vibration of a tool handle or workpiece was in the shear direction relative to the fingers, as observed in the operation of needle scalers, hammer chisels, and bucking bars, the gloves did not reduce the vibration but increased it. CONCLUSIONS This study confirmed that the effectiveness for reducing vibration varied with the gloves and the vibration reduction of each glove depended on tool, vibration direction to the fingers, and finger location. VR gloves, including certified anti-vibration gloves do not provide much vibration reduction when judged based on frequency-weighted acceleration. However, some of the VR gloves can provide more than 10% reduction of the unweighted vibration for some tools or workpieces. Tools and gloves can be matched for better effectiveness for protecting the fingers. PMID:27867313
NASA Technical Reports Server (NTRS)
Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian
2000-01-01
This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
A Lagrangian stochastic model is proposed as a tool that can be utilized in forecasting remedial performance and estimating the benefits (in terms of flux and mass reduction) derived from a source zone remedial effort. The stochastic functional relationships that describe the hyd...
USDA-ARS?s Scientific Manuscript database
Streams throughout the North Canadian River watershed in northwest Oklahoma, USA have elevated levels of nutrients and sediment. SWAT (Soil and Water Assessment Tool) was used to identify areas that likely contributed disproportionate amounts of phosphorus (P) and sediment to Lake Overholser, the re...
Atmospheric Delay Reduction Using KARAT for GPS Analysis and Implications for VLBI
NASA Technical Reports Server (NTRS)
Ichikawa, Ryuichi; Hobiger, Thomas; Koyama, Yasuhiro; Kondo, Tetsuro
2010-01-01
We have been developing a state-of-the-art tool to estimate the atmospheric path delays by raytracing through mesoscale analysis (MANAL) data, which is operationally used for numerical weather prediction by the Japan Meteorological Agency (JMA). The tools, which we have named KAshima RAytracing Tools (KARAT)', are capable of calculating total slant delays and ray-bending angles considering real atmospheric phenomena. The KARAT can estimate atmospheric slant delays by an analytical 2-D ray-propagation model by Thayer and a 3-D Eikonal solver. We compared PPP solutions using KARAT with that using the Global Mapping Function (GMF) and Vienna Mapping Function 1 (VMF1) for GPS sites of the GEONET (GPS Earth Observation Network System) operated by Geographical Survey Institute (GSI). In our comparison 57 stations of GEONET during the year of 2008 were processed. The KARAT solutions are slightly better than the solutions using VMF1 and GMF with linear gradient model for horizontal and height positions. Our results imply that KARAT is a useful tool for an efficient reduction of atmospheric path delays in radio-based space geodetic techniques such as GNSS and VLBI.
Kenneth L. Clark; Nicholas Skowronski; John Hom; Matthew Duveneck; Yude Pan; Stephen Van Tuyl; Jason Cole; Matthew Patterson; Stephen Maurer
2009-01-01
Our goal is to assist the New Jersey Forest Fire Service and federal wildland fire managers in the New Jersey Pine Barrens evaluate where and when to conduct hazardous fuel reduction treatments. We used remotely sensed LIDAR (Light Detection and Ranging System) data and field sampling to estimate fuel loads and consumption during prescribed fire treatments. This...
Diesel Emissions Quantifier (DEQ)
.The Diesel Emissions Quantifier (Quantifier) is an interactive tool to estimate emission reductions and cost effectiveness. Publications EPA-420-F-13-008a (420f13008a), EPA-420-B-10-035 (420b10023), EPA-420-B-10-034 (420b10034)
Estimating rare events in biochemical systems using conditional sampling.
Sundar, V S
2017-01-28
The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.
Combat Service Support (CSS) Enabler Functional Assessment (CEFA)
1998-07-01
CDR), Combined Arms Support Command (CASCOM) with a tool to aid decision making related to mitigating E/I peacetime (programmatic) and wartime risks...not be fielded by Fiscal Year (FY) 10. Based on their estimates, any decisions , especially reductions in manpower, which rely on the existence of the E...Support (CSS) enablers/initiatives (E/I), thereby providing the Commander (CDR), Combined Arms Support Command (CASCOM) with a tool to aid decision
Experimental evaluation of tool run-out in micro milling
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta
2018-05-01
This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.
Dong, Ren G.; Welcome, Daniel E.; Peterson, Donald R.; Xu, Xueyan S.; McDowell, Thomas W.; Warren, Christopher; Asaki, Takafumi; Kudernatsch, Simon; Brammer, Antony
2015-01-01
Vibration-reducing (VR) gloves have been increasingly used to help reduce vibration exposure, but it remains unclear how effective these gloves are. The purpose of this study was to estimate tool-specific performances of VR gloves for reducing the vibrations transmitted to the palm of the hand in three orthogonal directions (3-D) in an attempt to assess glove effectiveness and aid in the appropriate selection of these gloves. Four typical VR gloves were considered in this study, two of which can be classified as anti-vibration (AV) gloves according to the current AV glove test standard. The average transmissibility spectrum of each glove in each direction was synthesized based on spectra measured in this study and other spectra collected from reported studies. More than seventy vibration spectra of various tools or machines were considered in the estimations, which were also measured in this study or collected from reported studies. The glove performance assessments were based on the percent reduction of frequency-weighted acceleration as is required in the current standard for assessing the risk of vibration exposures. The estimated tool-specific vibration reductions of the gloves indicate that the VR gloves could slightly reduce (<5%) or marginally amplify (<10%) the vibrations generated from low-frequency (<25 Hz) tools or those vibrating primarily along the axis of the tool handle. With other tools, the VR gloves could reduce palm-transmitted vibrations in the range of 5%–58%, primarily depending on the specific tool and its vibration spectra in the three directions. The two AV gloves were not more effective than the other gloves with some of the tools considered in this study. The implications of the results are discussed. Relevance to industry Hand-transmitted vibration exposure may cause hand-arm vibration syndrome. Vibration-reducing gloves are considered as an alternative approach to reduce the vibration exposure. This study provides useful information on the effectiveness of the gloves when used with many tools for reducing the vibration transmitted to the palm in three directions. The results can aid in the appropriate selection and use of these gloves. PMID:26726275
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. John J. Moore; Dr. Jianliang Lin,
2012-07-31
The main objective of this research program was to design and develop an optimal coating system that extends die life by minimizing premature die failure. In high-pressure aluminum die-casting, the die, core pins and inserts must withstand severe processing conditions. Many of the dies and tools in the industry are being coated to improve wear-resistance and decrease down-time for maintenance. However, thermal fatigue in metal itself can still be a major problem, especially since it often leads to catastrophic failure (i.e. die breakage) as opposed to a wear-based failure (parts begin to go out of tolerance). Tooling costs remain themore » largest portion of production costs for many of these parts, so the ability prevent catastrophic failures would be transformative for the manufacturing industry.The technology offers energy savings through reduced energy use in the die casting process from several factors, including increased life of the tools and dies, reuse of the dies and die components, reduction/elimination of lubricants, and reduced machine down time, and reduction of Al solder sticking on the die. The use of the optimized die coating system will also reduce environmental wastes and scrap parts. Current (2012) annual energy saving estimates, based on initial dissemination to the casting industry in 2010 and market penetration of 80% by 2020, is 3.1 trillion BTU's/year. The average annual estimate of CO2 reduction per year through 2020 is 0.63 Million Metric Tons of Carbon Equivalent (MM TCE).« less
Predictive tool for estimating the potential effect of water fluoridation on dental caries.
Foster, G R K; Downer, M C; Lunt, M; Aggarwal, V; Tickle, M
2009-03-01
To provide a tool for public health planners to estimate the potential improvement in dental caries in children that might be expected in a region if its water supply were to be fluoridated. Recent BASCD (British Association for the Study of Community Dentistry) dental epidemiological data for caries in 5- and 11-year-old children in English primary care trusts in fluoridated and non-fluoridated areas were analysed to estimate absolute and relative improvement in dmft/DMFT and caries-free measures observed in England. Where data were sufficient for testing significance this analysis included the effect of different levels of deprivation. A table of observed improvements was produced, together with an example of how that table can be used as a tool for estimating the expected improvement in caries in any specific region of England. Observed absolute improvements and 95% confidence intervals were: for 5-year-olds reduction in mean dmft 0.56 (0.38, 0.74) for IMD 12, 0.73 (0.60, 0.85) for IMD 20, and 0.94 (0.76, 1.12) for IMD 30, with 12% (9%, 14%) more children free of caries; for 11-year-olds reduction in mean DMFT 0.12 (0.04, 0.20) for IMD 12, 0.19 (0.13, 0.26) for IMD 20, 0.29 (0.18, 0.40) and for IMD 30, with 8% (5%, 11%) more children free from caries. The BASCD data taken together with a deprivation measure are capable of yielding an age-specific, 'intention to treat' model of water fluoridation that can be used to estimate the potential effect on caries levels of a notional new fluoridation scheme in an English region.
Block, Robert C; Abdolahi, Amir; Niemiec, Christopher P; Rigby, C Scott; Williams, Geoffrey C
2016-12-01
There is a lack of research on the use of electronic tools that guide patients toward reducing their cardiovascular disease risk. We conducted a 9-month clinical trial in which participants who were at low (n = 100) and moderate (n = 23) cardiovascular disease risk-based on the National Cholesterol Education Program III's 10-year risk estimator-were randomized to usual care or to usual care plus use of an Interactive Cholesterol Advisory Tool during the first 8 weeks of the study. In the moderate-risk category, an interaction between treatment condition and Framingham risk estimate on low-density lipoprotein and non-high-density lipoprotein cholesterol was observed, such that participants in the virtual clinician treatment condition had a larger reduction in low-density lipoprotein and non-high-density lipoprotein cholesterol as their Framingham risk estimate increased. Perceptions of the Interactive Cholesterol Advisory Tool were positive. Evidence-based information about cardiovascular disease risk and its management was accessible to participants without major technical challenges. © The Author(s) 2015.
Nutrient mitigation in a temporary river basin.
Tzoraki, Ourania; Nikolaidis, Nikolaos P; Cooper, David; Kassotaki, Elissavet
2014-04-01
We estimate the nutrient budget in a temporary Mediterranean river basin. We use field monitoring and modelling tools to estimate nutrient sources and transfer in both high and low flow conditions. Inverse modelling by the help of PHREEQC model validated the hypothesis of a losing stream during the dry period. Soil and Water Assessment Tool model captured the water quality of the basin. The 'total daily maximum load' approach is used to estimate the nutrient flux status by flow class, indicating that almost 60% of the river network fails to meet nitrogen criteria and 50% phosphate criteria. We recommend that existing well-documented remediation measures such as reforestation of the riparian area or composting of food process biosolids should be implemented to achieve load reduction in close conjunction with social needs.
Neurodynamic evaluation of hearing aid features using EEG correlates of listening effort.
Bernarding, Corinna; Strauss, Daniel J; Hannemann, Ronny; Seidler, Harald; Corona-Strauss, Farah I
2017-06-01
In this study, we propose a novel estimate of listening effort using electroencephalographic data. This method is a translation of our past findings, gained from the evoked electroencephalographic activity, to the oscillatory EEG activity. To test this technique, electroencephalographic data from experienced hearing aid users with moderate hearing loss were recorded, wearing hearing aids. The investigated hearing aid settings were: a directional microphone combined with a noise reduction algorithm in a medium and a strong setting, the noise reduction setting turned off, and a setting using omnidirectional microphones without any noise reduction. The results suggest that the electroencephalographic estimate of listening effort seems to be a useful tool to map the exerted effort of the participants. In addition, the results indicate that a directional processing mode can reduce the listening effort in multitalker listening situations.
Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae
2015-03-01
Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.
Measuring the Benefits of Clean Air and Water.
ERIC Educational Resources Information Center
Kneese, Allen V.
This book examines the current state of the art regarding benefits assessment, including such tools as bidding games, surveys, property-value studies, wage differentials, risk reduction evaluation, and mortality and morbidity cost estimation. It is based on research, sponsored by the United States Environmental Protection Agency, related to the…
USDA-ARS?s Scientific Manuscript database
Demographic matrix modeling of invasive plant populations can be a powerful tool to identify key life stage transitions for targeted disruption in order to cause population decline. This approach can provide quantitative estimates of reductions in select vital rates needed to reduce population growt...
Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.
2016-01-01
Abstract. The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. PMID:27493982
Least-cost control of agricultural nutrient contributions to the Gulf of Mexico hypoxic zone.
Rabotyagov, Sergey; Campbell, Todd; Jha, Manoj; Gassman, Philip W; Arnold, Jeffrey; Kurkalova, Lyubov; Secchi, Silvia; Feng, Hongli; Kling, Catherine L
2010-09-01
In 2008, the hypoxic zone in the Gulf of Mexico, measuring 20 720 km2, was one of the two largest reported since measurement of the zone began in 1985. The extent of the hypoxic zone is related to nitrogen and phosphorous loadings originating on agricultural fields in the upper Midwest. This study combines the tools of evolutionary computation with a water quality model and cost data to develop a trade-off frontier for the Upper Mississippi River Basin specifying the least cost of achieving nutrient reductions and the location of the agricultural conservation practices needed. The frontier allows policymakers and stakeholders to explicitly see the trade-offs between cost and nutrient reductions. For example, the cost of reducing annual nitrate-N loadings by 30% is estimated to be US$1.4 billion/year, with a concomitant 36% reduction in P and the cost of reducing annual P loadings by 30% is estimated to be US$370 million/year, with a concomitant 9% reduction in nitrate-N.
Development of a Comprehensive Community Nitrogen Oxide Emissions Reduction Toolkit (CCNERT)
NASA Astrophysics Data System (ADS)
Sung, Yong Hoon
The main objective of this study is to research and develop a simplified tool to estimate energy use in a community and its associated effects on air pollution. This tool is intended to predict the impacts of selected energy conservation options and efficiency programs on emission reduction. It is intended to help local government and their residents understand and manage information collection and the procedures to be used. This study presents a broad overview of the community-wide energy use and NOx emissions inventory process. It also presents various simplified procedures to estimate each sector's energy use. In an effort to better understand community-wide energy use and its associated NOx emissions, the City of College Station, Texas, was selected as a case study community for this research. While one community might successfully reduce the production of NOx emissions by adopting electricity efficiency programs in its buildings, another community might be equally successful by changing the mix of fuel sources used to generate electricity, which is consumed by the community. In yet a third community low NOx automobiles may be mandated. Unfortunately, the impact and cost of one strategy over another changes over time as major sources of pollution are reduced. Therefore, this research proposes to help community planners answer these questions and to assist local communities with their NOx emission reduction plans by developing a Comprehensive Community NOx Emissions Reduction Toolkit (CCNERT). The proposed simplified tool could have a substantial impact on reducing NOx emission by providing decision-makers with a preliminary understanding about the impacts of various energy efficiency programs on emissions reductions. To help decision makers, this study has addressed these issues by providing a general framework for examining how a community's non-renewable energy use leads to NOx emissions, by quantifying each end-user's energy usage and its associated NOx emissions, and by evaluating the environmental benefits of various types of energy saving options.
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2016-12-01
Modeling of fate and transport of fecal bacteria in a watershed is generally a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria (E.coli) source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads were input to the SWAT model in order to simulate the transport through the land and in-stream conditions. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on H-GAC's regional land use, population and household projections (up to 2040). Based on the in-stream reductions required to meet the water quality standards, the corresponding required source load reductions were estimated.
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2017-12-01
Modeling of fate and transport of fecal bacteria in a watershed is a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads from SELECT model were input to the SWAT model, and simulate the bacteria transport through the land and in-stream. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on regional land use, population and household forecast (up to 2040). Based on the reductions required to meet the water quality standards in-stream, the corresponding required source load reductions were estimated.
Population genetics of autopolyploids under a mixed mating model and the estimation of selfing rate.
Hardy, Olivier J
2016-01-01
Nowadays, the population genetics analysis of autopolyploid species faces many difficulties due to (i) limited development of population genetics tools under polysomic inheritance, (ii) difficulties to assess allelic dosage when genotyping individuals and (iii) a form of inbreeding resulting from the mechanism of 'double reduction'. Consequently, few data analysis computer programs are applicable to autopolyploids. To contribute bridging this gap, this article first derives theoretical expectations for the inbreeding and identity disequilibrium coefficients under polysomic inheritance in a mixed mating model. Moment estimators of these coefficients are proposed when exact genotypes or just markers phenotypes (i.e. allelic dosage unknown) are available. This led to the development of estimators of the selfing rate based on adult genotypes or phenotypes and applicable to any even-ploidy level. Their statistical performances and robustness were assessed by numerical simulations. Contrary to inbreeding-based estimators, the identity disequilibrium-based estimator using phenotypes is robust (absolute bias generally < 0.05), even in the presence of double reduction, null alleles or biparental inbreeding due to isolation by distance. A fairly good precision of the selfing rate estimates (root mean squared error < 0.1) is already achievable using a sample of 30-50 individuals phenotyped at 10 loci bearing 5-10 alleles each, conditions reachable using microsatellite markers. Diallelic markers (e.g. SNP) can also perform satisfactorily in diploids and tetraploids but more polymorphic markers are necessary for higher ploidy levels. The method is implemented in the software SPAGeDi and should contribute to reduce the lack of population genetics tools applicable to autopolyploids. © 2015 John Wiley & Sons Ltd.
Bailey, E A; Dutton, A W; Mattingly, M; Devasia, S; Roemer, R B
1998-01-01
Reduced-order modelling techniques can make important contributions in the control and state estimation of large systems. In hyperthermia, reduced-order modelling can provide a useful tool by which a large thermal model can be reduced to the most significant subset of its full-order modes, making real-time control and estimation possible. Two such reduction methods, one based on modal decomposition and the other on balanced realization, are compared in the context of simulated hyperthermia heat transfer problems. The results show that the modal decomposition reduction method has three significant advantages over that of balanced realization. First, modal decomposition reduced models result in less error, when compared to the full-order model, than balanced realization reduced models of similar order in problems with low or moderate advective heat transfer. Second, because the balanced realization based methods require a priori knowledge of the sensor and actuator placements, the reduced-order model is not robust to changes in sensor or actuator locations, a limitation not present in modal decomposition. Third, the modal decomposition transformation is less demanding computationally. On the other hand, in thermal problems dominated by advective heat transfer, numerical instabilities make modal decomposition based reduction problematic. Modal decomposition methods are therefore recommended for reduction of models in which advection is not dominant and research continues into methods to render balanced realization based reduction more suitable for real-time clinical hyperthermia control and estimation.
In the mid-1990s the Tampa Bay Estuary Program proposed a nutrient reduction strategy focused on improving water clarity to promote seagrass expansion within Tampa Bay. A System Dynamics Model is being developed to evaluate spatially and temporally explicit impacts of nutrient r...
Cost of areal reduction of gulf hypoxia through agricultural practice
USDA-ARS?s Scientific Manuscript database
A major share of the area of hypoxic growth in the Northern Gulf of Mexico has been attributed to nutrient run-off from agricultural fields, but no estimate is available for the cost of reducing Gulf hypoxic area using agricultural conservation practices. We apply the Soil and Water Assessment Tool ...
Dinitz, Laura B.
2008-01-01
With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1991-01-01
Designing for cost is a state of mind. Of course, a lot of technical knowledge is required and the use of appropriate tools will improve the process. Unfortunately, the extensive use of weight based cost estimating relationships has generated a perception in the aerospace community that the primary way to reduce cost is to reduce weight. Wrong! Based upon an approximation of an industry accepted formula, the PRICE H (tm) production-production equation, Dean demonstrated theoretically that the optimal trajectory for cost reduction is predominantly in the direction of system complexity reduction, not system weight reduction. Thus the phrase "keep it simple" is a primary state of mind required for reducing cost throughout the design process.
Small Engine Technology (SET). Task 33: Airframe, Integration, and Community Noise Study
NASA Technical Reports Server (NTRS)
Lieber, Lys S.; Elkins, Daniel; Golub, Robert A. (Technical Monitor)
2002-01-01
Task Order 33 had four primary objectives as follows: (1) Identify and prioritize the airframe noise reduction technologies needed to accomplish the NASA Pillar goals for business and regional aircraft. (2) Develop a model to estimate the effect of jet shear layer refraction and attenuation of internally generated source noise of a turbofan engine on the aircraft system noise. (3) Determine the effect on community noise of source noise changes of a generic turbofan engine operating from sea level to 15,000 feet. (4) Support lateral attenuation experiments conducted by NASA Langley at Wallops Island, VA, by coordinating opportunities for Contractor Aircraft to participate as a noise source during the noise measurements. Noise data and noise prediction tools, including airframe noise codes, from the NASA Advanced Subsonic Technology (AST) program were applied to assess the current status of noise reduction technologies relative to the NASA pillar goals for regional and small business jet aircraft. In addition, the noise prediction tools were applied to evaluate the effectiveness of airframe-related noise reduction concepts developed in the AST program on reducing the aircraft system noise. The AST noise data and acoustic prediction tools used in this study were furnished by NASA.
van Mantgem, Phillip J.; Lalemand, Laura; Keifer, MaryBeth; Kane, Jeffrey M.
2016-01-01
Prescribed fire is a widely used forest management tool, yet the long-term effectiveness of prescribed fire in reducing fuels and fire hazards in many vegetation types is not well documented. We assessed the magnitude and duration of reductions in surface fuels and modeled fire hazards in coniferous forests across nine U.S. national parks in California and the Colorado Plateau. We used observations from a prescribed fire effects monitoring program that feature standard forest and surface fuels inventories conducted pre-fire, immediately following an initial (first-entry) prescribed fire and at varying intervals up to >20 years post-fire. A subset of these plots was subjected to prescribed fire again (second-entry) with continued monitoring. Prescribed fire effects were highly variable among plots, but we found on average first-entry fires resulted in a significant post-fire reduction in surface fuels, with litter and duff fuels not returning to pre-fire levels over the length of our observations. Fine and coarse woody fuels often took a decade or longer to return to pre-fire levels. For second-entry fires we found continued fuels reductions, without strong evidence of fuel loads returning to levels observed immediately prior to second-entry fire. Following both first- and second-entry fire there were increases in estimated canopy base heights, along with reductions in estimated canopy bulk density and modeled flame lengths. We did not find evidence of return to pre-fire conditions during our observation intervals for these measures of fire hazard. Our results show that prescribed fire can be a valuable tool to reduce fire hazards and, depending on forest conditions and the measurement used, reductions in fire hazard can last for decades. Second-entry prescribed fire appeared to reinforce the reduction in fuels and fire hazard from first-entry fires.
Evaluating the potential for secondary mass savings in vehicle lightweighting.
Alonso, Elisa; Lee, Theresa M; Bjelkengren, Catarina; Roth, Richard; Kirchain, Randolph E
2012-03-06
Secondary mass savings are mass reductions that may be achieved in supporting (load-bearing) vehicle parts when the gross vehicle mass (GVM) is reduced. Mass decompounding is the process by which it is possible to identify further reductions when secondary mass savings result in further reduction of GVM. Maximizing secondary mass savings (SMS) is a key tool for maximizing vehicle fuel economy. In today's industry, the most complex parts, which require significant design detail (and cost), are designed first and frozen while the rest of the development process progresses. This paper presents a tool for estimating SMS potential early in the design process and shows how use of the tool to set SMS targets early, before subsystems become locked in, maximizes mass savings. The potential for SMS in current passenger vehicles is estimated with an empirical model using engineering analysis of vehicle components to determine mass-dependency. Identified mass-dependent components are grouped into subsystems, and linear regression is performed on subsystem mass as a function of GVM. A Monte Carlo simulation is performed to determine the mean and 5th and 95th percentiles for the SMS potential per kilogram of primary mass saved. The model projects that the mean theoretical secondary mass savings potential is 0.95 kg for every 1 kg of primary mass saved, with the 5th percentile at 0.77 kg/kg when all components are available for redesign. The model was used to explore an alternative scenario where realistic manufacturing and design limitations were implemented. In this case study, four key subsystems (of 13 total) were locked-in and this reduced the SMS potential to a mean of 0.12 kg/kg with a 5th percentile of 0.1 kg/kg. Clearly, to maximize the impact of mass reduction, targets need to be established before subsystems become locked in.
Design of a practical model-observer-based image quality assessment method for CT imaging systems
NASA Astrophysics Data System (ADS)
Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana
2014-03-01
The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.
HI data reduction for the Arecibo Pisces-Perseus Supercluster Survey
NASA Astrophysics Data System (ADS)
Davis, Cory; Johnson, Cory; Craig, David W.; Haynes, Martha P.; Jones, Michael G.; Koopmann, Rebecca A.; Hallenbeck, Gregory L.; Undergraduate ALFALFA Team
2017-01-01
The Undergraduate ALFALFA team is currently focusing on the analysis of the Pisces-Perseus Supercluster to test current supercluster formation models. The primary goal of our research is to reduce L-band HI data from the Arecibo telescope. To reduce the data we use IDL programs written by our collaborators to reduce the data and find potential sources whose mass can be estimated by the baryonic Tully-Fisher relation, which relates the luminosity to the rotational velocity profile of spiral galaxies. Thus far we have reduced data and estimated HI masses for several galaxies in the supercluster region.We will give examples of data reduction and preliminary results for both the fall 2015 and 2016 observing seasons. We will also describe the data reduction process and the process of learning the associated software, and the use of virtual observatory tools such as the SDSS databases, Aladin, TOPCAT and others.This research was supported by the NSF grant AST-1211005.
[Economic evaluation and rationale for human health risk management decisions].
Fokin, S G; Bobkova, T E
2011-01-01
The priority task of human health maintenance and improvement is risk management using the new economic concepts based on the assessment of potential and real human risks from exposure to poor environmental factors and on the estimation of cost-benefit and cost-effectiveness ratios. The application of economic tools to manage a human risk makes it possible to assess various measures both as a whole and their individual priority areas, to rank different scenarios in terms of their effectiveness, to estimate costs per unit of risk reduction and benefit increase (damage decrease).
Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen
2018-01-01
Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system. PMID:29473877
Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen
2018-02-23
Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system.
Nosyk, Bohdan; Zang, Xiao; Min, Jeong E; Krebs, Emanuel; Lima, Viviane D; Milloy, M-J; Shoveller, Jean; Barrios, Rolando; Harrigan, P Richard; Kerr, Thomas; Wood, Evan; Montaner, Julio S G
2017-07-01
Antiretroviral therapy (ART) and harm reduction services have been cited as key contributors to control of HIV epidemics; however, the specific contribution of ART has been questioned due to uncertainty of its true efficacy on HIV transmission through needle sharing. We aimed to isolate the independent effects of harm reduction services (opioid agonist treatment uptake and needle distribution volumes) and ART on HIV transmission via needle sharing in British Columbia, Canada, from 1996 to 2013. We used comprehensive linked individual health administrative and registry data for the population of diagnosed people living with HIV in British Columbia to populate a dynamic, compartmental transmission model to simulate the HIV/AIDS epidemic in British Columbia from 1996 to 2013. We estimated HIV incidence, mortality, and quality-adjusted life-years (QALYs). We also estimated scenarios designed to isolate the independent effects of harm reduction services and ART, assuming 50% (10-90%) efficacy, in reducing HIV incidence through needle sharing, and we investigated structural and parameter uncertainty. We estimate that 3204 (upper bound-lower bound 2402-4589) incident HIV cases were averted between 1996 and 2013 as a result of the combined effect of the expansion of harm reduction services and ART coverage on HIV transmission via needle sharing. In a hypothetical scenario assuming ART had zero effect on transmission through needle sharing, we estimated harm reduction services alone would have accounted for 77% (upper bound-lower bound 62-95%) of averted HIV incidence. In a separate hypothetical scenario where harm reduction services remained at 1996 levels, we estimated ART alone would have accounted for 44% (10-67%) of averted HIV incidence. As a result of high distribution volumes, needle distribution predominantly accounted for incidence reductions attributable to harm reduction but opioid agonist treatment provided substantially greater QALY gains. If the true efficacy of ART in preventing HIV transmission through needle sharing is closer to its efficacy in sexual transmission, ART's effect on incident cases averted could be greater than that of harm reduction. Nonetheless, harm reduction services had a vital role in reducing HIV incidence in British Columbia, and should be viewed as essential and cost-effective tools in combination implementation strategies to reduce the public health and economic burden of HIV/AIDS. BC Ministry of Health; National Institutes of Health (R01DA041747); Genome Canada (142HIV). Copyright © 2017 Elsevier Ltd. All rights reserved.
Ernst, Christian; Szczesny, Andrea; Soderstrom, Naomi; Siegmund, Frank; Schleppers, Alexander
2012-09-01
One of the declared objectives of surgical suite management in Germany is to increase operating room (OR) efficiency by reducing tardiness of first case of the day starts. We analyzed whether the introduction of OR management tools by German hospitals in response to increasing economic pressure was successful in achieving this objective. The OR management tools we considered were the appointment of an OR manager and the development and adoption of a surgical suite governance document (OR charter). We hypothesized that tardiness of first case starts was less in ORs that have adopted one or both of these tools. Using representative 2005 survey data from 107 German anesthesiology departments, we used a Tobit model to estimate the effect of the introduction of an OR manager or OR charter on tardiness of first case starts, while controlling for hospital size and surgical suite complexity. Adoption reduced tardiness of first case starts by at least 7 minutes (mean reduction 15 minutes, 95% confidence interval (CI): 7-22 minutes, P < 0.001). Reductions in tardiness of first case starts figure prominently the objectives of surgical suite management in Germany. Our results suggest that the appointment of an OR manager or the adoption of an OR charter support this objective. For short-term decision making on the day of surgery, this reduction in tardiness may have economic implications, because it reduced overutilized OR time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Littleton, Harry; Griffin, John
2011-07-31
This project was a subtask of Energy Saving Melting and Revert Reduction Technology (Energy SMARRT) Program. Through this project, technologies, such as computer modeling, pattern quality control, casting quality control and marketing tools, were developed to advance the Lost Foam Casting process application and provide greater energy savings. These technologies have improved (1) production efficiency, (2) mechanical properties, and (3) marketability of lost foam castings. All three reduce energy consumption in the metals casting industry. This report summarizes the work done on all tasks in the period of January 1, 2004 through June 30, 2011. Current (2011) annual energy savingmore » estimates based on commercial introduction in 2011 and a market penetration of 97% by 2020 is 5.02 trillion BTU's/year and 6.46 trillion BTU's/year with 100% market penetration by 2023. Along with these energy savings, reduction of scrap and improvement in casting yield will result in a reduction of the environmental emissions associated with the melting and pouring of the metal which will be saved as a result of this technology. The average annual estimate of CO2 reduction per year through 2020 is 0.03 Million Metric Tons of Carbon Equivalent (MM TCE).« less
Co-control of urban air pollutants and greenhouse gases in Mexico City.
West, J Jason; Osnaya, Patricia; Laguna, Israel; Martínez, Julia; Fernández, Adrián
2004-07-01
This study addresses the synergies of mitigation measures to control urban air pollutant and greenhouse gas (GHG) emissions, in developing integrated "co-control" strategies for Mexico City. First, existing studies of emissions reduction measures--PROAIRE (the air quality plan for Mexico City) and separate GHG studies--are used to construct a harmonized database of options. Second, linear programming (LP) is developed and applied as a decision-support tool to analyze least-cost strategies for meeting co-control targets for multiple pollutants. We estimate that implementing PROAIRE measures as planned will reduce 3.1% of the 2010 metropolitan CO2 emissions, in addition to substantial local air pollutant reductions. Applying the LP, PROAIRE emissions reductions can be met at a 20% lower cost, using only the PROAIRE measures, by adjusting investments toward the more cost-effective measures; lower net costs are possible by including cost-saving GHG mitigation measures, but with increased investment. When CO2 emission reduction targets are added to PROAIRE targets, the most cost-effective solutions use PROAIRE measures for the majority of local pollutant reductions, and GHG measures for additional CO2 control. Because of synergies, the integrated planning of urban-global co-control can be beneficial, but we estimate that for Mexico City these benefits are often small.
Garcés-Vega, Francisco; Marks, Bradley P
2014-08-01
In the last 20 years, the use of microbial reduction models has expanded significantly, including inactivation (linear and nonlinear), survival, and transfer models. However, a major constraint for model development is the impossibility to directly quantify the number of viable microorganisms below the limit of detection (LOD) for a given study. Different approaches have been used to manage this challenge, including ignoring negative plate counts, using statistical estimations, or applying data transformations. Our objective was to illustrate and quantify the effect of negative plate count data management approaches on parameter estimation for microbial reduction models. Because it is impossible to obtain accurate plate counts below the LOD, we performed simulated experiments to generate synthetic data for both log-linear and Weibull-type microbial reductions. We then applied five different, previously reported data management practices and fit log-linear and Weibull models to the resulting data. The results indicated a significant effect (α = 0.05) of the data management practices on the estimated model parameters and performance indicators. For example, when the negative plate counts were replaced by the LOD for log-linear data sets, the slope of the subsequent log-linear model was, on average, 22% smaller than for the original data, the resulting model underpredicted lethality by up to 2.0 log, and the Weibull model was erroneously selected as the most likely correct model for those data. The results demonstrate that it is important to explicitly report LODs and related data management protocols, which can significantly affect model results, interpretation, and utility. Ultimately, we recommend using only the positive plate counts to estimate model parameters for microbial reduction curves and avoiding any data value substitutions or transformations when managing negative plate counts to yield the most accurate model parameters.
Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C
2009-03-01
Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.
Willis, Henry H; LaTourrette, Tom
2008-04-01
This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI-L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk-reducing effectiveness of WHTI-L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI-L, and a range of casualty cost estimates based on the willingness-to-pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI-L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14-26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5-6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit-cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the distribution of risks it is intended to manage. But, continued research to develop additional tools and data is necessary to support application of these approaches. These include refinement of models and simulations, engagement of subject matter experts, implementation of program evaluation, and estimating the costs of casualties from terrorism events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Therkelsen, Peter L.; Rao, Prakash; Aghajanzadeh, Arian
ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performancemore » improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.« less
CEREBRA: a 3-D visualization tool for brain network extracted from fMRI data.
Nasir, Baris; Yarman Vural, Fatos T
2016-08-01
In this paper, we introduce a new tool, CEREBRA, to visualize the 3D network of human brain, extracted from the fMRI data. The tool aims to analyze the brain connectivity by representing the selected voxels as the nodes of the network. The edge weights among the voxels are estimated by considering the relationships among the voxel time series. The tool enables the researchers to observe the active brain regions and the interactions among them by using graph theoretic measures, such as, the edge weight and node degree distributions. CEREBRA provides an interactive interface with basic display and editing options for the researchers to study their hypotheses about the connectivity of the brain network. CEREBRA interactively simplifies the network by selecting the active voxels and the most correlated edge weights. The researchers may remove the voxels and edges by using local and global thresholds selected on the window. The built-in graph reduction algorithms are then eliminate the irrelevant regions, voxels and edges and display various properties of the network. The toolbox is capable of space-time representation of the voxel time series and estimated arc weights by using the animated heat maps.
Genome-Enabled Molecular Tools for Reductive Dehalogenation
2011-11-01
Genome-Enabled Molecular Tools for Reductive Dehalogenation - A Shift in Paradigm for Bioremediation - Alfred M. Spormann Departments of Chemical...Genome-Enabled Molecular Tools for Reductive Dehalogenation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Applications Technical Session No. 3D C-77 GENOME-ENABLED MOLECULAR TOOLS FOR REDUCTIVE DEHALOGENATION PROFESSOR ALFRED SPORMANN Stanford
Wear-Induced Changes in FSW Tool Pin Profile: Effect of Process Parameters
NASA Astrophysics Data System (ADS)
Sahlot, Pankaj; Jha, Kaushal; Dey, G. K.; Arora, Amit
2018-06-01
Friction stir welding (FSW) of high melting point metallic (HMPM) materials has limited application due to tool wear and relatively short tool life. Tool wear changes the profile of the tool pin and adversely affects weld properties. A quantitative understanding of tool wear and tool pin profile is crucial to develop the process for joining of HMPM materials. Here we present a quantitative wear study of H13 steel tool pin profile for FSW of CuCrZr alloy. The tool pin profile is analyzed at multiple traverse distances for welding with various tool rotational and traverse speeds. The results indicate that measured wear depth is small near the pin root and significantly increases towards the tip. Near the pin tip, wear depth increases with increase in tool rotational speed. However, change in wear depth near the pin root is minimal. Wear depth also increases with decrease in tool traverse speeds. Tool pin wear from the bottom results in pin length reduction, which is greater for higher tool rotational speeds, and longer traverse distances. The pin profile changes due to wear and result in root defect for long traverse distance. This quantitative understanding of tool wear would be helpful to estimate tool wear, optimize process parameters, and tool pin shape during FSW of HMPM materials.
Reduction in child mortality in Ethiopia: analysis of data from demographic and health surveys.
Doherty, Tanya; Rohde, Sarah; Besada, Donela; Kerber, Kate; Manda, Samuel; Loveday, Marian; Nsibande, Duduzile; Daviaud, Emmanuelle; Kinney, Mary; Zembe, Wanga; Leon, Natalie; Rudan, Igor; Degefie, Tedbabe; Sanders, David
2016-12-01
To examine changes in under-5 mortality, coverage of child survival interventions and nutritional status of children in Ethiopia between 2000 and 2011. Using the Lives Saved Tool, the impact of changes in coverage of child survival interventions on under-5 lives saved was estimated. Estimates of child mortality were generated using three Ethiopia Demographic and Health Surveys undertaken between 2000 and 2011. Coverage indicators for high impact child health interventions were calculated and the Lives Saved Tool (LiST) was used to estimate child lives saved in 2011. The mortality rate in children younger than 5 years decreased rapidly from 218 child deaths per 1000 live births (95% confidence interval 183 to 252) in the period 1987-1991 to 88 child deaths per 1000 live births in the period 2007-2011 (78 to 98). The prevalence of moderate or severe stunting in children aged 6-35 months also declined significantly. Improvements in the coverage of interventions relevant to child survival in rural areas of Ethiopia between 2000 and 2011 were found for tetanus toxoid, DPT3 and measles vaccination, oral rehydration solution (ORS) and care-seeking for suspected pneumonia. The LiST analysis estimates that there were 60 700 child deaths averted in 2011, primarily attributable to decreases in wasting rates (18%), stunting rates (13%) and water, sanitation and hygiene (WASH) interventions (13%). Improvements in the nutritional status of children and increases in coverage of high impact interventions most notably WASH and ORS have contributed to the decline in under-5 mortality in Ethiopia. These proximal determinants however do not fully explain the mortality reduction which is plausibly also due to the synergistic effect of major child health and nutrition policies and delivery strategies.
Theodoratou, Evropi; Johnson, Sue; Jhass, Arnoupe; Madhi, Shabir A; Clark, Andrew; Boschi-Pinto, Cynthia; Bhopal, Sunil; Rudan, Igor; Campbell, Harry
2010-04-01
With the aim of populating the Lives Saved Tool (LiST) with parameters of effectiveness of existing interventions, we conducted a systematic review of the literature assessing the effect of Haemophilus influenzae type b (Hib) and pneumococcal (PC) conjugate vaccines on incidence, severe morbidity and mortality from childhood pneumonia. We summarized cluster randomized controlled trials (cRCTs) and case-control studies of Hib conjugate vaccines and RCTs of 9- and 11-valent PC conjugate vaccines conducted in developing countries across outcome measures using standard meta-analysis methods. We used a set of standardized rules developed for the purpose of populating the LiST tool with required parameters to promote comparability across reviews of interventions against the major causes of childhood mortality. The estimates could be adjusted further to account for factors such as PC vaccine serotype content, PC serotype distribution and human immunodeficiency virus prevalence but this was not included as part of the LiST model approach. The available evidence from published data points to a summary effect of the Hib conjugate vaccine on clinical pneumonia of 4%, on clinical severe pneumonia of 6% and on radiologically confirmed pneumonia of 18%. Respective effectiveness estimates for PC vaccines (all valent) on clinical pneumonia is 7%, clinical severe pneumonia is 7% and radiologically confirmed pneumonia is 26%. The findings indicated that radiologically confirmed pneumonia, as a severe morbidity proxy for mortality, provided better estimates for the LiST model of effect of interventions on mortality reduction than did other outcomes evaluated. The LiST model will use this to estimate the pneumonia mortality reduction which might be observed when scaling up Hib and PC conjugate vaccination in the context of an overall package of child health interventions.
Model diagnostics in reduced-rank estimation
Chen, Kun
2016-01-01
Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches. PMID:28003860
Model diagnostics in reduced-rank estimation.
Chen, Kun
2016-01-01
Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches.
NASA Technical Reports Server (NTRS)
Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A
1992-01-01
The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.
van der Waal, Daniëlle; Broeders, Mireille J M; Verbeek, André L M; Duffy, Stephen W; Moss, Sue M
2015-07-01
Ongoing breast cancer screening programs can only be evaluated using observational study designs. Most studies have observed a reduction in breast cancer mortality, but design differences appear to have resulted in different estimates. Direct comparison of case-control and trial analyses gives more insight into this variation. Here, we performed case-control analyses within the randomized UK Age Trial. The Age Trial assessed the effect of screening on breast cancer mortality in women ages 40-49 years. In our approach, case subjects were defined as breast cancer deaths between trial entry (1991-1997) and 2004. Women were ages 39-41 years at entry. For every case subject, five control subjects were selected. All case subjects were included in analyses of screening invitation (356 case subjects, 1,780 controls), whereas analyses of attendance were restricted to women invited to screening (105 case subjects, 525 age-matched controls). Odds ratios (OR) were estimated with conditional logistic regression. We used and compared two methods to correct for self-selection bias. Screening invitation resulted in a breast cancer mortality reduction of 17% (95% confidence interval [CI]: -36%, +6%), similar to trial results. Different exposure definitions and self-selection adjustments influenced the observed breast cancer mortality reduction. Depending on the method, "ever screened" appeared to be associated with a small reduction (OR: 0.86, 95% CI: 0.40, 1.89) or no reduction (OR: 1.02, 95% CI: 0.48, 2.14) using the two methods of correction. Recent attendance resulted in an adjusted mortality reduction of 36% (95% CI: -69%, +31%) or 45% (95% CI: -71%, +5%). Observational studies, and particularly case-control studies, are an important monitoring tool for breast cancer screening programs. The focus should be on diminishing bias in observational studies and gaining a better understanding of the influence of study design on estimates of mortality reduction.
Rebolledo-Leiva, Ricardo; Angulo-Meza, Lidia; Iriarte, Alfredo; González-Araya, Marcela C
2017-09-01
Operations management tools are critical in the process of evaluating and implementing action towards a low carbon production. Currently, a sustainable production implies both an efficient resource use and the obligation to meet targets for reducing greenhouse gas (GHG) emissions. The carbon footprint (CF) tool allows estimating the overall amount of GHG emissions associated with a product or activity throughout its life cycle. In this paper, we propose a four-step method for a joint use of CF assessment and Data Envelopment Analysis (DEA). Following the eco-efficiency definition, which is the delivery of goods using fewer resources and with decreasing environmental impact, we use an output oriented DEA model to maximize production and reduce CF, taking into account simultaneously the economic and ecological perspectives. In another step, we stablish targets for the contributing CF factors in order to achieve CF reduction. The proposed method was applied to assess the eco-efficiency of five organic blueberry orchards throughout three growing seasons. The results show that this method is a practical tool for determining eco-efficiency and reducing GHG emissions. Copyright © 2017 Elsevier B.V. All rights reserved.
2011-01-01
Background There is a growing body of evidence that integrated packages of community-based interventions, a form of programming often implemented by NGOs, can have substantial child mortality impact. More countries may be able to meet Millennium Development Goal (MDG) 4 targets by leveraging such programming. Analysis of the mortality effect of this type of programming is hampered by the cost and complexity of direct mortality measurement. The Lives Saved Tool (LiST) produces an estimate of mortality reduction by modelling the mortality effect of changes in population coverage of individual child health interventions. However, few studies to date have compared the LiST estimates of mortality reduction with those produced by direct measurement. Methods Using results of a recent review of evidence for community-based child health programming, a search was conducted for NGO child health projects implementing community-based interventions that had independently verified child mortality reduction estimates, as well as population coverage data for modelling in LiST. One child survival project fit inclusion criteria. Subsequent searches of the USAID Development Experience Clearinghouse and Child Survival Grants databases and interviews of staff from NGOs identified no additional projects. Eight coverage indicators, covering all the project’s technical interventions were modelled in LiST, along with indicator values for most other non-project interventions in LiST, mainly from DHS data from 1997 and 2003. Results The project studied was implemented by World Relief from 1999 to 2003 in Gaza Province, Mozambique. An independent evaluation collecting pregnancy history data estimated that under-five mortality declined 37% and infant mortality 48%. Using project-collected coverage data, LiST produced estimates of 39% and 34% decline, respectively. Conclusions LiST gives reasonably accurate estimates of infant and child mortality decline in an area where a package of community-based interventions was implemented. This and other validation exercises support use of LiST as an aid for program planning to tailor packages of community-based interventions to the epidemiological context and for project evaluation. Such targeted planning and assessments will be useful to accelerate progress in reaching MDG4 targets. PMID:21501454
Woodcock, James; Givoni, Moshe; Morgan, Andrei Scott
2013-01-01
Background Achieving health benefits while reducing greenhouse gas emissions from transport offers a potential policy win-win; the magnitude of potential benefits, however, is likely to vary. This study uses an Integrated Transport and Health Impact Modelling tool (ITHIM) to evaluate the health and environmental impacts of high walking and cycling transport scenarios for English and Welsh urban areas outside London. Methods Three scenarios with increased walking and cycling and lower car use were generated based upon the Visions 2030 Walking and Cycling project. Changes to carbon dioxide emissions were estimated by environmental modelling. Health impact assessment modelling was used to estimate changes in Disability Adjusted Life Years (DALYs) resulting from changes in exposure to air pollution, road traffic injury risk, and physical activity. We compare the findings of the model with results generated using the World Health Organization's Health Economic Assessment of Transport (HEAT) tools. Results This study found considerable reductions in disease burden under all three scenarios, with the largest health benefits attributed to reductions in ischemic heart disease. The pathways that produced the largest benefits were, in order, physical activity, road traffic injuries, and air pollution. The choice of dose response relationship for physical activity had a large impact on the size of the benefits. Modelling the impact on all-cause mortality rather than through individual diseases suggested larger benefits. Using the best available evidence we found fewer road traffic injuries for all scenarios compared with baseline but alternative assumptions suggested potential increases. Conclusions Methods to estimate the health impacts from transport related physical activity and injury risk are in their infancy; this study has demonstrated an integration of transport and health impact modelling approaches. The findings add to the case for a move from car transport to walking and cycling, and have implications for empirical and modelling research. PMID:23326315
Woodcock, James; Givoni, Moshe; Morgan, Andrei Scott
2013-01-01
Achieving health benefits while reducing greenhouse gas emissions from transport offers a potential policy win-win; the magnitude of potential benefits, however, is likely to vary. This study uses an Integrated Transport and Health Impact Modelling tool (ITHIM) to evaluate the health and environmental impacts of high walking and cycling transport scenarios for English and Welsh urban areas outside London. Three scenarios with increased walking and cycling and lower car use were generated based upon the Visions 2030 Walking and Cycling project. Changes to carbon dioxide emissions were estimated by environmental modelling. Health impact assessment modelling was used to estimate changes in Disability Adjusted Life Years (DALYs) resulting from changes in exposure to air pollution, road traffic injury risk, and physical activity. We compare the findings of the model with results generated using the World Health Organization's Health Economic Assessment of Transport (HEAT) tools. This study found considerable reductions in disease burden under all three scenarios, with the largest health benefits attributed to reductions in ischemic heart disease. The pathways that produced the largest benefits were, in order, physical activity, road traffic injuries, and air pollution. The choice of dose response relationship for physical activity had a large impact on the size of the benefits. Modelling the impact on all-cause mortality rather than through individual diseases suggested larger benefits. Using the best available evidence we found fewer road traffic injuries for all scenarios compared with baseline but alternative assumptions suggested potential increases. Methods to estimate the health impacts from transport related physical activity and injury risk are in their infancy; this study has demonstrated an integration of transport and health impact modelling approaches. The findings add to the case for a move from car transport to walking and cycling, and have implications for empirical and modelling research.
NASA Astrophysics Data System (ADS)
Graham, Thomas; Wheeler, Raymond
2016-06-01
The objective of this study was to evaluate root restriction as a tool to increase volume utilization efficiency in spaceflight crop production systems. Bell pepper plants (Capsicum annuum cv. California Wonder) were grown under restricted rooting volume conditions in controlled environment chambers. The rooting volume was restricted to 500 ml and 60 ml in a preliminary trial, and 1500 ml (large), 500 ml (medium), and 250 ml (small) for a full fruiting trial. To reduce the possible confounding effects of water and nutrient restrictions, care was taken to ensure an even and consistent soil moisture throughout the study, with plants being watered/fertilized several times daily with a low concentration soluble fertilizer solution. Root restriction resulted in a general reduction in biomass production, height, leaf area, and transpiration rate; however, the fruit production was not significantly reduced in the root restricted plants under the employed environmental and horticultural conditions. There was a 21% reduction in total height and a 23% reduction in overall crown diameter between the large and small pot size in the fruiting study. Data from the fruiting trial were used to estimate potential volume utilization efficiency improvements for edible biomass in a fixed production volume. For fixed lighting and rooting hardware situations, the majority of improvement from root restriction was in the reduction of canopy area per plant, while height reductions could also improve volume utilization efficiency in high stacked or vertical agricultural systems.
Graham, Thomas; Wheeler, Raymond
2016-06-01
The objective of this study was to evaluate root restriction as a tool to increase volume utilization efficiency in spaceflight crop production systems. Bell pepper plants (Capsicum annuum cv. California Wonder) were grown under restricted rooting volume conditions in controlled environment chambers. The rooting volume was restricted to 500ml and 60ml in a preliminary trial, and 1500ml (large), 500ml (medium), and 250ml (small) for a full fruiting trial. To reduce the possible confounding effects of water and nutrient restrictions, care was taken to ensure an even and consistent soil moisture throughout the study, with plants being watered/fertilized several times daily with a low concentration soluble fertilizer solution. Root restriction resulted in a general reduction in biomass production, height, leaf area, and transpiration rate; however, the fruit production was not significantly reduced in the root restricted plants under the employed environmental and horticultural conditions. There was a 21% reduction in total height and a 23% reduction in overall crown diameter between the large and small pot size in the fruiting study. Data from the fruiting trial were used to estimate potential volume utilization efficiency improvements for edible biomass in a fixed production volume. For fixed lighting and rooting hardware situations, the majority of improvement from root restriction was in the reduction of canopy area per plant, while height reductions could also improve volume utilization efficiency in high stacked or vertical agricultural systems. Copyright © 2016 The Committee on Space Research (COSPAR). All rights reserved.
Aeroshell Design Techniques for Aerocapture Entry Vehicles
NASA Technical Reports Server (NTRS)
Dyke, R. Eric; Hrinda, Glenn A.
2004-01-01
A major goal of NASA s In-Space Propulsion Program is to shorten trip times for scientific planetary missions. To meet this challenge arrival speeds will increase, requiring significant braking for orbit insertion, and thus increased deceleration propellant mass that may exceed launch lift capabilities. A technology called aerocapture has been developed to expand the mission potential of exploratory probes destined for planets with suitable atmospheres. Aerocapture inserts a probe into planetary orbit via a single pass through the atmosphere using the probe s aeroshell drag to reduce velocity. The benefit of an aerocapture maneuver is a large reduction in propellant mass that may result in smaller, less costly missions and reduced mission cruise times. The methodology used to design rigid aerocapture aeroshells will be presented with an emphasis on a new systems tool under development. Current methods for fast, efficient evaluations of structural systems for exploratory vehicles to planets and moons within our solar system have been under development within NASA having limited success. Many systems tools that have been attempted applied structural mass estimation techniques based on historical data and curve fitting techniques that are difficult and cumbersome to apply to new vehicle concepts and missions. The resulting vehicle aeroshell mass may be incorrectly estimated or have high margins included to account for uncertainty. This new tool will reduce the guesswork previously found in conceptual aeroshell mass estimations.
CERES: A Set of Automated Routines for Echelle Spectra
NASA Astrophysics Data System (ADS)
Brahm, Rafael; Jordán, Andrés; Espinoza, Néstor
2017-03-01
We present the Collection of Elemental Routines for Echelle Spectra (CERES). These routines were developed for the construction of automated pipelines for the reduction, extraction, and analysis of spectra acquired with different instruments, allowing the obtention of homogeneous and standardized results. This modular code includes tools for handling the different steps of the processing: CCD image reductions; identification and tracing of the echelle orders; optimal and rectangular extraction; computation of the wavelength solution; estimation of radial velocities; and rough and fast estimation of the atmospheric parameters. Currently, CERES has been used to develop automated pipelines for 13 different spectrographs, namely CORALIE, FEROS, HARPS, ESPaDOnS, FIES, PUCHEROS, FIDEOS, CAFE, DuPont/Echelle, Magellan/Mike, Keck/HIRES, Magellan/PFS, and APO/ARCES, but the routines can be easily used to deal with data coming from other spectrographs. We show the high precision in radial velocity that CERES achieves for some of these instruments, and we briefly summarize some results that have already been obtained using the CERES pipelines.
Leach, A W; Mumford, J D
2008-01-01
The Pesticide Environmental Accounting (PEA) tool provides a monetary estimate of environmental and health impacts per hectare-application for any pesticide. The model combines the Environmental Impact Quotient method and a methodology for absolute estimates of external pesticide costs in UK, USA and Germany. For many countries resources are not available for intensive assessments of external pesticide costs. The model converts external costs of a pesticide in the UK, USA and Germany to Mediterranean countries. Economic and policy applications include estimating impacts of pesticide reduction policies or benefits from technologies replacing pesticides, such as sterile insect technique. The system integrates disparate data and approaches into a single logical method. The assumptions in the system provide transparency and consistency but at the cost of some specificity and precision, a reasonable trade-off for a method that provides both comparative estimates of pesticide impacts and area-based assessments of absolute impacts.
Cost-effectiveness in fall prevention for older women.
Hektoen, Liv F; Aas, Eline; Lurås, Hilde
2009-08-01
The aim of this study was to estimate the cost-effectiveness of implementing an exercise-based fall prevention programme for home-dwelling women in the > or = 80-year age group in Norway. The impact of the home-based individual exercise programme on the number of falls is based on a New Zealand study. On the basis of the cost estimates and the estimated reduction in the number of falls obtained with the chosen programme, we calculated the incremental costs and the incremental effect of the exercise programme as compared with no prevention. The calculation of the average healthcare cost of falling was based on assumptions regarding the distribution of fall injuries reported in the literature, four constructed representative case histories, assumptions regarding healthcare provision associated with the treatment of the specified cases, and estimated unit costs from Norwegian cost data. We calculated the average healthcare costs per fall for the first year. We found that the reduction in healthcare costs per individual for treating fall-related injuries was 1.85 times higher than the cost of implementing a fall prevention programme. The reduction in healthcare costs more than offset the cost of the prevention programme for women aged > or = 80 years living at home, which indicates that health authorities should increase their focus on prevention. The main intention of this article is to stipulate costs connected to falls among the elderly in a transparent way and visualize the whole cost picture. Cost-effectiveness analysis is a health policy tool that makes politicians and other makers of health policy conscious of this complexity.
Whole farm quantification of GHG emissions within smallholder farms in developing countries
NASA Astrophysics Data System (ADS)
Seebauer, Matthias
2014-03-01
The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.
ISRU System Model Tool: From Excavation to Oxygen Production
NASA Technical Reports Server (NTRS)
Santiago-Maldonado, Edgardo; Linne, Diane L.
2007-01-01
In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Fifield, Leonard S.; Gandhi, Umesh N.
This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used asmore » resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.« less
New Tools for Managing Agricultural P
NASA Astrophysics Data System (ADS)
Nieber, J. L.; Baker, L. A.; Peterson, H. M.; Ulrich, J.
2014-12-01
Best management practices (BMPs) generally focus on retaining nutrients (especially P) after they enter the watershed. This approach is expensive, unsustainable, and has not led to reductions of P pollution at large scales (e.g., Mississippi River). Although source reduction, which results in reducing inputs of nutrients to a watershed, has long been cited as a preferred approach, we have not had tools to guide source reduction efforts at the watershed level. To augment conventional TMDL tools, we developed an "actionable" watershed P balance approach, based largely on watershed-specific information, yet simple enough to be utilized as a practical tool. Interviews with farmers were used to obtain detailed farm management data, data from livestock permits were adjusted based on site visits, stream P fluxes were calculated from 3 years of monitoring data, and expert knowledge was used to model P fluxes through animal operations. The overall P use efficiency. Puse was calculated as the sum of deliberate exports (P in animals, milk, eggs, and crops) divided by deliberate inputs (P inputs of fertilizer, feed, and nursery animals x 100. The crop P use efficiency was 1.7, meaning that more P was exported as products that was deliberately imported; we estimate that this mining would have resulted in a loss of 6 mg P/kg across the watershed. Despite the negative P balance, the equivalent of 5% of watershed input was lost via stream export. Tile drainage, the presence of buffer strips, and relatively flat topography result in dominance of P loads by ortho-P (66%) and low particulate P. This, together with geochemical analysis (ongoing) suggest that biological processes may be at least as important as sediment transport in controlling P loads. We have developed a P balance calculator tool to enable watershed management organizations to develop watershed P balances and identify opportunities for improving the efficiency of P utilization.
Rahne, T; Buthut, F; Plößl, S; Plontke, S K
2016-03-01
Selecting subjects for clinical trials on hearing loss therapies relies on the patient meeting the audiological inclusion criteria. In studies on the treatment of idiopathic sudden sensorineural hearing loss, the patient's acute audiogram is usually compared with a previous audiogram, the audiogram of the non-affected ear, or a normal audiogram according to an ISO standard. Generally, many more patients are screened than actually fulfill the particular inclusion criteria. The inclusion criteria often require a calculation of pure-tone averages, selection of the most affected frequencies, and calculation of hearing loss differences. A software tool was developed to simplify and accelerate this inclusion procedure for investigators to estimate the possible recruitment rate during the planning phase of a clinical trial and during the actual study. This tool is Microsoft Excel-based and easy to modify to meet the particular inclusion criteria of a specific clinical trial. The tool was retrospectively evaluated on 100 patients with acute hearing loss comparing the times for classifying automatically and manually. The study sample comprised 100 patients with idiopathic sudden sensorineural hearing loss. The age- and sex-related normative audiogram was calculated automatically by the tool and the hearing impairment was graded. The estimated recruitment rate of our sample was quickly calculated. Information about meeting the inclusion criteria was provided instantaneously. A significant reduction of 30 % in the time required for classifying (30 s per patient) was observed.
BioReD: Biomarkers and Tools for Reductive Dechlorination Site Assessment, Monitoring and Management
2013-11-01
1,2,3-Trichloropropane 1 -CP 1 - Chloropropane 2-CP 2- Chloropropane 2,2-DCP 2,2-Dichloropropane 2-Br- 1 -CP 1 -Bromo- 1 - chloropropane 6-FAM 6...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1 . REPORT DATE (DD-MM-YYYY) 07
Whitfield, Geoffrey P; Meehan, Leslie A; Maizlish, Neil; Wendel, Arthur M
2016-01-01
The Integrated Transport and Health Impact Model (ITHIM) is a comprehensive tool that estimates the hypothetical health effects of transportation mode shifts through changes to physical activity, air pollution, and injuries. The purpose of this paper is to describe the implementation of ITHIM in greater Nashville, Tennessee (USA), describe important lessons learned, and serve as an implementation guide for other practitioners and researchers interested in running ITHIM. As might be expected in other metropolitan areas in the US, not all the required calibration data was available locally. We utilized data from local, state, and federal sources to fulfill the 14 ITHIM calibration items, which include disease burdens, travel habits, physical activity participation, air pollution levels, and traffic injuries and fatalities. Three scenarios were developed that modeled stepwise increases in walking and bicycling, and one that modeled reductions in car travel. Cost savings estimates were calculated by scaling national-level, disease-specific direct treatment costs and indirect lost productivity costs to the greater Nashville population of approximately 1.5 million. Implementation required approximately one year of intermittent, part-time work. Across the range of scenarios, results suggested that 24 to 123 deaths per year could be averted in the region through a 1%–5% reduction in the burden of several chronic diseases. This translated into $10–$63 million in estimated direct and indirect cost savings per year. Implementing ITHIM in greater Nashville has provided local decision makers with important information on the potential health effects of transportation choices. Other jurisdictions interested in ITHIM might find the Nashville example as a useful guide to streamline the effort required to calibrate and run the model. PMID:27595067
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
Estimating parameters from rotating ring disc electrode measurements
Santhanagopalan, Shriram; White, Ralph E.
2017-10-21
Rotating ring disc electrode (RRDE) experiments are a classic tool for investigating kinetics of electrochemical reactions. Several standardized methods exist for extracting transport parameters and reaction rate constants using RRDE measurements. Here in this work, we compare some approximate solutions to the convective diffusion used popularly in the literature to a rigorous numerical solution of the Nernst-Planck equations coupled to the three dimensional flow problem. In light of these computational advancements, we explore design aspects of the RRDE that will help improve sensitivity of our parameter estimation procedure to experimental data. We use the oxygen reduction in acidic media involvingmore » three charge transfer reactions and a chemical reaction as an example, and identify ways to isolate reaction currents for the individual processes in order to accurately estimate the exchange current densities.« less
Reaching the healthy people goals for reducing childhood obesity: closing the energy gap.
Wang, Y Claire; Orleans, C Tracy; Gortmaker, Steven L
2012-05-01
The federal government has set measurable goals for reducing childhood obesity to 5% by 2010 (Healthy People 2010), and 10% lower than 2005-2008 levels by 2020 (Healthy People 2020). However, population-level estimates of the changes in daily energy balance needed to reach these goals are lacking. To estimate needed per capita reductions in youths' daily "energy gap" (calories consumed over calories expended) to achieve Healthy People goals by 2020. Analyses were conducted in 2010 to fit multivariate models using National Health and Nutrition Examination Surveys 1971-2008 (N=46,164) to extrapolate past trends in obesity prevalence, weight, and BMI among youth aged 2-19 years. Differences in average daily energy requirements between the extrapolated 2020 levels and Healthy People scenarios were estimated. During 1971-2008, mean BMI and weight among U.S. youth increased by 0.55 kg/m(2) and by 1.54 kg per decade, respectively. Extrapolating from these trends to 2020, the average weight among youth in 2020 would increase by ∼1.8 kg from 2007-2008 levels. Averting this increase will require an average reduction of 41 kcal/day in youth's daily energy gap. An additional reduction of 120 kcal/day and 23 kcal/day would be needed to reach Healthy People 2010 and Healthy People 2020 goals, respectively. Larger reductions are needed among adolescents and racial/ethnic minority youth. Aggressive efforts are needed to reverse the positive energy imbalance underlying the childhood obesity epidemic. The energy-gap metric provides a useful tool for goal setting, intervention planning, and charting progress. Copyright © 2012 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Budget impact analysis of trastuzumab in early breast cancer: a hospital district perspective.
Purmonen, Timo T; Auvinen, Päivi K; Martikainen, Janne A
2010-04-01
Adjuvant trastuzumab is widely used in HER2-positive (HER2+) early breast cancer, and despite its cost-effectiveness, it causes substantial costs for health care. The purpose of the study was to develop a tool for estimating the budget impact of new cancer treatments. With this tool, we were able to estimate the budget impact of adjuvant trastuzumab, as well as the probability of staying within a given budget constraint. The created model-based evaluation tool was used to explore the budget impact of trastuzumab in early breast cancer in a single Finnish hospital district with 250,000 inhabitants. The used model took into account the number of patients, HER2+ prevalence, length and cost of treatment, and the effectiveness of the therapy. Probabilistic sensitivity analysis and alternative case scenarios were performed to ensure the robustness of the results. Introduction of adjuvant trastuzumab caused substantial costs for a relatively small hospital district. In base-case analysis the 4-year net budget impact was 1.3 million euro. The trastuzumab acquisition costs were partially offset by the reduction in costs associated with the treatment of cancer recurrence and metastatic disease. Budget impact analyses provide important information about the overall economic impact of new treatments, and thus offer complementary information to cost-effectiveness analyses. Inclusion of treatment outcomes and probabilistic sensitivity analysis provides more realistic estimates of the net budget impact. The length of trastuzumab treatment has a strong effect on the budget impact.
NASA Astrophysics Data System (ADS)
Reimann, S.; Vollmer, M. K.; Henne, S.; Brunner, D.; Emmenegger, L.; Manning, A.; Fraser, P. J.; Krummel, P. B.; Dunse, B. L.; DeCola, P.; Tarasova, O. A.
2016-12-01
In the recently adopted Paris Agreement the community of signatory states has agreed to limit the future global temperature increase between +1.5 °C and +2.0 °C, compared to pre-industrial times. To achieve this goal, emission reduction targets have been submitted by individual nations (called Intended Nationally Determined Contributions, INDCs). Inventories will be used for checking progress towards these envisaged goals. These inventories are calculated by combining information on specific activities (e.g. passenger cars, agriculture) with activity-related, typically IPCC-sanctioned, emission factors - the so-called bottom-up method. These calculated emissions are reported on an annual basis and are checked by external bodies by using the same method. A second independent method estimates emissions by translating greenhouse gas measurements made at regionally representative stations into regional/global emissions using meteorologically-based transport models. In recent years this so-called top-down approach has been substantially advanced into a powerful tool and emission estimates at the national/regional level have become possible. This method is already used in Switzerland, in the United Kingdom and in Australia to estimate greenhouse gas emissions and independently support the national bottom-up emission inventories within the UNFCCC framework. Examples of the comparison of the two independent methods will be presented and the added-value will be discussed. The World Meteorological Organization (WMO) and partner organizations are currently developing a plan to expand this top-down approach and to expand the globally representative GAW network of ground-based stations and remote-sensing platforms and integrate their information with atmospheric transport models. This Integrated Global Greenhouse Gas Information System (IG3IS) initiative will help nations to improve the accuracy of their country-based emissions inventories and their ability to evaluate the success of emission reductions strategies. This could foster trans-national collaboration on methodologies for estimation of emissions. Furthermore, more accurate emission knowledge will clarify the value of emission reduction efforts and could encourage countries to strengthen their reduction pledges.
Lancelot, Christiane; Thieu, Vincent; Polard, Audrey; Garnier, Josette; Billen, Gilles; Hecq, Walter; Gypens, Nathalie
2011-05-01
Nutrient reduction measures have been already taken by wealthier countries to decrease nutrient loads to coastal waters, in most cases however, prior to having properly assessed their ecological effectiveness and their economic costs. In this paper we describe an original integrated impact assessment methodology to estimate the direct cost and the ecological performance of realistic nutrient reduction options to be applied in the Southern North Sea watershed to decrease eutrophication, visible as Phaeocystis blooms and foam deposits on the beaches. The mathematical tool couples the idealized biogeochemical GIS-based model of the river system (SENEQUE-RIVERSTRAHLER) implemented in the Eastern Channel/Southern North Sea watershed to the biogeochemical MIRO model describing Phaeocystis blooms in the marine domain. Model simulations explore how nutrient reduction options regarding diffuse and/or point sources in the watershed would affect the Phaeocystis colony spreading in the coastal area. The reference and prospective simulations are performed for the year 2000 characterized by mean meteorological conditions, and nutrient reduction scenarios include and compare upgrading of wastewater treatment plants and changes in agricultural practices including an idealized shift towards organic farming. A direct cost assessment is performed for each realistic nutrient reduction scenario. Further the reduction obtained for Phaeocystis blooms is assessed by comparison with ecological indicators (bloom magnitude and duration) and the cost for reducing foam events on the beaches is estimated. Uncertainty brought by the added effect of meteorological conditions (rainfall) on coastal eutrophication is discussed. It is concluded that the reduction obtained by implementing realistic environmental measures on the short-term is costly and insufficient to restore well-balanced nutrient conditions in the coastal area while the replacement of conventional agriculture by organic farming might be an option to consider in the nearby future. Copyright © 2011 Elsevier B.V. All rights reserved.
Modifying high-order aeroelastic math model of a jet transport using maximum likelihood estimation
NASA Technical Reports Server (NTRS)
Anissipour, Amir A.; Benson, Russell A.
1989-01-01
The design of control laws to damp flexible structural modes requires accurate math models. Unlike the design of control laws for rigid body motion (e.g., where robust control is used to compensate for modeling inaccuracies), structural mode damping usually employs narrow band notch filters. In order to obtain the required accuracy in the math model, maximum likelihood estimation technique is employed to improve the accuracy of the math model using flight data. Presented here are all phases of this methodology: (1) pre-flight analysis (i.e., optimal input signal design for flight test, sensor location determination, model reduction technique, etc.), (2) data collection and preprocessing, and (3) post-flight analysis (i.e., estimation technique and model verification). In addition, a discussion is presented of the software tools used and the need for future study in this field.
Developments in seismic monitoring for risk reduction
Celebi, M.
2007-01-01
This paper presents recent state-of-the-art developments to obtain displacements and drift ratios for seismic monitoring and damage assessment of buildings. In most cases, decisions on safety of buildings following seismic events are based on visual inspections of the structures. Real-time instrumental measurements using GPS or double integration of accelerations, however, offer a viable alternative. Relevant parameters, such as the type of connections and structural characteristics (including storey geometry), can be estimated to compute drifts corresponding to several pre-selected threshold stages of damage. Drift ratios determined from real-time monitoring can then be compared to these thresholds in order to estimate damage conditions drift ratios. This approach is demonstrated in three steel frame buildings in San Francisco, California. Recently recorded data of strong shaking from these buildings indicate that the monitoring system can be a useful tool in rapid assessment of buildings and other structures following an earthquake. Such systems can also be used for risk monitoring, as a method to assess performance-based design and analysis procedures, for long-term assessment of structural characteristics of a building, and as a possible long-term damage detection tool.
TBC costing. [test bed concentrator
NASA Technical Reports Server (NTRS)
Kaminski, H. L.
1980-01-01
Procedures to be used in determining the cost of producing and installing a parabolic dish collector in annual production volumes of 10,000, 50,000, 100,000, and 1,000,000 units include (1) evaluating each individual part for material cost and for the type and number of operations required to work the raw material into the finished part; (2) costing labor, burden, tooling, gaging, machinery, and equipment; (3) estimating facilities requirements for each production volume; and (4) considering suggestions for design and material alterations that could result in cost reduction.
Uddin, Muhammad Shahin; Halder, Kalyan Kumar; Tahtali, Murat; Lambert, Andrew J; Pickering, Mark R; Marchese, Margaret; Stuart, Iain
2016-11-01
Ultrasound (US) imaging is a widely used clinical diagnostic tool in medical imaging techniques. It is a comparatively safe, economical, painless, portable, and noninvasive real-time tool compared to the other imaging modalities. However, the image quality of US imaging is severely affected by the presence of speckle noise and blur during the acquisition process. In order to ensure a high-quality clinical diagnosis, US images must be restored by reducing their speckle noise and blur. In general, speckle noise is modeled as a multiplicative noise following a Rayleigh distribution and blur as a Gaussian function. Hereto, we propose an intelligent estimator based on artificial neural networks (ANNs) to estimate the variances of noise and blur, which, in turn, are used to obtain an image without discernible distortions. A set of statistical features computed from the image and its complex wavelet sub-bands are used as input to the ANN. In the proposed method, we solve the inverse Rayleigh function numerically for speckle reduction and use the Richardson-Lucy algorithm for de-blurring. The performance of this method is compared with that of the traditional methods by applying them to a synthetic, physical phantom and clinical data, which confirms better restoration results by the proposed method.
Hamilton, Matthew; Mahiane, Guy; Werst, Elric; Sanders, Rachel; Briët, Olivier; Smith, Thomas; Cibulskis, Richard; Cameron, Ewan; Bhatt, Samir; Weiss, Daniel J; Gething, Peter W; Pretorius, Carel; Korenromp, Eline L
2017-02-10
Scale-up of malaria prevention and treatment needs to continue but national strategies and budget allocations are not always evidence-based. This article presents a new modelling tool projecting malaria infection, cases and deaths to support impact evaluation, target setting and strategic planning. Nested in the Spectrum suite of programme planning tools, the model includes historic estimates of case incidence and deaths in groups aged up to 4, 5-14, and 15+ years, and prevalence of Plasmodium falciparum infection (PfPR) among children 2-9 years, for 43 sub-Saharan African countries and their 602 provinces, from the WHO and malaria atlas project. Impacts over 2016-2030 are projected for insecticide-treated nets (ITNs), indoor residual spraying (IRS), seasonal malaria chemoprevention (SMC), and effective management of uncomplicated cases (CMU) and severe cases (CMS), using statistical functions fitted to proportional burden reductions simulated in the P. falciparum dynamic transmission model OpenMalaria. In projections for Nigeria, ITNs, IRS, CMU, and CMS scale-up reduced health burdens in all age groups, with largest proportional and especially absolute reductions in children up to 4 years old. Impacts increased from 8 to 10 years following scale-up, reflecting dynamic effects. For scale-up of each intervention to 80% effective coverage, CMU had the largest impacts across all health outcomes, followed by ITNs and IRS; CMS and SMC conferred additional small but rapid mortality impacts. Spectrum-Malaria's user-friendly interface and intuitive display of baseline data and scenario projections holds promise to facilitate capacity building and policy dialogue in malaria programme prioritization. The module's linking to the OneHealth Tool for costing will support use of the software for strategic budget allocation. In settings with moderately low coverage levels, such as Nigeria, improving case management and achieving universal coverage with ITNs could achieve considerable burden reductions. Projections remain to be refined and validated with local expert input data and actual policy scenarios.
Gray, David R
2016-05-01
Reducing the risk of introduction to North America of the invasive Asian gypsy moth (Lymantria dispar asiatica Vnukovskij and L. d. japonica [Motschulsky]) on international maritime vessels involves two tactics: (1) vessels that wish to arrive in Canada or the United States and have visited any Asian port that is subject to regulation during designated times must obtain a predeparture inspection certificate from an approved entity; and (2) vessels with a certificate may be subjected to an additional inspection upon arrival. A decision support tool is described here with which the allocation of inspection resources at North American ports can be partitioned among multiple vessels according to estimates of the potential onboard Asian gypsy moth population and estimates of the onboard larval emergence pattern. The decision support tool assumes that port inspection is uniformly imperfect at the Asian ports and that each visit to a regulated port has potential for the vessel to be contaminated with gypsy moth egg masses. The decision support tool uses a multigenerational phenology model to estimate the potential onboard population of egg masses by calculating the temporal intersection between the dates of port visits to regulated ports and the simulated oviposition pattern in each port. The phenological development of the onboard population is simulated each day of the vessel log until the vessel arrives at the port being protected from introduction. Multiple independent simulations are used to create a probability distribution of the size and timing of larval emergence. © 2015 Society for Risk Analysis.
Herron, Natasha; Davis, Richard; Jones, Roger
2002-08-01
Widespread afforestation has been proposed as one means of addressing the increasing dryland and stream salinity problem in Australia. However, modelling results presented here suggest that large-scale tree planting will substantially reduce river flows and impose costs on downstream water users if planted in areas of high runoff yield. Streamflow reductions in the Macquarie River, NSW, Australia are estimated for a number of tree planting scenarios and global warming forecasts. The modelling framework includes the Sacramento rainfall-runoff model and IQQM, a streamflow routing tool, as well as various global climate model outputs from which daily rainfall and potential evaporation data files have been generated in OzClim, a climate scenario generator. For a 10% increase in tree cover in the headwaters of the Macquarie, we estimate a 17% reduction in inflows to Burrendong Dam. The drying trend for a mid-range scenario of regional rainfall and potential evaporation caused by a global warming of 0.5 degree C may cause an additional 5% reduction in 2030. These flow reductions will decrease the frequency of bird-breeding events in Macquarie Marshes (a RAMSAR protected wetland) and reduce the security of supply to irrigation areas downstream. Inter-decadal climate variability is predicted to have a very significant influence on catchment hydrologic behaviour. A further 20% reduction in flows from the long-term historical mean is possible, should we move into an extended period of below average rainfall years, such as occurred in eastern Australia between 1890 and 1948. Because current consumptive water use is largely adapted to the wetter conditions of post 1949, a return to prolonged dry periods would cause significant environmental stress given the agricultural and domestic water developments that have been instituted.
Least-squares dual characterization for ROI assessment in emission tomography
NASA Astrophysics Data System (ADS)
Ben Bouallègue, F.; Crouzet, J. F.; Dubois, A.; Buvat, I.; Mariano-Goulart, D.
2013-06-01
Our aim is to describe an original method for estimating the statistical properties of regions of interest (ROIs) in emission tomography. Drawn upon the works of Louis on the approximate inverse, we propose a dual formulation of the ROI estimation problem to derive the ROI activity and variance directly from the measured data without any image reconstruction. The method requires the definition of an ROI characteristic function that can be extracted from a co-registered morphological image. This characteristic function can be smoothed to optimize the resolution-variance tradeoff. An iterative procedure is detailed for the solution of the dual problem in the least-squares sense (least-squares dual (LSD) characterization), and a linear extrapolation scheme is described to compensate for sampling partial volume effect and reduce the estimation bias (LSD-ex). LSD and LSD-ex are compared with classical ROI estimation using pixel summation after image reconstruction and with Huesman's method. For this comparison, we used Monte Carlo simulations (GATE simulation tool) of 2D PET data of a Hoffman brain phantom containing three small uniform high-contrast ROIs and a large non-uniform low-contrast ROI. Our results show that the performances of LSD characterization are at least as good as those of the classical methods in terms of root mean square (RMS) error. For the three small tumor regions, LSD-ex allows a reduction in the estimation bias by up to 14%, resulting in a reduction in the RMS error of up to 8.5%, compared with the optimal classical estimation. For the large non-specific region, LSD using appropriate smoothing could intuitively and efficiently handle the resolution-variance tradeoff.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santhanagopalan, Shriram; White, Ralph E.
Rotating ring disc electrode (RRDE) experiments are a classic tool for investigating kinetics of electrochemical reactions. Several standardized methods exist for extracting transport parameters and reaction rate constants using RRDE measurements. Here in this work, we compare some approximate solutions to the convective diffusion used popularly in the literature to a rigorous numerical solution of the Nernst-Planck equations coupled to the three dimensional flow problem. In light of these computational advancements, we explore design aspects of the RRDE that will help improve sensitivity of our parameter estimation procedure to experimental data. We use the oxygen reduction in acidic media involvingmore » three charge transfer reactions and a chemical reaction as an example, and identify ways to isolate reaction currents for the individual processes in order to accurately estimate the exchange current densities.« less
Evolution of seismic risk management for insurance over the past 30 years
NASA Astrophysics Data System (ADS)
Shah, Haresh C.; Dong, Weimin; Stojanovski, Pane; Chen, Alex
2018-01-01
During the past 30 years, there has been spectacular growth in the use of risk analysis and risk management tools developed by engineers in the financial and insurance sectors. The insurance, the reinsurance, and the investment banking sectors have enthusiastically adopted loss estimation tools developed by engineers in developing their business strategies and for managing their financial risks. As a result, insurance/reinsurance strategy has evolved as a major risk mitigation tool in managing catastrophe risk at the individual, corporate, and government level. This is particularly true in developed countries such as US, Western Europe, and Japan. Unfortunately, it has not received the needed attention in developing countries, where such a strategy for risk management is most needed. Fortunately, in the last five years, there has been excellent focus in developing "InsurTech" tools to address the much needed "Insurance for the Masses", especially for the Asian Markets. In the earlier years of catastrophe model development, risk analysts were mainly concerned with risk reduction options through engineering strategies, and relatively little attention was given to financial and economic strategies. Such state-of-affairs still exists in many developing countries. The new developments in the science and technologies of loss estimation due to natural catastrophes have made it possible for financial sectors to model their business strategies such as peril and geographic diversification, premium calculations, reserve strategies, reinsurance contracts, and other underwriting tools. These developments have not only changed the way in which financial sectors assess and manage their risks, but have also changed the domain of opportunities for engineers and scientists. This paper will address the issues related to developing insurance/reinsurance strategies to mitigate catastrophe risks and describe the role catastrophe risk insurance and reinsurance has played in managing financial risk due to natural catastrophes. Historical losses and the share of those losses covered by insurance will be presented. How such risk sharing can help the nation share the burden of losses between tax paying public, the "at risk" property owners, the insurers and the reinsurers will be discussed. The paper will summarize the tools that are used by the insurance and reinsurance companies for estimating their future losses due to catastrophic natural events. The paper will also show how the results of loss estimation technologies developed by engineers are communicated to the business flow of insurance/reinsurance companies. Finally, to make it possible to grow "Insurance for the Masses-IFM", the role played by parametric insurance products and InsurTech tools will be discussed.
Satellite vulnerability to space debris - an improved 3D risk assessment methodology
NASA Astrophysics Data System (ADS)
Grassi, Lilith; Tiboldo, Francesca; Destefanis, Roberto; Donath, Thérèse; Winterboer, Arne; Evans, Leanne; Janovsky, Rolf; Kempf, Scott; Rudolph, Martin; Schäfer, Frank; Gelhaus, Johannes
2014-06-01
The work described in the present paper, performed as a part of the P2 project, presents an enhanced method to evaluate satellite vulnerability to micrometeoroids and orbital debris (MMOD), using the ESABASE2/Debris tool (developed under ESA contract). Starting from the estimation of induced failures on spacecraft (S/C) components and from the computation of lethal impacts (with an energy leading to the loss of the satellite), and considering the equipment redundancies and interactions between components, the debris-induced S/C functional impairment is assessed. The developed methodology, illustrated through its application to a case study satellite, includes the capability to estimate the number of failures on internal components, overcoming the limitations of current tools which do not allow propagating the debris cloud inside the S/C. The ballistic limit of internal equipment behind a sandwich panel structure is evaluated through the implementation of the Schäfer Ryan Lambert (SRL) Ballistic Limit Equation (BLE). The analysis conducted on the case study satellite shows the S/C vulnerability index to be in the range of about 4% over the complete mission, with a significant reduction with respect to the results typically obtained with the traditional analysis, which considers as a failure the structural penetration of the satellite structural panels. The methodology has then been applied to select design strategies (additional local shielding, relocation of components) to improve S/C protection with respect to MMOD. The results of the analyses conducted on the improved design show a reduction of the vulnerability index of about 18%.
Insights into early lithic technologies from ethnography
Hayden, Brian
2015-01-01
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. PMID:26483534
Nikolic, D.; Richter, S. S.; Asamoto, K.; Wyllie, R.; Tuttle, R.
2017-01-01
ABSTRACT There is substantial evidence that stool culture and parasitological examinations are of minimal to no value after 3 days of hospitalization. We implemented and studied the impact of a clinical decision support tool (CDST) to decrease the number of unnecessary stool cultures (STCUL), ova/parasite (O&P) examinations, and Giardia/Cryptosporidium enzyme immunoassay screens (GC-EIA) performed for patients hospitalized >3 days. We studied the frequency of stool studies ordered before or on day 3 and after day 3 of hospitalization (i.e., categorical orders/total number of orders) before and after this intervention and denoted the numbers and types of microorganisms detected within those time frames. This intervention, which corresponded to a custom-programmed hard-stop alert tool in the Epic hospital information system, allowed providers to override the intervention by calling the laboratory, if testing was deemed medically necessary. Comparative statistics were employed to determine significance, and cost savings were estimated based on our internal costs. Before the intervention, 129/670 (19.25%) O&P examinations, 47/204 (23.04%) GC-EIA, and 249/1,229 (20.26%) STCUL were ordered after 3 days of hospitalization. After the intervention, 46/521 (8.83%) O&P examinations, 27/157 (17.20%) GC-EIA, and 106/1,028 (10.31%) STCUL were ordered after 3 days of hospitalization. The proportions of reductions in the number of tests performed after 3 days and the associated P values were 54.1% for O&P examinations (P < 0.0001), 22.58% for GC-EIA (P = 0.2807), and 49.1% for STCUL (P < 0.0001). This was estimated to have resulted in $8,108.84 of cost savings. The electronic CDST resulted in a substantial reduction in the number of evaluations of stool cultures and the number of parasitological examinations for patients hospitalized for more than 3 days and in a cost savings while retaining the ability of the clinician to obtain these tests if clinically indicated. PMID:28954902
Speckle reduction during all-fiber common-path optical coherence tomography of the cavernous nerves
NASA Astrophysics Data System (ADS)
Chitchian, Shahab; Fiddy, Michael; Fried, Nathaniel M.
2009-02-01
Improvements in identification, imaging, and visualization of the cavernous nerves during prostate cancer surgery, which are responsible for erectile function, may improve nerve preservation and postoperative sexual potency. In this study, we use a rat prostate, ex vivo, to evaluate the feasibility of optical coherence tomography (OCT) as a diagnostic tool for real-time imaging and identification of the cavernous nerves. A novel OCT system based on an all single-mode fiber common-path interferometer-based scanning system is used for this purpose. A wavelet shrinkage denoising technique using Stein's unbiased risk estimator (SURE) algorithm to calculate a data-adaptive threshold is implemented for speckle noise reduction in the OCT image. The signal-to-noise ratio (SNR) was improved by 9 dB and the image quality metrics of the cavernous nerves also improved significantly.
NASA Astrophysics Data System (ADS)
Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya; Prodanov, Bogdan
2017-04-01
Coastal zone is among the fastest evolving areas worldwide. Ever increasing population inhabiting coastal settlements develops often conflicting economic and societal activities. The existing imbalance between the expansion of these activities, on one hand, and the potential to accommodate them in a sustainable manner, on the other, becomes a critical problem. Concurrently, coasts are affected by various hydro-meteorological phenomena such as storm surges, heavy seas, strong winds and flash floods, which intensities and occurrence frequency is likely to increase due to the climate change. This implies elaboration of tools capable of quick prediction of impact of those phenomena on the coast and providing solutions in terms of disaster risk reduction measures. One such tool is Bayesian network. Proposed paper describes the set-up of such network for Varna Bay (Bulgaria, Western Black Sea). It relates near-shore storm conditions to their onshore flood potential and ultimately to relevant impact as relative damage on coastal and manmade environment. Methodology for set-up and training of the Bayesian network was developed within RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). Proposed BN reflects the interaction between boundary conditions, receptors, hazard, and consequences. Storm boundary conditions - maximum significant wave height and peak surge level, were determined on the basis of their historical and projected occurrence. The only hazard considered in this study is flooding characterized by maximum inundation depth. BN was trained with synthetic events created by combining estimated boundary conditions. Flood impact was modeled with the process-based morphodynamical model XBeach. Restaurants, sport and leisure facilities, administrative buildings, and car parks were introduced in the network as receptors. Consequences (impact) are estimated in terms of relative damage caused by given inundation depth. National depth-damage (susceptibility) curves were used to define the percentage of damage ranked as low, moderate, high and very high. Besides previously described components, BN includes also two hazard influencing disaster risk reduction (DRR) measures: re-enforced embankment of Varna Port wall and beach nourishment. As a result of training process the network is able to evaluate spatially varying hazards and damages for specific storm conditions. Moreover, it is able to predict where on the site the highest impact would occur and to quantify the mitigation capacity of proposed DRR measures. For example, it is estimated that storm impact would be considerably reduced in present conditions but vulnerability would be still high in climate change perspective.
Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)
2005-02-01
method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background
Tago, Damian; Sall, Baba; Lancelot, Renaud; Pradel, Jennifer
2017-09-01
Vaccination is one of the main tools currently available to control animal diseases. In eradication campaigns, vaccination plays a crucial role by reducing the number of susceptible hosts with the ultimate goal of interrupting disease transmission. Nevertheless, mass vaccination campaigns may be very expensive and in some cases unprofitable. VacciCost is a tool designed to help decision-makers in the estimation of the resources required to implement mass livestock vaccination campaigns against regulated diseases. The tool focuses on the operational or running costs of the campaign, so acquisition of new equipment or vehicles is not considered. It takes into account different types of production systems to differentiate the vaccination productivity (number of animals vaccinated per day) in systems where animals are concentrated and easy to reach, from those characterized by small herds that are scattered and less accessible. The resource requirements are classified in eight categories: vaccines, injection supplies, personnel, transport, maintenance and overhead, training, social mobilization, and surveillance and monitoring. This categorization allows identifying the most expensive components of a vaccination campaign, which is crucial to design cost-reduction strategies. The use of the tool is illustrated using data collected in collaboration with Senegalese Veterinary Services regarding vaccination against peste des petits ruminants. The average daily number of animals vaccinated per vaccination team was found to be crucial for the costs of the campaign so significant savings can be obtained by implementing training to improve the performance of vaccination teams. Copyright © 2017 Centre de cooperation internationale en recherche agronomique pour le developpement (CIRAD). Published by Elsevier B.V. All rights reserved.
The EPA's human exposure research program for assessing cumulative risk in communities.
Zartarian, Valerie G; Schultz, Bradley D
2010-06-01
Communities are faced with challenges in identifying and prioritizing environmental issues, taking actions to reduce their exposures, and determining their effectiveness for reducing human health risks. Additional challenges include determining what scientific tools are available and most relevant, and understanding how to use those tools; given these barriers, community groups tend to rely more on risk perception than science. The U.S. Environmental Protection Agency's Office of Research and Development, National Exposure Research Laboratory (NERL) and collaborators are developing and applying tools (models, data, methods) for enhancing cumulative risk assessments. The NERL's "Cumulative Communities Research Program" focuses on key science questions: (1) How to systematically identify and prioritize key chemical stressors within a given community?; (2) How to develop estimates of exposure to multiple stressors for individuals in epidemiologic studies?; and (3) What tools can be used to assess community-level distributions of exposures for the development and evaluation of the effectiveness of risk reduction strategies? This paper provides community partners and scientific researchers with an understanding of the NERL research program and other efforts to address cumulative community risks; and key research needs and opportunities. Some initial findings include the following: (1) Many useful tools exist for components of risk assessment, but need to be developed collaboratively with end users and made more comprehensive and user-friendly for practical application; (2) Tools for quantifying cumulative risks and impact of community risk reduction activities are also needed; (3) More data are needed to assess community- and individual-level exposures, and to link exposure-related information with health effects; and (4) Additional research is needed to incorporate risk-modifying factors ("non-chemical stressors") into cumulative risk assessments. The products of this research program will advance the science for cumulative risk assessments and empower communities with information so that they can make informed, cost-effective decisions to improve public health.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2005-10-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2006-01-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL
NASA Technical Reports Server (NTRS)
Roush, G. B.
1994-01-01
The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.
Insights into early lithic technologies from ethnography.
Hayden, Brian
2015-11-19
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. © 2015 The Author(s).
MEMO2 - MEthane goes MObile - MEasurements and Modelling - Part 2
NASA Astrophysics Data System (ADS)
Röckmann, Thomas; Walter, Sylvia
2017-04-01
As mitigation of climate change is a key scientific and societal challenge, the 2015 United Nations Climate Change Conference in Paris (COP21) agreed to limit global warming "well below" 2˚ C and, if possible, below 1.5˚ C. Reaching this target requires massive reductions of greenhouse gas emissions, and achieving significant reduction of greenhouse gas emissions is a logical headline targets of the EU climate action and of the H2020 strategy. CH4 emissions are a major contributor to Europe's global warming impact and emissions are not well quantified yet. There are significant discrepancies between official inventories of emissions and estimates derived from direct atmospheric measurement. Effective emission reduction can only be achieved if sources are properly quantified, and mitigation efforts are verified. New advanced combinations of measurement and modelling are needed to archive such quantification. MEMO2 will contribute to the targets of the EU with a focus on methane (CH4). The project will bridge the gap between large-scale scientific estimates from in situ monitoring programs and the 'bottom-up' estimates of emissions from local sources that are used in the national reporting by I) developing new and advanced mobile methane (CH4) measurements tools and networks, II) isotopic source identification, and III) modelling at different scales. Within the project qualified scientists will be educated in the use and implementation of interdisciplinary knowledge and techniques that are essential to meet and verify emission reduction goals. MEMO2 will facilitate intensive collaboration between the largely academic greenhouse gas monitoring community and non-academic partners who are responsible for evaluating and reporting greenhouse gas emissions to policy makers. MEMO2 is a European Training Network with more than 20 collaborators from 7 countries. It is a 4-years project and we will present the project and its objectives to the scientific community to foster collaboration and scientific exchange from the beginning.
Estimating the strength of bone using linear response
NASA Astrophysics Data System (ADS)
Gunaratne, Gemunu H.
2002-12-01
Accurate diagnostic tools are required for effective management of osteoporosis; one method to identify additional diagnostics is to search for observable consequences of bone loss. An analysis of a model system is used to show that weakening of a bone is accompanied by a reduction of the fraction of the bone that participates in load transmission. On the basis of this observation, it is argued that the ratio Γ of linear responses of a network to dc and high-frequency ac driving can be used as a surrogate for their strength. Protocols needed to obtain Γ for bone samples are discussed.
Approach and case-study of green infrastructure screening analysis for urban stormwater control.
Eaton, Timothy T
2018-03-01
Urban stormwater control is an urgent concern in megacities where increased impervious surface has disrupted natural hydrology. Water managers are increasingly turning to more environmentally friendly ways of capturing stormwater, called Green Infrastructure (GI), to mitigate combined sewer overflow (CSO) that degrades local water quality. A rapid screening approach is described to evaluate how GI strategies can reduce the amount of stormwater runoff in a low-density residential watershed in New York City. Among multiple possible tools, the L-THIA LID online software package, using the SCS-CN method, was selected to estimate relative runoff reductions expected with different strategies in areas of different land uses in the watershed. Results are sensitive to the relative areas of different land uses, and show that bioretention and raingardens provide the maximum reduction (∼12%) in this largely residential watershed. Although commercial, industrial and high-density residential areas in the watershed are minor, larger runoff reductions from disconnection strategies and porous pavement in parking lots are also possible. Total stormwater reductions from various combinations of these strategies can reach 35-55% for individual land uses, and between 23% and 42% for the entire watershed. Copyright © 2017. Published by Elsevier Ltd.
Determination of Flood Reduction Alternatives for Climate Change Adaptation in Gyeongancheon basin
NASA Astrophysics Data System (ADS)
Han, D.; Joo, H. J.; Jung, J.; Kim, H. S.
2017-12-01
Recently, the frequency of extreme rainfall event has increased due to the climate change and the impermeable area in an urban watershed has also increased due to the rapid urbanization. Therefore, the flood risk is increasing and we ought to prepare countermeasures for flood damage reduction. For the determination of appropriate measures or alternatives, firstly, this study estimated the frequency based rainfall considering the climate change according to the each target period(reference : 1971˜2010, Target period Ⅰ : 2011˜2040, Target period Ⅱ : 2041˜2070, Target period Ⅲ : 2071˜2100). Then the future flood discharge was computed by using HEC-HMS model. We set 5 sizes of drainage pumps and detention ponds respectively as the flood reduction alternatives and the flood level in the river was obtained by each alternative through HEC-RAS model. The flood inundation map was constructed using topographical data and flood water level in the river and the economic analysis was conducted for the flood damage reduction studies using Multi Dimensional Flood Damage Analysis (MD-FDA) tool. As a result of the effectiveness analysis of the flood reduction alternatives, the flood level by drainage pump was reduced by 0.06m up to 0.44m while it was reduced by 0.01m up to 1.86m in the case of the detention pond. The flooded area was shrunk by up to 32.64% from 0.3% and inundation depth was also dropped. As a result of a comparison of the Benefit/Cost ratio estimated by the economic analysis, a detention pond E in the target period Ⅰ and the pump D in the periods Ⅱ and Ⅲ were considered as the appropriate alternatives for the flood damage reduction under the climate change. AcknowledgementsThis research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(2017R1A2B3005695)
Phylogenetic Tools for Generalized HIV-1 Epidemics: Findings from the PANGEA-HIV Methods Comparison
Ratmann, Oliver; Hodcroft, Emma B.; Pickles, Michael; Cori, Anne; Hall, Matthew; Lycett, Samantha; Colijn, Caroline; Dearlove, Bethany; Didelot, Xavier; Frost, Simon; Hossain, A.S. Md Mukarram; Joy, Jeffrey B.; Kendall, Michelle; Kühnert, Denise; Leventhal, Gabriel E.; Liang, Richard; Plazzotta, Giacomo; Poon, Art F.Y.; Rasmussen, David A.; Stadler, Tanja; Volz, Erik; Weis, Caroline; Leigh Brown, Andrew J.; Fraser, Christophe
2017-01-01
Viral phylogenetic methods contribute to understanding how HIV spreads in populations, and thereby help guide the design of prevention interventions. So far, most analyses have been applied to well-sampled concentrated HIV-1 epidemics in wealthy countries. To direct the use of phylogenetic tools to where the impact of HIV-1 is greatest, the Phylogenetics And Networks for Generalized HIV Epidemics in Africa (PANGEA-HIV) consortium generates full-genome viral sequences from across sub-Saharan Africa. Analyzing these data presents new challenges, since epidemics are principally driven by heterosexual transmission and a smaller fraction of cases is sampled. Here, we show that viral phylogenetic tools can be adapted and used to estimate epidemiological quantities of central importance to HIV-1 prevention in sub-Saharan Africa. We used a community-wide methods comparison exercise on simulated data, where participants were blinded to the true dynamics they were inferring. Two distinct simulations captured generalized HIV-1 epidemics, before and after a large community-level intervention that reduced infection levels. Five research groups participated. Structured coalescent modeling approaches were most successful: phylogenetic estimates of HIV-1 incidence, incidence reductions, and the proportion of transmissions from individuals in their first 3 months of infection correlated with the true values (Pearson correlation > 90%), with small bias. However, on some simulations, true values were markedly outside reported confidence or credibility intervals. The blinded comparison revealed current limits and strengths in using HIV phylogenetics in challenging settings, provided benchmarks for future methods’ development, and supports using the latest generation of phylogenetic tools to advance HIV surveillance and prevention. PMID:28053012
Petterson, S R
2016-02-01
The aim of this study was to develop a modified quantitative microbial risk assessment (QMRA) framework that could be applied as a decision support tool to choose between alternative drinking water interventions in the developing context. The impact of different household water treatment (HWT) interventions on the overall incidence of diarrheal disease and disability adjusted life years (DALYs) was estimated, without relying on source water pathogen concentration as the starting point for the analysis. A framework was developed and a software tool constructed and then implemented for an illustrative case study for Nepal based on published scientific data. Coagulation combined with free chlorine disinfection provided the greatest estimated health gains in the short term; however, when long-term compliance was incorporated into the calculations, the preferred intervention was porous ceramic filtration. The model demonstrates how the QMRA framework can be used to integrate evidence from different studies to inform management decisions, and in particular to prioritize the next best intervention with respect to estimated reduction in diarrheal incidence. This study only considered HWT interventions; it is recognized that a systematic consideration of sanitation, recreation, and drinking water pathways is important for effective management of waterborne transmission of pathogens, and the approach could be expanded to consider the broader water-related context. © 2015 Society for Risk Analysis.
Lessons learned in deploying software estimation technology and tools
NASA Technical Reports Server (NTRS)
Panlilio-Yap, Nikki; Ho, Danny
1994-01-01
Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.
Chapman, Benjamin; Eversley, Tiffany; Fillion, Katie; Maclaurin, Tanya; Powell, Douglas
2010-06-01
Globally, foodborne illness affects an estimated 30% of individuals annually. Meals prepared outside of the home are a risk factor for acquiring foodborne illness and have been implicated in up to 70% of traced outbreaks. The Centers for Disease Control and Prevention has called on food safety communicators to design new methods and messages aimed at increasing food safety risk-reduction practices from farm to fork. Food safety infosheets, a novel communication tool designed to appeal to food handlers and compel behavior change, were evaluated. Food safety infosheets were provided weekly to food handlers in working food service operations for 7 weeks. It was hypothesized that through the posting of food safety infosheets in highly visible locations, such as kitchen work areas and hand washing stations, that safe food handling behaviors of food service staff could be positively influenced. Using video observation, food handlers (n = 47) in eight food service operations were observed for a total of 348 h (pre- and postintervention combined). After the food safety infosheets were introduced, food handlers demonstrated a significant increase (6.7%, P < 0.05, 95% confidence interval) in mean hand washing attempts, and a significant reduction in indirect cross-contamination events (19.6%, P < 0.05, 95% confidence interval). Results of the research demonstrate that posting food safety infosheets is an effective intervention tool that positively influences the food safety behaviors of food handlers.
Performance and Weight Estimates for an Advanced Open Rotor Engine
NASA Technical Reports Server (NTRS)
Hendricks, Eric S.; Tong, Michael T.
2012-01-01
NASA s Environmentally Responsible Aviation Project and Subsonic Fixed Wing Project are focused on developing concepts and technologies which may enable dramatic reductions to the environmental impact of future generation subsonic aircraft. The open rotor concept (also historically referred to an unducted fan or advanced turboprop) may allow for the achievement of this objective by reducing engine fuel consumption. To evaluate the potential impact of open rotor engines, cycle modeling and engine weight estimation capabilities have been developed. The initial development of the cycle modeling capabilities in the Numerical Propulsion System Simulation (NPSS) tool was presented in a previous paper. Following that initial development, further advancements have been made to the cycle modeling and weight estimation capabilities for open rotor engines and are presented in this paper. The developed modeling capabilities are used to predict the performance of an advanced open rotor concept using modern counter-rotating propeller designs. Finally, performance and weight estimates for this engine are presented and compared to results from a previous NASA study of advanced geared and direct-drive turbofans.
2010-12-01
processes. Novice estimators must often use of these complicated cost estimation tools (e.g., ACEIT , SEER-H, SEER-S, PRICE-H, PRICE-S, etc.) until...However, the thesis will leverage the processes embedded in cost estimation tools such as the Automated Cost Estimating Integration Tool ( ACEIT ) and the
Costing behavioral interventions: a practical guide to enhance translation.
Ritzwoller, Debra P; Sukhanova, Anna; Gaglio, Bridget; Glasgow, Russell E
2009-04-01
Cost and cost effectiveness of behavioral interventions are critical parts of dissemination and implementation into non-academic settings. Due to the lack of indicative data and policy makers' increasing demands for both program effectiveness and efficiency, cost analyses can serve as valuable tools in the evaluation process. To stimulate and promote broader use of practical techniques that can be used to efficiently estimate the implementation costs of behavioral interventions, we propose a set of analytic steps that can be employed across a broad range of interventions. Intervention costs must be distinguished from research, development, and recruitment costs. The inclusion of sensitivity analyses is recommended to understand the implications of implementation of the intervention into different settings using different intervention resources. To illustrate these procedures, we use data from a smoking reduction practical clinical trial to describe the techniques and methods used to estimate and evaluate the costs associated with the intervention. Estimated intervention costs per participant were $419, with a range of $276 to $703, depending on the number of participants.
Salgado, Diana; Torres, J Antonio; Welti-Chanes, Jorge; Velazquez, Gonzalo
2011-08-01
Consumer demand for food safety and quality improvements, combined with new regulations, requires determining the processor's confidence level that processes lowering safety risks while retaining quality will meet consumer expectations and regulatory requirements. Monte Carlo calculation procedures incorporate input data variability to obtain the statistical distribution of the output of prediction models. This advantage was used to analyze the survival risk of Mycobacterium avium subspecies paratuberculosis (M. paratuberculosis) and Clostridium botulinum spores in high-temperature short-time (HTST) milk and canned mushrooms, respectively. The results showed an estimated 68.4% probability that the 15 sec HTST process would not achieve at least 5 decimal reductions in M. paratuberculosis counts. Although estimates of the raw milk load of this pathogen are not available to estimate the probability of finding it in pasteurized milk, the wide range of the estimated decimal reductions, reflecting the variability of the experimental data available, should be a concern to dairy processors. Knowledge of the C. botulinum initial load and decimal thermal time variability was used to estimate an 8.5 min thermal process time at 110 °C for canned mushrooms reducing the risk to 10⁻⁹ spores/container with a 95% confidence. This value was substantially higher than the one estimated using average values (6.0 min) with an unacceptable 68.6% probability of missing the desired processing objective. Finally, the benefit of reducing the variability in initial load and decimal thermal time was confirmed, achieving a 26.3% reduction in processing time when standard deviation values were lowered by 90%. In spite of novel technologies, commercialized or under development, thermal processing continues to be the most reliable and cost-effective alternative to deliver safe foods. However, the severity of the process should be assessed to avoid under- and over-processing and determine opportunities for improvement. This should include a systematic approach to consider variability in the parameters for the models used by food process engineers when designing a thermal process. The Monte Carlo procedure here presented is a tool to facilitate this task for the determination of process time at a constant lethal temperature. © 2011 Institute of Food Technologists®
Cost of areal reduction of gulf hypoxia through agricultural practice.
Whittaker, Gerald; Barnhart, Bradley L; Srinivasan, Raghavan; Arnold, Jeffrey G
2015-02-01
A major share of the area of hypoxic growth in the Northern Gulf of Mexico has been attributed to nutrient run-off from agricultural fields, but no estimate is available for the cost of reducing Gulf hypoxic area using agricultural conservation practices. We apply the Soil and Water Assessment Tool using observed daily weather to simulate the reduction in nitrogen loading in the Upper Mississippi River Basin (UMRB) that would result from enrolling all row crop acreage in the Conservation Reserve Program (CRP). Nitrogen loadings at the outlet of the UMRB are used to predict Gulf hypoxic area, and net cash farm rent is used as the price for participation in the CRP. Over the course of the 42 year simulation, direct CRP costs total more than $388 billion, and the Inter-Governmental Task Force goal of hypoxic area less than 5000 square kilometers is met in only two years. Published by Elsevier B.V.
SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories
NASA Astrophysics Data System (ADS)
Zhang, M.; Collioud, A.; Charlot, P.
2018-02-01
We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.
Challenges to the global control of tuberculosis.
Chiang, Chen-Yuan; Van Weezenbeek, Catharina; Mori, Toru; Enarson, Donald A
2013-05-01
Diagnosis and treatment of tuberculosis (TB) will likely navigate a historical turning point in the 2010s with a new management paradigm emerging. However, global control of TB remains a formidable challenge for the decades to come. The estimated case detection rate of TB globally was 66%, and there were 310 000 estimated multidrug-resistant TB (MDR-TB) cases among the 6.2 million TB patients notified in 2011. Although new tools are being introduced for the diagnosis of MDR-TB, there are operational and cost issues related to their use that require urgent attention, so that the poor and vulnerable can benefit. World Health Organization (WHO) estimated that globally, 3.7% of new cases and 20% of previously treated cases have MDR-TB. However, the scale-up of programmatic management of drug-resistant TB is slow, with only 60 000 MDR-TB cases notified to WHO in 2011. The overall proportion of treatment success of MDR-TB notified globally in 2009 was 48%, far below the global target of 75% success rate. Although new tools and drugs have the potential to significantly improve both case detection and treatment outcome, adequate health systems and human resources are needed for rapid uptake and proper implementation to have the impact required to eliminate TB. Hence, the global TB community should broaden its scope, seek intersectoral collaboration and advocate for cost reduction of new tools, while ensuring that the basics of TB control are implemented to reduce the TB burden through the current 'prevention through case management' paradigm. Respirology © 2013 Asian Pacific Society of Respirology. The World Health Organization retains copyright and all other rights in the manuscript as submitted for publication and has granted the Publisher permission for the reproduction of this article.
Innovated Conceptual Design of Loading Unloading Tool for Livestock at the Port
NASA Astrophysics Data System (ADS)
Mustakim, Achmad; Hadi, Firmanto
2018-03-01
The condition of loading and unloading process of livestock in a number of Indonesian ports doesn’t meet the principle of animal welfare, which makes cattle lose weight and injury when unloaded. Livestock loading and unloading is done by throwing cattle into the sea one by one, tying cattle hung with a sling strap and push the cattle to the berth directly. This process is against PP. 82 year 2000 on Article 47 and 55 about animal welfare. Innovation of loading and unloading tools design offered are loading and unloading design with garbarata. In the design of loading and unloading tools with garbarata, apply the concept of semi-horizontal hydraulic ladder that connects the ship and truck directly. This livestock unloading equipment design innovation is a combination of fire extinguisher truck design and bridge equipped with weightlifting equipment. In 10 years of planning garbarata, requires a total cost of IDR 321,142,921; gets benefits IDR 923,352,333; and BCR (Benefit-Cost Ratio) Value worth 2.88. BCR value >1 means the tool is feasible applied. The designs of this loading and unloading tools are estimated up to 1 hour faster than existing way. It can also minimize risks such as injury and also weight reduction livestock agencies significantly.
Molecular Analysis of the In Situ Growth Rates of Subsurface Geobacter Species
Giloteaux, Ludovic; Barlett, Melissa; Chavan, Milind A.; Smith, Jessica A.; Williams, Kenneth H.; Wilkins, Michael; Long, Philip; Lovley, Derek R.
2013-01-01
Molecular tools that can provide an estimate of the in situ growth rate of Geobacter species could improve understanding of dissimilatory metal reduction in a diversity of environments. Whole-genome microarray analyses of a subsurface isolate of Geobacter uraniireducens, grown under a variety of conditions, identified a number of genes that are differentially expressed at different specific growth rates. Expression of two genes encoding ribosomal proteins, rpsC and rplL, was further evaluated with quantitative reverse transcription-PCR (qRT-PCR) in cells with doubling times ranging from 6.56 h to 89.28 h. Transcript abundance of rpsC correlated best (r2 = 0.90) with specific growth rates. Therefore, expression patterns of rpsC were used to estimate specific growth rates of Geobacter species during an in situ uranium bioremediation field experiment in which acetate was added to the groundwater to promote dissimilatory metal reduction. Initially, increased availability of acetate in the groundwater resulted in higher expression of Geobacter rpsC, and the increase in the number of Geobacter cells estimated with fluorescent in situ hybridization compared well with specific growth rates estimated from levels of in situ rpsC expression. However, in later phases, cell number increases were substantially lower than predicted from rpsC transcript abundance. This change coincided with a bloom of protozoa and increased attachment of Geobacter species to solid phases. These results suggest that monitoring rpsC expression may better reflect the actual rate that Geobacter species are metabolizing and growing during in situ uranium bioremediation than changes in cell abundance. PMID:23275510
NASA Astrophysics Data System (ADS)
Albano, R.; Sole, A.; Adamowski, J.; Mancusi, L.
2014-11-01
Efficient decision-making regarding flood risk reduction has become a priority for authorities and stakeholders in many European countries. Risk analysis methods and techniques are a useful tool for evaluating costs and benefits of possible interventions. Within this context, a methodology to estimate flood consequences was developed in this paper that is based on GIS, and integrated with a model that estimates the degree of accessibility and operability of strategic emergency response structures in an urban area. The majority of the currently available approaches do not properly analyse road network connections and dependencies within systems, and as such a loss of roads could cause significant damages and problems to emergency services in cases of flooding. The proposed model is unique in that it provides a maximum-impact estimation of flood consequences on the basis of the operability of the strategic emergency structures in an urban area, their accessibility, and connection within the urban system of a city (i.e. connection between aid centres and buildings at risk), in the emergency phase. The results of a case study in the Puglia region in southern Italy are described to illustrate the practical applications of this newly proposed approach. The main advantage of the proposed approach is that it allows for defining a hierarchy between different infrastructure in the urban area through the identification of particular components whose operation and efficiency are critical for emergency management. This information can be used by decision-makers to prioritize risk reduction interventions in flood emergencies in urban areas, given limited financial resources.
No, Yeon A; Ahn, Byeong Heon; Kim, Beom Joon; Kim, Myeung Nam; Hong, Chang Kwon
2016-01-01
For correction of this asymmetrical hypertrophy, botulinum toxin type A (BTxA) injection is one of convenient treatment modalities. Unfortunately, physical examination of masseter muscle is not enough to estimate the exact volume of muscle hypertrophy difference. Two Koreans, male and female, of bilateral masseter hypertrophy with asymmetricity were evaluated. BTxA (NABOTA(®), Daewoong, Co. Ltd., Seoul, Korea) was injected at master muscle site with total 50 U (25 U at each side) and volume change was evaluated with three-dimensional (3D) CT image analysis. Maximum reduction of masseter hypertrophy was recognized at 2-month follow-up and reduced muscle size started to restore after 3 months. Mean reduction of masseter muscle volume was 36% compared with baseline. More hypertrophied side of masseter muscle presented 42% of volume reduction at 2-month follow-up but less hypertrophied side of masseter muscle showed 30% of volume shrinkage. In conclusion, 3D CT image analysis might be the exact evaluation tool for correction of asymmetrical masseter hypertrophy by botulinum toxin injection.
Juan-Blasco, M; Sabater-Muñoz, B; Pla, I; Argilés, R; Castañera, P; Jacas, J A; Ibáñez-Gual, M V; Urbaneja, A
2014-04-01
Area-wide sterile insect technique (SIT) programs assume that offspring reduction of the target population correlates with the mating success of the sterile males released. However, there is a lack of monitoring tools to prove the success of these programs in real-time. Field-cage tests were conducted under the environmental conditions of the Mediterranean coast of Spain to estimate: (a) the mating success of sterile Vienna-8 (V8) Ceratitis capitata males using molecular markers and (b) their efficacy to reduce C. capitata populations under six release ratios of wild females to wild males to V8 males (1:0:0, 1:1:0, 1:1:1, 1:1:5, 1:1:10, and 1:1:20). Statistical models were developed to predict: (a) the number of females captured in traps, (b) sperm ID (sterile or not) in spermathecae of the trapped females, and (c) the viable offspring produced, using release ratio and temperature as predictors. The number of females captured was affected by relative humidity. However, its influence in the model was low. Female captures were significantly higher in ratios 1:0:0 compared to ratios where V8 males were released. The proportion of V8 sperm in spermathecae increased with temperature and with the number of V8 males released, but leveled off between ratios 1:1:10 and 1:1:20. In all seasons, except winter (no offspring), viable offspring increased with temperature and was lowest for ratio 1:1:20. For the first time, a strong negative relationship between proportion of V8 sperm detected by molecular tools and C. capitata offspring was established. The models obtained should contribute to enhance the efficacy of SIT programs against this pest.
Garcia, Ana Maria
2009-01-01
A study of the Currituck Sound was initiated in 2005 to evaluate the water chemistry of the Sound and assess the effectiveness of management strategies. As part of this study, the Soil and Water Assessment Tool (SWAT) model was used to simulate current sediment and nutrient loadings for two distinct watersheds in the Currituck Sound basin and to determine the consequences of different water-quality management scenarios. The watersheds studied were (1) Tull Creek watershed, which has extensive row-crop cultivation and artificial drainage, and (2) West Neck Creek watershed, which drains urban areas in and around Virginia Beach, Virginia. The model simulated monthly streamflows with Nash-Sutcliffe model efficiency coefficients of 0.83 and 0.76 for Tull Creek and West Neck Creek, respectively. The daily sediment concentration coefficient of determination was 0.19 for Tull Creek and 0.36 for West Neck Creek. The coefficient of determination for total nitrogen was 0.26 for both watersheds and for dissolved phosphorus was 0.4 for Tull Creek and 0.03 for West Neck Creek. The model was used to estimate current (2006-2007) sediment and nutrient yields for the two watersheds. Total suspended-solids yield was 56 percent lower in the urban watershed than in the agricultural watershed. Total nitrogen export was 45 percent lower, and total phosphorus was 43 percent lower in the urban watershed than in the agricultural watershed. A management scenario with filter strips bordering the main channels was simulated for Tull Creek. The Soil and Water Assessment Tool model estimated a total suspended-solids yield reduction of 54 percent and total nitrogen and total phosphorus reductions of 21 percent and 29 percent, respectively, for the Tull Creek watershed.
Unruh, Mark Aaron; Jung, Hye-Young; Kaushal, Rainu; Vest, Joshua R
2017-04-01
Follow-up with a primary care provider after hospital discharge has been associated with a reduced likelihood of readmission. However, primary care providers are frequently unaware of their patients' hospitalizations. Event notification may be an effective tool for reducing readmissions by notifying primary care providers when their patients have been admitted to and discharged from a hospital. We examined the effect of an event notification system on 30-day readmissions in the Bronx, New York. The Bronx has among the highest readmission rates in the country and is a particularly challenging setting to improve care due to the low socioeconomic status of the county and high rates of poor health behaviors among its residents. The study cohort included 2559 Medicare fee-for-service beneficiaries associated with 14 141 hospital admissions over the period January 2010 through June 2014. Linear regression models with beneficiary-level fixed-effects were used to estimate the impact of event notifications on readmissions by comparing the likelihood of rehospitalization for a beneficiary before and after event notifications were active. The unadjusted 30-day readmission rate when event notifications were not active was 29.5% compared to 26.5% when alerts were active. Regression estimates indicated that active hospitalization alert services were associated with a 2.9 percentage point reduction in the likelihood of readmission (95% confidence interval: -5.5, -0.4). Alerting providers through event notifications may be an effective tool for improving the quality and efficiency of care among high-risk populations. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Forensic surface metrology: tool mark evidence.
Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K
2011-01-01
Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.
Development of a tool to improve the quality of decision making in atrial fibrillation
2011-01-01
Background Decision-making about appropriate therapy to reduce the stroke risk associated with non-valvular atrial fibrillation (NVAF) involves the consideration of trade-offs among the benefits, risks, and inconveniences of different treatment options. The objective of this paper is to describe the development of a decision support tool for NVAF based on the provision of individualized risk estimates for stroke and bleeding and on preparing patients to communicate with their physicians about their values and potential treatment options. Methods We developed a tool based on the principles of the International Patient Decision Aids Standards. The tool focuses on the patient-physician dyad as the decision-making unit and emphasizes improving the interaction between the two. It is built on the recognition that the application of patient values to a specific treatment decision is complex and that the final treatment choice is best made through a process of patient-clinician communication. Results The tool provides education incorporating patients ' illness perceptions to explain the relationship between NVAF and stroke, and then presents individualized risk estimates, derived using separate risk calculators for stroke and bleeding over a clinically meaningful time period (5 years) associated with no treatment, aspirin, and warfarin. Sequelae of both stroke and bleeding outcomes are also described. Patients are encouraged to verbalize how they value the incremental risks and benefits associated with each option and write down specific concerns to address with their physician. A physician prompt to encourage patients to discuss their opinions is included as part of the decision support tool. In pilot testing with 11 participants (mean age 78 ± 9 years, 64% with ≤ high-school education), 8 (72%) rated ease of completion as "very easy," and 9 (81%) rated amount of information as "just right." Conclusions The risks and benefits of different treatment options for reduction of stroke in NVAF vary widely according to patients' comorbidities. This tool facilitates the provision of individualized outcome data and encourages patients to communicate with their physicians about these risks and benefits. Future studies will examine whether use of the tool is associated with improved quality of decision making. PMID:21977943
van Tongeren, Martie; Lamb, Judith; Cherrie, John W; MacCalman, Laura; Basinas, Ioannis; Hesse, Susanne
2017-10-01
Tier 1 exposure tools recommended for use under REACH are designed to easily identify situations that may pose a risk to health through conservative exposure predictions. However, no comprehensive evaluation of the performance of the lower tier tools has previously been carried out. The ETEAM project aimed to evaluate several lower tier exposure tools (ECETOC TRA, MEASE, and EMKG-EXPO-TOOL) as well as one higher tier tool (STOFFENMANAGER®). This paper describes the results of the external validation of tool estimates using measurement data. Measurement data were collected from a range of providers, both in Europe and United States, together with contextual information. Individual measurement and aggregated measurement data were obtained. The contextual information was coded into the tools to obtain exposure estimates. Results were expressed as percentage of measurements exceeding the tool estimates and presented by exposure category (non-volatile liquid, volatile liquid, metal abrasion, metal processing, and powder handling). We also explored tool performance for different process activities as well as different scenario conditions and exposure levels. In total, results from nearly 4000 measurements were obtained, with the majority for the use of volatile liquids and powder handling. The comparisons of measurement results with tool estimates suggest that the tools are generally conservative. However, the tools were more conservative when estimating exposure from powder handling compared to volatile liquids and other exposure categories. In addition, results suggested that tool performance varies between process activities and scenario conditions. For example, tools were less conservative when estimating exposure during activities involving tabletting, compression, extrusion, pelletisation, granulation (common process activity PROC14) and transfer of substance or mixture (charging and discharging) at non-dedicated facilities (PROC8a; powder handling only). With the exception of STOFFENMANAGER® (for estimating exposure during powder handling), the tools were less conservative for scenarios with lower estimated exposure levels. This is the most comprehensive evaluation of the performance of REACH exposure tools carried out to date. The results show that, although generally conservative, the tools may not always achieve the performance specified in the REACH guidance, i.e. using the 75th or 90th percentile of the exposure distribution for the risk characterisation. Ongoing development, adjustment, and recalibration of the tools with new measurement data are essential to ensure adequate characterisation and control of worker exposure to hazardous substances. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Code of Federal Regulations, 2010 CFR
2010-07-01
... identify and estimate safety and environmental management risks and appropriate risk reduction strategies... responsible for identifying/estimating risks and for appropriate risk reduction strategies? 102-80.50 Section... Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lapota, D.; Moskowitz, G.; Grovhoug, J.
1993-03-01
Phytoplankton bioassays have been used as biological tools in assessing environmental contamination. In our laboratory, a simple bioassay has been developed which measures the light output from bioluminescence dinoflagellates for assessment of toxic effects when exposed to a single toxicant or mixture. Successful use of this type of bioassay has provided data on the acute response and has demonstrated the chronic effects, from hours up to 11 days, on dinoflagellate cells of Pyrocystis lunula and Gonyaulax polyedra upon exposure to several metals and storm drain effluent. Dinoflagellate cells were exposed to various concentrations of tributyltin chloride (TBTCI), copper (11) sulfatemore » (CUS04), zinc sulfate (ZnSO4), or storm drain effluent. Stimulable bioluminescence was measured at each test period (3 or 4 h, 24 h, 48 h, 72 h, etc.) following setup for all assays. Cells were kept in the dark for 3 or 4 h prior to testing. Stirring the cells within the chamber stimulated maximum bioluminescence from the dinoflagellates. An IC50 (an estimated concentration that is likely to cause a 50% reduction in light output) was estimated for all assays. The trend of light reduction as a response to increasing dose level of test article was observed in all assays. A reduction in light output was measured from cells exposed to 1.6, 4.2, and 12.8 ug/L TBTCI. The IC50 decreased from 8.5 ug/L at 120 h to 3.0 ug/L at 264 h. The cells exposed to 6.25%, 12.5%, and 25.0% storm drain effluent exhibited a statistically significant (P=0.05) reduction in light output in as little as 3 h exposure....Plankton, Oceanography, Bioluminescence.« less
2011-01-01
Background Globally syphilis is an important yet preventable cause of stillbirth, neonatal mortality and morbidity. Objectives This review sought to estimate the effect of detection and treatment of active syphilis in pregnancy with at least 2.4MU benzathine penicillin (or equivalent) on syphilis-related stillbirths and neonatal mortality. Methods We conducted a systematic literature review of multiple databases to identify relevant studies. Data were abstracted into standardised tables and the quality of evidence was assessed using adapted GRADE criteria. Where appropriate, meta-analyses were undertaken. Results Moderate quality evidence (3 studies) supports a reduction in the incidence of clinical congenital syphilis of 97% (95% c.i 93 – 98%) with detection and treatment of women with active syphilis in pregnancy with at least 2.4MU penicillin. The results of meta-analyses suggest that treatment with penicillin is associated with an 82% reduction in stillbirth (95% c.i. 67 – 90%) (8 studies), a 64% reduction in preterm delivery (95% c.i. 53 – 73%) (7 studies) and an 80% reduction in neonatal deaths (95% c.i. 68 – 87%) (5 studies). Although these effect estimates were large and remarkably consistent across studies, few of the studies adjusted for potential confounding factors and thus the overall quality of the evidence was considered low. However, given these large observed effects and a clear biological mechanism for effectiveness the GRADE recommendation is strong. Conclusion Detection and appropriate, timely penicillin treatment is a highly effective intervention to reduce adverse syphilis-related pregnancy outcomes. More research is required to identify the most cost-effective strategies for achieving maximum coverage of screening for all pregnant women, and access to treatment if required. PMID:21501460
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrow, III, William R.; Hasanbeigi, Ali; Xu, Tengfang
2012-12-03
India’s cement industry is the second largest in the world behind China with annual cement production of 168 Mt in 2010 which accounted for slightly greater than six percent of the world’s annual cement production in the same year. To produce that amount of cement, the industry consumed roughly 700 PJ of fuel and 14.7 TWh of electricity. We identified and analyzed 22 energy efficiency technologies and measures applicable to the processes in the Indian cement industry. The Conservation Supply Curve (CSC) used in this study is an analytical tool that captures both the engineering and the economic perspectives ofmore » energy conservation. Using a bottom-up electricity CSC model and compared to an electricity price forecast the cumulative cost-effective plant-level electricity savings potential for the Indian cement industry for 2010- 2030 is estimated to be 83 TWh, and the cumulative plant-level technical electricity saving potential is 89 TWh during the same period. The grid-level CO2 emissions reduction associated with cost-effective electricity savings is 82 Mt CO2 and the electric grid-level CO2 emission reduction associated with technical electricity saving potential is 88 Mt CO2. Compared to a fuel price forecast, an estimated cumulative cost-effective fuel savings potential of 1,029 PJ with associated CO2 emission reduction of 97 Mt CO2 during 2010-2030 is possible. In addition, a sensitivity analysis with respect to the discount rate used is conducted to assess the effect of changes in this parameter on the results. The result of this study gives a comprehensive and easy to understand perspective to the Indian cement industry and policy makers about the energy efficiency potential and its associated cost over the next twenty years.« less
Integrating economic and biophysical data in assessing cost-effectiveness of buffer strip placement.
Balana, Bedru Babulo; Lago, Manuel; Baggaley, Nikki; Castellazzi, Marie; Sample, James; Stutter, Marc; Slee, Bill; Vinten, Andy
2012-01-01
The European Union Water Framework Directive (WFD) requires Member States to set water quality objectives and identify cost-effective mitigation measures to achieve "good status" in all waters. However, costs and effectiveness of measures vary both within and between catchments, depending on factors such as land use and topography. The aim of this study was to develop a cost-effectiveness analysis framework for integrating estimates of phosphorus (P) losses from land-based sources, potential abatement using riparian buffers, and the economic implications of buffers. Estimates of field-by-field P exports and routing were based on crop risk and field slope classes. Buffer P trapping efficiencies were based on literature metadata analysis. Costs of placing buffers were based on foregone farm gross margins. An integrated optimization model of cost minimization was developed and solved for different P reduction targets to the Rescobie Loch catchment in eastern Scotland. A target mean annual P load reduction of 376 kg to the loch to achieve good status was identified. Assuming all the riparian fields initially have the 2-m buffer strip required by the General Binding Rules (part of the WFD in Scotland), the model gave good predictions of P loads (345-481 kg P). The modeling results show that riparian buffers alone cannot achieve the required P load reduction (up to 54% P can be removed). In the medium P input scenario, average costs vary from £38 to £176 kg P at 10% and 54% P reduction, respectively. The framework demonstrates a useful tool for exploring cost-effective targeting of environmental measures. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Foster, E; Matthews, J N S; Lloyd, J; Marshall, L; Mathers, J C; Nelson, M; Barton, K L; Wrieden, W L; Cornelissen, P; Harris, J; Adamson, A J
2008-01-01
A number of methods have been developed to assist subjects in providing an estimate of portion size but their application in improving portion size estimation by children has not been investigated systematically. The aim was to develop portion size assessment tools for use with children and to assess the accuracy of children's estimates of portion size using the tools. The tools were food photographs, food models and an interactive portion size assessment system (IPSAS). Children (n 201), aged 4-16 years, were supplied with known quantities of food to eat, in school. Food leftovers were weighed. Children estimated the amount of each food using each tool, 24 h after consuming the food. The age-specific portion sizes represented were based on portion sizes consumed by children in a national survey. Significant differences were found between the accuracy of estimates using the three tools. Children of all ages performed well using the IPSAS and food photographs. The accuracy and precision of estimates made using the food models were poor. For all tools, estimates of the amount of food served were more accurate than estimates of the amount consumed. Issues relating to reporting of foods left over which impact on estimates of the amounts of foods actually consumed require further study. The IPSAS has shown potential for assessment of dietary intake with children. Before practical application in assessment of dietary intake of children the tool would need to be expanded to cover a wider range of foods and to be validated in a 'real-life' situation.
Method to monitor HC-SCR catalyst NOx reduction performance for lean exhaust applications
Viola, Michael B [Macomb Township, MI; Schmieg, Steven J [Troy, MI; Sloane, Thompson M [Oxford, MI; Hilden, David L [Shelby Township, MI; Mulawa, Patricia A [Clinton Township, MI; Lee, Jong H [Rochester Hills, MI; Cheng, Shi-Wai S [Troy, MI
2012-05-29
A method for initiating a regeneration mode in selective catalytic reduction device utilizing hydrocarbons as a reductant includes monitoring a temperature within the aftertreatment system, monitoring a fuel dosing rate to the selective catalytic reduction device, monitoring an initial conversion efficiency, selecting a determined equation to estimate changes in a conversion efficiency of the selective catalytic reduction device based upon the monitored temperature and the monitored fuel dosing rate, estimating changes in the conversion efficiency based upon the determined equation and the initial conversion efficiency, and initiating a regeneration mode for the selective catalytic reduction device based upon the estimated changes in conversion efficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2006-06-30
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st and July 30th 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool. Work is being carried out in Brazil, Belize, Chile, Peru and the USA.« less
Implications of Overdiagnosis: Impact on Screening Mammography Practices
Morris, Elizabeth; Feig, Stephen A.; Drexler, Madeline
2015-01-01
Abstract This review article explores the issue of overdiagnosis in screening mammography. Overdiagnosis is the screen detection of a breast cancer, histologically confirmed, that might not otherwise become clinically apparent during the lifetime of the patient. While screening mammography is an imperfect tool, it remains the best tool we have to diagnose breast cancer early, before a patient is symptomatic and at a time when chances of survival and options for treatment are most favorable. In 2015, an estimated 231,840 new cases of breast cancer (excluding ductal carcinoma in situ) will be diagnosed in the United States, and some 40,290 women will die. Despite these data, screening mammography for women ages 40–69 has contributed to a substantial reduction in breast cancer mortality, and organized screening programs have led to a shift from late-stage diagnosis to early-stage detection. Current estimates of overdiagnosis in screening mammography vary widely, from 0% to upwards of 30% of diagnosed cancers. This range reflects the fact that measuring overdiagnosis is not a straightforward calculation, but usually one based on different sets of assumptions and often biased by methodological flaws. The recent development of tomosynthesis, which creates high-resolution, three-dimensional images, has increased breast cancer detection while reducing false recalls. Because the greatest harm of overdiagnosis is overtreatment, the key goal should not be less diagnosis but better treatment decision tools. (Population Health Management 2015;18:S3–S11) PMID:26414384
A Methodology for Developing Army Acquisition Strategies for an Uncertain Future
2007-01-01
manuscript for publication. Acronyms ABP Assumption-Based Planning ACEIT Automated Cost Estimating Integrated Tool ACR Armored Cavalry Regiment ACTD...decisions. For example, they employ the Automated Cost Estimating Integrated Tools ( ACEIT ) to simplify life cycle cost estimates; other tools are
Tool for the Reduction and Assessment of Chemical and other Environmental Impacts
TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, has been developed by the US Environmental Protection Agency’s National Risk Management Research Laboratory to facilitate the characterization of stressors that have potential effects, ...
Bhalla, Kavi; Harrison, James E
2016-04-01
Burden of disease and injury methods can be used to summarise and compare the effects of conditions in terms of disability-adjusted life years (DALYs). Burden estimation methods are not inherently complex. However, as commonly implemented, the methods include complex modelling and estimation. To provide a simple and open-source software tool that allows estimation of incidence-DALYs due to injury, given data on incidence of deaths and non-fatal injuries. The tool includes a default set of estimation parameters, which can be replaced by users. The tool was written in Microsoft Excel. All calculations and values can be seen and altered by users. The parameter sets currently used in the tool are based on published sources. The tool is available without charge online at http://calculator.globalburdenofinjuries.org. To use the tool with the supplied parameter sets, users need to only paste a table of population and injury case data organised by age, sex and external cause of injury into a specified location in the tool. Estimated DALYs can be read or copied from tables and figures in another part of the tool. In some contexts, a simple and user-modifiable burden calculator may be preferable to undertaking a more complex study to estimate the burden of disease. The tool and the parameter sets required for its use can be improved by user innovation, by studies comparing DALYs estimates calculated in this way and in other ways, and by shared experience of its use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Reliability Assessment Of Conceptual Launch Vehicles
NASA Technical Reports Server (NTRS)
Bloomer, Lisa A.
2005-01-01
Planning is underway for new NASA missions to the moon and to MARS. These missions carry a great deal of risk, as the Challenger and Columbia accidents demonstrate. In order to minimize the risks to the crew and the mission, risk reduction must be done at every stage, not only in quality manufacturing, but also in design. It is necessary, therefore, to be able to compare the risks posed in different launch vehicle designs. Further, these designs have not yet been implemented, so it is necessary to compare these risks without being able to test the vehicles themselves. This paper will discuss some of the issues involved in this type of comparison. It will start with a general discussion of reliability estimation. It will continue with a short look at some software designed to make this estimation easier and faster. It will conclude with a few recommendations for future tools.
Adaptive Optics Images of the Galactic Center: Using Empirical Noise-maps to Optimize Image Analysis
NASA Astrophysics Data System (ADS)
Albers, Saundra; Witzel, Gunther; Meyer, Leo; Sitarski, Breann; Boehle, Anna; Ghez, Andrea M.
2015-01-01
Adaptive Optics images are one of the most important tools in studying our Galactic Center. In-depth knowledge of the noise characteristics is crucial to optimally analyze this data. Empirical noise estimates - often represented by a constant value for the entire image - can be greatly improved by computing the local detector properties and photon noise contributions pixel by pixel. To comprehensively determine the noise, we create a noise model for each image using the three main contributors—photon noise of stellar sources, sky noise, and dark noise. We propagate the uncertainties through all reduction steps and analyze the resulting map using Starfinder. The estimation of local noise properties helps to eliminate fake detections while improving the detection limit of fainter sources. We predict that a rigorous understanding of noise allows a more robust investigation of the stellar dynamics in the center of our Galaxy.
Carrà, Giuseppe; Crocamo, Cristina; Humphris, Gerald; Tabacchi, Tommaso; Bartoli, Francesco; Neufeind, Julia; Scherbaum, Norbert; Baldacchino, Alexander
2017-12-01
Increasing awareness of, and information about, overdose risk is an appropriate approach in risk reduction. e-Health technology in substance use disorders is an opportunity to support behavioral changes related to public health concerns. The present study aimed to evaluate the short-term impact of an innovative e-health psychoeducational software, the Overdose RIsk InfOrmatioN (ORION) tool. The ORION programme provided relevant information to opioid-dependent individuals about the risk of suffering a drug overdose as a result of high risky and dysfunctional behaviors. Seven aggregate risk factors were identified through a systematic review and their outputs included in a risk estimation model. We recruited 194 opioid-dependent treatment-seeking individuals from the United Kingdom, Germany, Italy, and Denmark. All participants were given at study entry, and after their use of the software, the General Self-Efficacy (GSE) Scale. We found comparable pre- and post-ORION administration mean GSE scores (SD), 28.49 (5.50) and 28.32 (5.90), respectively (p = 0.297). However, there was an inverse correlation between the number of risk factors and reported levels of self-efficacy (p < 0.001). ORION was able to identify individuals who are most in need of reducing their modifiable risk factors with appropriate interventions. However, a one-shot e-health tool cannot influence complex domains such as self-efficacy unless this is used with other effective interventions. Nonetheless, the ORION tool is unique in its style and content of delivery, that is translating risks combination into a clear estimation, and will need further development such as (a) integration in smartphone-based e-health apps and (b) testing in other high-risk populations.
Phylogenetic Tools for Generalized HIV-1 Epidemics: Findings from the PANGEA-HIV Methods Comparison.
Ratmann, Oliver; Hodcroft, Emma B; Pickles, Michael; Cori, Anne; Hall, Matthew; Lycett, Samantha; Colijn, Caroline; Dearlove, Bethany; Didelot, Xavier; Frost, Simon; Hossain, A S Md Mukarram; Joy, Jeffrey B; Kendall, Michelle; Kühnert, Denise; Leventhal, Gabriel E; Liang, Richard; Plazzotta, Giacomo; Poon, Art F Y; Rasmussen, David A; Stadler, Tanja; Volz, Erik; Weis, Caroline; Leigh Brown, Andrew J; Fraser, Christophe
2017-01-01
Viral phylogenetic methods contribute to understanding how HIV spreads in populations, and thereby help guide the design of prevention interventions. So far, most analyses have been applied to well-sampled concentrated HIV-1 epidemics in wealthy countries. To direct the use of phylogenetic tools to where the impact of HIV-1 is greatest, the Phylogenetics And Networks for Generalized HIV Epidemics in Africa (PANGEA-HIV) consortium generates full-genome viral sequences from across sub-Saharan Africa. Analyzing these data presents new challenges, since epidemics are principally driven by heterosexual transmission and a smaller fraction of cases is sampled. Here, we show that viral phylogenetic tools can be adapted and used to estimate epidemiological quantities of central importance to HIV-1 prevention in sub-Saharan Africa. We used a community-wide methods comparison exercise on simulated data, where participants were blinded to the true dynamics they were inferring. Two distinct simulations captured generalized HIV-1 epidemics, before and after a large community-level intervention that reduced infection levels. Five research groups participated. Structured coalescent modeling approaches were most successful: phylogenetic estimates of HIV-1 incidence, incidence reductions, and the proportion of transmissions from individuals in their first 3 months of infection correlated with the true values (Pearson correlation > 90%), with small bias. However, on some simulations, true values were markedly outside reported confidence or credibility intervals. The blinded comparison revealed current limits and strengths in using HIV phylogenetics in challenging settings, provided benchmarks for future methods' development, and supports using the latest generation of phylogenetic tools to advance HIV surveillance and prevention. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
2011-01-01
Background Annually over 520,000 newborns die from neonatal sepsis, and 60,000 more from tetanus. Estimates of the effect of clean birth and postnatal care practices are required for evidence-based program planning. Objective To review the evidence for clean birth and postnatal care practices and estimate the effect on neonatal mortality from sepsis and tetanus for the Lives Saved Tool (LiST). Methods We conducted a systematic review of multiple databases. Data were abstracted into standard tables and assessed by GRADE criteria. Where appropriate, meta-analyses were undertaken. For interventions with low quality evidence but a strong GRADE recommendation, a Delphi process was conducted. Results Low quality evidence supports a reduction in all-cause neonatal mortality (19% (95% c.i. 1–34%)), cord infection (30% (95% c.i. 20–39%)) and neonatal tetanus (49% (95% c.i. 35–62%)) with birth attendant handwashing. Very low quality evidence supports a reduction in neonatal tetanus mortality with a clean birth surface (93% (95% c.i. 77-100%)) and no relationship between a clean perineum and tetanus. Low quality evidence supports a reduction of neonatal tetanus with facility birth (68% (95% c.i. 47-88%). No relationship was found between birth place and cord infections or sepsis mortality. For postnatal clean practices, all-cause mortality is reduced with chlorhexidine cord applications in the first 24 hours of life (34% (95% c.i. 5–54%, moderate quality evidence) and antimicrobial cord applications (63% (95% c.i. 41–86%, low quality evidence). One study of postnatal maternal handwashing reported reductions in all-cause mortality (44% (95% c.i. 18–62%)) and cord infection ((24% (95% c.i. 5-40%)). Given the low quality of evidence, a Delphi expert opinion process was undertaken. Thirty experts reached consensus regarding reduction of neonatal sepsis deaths by clean birth practices at home (15% (IQR 10–20)) or in a facility (27% IQR 24–36)), and by clean postnatal care practices (40% (IQR 25–50)). The panel estimated that neonatal tetanus mortality was reduced by clean birth practices at home (30% (IQR(20–30)), or in a facility (38% (IQR 34–40)), and by clean postnatal care practices (40% (IQR 30–50)). Conclusion According to expert opinion, clean birth and particularly postnatal care practices are effective in reducing neonatal mortality from sepsis and tetanus. Further research is required regarding optimal implementation strategies. PMID:21501428
Isma’eel, Hussain A.; Sakr, George E.; Almedawar, Mohamad M.; Fathallah, Jihan; Garabedian, Torkom; Eddine, Savo Bou Zein
2015-01-01
Background High dietary salt intake is directly linked to hypertension and cardiovascular diseases (CVDs). Predicting behaviors regarding salt intake habits is vital to guide interventions and increase their effectiveness. We aim to compare the accuracy of an artificial neural network (ANN) based tool that predicts behavior from key knowledge questions along with clinical data in a high cardiovascular risk cohort relative to the least square models (LSM) method. Methods We collected knowledge, attitude and behavior data on 115 patients. A behavior score was calculated to classify patients’ behavior towards reducing salt intake. Accuracy comparison between ANN and regression analysis was calculated using the bootstrap technique with 200 iterations. Results Starting from a 69-item questionnaire, a reduced model was developed and included eight knowledge items found to result in the highest accuracy of 62% CI (58-67%). The best prediction accuracy in the full and reduced models was attained by ANN at 66% and 62%, respectively, compared to full and reduced LSM at 40% and 34%, respectively. The average relative increase in accuracy over all in the full and reduced models is 82% and 102%, respectively. Conclusions Using ANN modeling, we can predict salt reduction behaviors with 66% accuracy. The statistical model has been implemented in an online calculator and can be used in clinics to estimate the patient’s behavior. This will help implementation in future research to further prove clinical utility of this tool to guide therapeutic salt reduction interventions in high cardiovascular risk individuals. PMID:26090333
Modeling the Impact of Nutrition Interventions on Birth Outcomes in the Lives Saved Tool (LiST).
Heidkamp, Rebecca; Clermont, Adrienne; Phillips, Erica
2017-11-01
Background: Negative birth outcomes [small-for-gestational age (SGA) and preterm birth (PTB)] are common in low- and middle-income countries and have important subsequent health and developmental impacts on children. There are numerous nutritional and non-nutritional interventions that can decrease the risk of negative birth outcomes and reduce subsequent risk of mortality and growth faltering. Objective: The objective of this article was to review the current evidence for the impact of nutritional interventions in pregnancy [calcium supplementation, iron and folic acid supplementation, multiple micronutrient (MMN) supplementation, and balanced energy supplementation (BES)] and risk factors (maternal anemia) on birth outcomes, with the specific goal of determining which intervention-outcome linkages should be included in the Lives Saved Tool (LiST) software. Methods: A literature search was conducted by using the WHO e-Library of Evidence for Nutrition Actions as the starting point. Recent studies, meta-analyses, and systematic reviews were reviewed for inclusion on the basis of their relevance to LiST. Results: On the basis of the available scientific evidence, the following linkages were found to be supported for inclusion in LiST: calcium supplementation on PTB (12% reduction), MMN supplementation on SGA (9% reduction), and BES on SGA (21% reduction among food-insecure women). Conclusions: The inclusion of these linkages in LiST will improve the utility of the model for users who seek to estimate the impact of antenatal nutrition interventions on birth outcomes. Scaling up these interventions should lead to downstream impacts in reducing stunting and child mortality. © 2017 American Society for Nutrition.
TRACI - THE TOOL FOR THE REDUCTION AND ASSESSMENT OF CHEMICAL AND OTHER ENVIRONMENTAL IMPACTS
TRACI, The Tool for the Reduction and Assessment of Chemical and other environmental Impacts, is described along with its history, the underlying research, methodologies, and insights within individual impact categories. TRACI facilitates the characterization of stressors that ma...
A-posteriori error estimation for second order mechanical systems
NASA Astrophysics Data System (ADS)
Ruiner, Thomas; Fehr, Jörg; Haasdonk, Bernard; Eberhard, Peter
2012-06-01
One important issue for the simulation of flexible multibody systems is the reduction of the flexible bodies degrees of freedom. As far as safety questions are concerned knowledge about the error introduced by the reduction of the flexible degrees of freedom is helpful and very important. In this work, an a-posteriori error estimator for linear first order systems is extended for error estimation of mechanical second order systems. Due to the special second order structure of mechanical systems, an improvement of the a-posteriori error estimator is achieved. A major advantage of the a-posteriori error estimator is that the estimator is independent of the used reduction technique. Therefore, it can be used for moment-matching based, Gramian matrices based or modal based model reduction techniques. The capability of the proposed technique is demonstrated by the a-posteriori error estimation of a mechanical system, and a sensitivity analysis of the parameters involved in the error estimation process is conducted.
A validation study of public health knowledge, skills, social responsibility and applied learning.
Vackova, Dana; Chen, Coco K; Lui, Juliana N M; Johnston, Janice M
2018-06-22
To design and validate a questionnaire to measure medical students' Public Health (PH) knowledge, skills, social responsibility and applied learning as indicated in the four domains recommended by the Association of Schools & Programmes of Public Health (ASPPH). A cross-sectional study was conducted to develop an evaluation tool for PH undergraduate education through item generation, reduction, refinement and validation. The 74 preliminary items derived from the existing literature were reduced to 55 items based on expert panel review which included those with expertise in PH, psychometrics and medical education, as well as medical students. Psychometric properties of the preliminary questionnaire were assessed as follows: frequency of endorsement for item variance; principal component analysis (PCA) with varimax rotation for item reduction and factor estimation; Cronbach's Alpha, item-total correlation and test-retest validity for internal consistency and reliability. PCA yielded five factors: PH Learning Experience (6 items); PH Risk Assessment and Communication (5 items); Future Use of Evidence in Practice (6 items); Recognition of PH as a Scientific Discipline (4 items); and PH Skills Development (3 items), explaining 72.05% variance. Internal consistency and reliability tests were satisfactory (Cronbach's Alpha ranged from 0.87 to 0.90; item-total correlation > 0.59). Lower paired test-retest correlations reflected instability in a social science environment. An evaluation tool for community-centred PH education has been developed and validated. The tool measures PH knowledge, skills, social responsibilities and applied learning as recommended by the internationally recognised Association of Schools & Programmes of Public Health (ASPPH).
NASA Astrophysics Data System (ADS)
Leta, O. T.; Dulai, H.; El-Kadi, A. I.
2017-12-01
Upland soil erosion and sedimentation are the main threats for riparian and coastal reef ecosystems in Pacific islands. Here, due to small size of the watersheds and steep slope, the residence time of rainfall runoff and its suspended load is short. Fagaalu bay, located on the island of Tutuila (American Samoa) has been identified as a priority watershed, due to degraded coral reef condition and reduction of stream water quality from heavy anthropogenic activity yielding high nutrients and sediment loads to the receiving water bodies. This study aimed to estimate the sediment yield to the Fagaalu stream and assess the impact of Best Management Practices (BMP) on sediment yield reduction. For this, the Soil and Water Assessment Tool (SWAT) model was applied, calibrated, and validated for both daily streamflow and sediment load simulation. The model also estimated the sediment yield contributions from existing land use types of Fagaalu and identified soil erosion prone areas for introducing BMP scenarios in the watershed. Then, three BMP scenarios, such as stone bund, retention pond, and filter strip were treated on bare (quarry area), agricultural, and shrub land use types. It was found that the bare land with quarry activity yielded the highest annual average sediment yield of 133 ton per hectare (t ha-1) followed by agriculture (26.1 t ha-1) while the lowest sediment yield of 0.2 t ha-1 was estimated for the forested part of the watershed. Additionally, the bare land area (2 ha) contributed approximately 65% (207 ha) of the watershed's sediment yield, which is 4.0 t ha-1. The latter signifies the high impact as well as contribution of anthropogenic activity on sediment yield. The use of different BMP scenarios generally reduced the sediment yield to the coastal reef of Fagaalu watershed. However, treating the quarry activity area with stone bund showed the highest sediment yield reduction as compared to the other two BMP scenarios. This study provides an estimate of the impact that each BMP has on specific land use and Fagaalu's reef. It also offers information that may be useful for the coastal water resource management and mitigation measures to reduce sediment yield of the study site and similar areas.
Nikolic, D; Richter, S S; Asamoto, K; Wyllie, R; Tuttle, R; Procop, G W
2017-12-01
There is substantial evidence that stool culture and parasitological examinations are of minimal to no value after 3 days of hospitalization. We implemented and studied the impact of a clinical decision support tool (CDST) to decrease the number of unnecessary stool cultures (STCUL), ova/parasite (O&P) examinations, and Giardia / Cryptosporidium enzyme immunoassay screens (GC-EIA) performed for patients hospitalized >3 days. We studied the frequency of stool studies ordered before or on day 3 and after day 3 of hospitalization (i.e., categorical orders/total number of orders) before and after this intervention and denoted the numbers and types of microorganisms detected within those time frames. This intervention, which corresponded to a custom-programmed hard-stop alert tool in the Epic hospital information system, allowed providers to override the intervention by calling the laboratory, if testing was deemed medically necessary. Comparative statistics were employed to determine significance, and cost savings were estimated based on our internal costs. Before the intervention, 129/670 (19.25%) O&P examinations, 47/204 (23.04%) GC-EIA, and 249/1,229 (20.26%) STCUL were ordered after 3 days of hospitalization. After the intervention, 46/521 (8.83%) O&P examinations, 27/157 (17.20%) GC-EIA, and 106/1,028 (10.31%) STCUL were ordered after 3 days of hospitalization. The proportions of reductions in the number of tests performed after 3 days and the associated P values were 54.1% for O&P examinations ( P < 0.0001), 22.58% for GC-EIA ( P = 0.2807), and 49.1% for STCUL ( P < 0.0001). This was estimated to have resulted in $8,108.84 of cost savings. The electronic CDST resulted in a substantial reduction in the number of evaluations of stool cultures and the number of parasitological examinations for patients hospitalized for more than 3 days and in a cost savings while retaining the ability of the clinician to obtain these tests if clinically indicated. Copyright © 2017 American Society for Microbiology.
Adaptive reduction of constitutive model-form error using a posteriori error estimation techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bishop, Joseph E.; Brown, Judith Alice
In engineering practice, models are typically kept as simple as possible for ease of setup and use, computational efficiency, maintenance, and overall reduced complexity to achieve robustness. In solid mechanics, a simple and efficient constitutive model may be favored over one that is more predictive, but is difficult to parameterize, is computationally expensive, or is simply not available within a simulation tool. In order to quantify the modeling error due to the choice of a relatively simple and less predictive constitutive model, we adopt the use of a posteriori model-form error-estimation techniques. Based on local error indicators in the energymore » norm, an algorithm is developed for reducing the modeling error by spatially adapting the material parameters in the simpler constitutive model. The resulting material parameters are not material properties per se, but depend on the given boundary-value problem. As a first step to the more general nonlinear case, we focus here on linear elasticity in which the “complex” constitutive model is general anisotropic elasticity and the chosen simpler model is isotropic elasticity. As a result, the algorithm for adaptive error reduction is demonstrated using two examples: (1) A transversely-isotropic plate with hole subjected to tension, and (2) a transversely-isotropic tube with two side holes subjected to torsion.« less
Adaptive reduction of constitutive model-form error using a posteriori error estimation techniques
Bishop, Joseph E.; Brown, Judith Alice
2018-06-15
In engineering practice, models are typically kept as simple as possible for ease of setup and use, computational efficiency, maintenance, and overall reduced complexity to achieve robustness. In solid mechanics, a simple and efficient constitutive model may be favored over one that is more predictive, but is difficult to parameterize, is computationally expensive, or is simply not available within a simulation tool. In order to quantify the modeling error due to the choice of a relatively simple and less predictive constitutive model, we adopt the use of a posteriori model-form error-estimation techniques. Based on local error indicators in the energymore » norm, an algorithm is developed for reducing the modeling error by spatially adapting the material parameters in the simpler constitutive model. The resulting material parameters are not material properties per se, but depend on the given boundary-value problem. As a first step to the more general nonlinear case, we focus here on linear elasticity in which the “complex” constitutive model is general anisotropic elasticity and the chosen simpler model is isotropic elasticity. As a result, the algorithm for adaptive error reduction is demonstrated using two examples: (1) A transversely-isotropic plate with hole subjected to tension, and (2) a transversely-isotropic tube with two side holes subjected to torsion.« less
GIS Tools to Estimate Average Annual Daily Traffic
DOT National Transportation Integrated Search
2012-06-01
This project presents five tools that were created for a geographical information system to estimate Annual Average Daily : Traffic using linear regression. Three of the tools can be used to prepare spatial data for linear regression. One tool can be...
Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J
2012-01-01
Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, D; Badano, A; Sempau, J
Purpose: Variance reduction techniques (VRTs) are employed in Monte Carlo simulations to obtain estimates with reduced statistical uncertainty for a given simulation time. In this work, we study the bias and efficiency of a VRT for estimating the response of imaging detectors. Methods: We implemented Directed Sampling (DS), preferentially directing a fraction of emitted optical photons directly towards the detector by altering the isotropic model. The weight of each optical photon is appropriately modified to maintain simulation estimates unbiased. We use a Monte Carlo tool called fastDETECT2 (part of the hybridMANTIS open-source package) for optical transport, modified for VRT. Themore » weight of each photon is calculated as the ratio of original probability (no VRT) and the new probability for a particular direction. For our analysis of bias and efficiency, we use pulse height spectra, point response functions, and Swank factors. We obtain results for a variety of cases including analog (no VRT, isotropic distribution), and DS with 0.2 and 0.8 optical photons directed towards the sensor plane. We used 10,000, 25-keV primaries. Results: The Swank factor for all cases in our simplified model converged fast (within the first 100 primaries) to a stable value of 0.9. The root mean square error per pixel for DS VRT for the point response function between analog and VRT cases was approximately 5e-4. Conclusion: Our preliminary results suggest that DS VRT does not affect the estimate of the mean for the Swank factor. Our findings indicate that it may be possible to design VRTs for imaging detector simulations to increase computational efficiency without introducing bias.« less
Doherty, Tanya; Zembe, Wanga; Ngandu, Nobubelo; Kinney, Mary; Manda, Samuel; Besada, Donela; Jackson, Debra; Daniels, Karen; Rohde, Sarah; van Damme, Wim; Kerber, Kate; Daviaud, Emmanuelle; Rudan, Igor; Muniz, Maria; Oliphant, Nicholas P; Zamasiya, Texas; Rohde, Jon; Sanders, David
2015-12-01
Malawi is estimated to have achieved its Millennium Development Goal (MDG) 4 target. This paper explores factors influencing progress in child survival in Malawi including coverage of interventions and the role of key national policies. We performed a retrospective evaluation of the Catalytic Initiative (CI) programme of support (2007-2013). We developed estimates of child mortality using four population household surveys undertaken between 2000 and 2010. We recalculated coverage indicators for high impact child health interventions and documented child health programmes and policies. The Lives Saved Tool (LiST) was used to estimate child lives saved in 2013. The mortality rate in children under 5 years decreased rapidly in the 10 CI districts from 219 deaths per 1000 live births (95% confidence interval (CI) 189 to 249) in the period 1991-1995 to 119 deaths (95% CI 105 to 132) in the period 2006-2010. Coverage for all indicators except vitamin A supplementation increased in the 10 CI districts across the time period 2000 to 2013. The LiST analysis estimates that there were 10 800 child deaths averted in the 10 CI districts in 2013, primarily attributable to the introduction of the pneumococcal vaccine (24%) and increased household coverage of insecticide-treated bednets (19%). These improvements have taken place within a context of investment in child health policies and scale up of integrated community case management of childhood illnesses. Malawi provides a strong example for countries in sub-Saharan Africa of how high impact child health interventions implemented within a decentralised health system with an established community-based delivery platform, can lead to significant reductions in child mortality.
Doherty, Tanya; Zembe, Wanga; Ngandu, Nobubelo; Kinney, Mary; Manda, Samuel; Besada, Donela; Jackson, Debra; Daniels, Karen; Rohde, Sarah; van Damme, Wim; Kerber, Kate; Daviaud, Emmanuelle; Rudan, Igor; Muniz, Maria; Oliphant, Nicholas P; Zamasiya, Texas; Rohde, Jon; Sanders, David
2015-01-01
Background Malawi is estimated to have achieved its Millennium Development Goal (MDG) 4 target. This paper explores factors influencing progress in child survival in Malawi including coverage of interventions and the role of key national policies. Methods We performed a retrospective evaluation of the Catalytic Initiative (CI) programme of support (2007–2013). We developed estimates of child mortality using four population household surveys undertaken between 2000 and 2010. We recalculated coverage indicators for high impact child health interventions and documented child health programmes and policies. The Lives Saved Tool (LiST) was used to estimate child lives saved in 2013. Results The mortality rate in children under 5 years decreased rapidly in the 10 CI districts from 219 deaths per 1000 live births (95% confidence interval (CI) 189 to 249) in the period 1991–1995 to 119 deaths (95% CI 105 to 132) in the period 2006–2010. Coverage for all indicators except vitamin A supplementation increased in the 10 CI districts across the time period 2000 to 2013. The LiST analysis estimates that there were 10 800 child deaths averted in the 10 CI districts in 2013, primarily attributable to the introduction of the pneumococcal vaccine (24%) and increased household coverage of insecticide–treated bednets (19%). These improvements have taken place within a context of investment in child health policies and scale up of integrated community case management of childhood illnesses. Conclusions Malawi provides a strong example for countries in sub–Saharan Africa of how high impact child health interventions implemented within a decentralised health system with an established community–based delivery platform, can lead to significant reductions in child mortality. PMID:26649176
Fischer, Florian; Kraemer, Alexander
2016-02-05
Evidence of the adverse health effects attributable to second-hand smoke (SHS) exposure is available. This study aims to quantify the impact of SHS exposure on ischaemic heart diseases (IHD), chronic obstructive pulmonary diseases (COPD), and stroke in Germany. Therefore, this study estimated and forecasted the morbidity for the three outcomes in the German population. Furthermore, a health impact assessment was performed using DYNAMO-HIA, which is a generic software tool applying a Markov model. Overall 687,254 IHD cases, 231,973 COPD cases, and 288,015 stroke cases were estimated to be attributable to SHS exposure in Germany for 2014. Under the assumption that the population prevalence of these diseases and the prevalence of SHS exposure remain constant, the total number of cases will increase due to demographic aging. Assuming a total eradication of SHS exposure beginning in 2014 leads to an estimated reduction of 50% in cases, compared to the reference scenario in 2040 for all three diseases. The results highlight the relevance of SHS exposure because it affects several chronic disease conditions and has a major impact on the population's health. Therefore, public health campaigns to protect non-smokers are urgently needed.
Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela
2013-05-01
Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% <20%. In conclusion, our MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Schämann, M.; Bücker, M.; Hessel, S.; Langmann, U.
2008-05-01
High data rates combined with high mobility represent a challenge for the design of cellular devices. Advanced algorithms are required which result in higher complexity, more chip area and increased power consumption. However, this contrasts to the limited power supply of mobile devices. This presentation discusses the application of an HSDPA receiver which has been optimized regarding power consumption with the focus on the algorithmic and architectural level. On algorithmic level the Rake combiner, Prefilter-Rake equalizer and MMSE equalizer are compared regarding their BER performance. Both equalizer approaches provide a significant increase of performance for high data rates compared to the Rake combiner which is commonly used for lower data rates. For both equalizer approaches several adaptive algorithms are available which differ in complexity and convergence properties. To identify the algorithm which achieves the required performance with the lowest power consumption the algorithms have been investigated using SystemC models regarding their performance and arithmetic complexity. Additionally, for the Prefilter Rake equalizer the power estimations of a modified Griffith (LMS) and a Levinson (RLS) algorithm have been compared with the tool ORINOCO supplied by ChipVision. The accuracy of this tool has been verified with a scalable architecture of the UMTS channel estimation described both in SystemC and VHDL targeting a 130 nm CMOS standard cell library. An architecture combining all three approaches combined with an adaptive control unit is presented. The control unit monitors the current condition of the propagation channel and adjusts parameters for the receiver like filter size and oversampling ratio to minimize the power consumption while maintaining the required performance. The optimization strategies result in a reduction of the number of arithmetic operations up to 70% for single components which leads to an estimated power reduction of up to 40% while the BER performance is not affected. This work utilizes SystemC and ORINOCO for the first estimation of power consumption in an early step of the design flow. Thereby algorithms can be compared in different operating modes including the effects of control units. Here an algorithm having higher peak complexity and power consumption but providing more flexibility showed less consumption for normal operating modes compared to the algorithm which is optimized for peak performance.
TRACI 2.0 - The Tool for the Reduction and Assessment of Chemical and other environmental Impacts
TRACI 2.0, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts 2.0, has been expanded and developed for sustainability metrics, life cycle impact assessment, industrial ecology, and process design impact assessment for developing increasingly sus...
TRACI 2.1 (the Tool for the Reduction and Assessment of Chemical and other environmental Impacts) has been developed for sustainability metrics, life cycle impact assessment, industrial ecology, and process design impact assessment for developing increasingly sustainable products...
Energy-Saving Melting and Revert Reduction Technology (E-SMARRT): Final Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Thornton C
2014-03-31
Energy-Saving Melting and Revert Reduction Technology (E-SMARRT) is a balanced portfolio of R&D tasks that address energy-saving opportunities in the metalcasting industry. E-SMARRT was created to: • Improve important capabilities of castings • Reduce carbon footprint of the foundry industry • Develop new job opportunities in manufacturing • Significantly reduce metalcasting process energy consumption and includes R&D in the areas of: • Improvements in Melting Efficiency • Innovative Casting Processes for Yield Improvement/Revert Reduction • Instrumentation and Control Improvement • Material properties for Casting or Tooling Design Improvement The energy savings and process improvements developed under E-SMARRT have been mademore » possible through the unique collaborative structure of the E-SMARRT partnership. The E-SMARRT team consisted of DOE’s Office of Industrial Technology, the three leading metalcasting technical associations in the U.S: the American Foundry Society; the North American Die Casting Association; and the Steel Founders’ Society of America; and SCRA Applied R&D, doing business as the Advanced Technology Institute (ATI), a recognized leader in distributed technology management. This team provided collaborative leadership to a complex industry composed of approximately 2,000 companies, 80% of which employ less than 100 people, and only 4% of which employ more than 250 people. Without collaboration, these new processes and technologies that enable energy efficiencies and environment-friendly improvements would have been slow to develop and had trouble obtaining a broad application. The E-SMARRT R&D tasks featured low-threshold energy efficiency improvements that are attractive to the domestic industry because they do not require major capital investment. The results of this portfolio of projects are significantly reducing metalcasting process energy consumption while improving the important capabilities of metalcastings. Through June 2014, the E-SMARRT program predicts an average annual estimated savings of 59 Trillion BTUs per year over a 10 year period through Advanced Melting Efficiencies and Innovative Casting Processes. Along with these energy savings, an estimated average annual estimate of CO2 reduction per year over a ten year period is 3.56 Million Metric Tons of Carbon Equivalent (MM TCE).« less
CALCULATIONAL TOOL FOR SKIN CONTAMINATION DOSE ESTIMATE
DOE Office of Scientific and Technical Information (OSTI.GOV)
HILL, R.L.
2005-03-31
A spreadsheet calculational tool was developed to automate the calculations performed for estimating dose from skin contamination. This document reports on the design and testing of the spreadsheet calculational tool.
CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 4
2005-04-01
older automated cost- estimating tools are no longer being actively marketed but are still in use such as CheckPoint, COCOMO, ESTIMACS, REVIC, and SPQR ...estimation tools: SPQR /20, Checkpoint, and Knowl- edgePlan. These software estimation tools pioneered the use of function point metrics for sizing and
National Stormwater Calculator: Low Impact Development ...
The National Stormwater Calculator (NSC) makes it easy to estimate runoff reduction when planning a new development or redevelopment site with low impact development (LID) stormwater controls. The Calculator is currently deployed as a Windows desktop application. The Calculator is organized as a wizard style application that walks the user through the steps necessary to perform runoff calculations on a single urban sub-catchment of 10 acres or less in size. Using an interactive map, the user can select the sub-catchment location and the Calculator automatically acquires hydrologic data for the site.A new LID cost estimation module has been developed for the Calculator. This project involved programming cost curves into the existing Calculator desktop application. The integration of cost components of LID controls into the Calculator increases functionality and will promote greater use of the Calculator as a stormwater management and evaluation tool. The addition of the cost estimation module allows planners and managers to evaluate LID controls based on comparison of project cost estimates and predicted LID control performance. Cost estimation is accomplished based on user-identified size (or auto-sizing based on achieving volume control or treatment of a defined design storm), configuration of the LID control infrastructure, and other key project and site-specific variables, including whether the project is being applied as part of new development or redevelopm
Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S
2015-03-15
The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.
Brown, Zachary S; Kramer, Randall A; Ocan, David; Oryema, Christine
2016-10-06
Insecticide-based tools remain critical for controlling vector-borne diseases in Uganda. Securing public support from targeted populations for such tools is an important component in sustaining their long-run effectiveness. Yet little quantitative evidence is available on the perceived benefits and costs of vector control programmes among targeted households. A survey was administered to a clustered random sample of 612 households in Gulu and Oyam districts of northern Uganda during a period of very high malaria transmission and following a pilot indoor residual spray (IRS) programme. A discrete choice experiment was conducted within the survey, in which respondents indicated their preferences for different IRS programmes relative to money compensation in a series of experimentally controlled, hypothetical choice sets. The data were analysed using conditional logit regression models to estimate respondents' willingness to accept (WTA) some amount of money compensation in lieu of foregone malaria risk reductions. Latent class models were used to analyse whether respondent characteristics predicted WTA. Average WTA is estimated at $8.94 annually for a 10 % reduction in malaria risk, and additional co-benefits of IRS were estimated to be worth on average $54-$56 (depending on insecticide type) per round of IRS. Significant heterogeneity is observed: Four in five household heads in northern Uganda have high valuations for IRS programmes, while the remaining 20 % experience costly side effects of IRS (valued at between $2 and $3 per round). Statistically significant predictors of belonging to the high-value group include respondent gender, mean age of household members, participation in previous IRS, basic knowledge of mosquito reproduction, and the number of mosquito nets owned. Proxies for household income and wealth are not found to be statistically significant predictors of WTA. This study suggests that the majority of people in areas of high malaria transmission like northern Uganda place a high value on vector control programmes using IRS. However, there is significant heterogeneity in terms of the perceived side effects (positive and negative). This has implications for sustaining public support for these programmes in the long-term.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holttinen, Hannele; Kiviluoma, Juha; McCann, John
2015-10-05
This paper presents ways of estimating CO2 reductions of wind power using different methodologies. Estimates based on historical data have more pitfalls in methodology than estimates based on dispatch simulations. Taking into account exchange of electricity with neighboring regions is challenging for all methods. Results for CO2 emission reductions are shown from several countries. Wind power will reduce emissions for about 0.3-0.4 MtCO2/MWh when replacing mainly gas and up to 0.7 MtCO2/MWh when replacing mainly coal powered generation. The paper focuses on CO2 emissions from power system operation phase, but long term impacts are shortly discussed.
NASA Instrument Cost/Schedule Model
NASA Technical Reports Server (NTRS)
Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George
2011-01-01
NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.
Characterizing health plan price estimator tools: findings from a national survey.
Higgins, Aparna; Brainard, Nicole; Veselovskiy, German
2016-02-01
Policy makers have growing interest in price transparency and in the kinds of tools available to consumers. Health plans have implemented price estimator tools that make provider pricing information available to members; however, systematic data on prevalence and characteristics of such tools are limited. The purpose of this study was to describe the characteristics of price estimator tools offered by health plans to their members and to identify potential trends, challenges, and opportunities for advancing the utility of these tools. National Web-based survey. Between 2014 and 2015, we conducted a national Web-based survey of health plans with commercial enrollment (100 plans, 43% response rate). Descriptive analyses were conducted using survey data. Health plan members have access to a variety of price estimator tool capabilities for commonly used procedures. These tools take into account member characteristics, including member zip code and benefit design. Despite outreach to members, however, challenges remain with respect to member uptake of such tools. Our study found that health plans share price and provider performance data with their members.
Joffres, Michel R; Campbell, Norm R C; Manns, Braden; Tu, Karen
2007-05-01
Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension prevalence and result in health care cost savings in Canada.
Joffres, Michel R; Campbell, Norm RC; Manns, Braden; Tu, Karen
2007-01-01
BACKGROUND: Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. OBJECTIVES: To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. METHODS: Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. RESULTS: Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. CONCLUSIONS: Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension prevalence and result in health care cost savings in Canada. PMID:17487286
The Tool for the Reduction and Assessment of Chemical and other environmental Impacts (TRACI) was developed to allow the quantification of environmental impacts for a variety of impact categories which are necessary for a comprehensive impact assessment. See Figure 1. TRACI is c...
Reed, Shelby D.; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L.; Bowers, Margaret T.; Samsa, Gregory P.; Paul, Sara; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara J.
2011-01-01
Background Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Methods and Results Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers or health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. Conclusions The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions. PMID:22147884
Dong, Xing; Zhang, Kevin; Ren, Yuan; Wilson, Reda; O'Neil, Mary Elizabeth
2016-01-01
Studying population-based cancer survival by leveraging the high-quality cancer incidence data collected by the Centers for Disease Control and Prevention's National Program of Cancer Registries (NPCR) can offer valuable insight into the cancer burden and impact in the United States. We describe the development and validation of a SASmacro tool that calculates population-based cancer site-specific relative survival estimates comparable to those obtained through SEER*Stat. The NPCR relative survival analysis SAS tool (NPCR SAS tool) was developed based on the relative survival method and SAS macros developed by Paul Dickman. NPCR cancer incidence data from 25 states submitted in November 2012 were used, specifically cases diagnosed from 2003 to 2010 with follow-up through 2010. Decennial and annual complete life tables published by the National Center for Health Statistics (NCHS) for 2000 through 2009 were used. To assess comparability between the 2 tools, 5-year relative survival rates were calculated for 25 cancer sites by sex, race, and age group using the NPCR SAS tool and the National Cancer Institute's SEER*Stat 8.1.5 software. A module to create data files for SEER*Stat was also developed for the NPCR SAS tool. Comparison of the results produced by both SAS and SEER*Stat showed comparable and reliable relative survival estimates for NPCR data. For a majority of the sites, the net differences between the NPCR SAS tool and SEER*Stat-produced relative survival estimates ranged from -0.1% to 0.1%. The estimated standard errors were highly comparable between the 2 tools as well. The NPCR SAS tool will allow researchers to accurately estimate cancer 5-year relative survival estimates that are comparable to those produced by SEER*Stat for NPCR data. Comparison of output from the NPCR SAS tool and SEER*Stat provided additional quality control capabilities for evaluating data prior to producing NPCR relative survival estimates.
MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loeffler, Frank E.
2014-12-31
Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application ofmore » a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.« less
Hade, Erinn M; Murray, David M; Pennell, Michael L; Rhoda, Dale; Paskett, Electra D; Champion, Victoria L; Crabtree, Benjamin F; Dietrich, Allen; Dignan, Mark B; Farmer, Melissa; Fenton, Joshua J; Flocke, Susan; Hiatt, Robert A; Hudson, Shawna V; Mitchell, Michael; Monahan, Patrick; Shariff-Marco, Salma; Slone, Stacey L; Stange, Kurt; Stewart, Susan L; Strickland, Pamela A Ohman
2010-01-01
Screening has become one of our best tools for early detection and prevention of cancer. The group-randomized trial is the most rigorous experimental design for evaluating multilevel interventions. However, identifying the proper sample size for a group-randomized trial requires reliable estimates of intraclass correlation (ICC) for screening outcomes, which are not available to researchers. We present crude and adjusted ICC estimates for cancer screening outcomes for various levels of aggregation (physician, clinic, and county) and provide an example of how these ICC estimates may be used in the design of a future trial. Investigators working in the area of cancer screening were contacted and asked to provide crude and adjusted ICC estimates using the analysis of variance method estimator. Of the 29 investigators identified, estimates were obtained from 10 investigators who had relevant data. ICC estimates were calculated from 13 different studies, with more than half of the studies collecting information on colorectal screening. In the majority of cases, ICC estimates could be adjusted for age, education, and other demographic characteristics, leading to a reduction in the ICC. ICC estimates varied considerably by cancer site and level of aggregation of the groups. Previously, only two articles had published ICCs for cancer screening outcomes. We have complied more than 130 crude and adjusted ICC estimates covering breast, cervical, colon, and prostate screening and have detailed them by level of aggregation, screening measure, and study characteristics. We have also demonstrated their use in planning a future trial and the need for the evaluation of the proposed interval estimator for binary outcomes under conditions typically seen in GRTs.
Northern Hemisphere observations of ICRF sources on the USNO stellar catalogue frame
NASA Astrophysics Data System (ADS)
Fienga, A.; Andrei, A. H.
2004-06-01
The most recent USNO stellar catalogue, the USNO B1.0 (Monet et al. \\cite{Monet03}), provides positions for 1 042 618 261 objects, with a published astrometric accuracy of 200 mas and five-band magnitudes with a 0.3 mag accuracy. Its completeness is believed to be up to magnitude 21th in V-band. Such a catalogue would be a very good tool for astrometric reduction. This work investigates the accuracy of the USNO B1.0 link to ICRF and give an estimation of its internal and external accuracies by comparison with different catalogues, and by computation of ICRF sources using USNO B1.0 star positions.
International Space Station Noise Constraints Flight Rule Process
NASA Technical Reports Server (NTRS)
Limardo, Jose G.; Allen, Christopher S.; Danielson, Richard W.
2014-01-01
Crewmembers onboard the International Space Station (ISS) live in a unique workplace environment for as long as 6 -12 months. During these long-duration ISS missions, noise exposures from onboard equipment are posing concerns for human factors and crewmember health risks, such as possible reductions in hearing sensitivity, disruptions of crew sleep, interference with speech intelligibility and voice communications, interference with crew task performance, and reduced alarm audibility. The purpose of this poster is to describe how a recently-updated noise constraints flight rule is being used to implement a NASA-created Noise Exposure Estimation Tool and Noise Hazard Inventory to predict crew noise exposures and recommend when hearing protection devices are needed.
System Modeling of Lunar Oxygen Production: Mass and Power Requirements
NASA Technical Reports Server (NTRS)
Steffen, Christopher J.; Freeh, Joshua E.; Linne, Diane L.; Faykus, Eric W.; Gallo, Christopher A.; Green, Robert D.
2007-01-01
A systems analysis tool for estimating the mass and power requirements for a lunar oxygen production facility is introduced. The individual modeling components involve the chemical processing and cryogenic storage subsystems needed to process a beneficiated regolith stream into liquid oxygen via ilmenite reduction. The power can be supplied from one of six different fission reactor-converter systems. A baseline system analysis, capable of producing 15 metric tons of oxygen per annum, is presented. The influence of reactor-converter choice was seen to have a small but measurable impact on the system configuration and performance. Finally, the mission concept of operations can have a substantial impact upon individual component size and power requirements.
Evaluation of the cost-effectiveness of evolocumab in the FOURIER study: a Canadian analysis.
Lee, Todd C; Kaouache, Mohammed; Grover, Steven A
2018-04-03
Evolocumab, a proprotein convertase subtilisin-kexin type 9 (PCSK9) inhibitor, has been shown to reduce low-density lipoprotein levels by up to 60%. Despite the absence of a reduction in overall or cardiovascular mortality in the Further Cardiovascular Outcomes Research With PCSK9 Inhibition in Subjects With Elevated Risk (FOURIER) trial, some believe that, with longer treatment, such a benefit might eventually be realized. Our aim was to estimate the potential mortality benefit over a patient's lifetime and the cost per year of life saved (YOLS) for an average Canadian with established coronary artery disease. We also sought to estimate the price threshold at which evolocumab might be considered cost-effective for secondary prevention in Canada. We calibrated the Cardio-metabolic Model, a well-validated tool for predicting cardiovascular events and life expectancy, to the reduction in nonfatal events seen in the FOURIER trial. Assuming that long-term treatment will eventually result in mortality benefits, we estimated YOLSs and cost per YOLS with evolocumab treatment plus a statin compared to a statin alone. We then estimated the annual drug costs that would provide a 50% chance of being cost-effective at willingness-to-pay values of $50 000 and $100 000. In secondary prevention in patients similar to those in the FOURIER study, evolocumab treatment would save an average of 0.34 (95% confidence interval [CI] 0.27-0.41) life-years at a cost of $101 899 (95% CI $97 325-$106 473), yielding a cost per YOLS of $299 482. We estimate that to have a 50% probability of achieving a cost per YOLS below $50 000 and $100 000 would require annual drug costs below $1200 and $2300, respectively. At current pricing, the use of evolocumab for secondary prevention is unlikely to be cost-effective in Canada. Copyright 2018, Joule Inc. or its licensors.
NASA Astrophysics Data System (ADS)
Giama, E.; Papadopoulos, A. M.
2018-01-01
The reduction of carbon emissions has become a top priority in the decision-making process for governments and companies, the strict European legislation framework being a major driving force behind this effort. On the other hand, many companies face difficulties in estimating their footprint and in linking the results derived from environmental evaluation processes with an integrated energy management strategy, which will eventually lead to energy-efficient and cost-effective solutions. The paper highlights the need of companies to establish integrated environmental management practices, with tools such as carbon footprint analysis to monitor the energy performance of production processes. Concepts and methods are analysed, and selected indicators are presented by means of benchmarking, monitoring and reporting the results in order to be used effectively from the companies. The study is based on data from more than 90 Greek small and medium enterprises, followed by a comprehensive discussion of cost-effective and realistic energy-saving measures.
Probabilistic cost-benefit analysis of disaster risk management in a development context.
Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan
2013-07-01
Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.
Center for Corporate Climate Leadership Goal Setting
EPA provides tools and recognition for companies setting aggressive GHG reduction goals, which can galvanize reduction efforts at a company and often leads to the identification of many additional reduction opportunities.
Distribution of new HIV infections among key risk population groups in Togo.
Landoh, Dadja Essoya; Maboudou, Angèle Akouavi; Deku, Kodzo; Pitche, Palokinam Vincent
2014-01-01
Good data on the epidemiology of modes of transmission of HIV among population at risk are important for development of prevention strategies, and resource allocation for the implementation of the interventions. We sought to estimate new HIV infections among key risk groups in Togo. We conducted a systematic review of epidemiological data on HIV and AIDS as part of the HIV control strategies in Togo from 2001 to 2012 following the PRISMA guidelines. We used the Mode of Transmission (MoT) modelling tool to estimate the incidence of new HIV infections in high risk groups. The MoT tool was developed and validated by UNAIDS and implemented by several countries using data on the HIV epidemic to estimate new HIV infections that will appear in the core groups. We used Epi-MoT tool to assess the availability and the quality of data. A score of availability of data over 50% and the quality over 1.5 were required to proceed to the MoT analysis. Uncertainty analysis to assess the reliability of the results was performed. Incidence of new HIV infections was estimated at 6,643 (95% CI = 5274, 9005) with an incidence rate of 203 per 1,000,000 inhabitants. The proportion of new HIV infections was 61.9% (95% CI = 46.2 to 71.7) in stable heterosexual couples compare to 14.01% (95% CI = 7.2 to 23.3) in people having casual sex. In high-risk groups new HIV infections accounted for 2.4% among sex workers (SWs) (95% CI = 1.2 - 4.1), 7.9% among clients of SWs (95% CI = 3.9-14.1) and 6.9% among men who have sex with men (MSM) (95% CI = 3.1 to 13.1). We describe the prediction of the HIV epidemic with a large contribution of stable heterosexual couples in the occurrence of new infections. But HIV incidence remains high in key risk populations. Innovative strategies for risk reduction should be strengthened to reduce the transmission especially in stable heterosexual couples.
Distribution of new HIV infections among key risk population groups in Togo
Landoh, Dadja Essoya; Maboudou, Angèle Akouavi; Deku, Kodzo; Pitche, Palokinam Vincent
2014-01-01
Introduction Good data on the epidemiology of modes of transmission of HIV among population at risk are important for development of prevention strategies, and resource allocation for the implementation of the interventions. We sought to estimate new HIV infections among key risk groups in Togo. Methods We conducted a systematic review of epidemiological data on HIV and AIDS as part of the HIV control strategies in Togo from 2001 to 2012 following the PRISMA guidelines. We used the Mode of Transmission (MoT) modelling tool to estimate the incidence of new HIV infections in high risk groups. The MoT tool was developed and validated by UNAIDS and implemented by several countries using data on the HIV epidemic to estimate new HIV infections that will appear in the core groups. We used Epi-MoT tool to assess the availability and the quality of data. A score of availability of data over 50% and the quality over 1.5 were required to proceed to the MoT analysis. Uncertainty analysis to assess the reliability of the results was performed. Results Incidence of new HIV infections was estimated at 6,643 (95% CI = 5274, 9005) with an incidence rate of 203 per 1,000,000 inhabitants. The proportion of new HIV infections was 61.9% (95% CI = 46.2 to 71.7) in stable heterosexual couples compare to 14.01% (95% CI = 7.2 to 23.3) in people having casual sex. In high-risk groups new HIV infections accounted for 2.4% among sex workers (SWs) (95% CI = 1.2 - 4.1), 7.9% among clients of SWs (95% CI = 3.9-14.1) and 6.9% among men who have sex with men (MSM) (95% CI = 3.1 to 13.1). Conclusion We describe the prediction of the HIV epidemic with a large contribution of stable heterosexual couples in the occurrence of new infections. But HIV incidence remains high in key risk populations. Innovative strategies for risk reduction should be strengthened to reduce the transmission especially in stable heterosexual couples. PMID:25922630
Estimation of tool wear during CNC milling using neural network-based sensor fusion
NASA Astrophysics Data System (ADS)
Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.
2007-01-01
Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.
Abramov, Vladimir O; Abramova, Anna V; Bayazitov, Vadim M; Mullakaev, Marat S; Marnosov, Alexandr V; Ildiyakov, Alexandr V
2017-03-01
Reduction of oil viscosity is of great importance for the petroleum industry since it contributes a lot to the facilitation of pipeline transportation of oil. This study analyzes the capability of acoustic waves to decrease the viscosity of oil during its commercial production. Three types of equipment were tested: an ultrasonic emitter that is located directly in the well and affects oil during its production and two types of acoustic machines to be located at the wellhead and perform acoustic treatment after oil extraction: a setup for ultrasonic hydrodynamic treatment and a flow-through ultrasonic reactor. In our case, the two acoustic machines were rebuilt and tested in the laboratory. The viscosity of oil was measured before and after both types of acoustic treatment; and 2, 24 and 48h after ultrasonic treatment and 1 and 4h after hydrodynamic treatment in order to estimate the constancy of viscosity reduction. The viscosity reduction achieved by acoustic waves was compared to the viscosity reduction achieved by acoustic waves jointly with solvents. It was shown, that regardless of the form of powerful acoustic impact, a long lasting decrease in viscosity can be obtained only if sonochemical treatment is used. Using sonochemical treatment based on ultrasonic hydrodynamic treatment a viscosity reduction by 72,46% was achieved. However, the reduction in viscosity by 16%, which was demonstrated using the ultrasonic downhole tool in the well without addition of chemicals, is high enough to facilitate the production of viscous hydrocarbons. Copyright © 2016 Elsevier B.V. All rights reserved.
Health Gain by Salt Reduction in Europe: A Modelling Study
Hendriksen, Marieke A. H.; van Raaij, Joop M. A.; Geleijnse, Johanna M.; Breda, Joao; Boshuizen, Hendriek C.
2015-01-01
Excessive salt intake is associated with hypertension and cardiovascular diseases. Salt intake exceeds the World Health Organization population nutrition goal of 5 grams per day in the European region. We assessed the health impact of salt reduction in nine European countries (Finland, France, Ireland, Italy, Netherlands, Poland, Spain, Sweden and United Kingdom). Through literature research we obtained current salt intake and systolic blood pressure levels of the nine countries. The population health modeling tool DYNAMO-HIA including country-specific disease data was used to predict the changes in prevalence of ischemic heart disease and stroke for each country estimating the effect of salt reduction through its effect on blood pressure levels. A 30% salt reduction would reduce the prevalence of stroke by 6.4% in Finland to 13.5% in Poland. Ischemic heart disease would be decreased by 4.1% in Finland to 8.9% in Poland. When salt intake is reduced to the WHO population nutrient goal, it would reduce the prevalence of stroke from 10.1% in Finland to 23.1% in Poland. Ischemic heart disease would decrease by 6.6% in Finland to 15.5% in Poland. The number of postponed deaths would be 102,100 (0.9%) in France, and 191,300 (2.3%) in Poland. A reduction of salt intake to 5 grams per day is expected to substantially reduce the burden of cardiovascular disease and mortality in several European countries. PMID:25826317
The Toxicity Estimation Software Tool (T.E.S.T.)
The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to estimate toxicological values for aquatic and mammalian species considering acute and chronic endpoints for screening purposes within TSCA and REACH programs.
Are electronic nicotine delivery systems an effective smoking cessation tool?
Lam, Christine; West, Andrew
2015-01-01
Recent studies have estimated that 21% of all deaths over the past decade are due to smoking, making it the leading cause of premature death in Canada. To date, many steps have been taken to eradicate the global epidemic of tobacco smoking. Most recently, electronic nicotine delivery systems (ENDS) have become a popular smoking cessation tool. ENDS do not burn or use tobacco leaves, but instead vapourize a solution the user then inhales. The main constituents of the solution, in addition to nicotine when nicotine is present, are propylene glycol, with or without glycerol and flavouring agents. Currently, ENDS are not regulated, and have become a controversial topic. To determine whether ENDS are an effective smoking cessation tool. A systematic literature search was conducted in February 2015 using the following databases: PubMed, Scopus and Web of Science Core Collection. Randomized controlled trials were the only publications included in the search. A secondary search was conducted by reviewing the references of relevant publications. After conducting the primary and secondary search, 109 publications were identified. After applying all inclusion and exclusion criteria through abstract and full-text review, four publications were included in the present literature review. A low risk of bias was established for each included study using the Cochrane Collaboration risk of bias evaluation framework. The primary outcome measured in all studies was self-reported abstinence or reduction from smoking. In three of the four studies, self-reported abstinence or reduction from smoking was verified by measuring exhaled carbon monoxide. In the remaining study, the primary outcome measured was self-reported desire to smoke and measured desire to smoke. All four studies showed promise that ENDS are an effective smoking cessation tool. While all publications included in the present review revealed that ENDS are effective smoking cessation aid, further evaluation of the potential health effects in long-term use of ENDS remains vital.
ELER software - a new tool for urban earthquake loss assessment
NASA Astrophysics Data System (ADS)
Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.
2010-12-01
Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.
Costs and benefits of tool-use on the perception of reachable space.
Bourgeois, Jérémy; Farnè, Alessandro; Coello, Yann
2014-05-01
Previous studies have shown that using a tool modifies in a short time-scale both near-body space perception and arm-length representation in the body schema. However, to date no research has specifically investigated the effect of tool-use on an action-related perceptual task. We report here a study assessing the effect of tool-use on the perception of reachable space for perceptual estimates made in reference to either the tool or the hand. Using the tool on distal objects resulted in an extension of perceived reachable space with the tool and reduced the variability of reachability estimates. Tool use also extended perceived reachable space with the hand, but with a concomitant increase of the variability of reachability estimates. These findings suggest that tool incorporation into the represented arm following tool-use improves the anticipation of action possibilities with the tool, while hand representation becomes less accurate. Copyright © 2014 Elsevier B.V. All rights reserved.
TEST (Toxicity Estimation Software Tool) Ver 4.1
The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...
Cooper, Jennifer N; Lodwick, Daniel L; Adler, Brent; Lee, Choonsik; Minneci, Peter C; Deans, Katherine J
2017-06-01
Computed tomography (CT) is a widely used diagnostic tool in pediatric medicine. However, due to concerns regarding radiation exposure, it is essential to identify patient characteristics associated with higher radiation burden from CT imaging, in order to more effectively target efforts towards dose reduction. Our objective was to identify the effects of various demographic and clinical patient characteristics on radiation exposure from single abdomen/pelvis CT scans in children. CT scans performed at our institution between January 2013 and August 2015 in patients under 16 years of age were processed using a software tool that estimates patient-specific organ and effective doses and merges these estimates with data from the electronic health record and billing record. Quantile regression models at the 50th, 75th, and 90th percentiles were used to estimate the effects of patients' demographic and clinical characteristics on effective dose. 2390 abdomen/pelvis CT scans (median effective dose 1.52mSv) were included. Of all characteristics examined, only older age, female gender, higher BMI, and whether the scan was a multiphase exam or an exam that required repeating for movement were significant predictors of higher effective dose at each quantile examined (all p<0.05). The effects of obesity and multiphase or repeat scanning on effective dose were magnified in higher dose scans. Older age, female gender, obesity, and multiphase or repeat scanning are all associated with increased effective dose from abdomen/pelvis CT. Targeted efforts to reduce dose from abdominal CT in these groups should be undertaken. Copyright © 2017 Elsevier Ltd. All rights reserved.
Toxicity Estimation Software Tool (TEST)
The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...
NASA Astrophysics Data System (ADS)
Kiamehr, Saeed; Ahmed, Hesham; Viswanathan, Nurni; Seetharaman, Seshadri
2017-06-01
Knowledge of the effective thermal diffusivity changes of systems undergoing reactions where heat transfer plays an important role in the reaction kinetics is essential for process understanding and control. Carbothermic reduction process of magnetite containing composites is a typical example of such systems. The reduction process in this case is highly endothermic and hence, the overall rate of the reaction is greatly influenced by the heat transfer through composite compact. Using Laser-Flash method, the change of effective thermal diffusivity of magnetite-graphite composite pellet was monitored in the dynamic mode over a pre-defined thermal cycle (heating at the rate of 7 K/min to 1423 K (1150 °C), holding the sample for 270 minutes at this temperature and then cooling it down to the room temperature at the same rate as heating). These measurements were supplemented by Thermogravimetric Analysis under comparable experimental conditions as well as quenching tests of the samples in order to combine the impact of various factors such as sample dilatations and changes in apparent density on the progress of the reaction. The present results show that monitoring thermal diffusivity changes during the course of reduction would be a very useful tool in a total understanding of the underlying physicochemical phenomena. At the end, effort is made to estimate the apparent thermal conductivity values based on the measured thermal diffusivity and dilatations.
ExMC Work Prioritization Process
NASA Technical Reports Server (NTRS)
Simon, Matthew
2015-01-01
Last year, NASA's Human Research Program (HRP) introduced the concept of a "Path to Risk Reduction" (PRR), which will provide a roadmap that shows how the work being done within each HRP element can be mapped to reducing or closing exploration risks. Efforts are currently underway within the Exploration Medical Capability (ExMC) Element to develop a structured, repeatable process for prioritizing work utilizing decision analysis techniques and risk estimation tools. The goal of this effort is to ensure that the work done within the element maximizes risk reduction for future exploration missions in a quantifiable way and better aligns with the intent and content of the Path to Risk Reduction. The Integrated Medical Model (IMM) will be used to identify those conditions that are major contributors of medical risk for a given design reference mission. For each of these conditions, potential prevention, screening, diagnosis, and treatment methods will be identified. ExMC will then aim to prioritize its potential investments in these mitigation methods based upon their potential for risk reduction and other factors such as vehicle performance impacts, near term schedule needs, duplication with external efforts, and cost. This presentation will describe the process developed to perform this prioritization and inform investment discussions in future element planning efforts. It will also provide an overview of the required input information, types of process participants, figures of merit, and the expected outputs of the process.
Targeting Forest Management through Fire and Erosion Modeling
NASA Astrophysics Data System (ADS)
Elliot, William J.; Miller, Mary Ellen; MacDonald, Lee H.
2013-04-01
Forests deliver a number of ecosystem services, including clean water. When forests are disturbed by wildfire, the timing and quantity of runoff can be altered, and the quality can be severely degraded. A modeling study for about 1500 km2 in the Upper Mokelumne River Watershed in California was conducted to determine the risk of wildfire and the associated potential sediment delivery should a wildfire occur, and to calculate the potential reduction in sediment delivery that might result from fuel reduction treatments. The first step was to predict wildfire severity and probability of occurrence under current vegetation conditions with FlamMap fire prediction tool. FlamMap uses current vegetation, topography, and wind characteristics to predict the speed, flame length, and direction of a simulated flame front for each 30-m pixel. As the first step in the erosion modeling, a geospatial interface for the WEPP model (GeoWEPP) was used to delineate approximately 6-ha hillslope polygons for the study area. The flame length values from FlamMap were then aggregated for each hillslope polygon to yield a predicted fire intensity. Fire intensity and pre-fire vegetation conditions were used to estimate fire severity (either unburned, low, moderate or high). The fire severity was combined with soil properties from the STATSGO database to build the vegetation and soil files needed to run WEPP for each polygon. Eight different stochastic climates were generated to account for the weather variability within the basin. A modified batching version of GeoWEPP was used to predict the first-year post-fire sediment yield from each hillslope and subwatershed. Estimated sediment yields ranged from 0 to more than 100 Mg/ha, and were typical of observed values. The polygons that generated the greatest amount of sediment or that were critical for reducing fire spread were identified, and these were "treated" by reducing the amount of fuel available for a wildfire. The erosion associated with these fuel treatments was estimated using WEPP. FlamMap and WEPP were run a second time to determine the extent to which the imposed treatments reduced fire intensity, fire severity, and the predicted sediment yields. The results allowed managers to quantify the net reduction in sediment delivery due to the prescribed treatments. The modeling also identified those polygons with the greatest net decline in sediment delivery, with the expectation that these polygons would have the highest priority for fuel reduction treatments. An economic value can be assigned to the predicted net change in sediment delivered to a reservoir or a specified decline in water quality. The estimated avoided costs due to the reduction in sediment delivery can help justify the optimized fuel treatments.
Optimal Measurement Interval for Emergency Department Crowding Estimation Tools.
Wang, Hao; Ojha, Rohit P; Robinson, Richard D; Jackson, Bradford E; Shaikh, Sajid A; Cowden, Chad D; Shyamanand, Rath; Leuck, JoAnna; Schrader, Chet D; Zenarosa, Nestor R
2017-11-01
Emergency department (ED) crowding is a barrier to timely care. Several crowding estimation tools have been developed to facilitate early identification of and intervention for crowding. Nevertheless, the ideal frequency is unclear for measuring ED crowding by using these tools. Short intervals may be resource intensive, whereas long ones may not be suitable for early identification. Therefore, we aim to assess whether outcomes vary by measurement interval for 4 crowding estimation tools. Our eligible population included all patients between July 1, 2015, and June 30, 2016, who were admitted to the JPS Health Network ED, which serves an urban population. We generated 1-, 2-, 3-, and 4-hour ED crowding scores for each patient, using 4 crowding estimation tools (National Emergency Department Overcrowding Scale [NEDOCS], Severely Overcrowded, Overcrowded, and Not Overcrowded Estimation Tool [SONET], Emergency Department Work Index [EDWIN], and ED Occupancy Rate). Our outcomes of interest included ED length of stay (minutes) and left without being seen or eloped within 4 hours. We used accelerated failure time models to estimate interval-specific time ratios and corresponding 95% confidence limits for length of stay, in which the 1-hour interval was the reference. In addition, we used binomial regression with a log link to estimate risk ratios (RRs) and corresponding confidence limit for left without being seen. Our study population comprised 117,442 patients. The time ratios for length of stay were similar across intervals for each crowding estimation tool (time ratio=1.37 to 1.30 for NEDOCS, 1.44 to 1.37 for SONET, 1.32 to 1.27 for EDWIN, and 1.28 to 1.23 for ED Occupancy Rate). The RRs of left without being seen differences were also similar across intervals for each tool (RR=2.92 to 2.56 for NEDOCS, 3.61 to 3.36 for SONET, 2.65 to 2.40 for EDWIN, and 2.44 to 2.14 for ED Occupancy Rate). Our findings suggest limited variation in length of stay or left without being seen between intervals (1 to 4 hours) regardless of which of the 4 crowding estimation tools were used. Consequently, 4 hours may be a reasonable interval for assessing crowding with these tools, which could substantially reduce the burden on ED personnel by requiring less frequent assessment of crowding. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Mathematical Modeling to Reduce Waste of Compounded Sterile Products in Hospital Pharmacies
Dobson, Gregory; Haas, Curtis E.; Tilson, David
2014-01-01
Abstract In recent years, many US hospitals embarked on “lean” projects to reduce waste. One advantage of the lean operational improvement methodology is that it relies on process observation by those engaged in the work and requires relatively little data. However, the thoughtful analysis of the data captured by operational systems allows the modeling of many potential process options. Such models permit the evaluation of likely waste reductions and financial savings before actual process changes are made. Thus the most promising options can be identified prospectively, change efforts targeted accordingly, and realistic targets set. This article provides one example of such a datadriven process redesign project focusing on waste reduction in an in-hospital pharmacy. A mathematical model of the medication prepared and delivered by the pharmacy is used to estimate the savings from several potential redesign options (rescheduling the start of production, scheduling multiple batches, or reordering production within a batch) as well as the impact of information system enhancements. The key finding is that mathematical modeling can indeed be a useful tool. In one hospital setting, it estimated that waste could be realistically reduced by around 50% by using several process changes and that the greatest benefit would be gained by rescheduling the start of production (for a single batch) away from the period when most order cancellations are made. PMID:25477580
Estimation of the climate change impact on a catchment water balance using an ensemble of GCMs
NASA Astrophysics Data System (ADS)
Reshmidevi, T. V.; Nagesh Kumar, D.; Mehrotra, R.; Sharma, A.
2018-01-01
This work evaluates the impact of climate change on the water balance of a catchment in India. Rainfall and hydro-meteorological variables for current (20C3M scenario, 1981-2000) and two future time periods: mid of the 21st century (2046-2065) and end of the century (2081-2100) are simulated using Modified Markov Model-Kernel Density Estimation (MMM-KDE) and k-nearest neighbor downscaling models. Climate projections from an ensemble of 5 GCMs (MPI-ECHAM5, BCCR-BCM2.0, CSIRO-mk3.5, IPSL-CM4, and MRI-CGCM2) are used in this study. Hydrologic simulations for the current as well as future climate scenarios are carried out using Soil and Water Assessment Tool (SWAT) integrated with ArcGIS (ArcSWAT v.2009). The results show marginal reduction in runoff ratio, annual streamflow and groundwater recharge towards the end of the century. Increased temperature and evapotranspiration project an increase in the irrigation demand towards the end of the century. Rainfall projections for the future shows marginal increase in the annual average rainfall. Short and moderate wet spells are projected to decrease, whereas short and moderate dry spells are projected to increase in the future. Projected reduction in streamflow and groundwater recharge along with the increase in irrigation demand is likely to aggravate the water stress in the region under the future scenario.
InaSAFE applications in disaster preparedness
NASA Astrophysics Data System (ADS)
Pranantyo, Ignatius Ryan; Fadmastuti, Mahardika; Chandra, Fredy
2015-04-01
Disaster preparedness activities aim to reduce the impact of disasters by being better prepared to respond when a disaster occurs. In order to better anticipate requirements during a disaster, contingency planning activities can be undertaken prior to a disaster based on a realistic disaster scenario. InaSAFE is a tool that can inform this process. InaSAFE is a free and open source software that estimates the impact to people and infrastructure from potential hazard scenarios. By using InaSAFE, disaster managers can develop scenarios of disaster impacts (people and infrastructures affected) to inform their contingency plan and emergency response operation plan. While InaSAFE provides the software framework exposure data and hazard data are needed as inputs to run this software. Then InaSAFE can be used to forecast the impact of the hazard scenario to the exposure data. InaSAFE outputs include estimates of the number of people, buildings and roads are affected, list of minimum needs (rice and clean water), and response checklist. InaSAFE is developed by Indonesia's National Disaster Management Agency (BNPB) and the Australian Government, through the Australia-Indonesia Facility for Disaster Reduction (AIFDR), in partnership with the World Bank - Global Facility for Disaster Reduction and Recovery (GFDRR). This software has been used in many parts of Indonesia, including Padang, Maumere, Jakarta, and Slamet Mountain for emergency response and contingency planning.
Lives saved from malaria prevention in Africa--evidence to sustain cost-effective gains.
Korenromp, Eline L
2012-03-28
Lives saved have become a standard metric to express health benefits across interventions and diseases. Recent estimates of malaria-attributable under-five deaths prevented using the Lives Saved tool (LiST), extrapolating effectiveness estimates from community-randomized trials of scale-up of insecticide-treated nets (ITNs) in the 1990s, confirm the substantial impact and good cost-effectiveness that ITNs have achieved in high-endemic sub-Saharan Africa. An even higher cost-effectiveness would likely have been found if the modelling had included the additional indirect mortality impact of ITNs on preventing deaths from other common child illnesses, to which malaria contributes as a risk factor. As conventional ITNs are being replaced by long-lasting insecticidal nets and scale-up is expanded to target universal coverage for full, all-age populations at risk, enhanced transmission reduction may--above certain thresholds--enhance the mortality impact beyond that observed in the trials of the 1990s. On the other hand, lives saved by ITNs might fall if improved malaria case management with artemisinin-based combination therapy averts the deaths that ITNs would otherwise prevent.Validation and updating of LiST's simple assumption of a universal, fixed coverage-to-mortality-reduction ratio will require enhanced national programme and impact monitoring and evaluation. Key indicators for time trend analysis include malaria-related mortality from population-based surveys and vital registration, vector control and treatment coverage from surveys, and parasitologically-confirmed malaria cases and deaths recorded in health facilities. Indispensable is triangulation with dynamic transmission models, fitted to long-term trend data on vector, parasite and human populations over successive phases of malaria control and elimination.Sound, locally optimized budget allocation including on monitoring and evaluation priorities will benefit much if policy makers and programme planners use planning tools such as LiST - even when predictions are less certain than often understood. The ultimate success of LiST for supporting malaria prevention may be to prove its linear predictions less and less relevant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, Tom P.
This report recalculates the estimated relationship between vehicle mass and societal fatality risk, using alternative groupings by vehicle weight, to test whether the trend of decreasing fatality risk from mass reduction as case vehicle mass increases, holds over smaller increments of the range in case vehicle masses. The NHTSA baseline regression model estimates the relationship using for two weight groups for cars and light trucks; we re-estimated the mass reduction coefficients using four, six, and eight bins of vehicle mass. The estimated effect of mass reduction on societal fatality risk was not consistent over the range in vehicle masses inmore » these weight bins. These results suggest that the relationship indicated by the NHTSA baseline model is a result of other, unmeasured attributes of the mix of vehicles in the lighter vs. heavier weight bins, and not necessarily the result of a correlation between mass reduction and societal fatality risk. An analysis of the average vehicle, driver, and crash characteristics across the various weight groupings did not reveal any strong trends that might explain the lack of a consistent trend of decreasing fatality risk from mass reduction in heavier vehicles.« less
Consequent use of IT tools as a driver for cost reduction and quality improvements
NASA Astrophysics Data System (ADS)
Hein, Stefan; Rapp, Roberto; Feustel, Andreas
2013-10-01
The semiconductor industry drives a lot of efforts in the field of cost reductions and quality improvements. The consequent use of IT tools is one possibility to support these goals. With the extensions of its 150mm Fab to 200mm Robert Bosch increased the systematic use of data analysis and Advanced Process Control (APC).
2001-07-21
APPENDIX A. ACRONYMS ACCES Attenuating Custom Communication Earpiece System ACEIT Automated Cost estimating Integrated Tools AFSC Air Force...documented in the ACEIT cost estimating tool developed by Tecolote, Inc. The factor used was 14 percent of PMP. 1.3 System Engineering/ Program...The data source is the ASC Aeronautical Engineering Products Cost Factor Handbook which is documented in the ACEIT cost estimating tool developed
An approach to and web-based tool for infectious disease outbreak intervention analysis
NASA Astrophysics Data System (ADS)
Daughton, Ashlynn R.; Generous, Nicholas; Priedhorsky, Reid; Deshpande, Alina
2017-04-01
Infectious diseases are a leading cause of death globally. Decisions surrounding how to control an infectious disease outbreak currently rely on a subjective process involving surveillance and expert opinion. However, there are many situations where neither may be available. Modeling can fill gaps in the decision making process by using available data to provide quantitative estimates of outbreak trajectories. Effective reduction of the spread of infectious diseases can be achieved through collaboration between the modeling community and public health policy community. However, such collaboration is rare, resulting in a lack of models that meet the needs of the public health community. Here we show a Susceptible-Infectious-Recovered (SIR) model modified to include control measures that allows parameter ranges, rather than parameter point estimates, and includes a web user interface for broad adoption. We apply the model to three diseases, measles, norovirus and influenza, to show the feasibility of its use and describe a research agenda to further promote interactions between decision makers and the modeling community.
NASA Technical Reports Server (NTRS)
Jack, John; Kwan, Eric; Wood, Milana
2011-01-01
PRICE H was introduced into the JPL cost estimation tool set circa 2003. It became more available at JPL when IPAO funded the NASA-wide site license for all NASA centers. PRICE H was mainly used as one of the cost tools to validate proposal grassroots cost estimates. Program offices at JPL view PRICE H as an additional crosscheck to Team X (JPL Concurrent Engineering Design Center) estimates. PRICE H became widely accepted ca, 2007 at JPL when the program offices moved away from grassroots cost estimation for Step 1 proposals. PRICE H is now one of the key cost tools used for cost validation, cost trades, and independent cost estimates.
Efficient computation of parameter sensitivities of discrete stochastic chemical reaction networks.
Rathinam, Muruhan; Sheppard, Patrick W; Khammash, Mustafa
2010-01-21
Parametric sensitivity of biochemical networks is an indispensable tool for studying system robustness properties, estimating network parameters, and identifying targets for drug therapy. For discrete stochastic representations of biochemical networks where Monte Carlo methods are commonly used, sensitivity analysis can be particularly challenging, as accurate finite difference computations of sensitivity require a large number of simulations for both nominal and perturbed values of the parameters. In this paper we introduce the common random number (CRN) method in conjunction with Gillespie's stochastic simulation algorithm, which exploits positive correlations obtained by using CRNs for nominal and perturbed parameters. We also propose a new method called the common reaction path (CRP) method, which uses CRNs together with the random time change representation of discrete state Markov processes due to Kurtz to estimate the sensitivity via a finite difference approximation applied to coupled reaction paths that emerge naturally in this representation. While both methods reduce the variance of the estimator significantly compared to independent random number finite difference implementations, numerical evidence suggests that the CRP method achieves a greater variance reduction. We also provide some theoretical basis for the superior performance of CRP. The improved accuracy of these methods allows for much more efficient sensitivity estimation. In two example systems reported in this work, speedup factors greater than 300 and 10,000 are demonstrated.
Extracting galactic structure parameters from multivariated density estimation
NASA Technical Reports Server (NTRS)
Chen, B.; Creze, M.; Robin, A.; Bienayme, O.
1992-01-01
Multivariate statistical analysis, including includes cluster analysis (unsupervised classification), discriminant analysis (supervised classification) and principle component analysis (dimensionlity reduction method), and nonparameter density estimation have been successfully used to search for meaningful associations in the 5-dimensional space of observables between observed points and the sets of simulated points generated from a synthetic approach of galaxy modelling. These methodologies can be applied as the new tools to obtain information about hidden structure otherwise unrecognizable, and place important constraints on the space distribution of various stellar populations in the Milky Way. In this paper, we concentrate on illustrating how to use nonparameter density estimation to substitute for the true densities in both of the simulating sample and real sample in the five-dimensional space. In order to fit model predicted densities to reality, we derive a set of equations which include n lines (where n is the total number of observed points) and m (where m: the numbers of predefined groups) unknown parameters. A least-square estimation will allow us to determine the density law of different groups and components in the Galaxy. The output from our software, which can be used in many research fields, will also give out the systematic error between the model and the observation by a Bayes rule.
Power-Production Diagnostic Tools for Low-Density Wind Farms with Applications to Wake Steering
NASA Astrophysics Data System (ADS)
Takle, E. S.; Herzmann, D.; Rajewski, D. A.; Lundquist, J. K.; Rhodes, M. E.
2016-12-01
Hansen (2011) provided guidelines for wind farm wake analysis with applications to "high density" wind farms (where average distance between turbines is less than ten times rotor diameter). For "low-density" (average distance greater than fifteen times rotor diameter) wind farms, or sections of wind farms we demonstrate simpler sorting and visualization tools that reveal wake interactions and opportunities for wind farm power prediction and wake steering. SCADA data from a segment of a large mid-continent wind farm, together with surface flux measurements and lidar data are subjected to analysis and visualization of wake interactions. A time-history animated visualization of a plan view of power level of individual turbines provides a quick analysis of wake interaction dynamics. Yaw-based sectoral histograms of enhancement/decline of wind speed and power from wind farm reference levels reveals angular width of wake interactions and identifies the turbine(s) responsible for the power reduction. Concurrent surface flux measurements within the wind farm allowed us to evaluate stability influence on wake loss. A one-season climatology is used to identify high-priority candidates for wake steering based on estimated power recovery. Typical clearing prices on the day-ahead market are used to estimate the added value of wake steering. Current research is exploring options for identifying candidate locations for wind farm "build-in" in existing low-density wind farms.
Umbrello, Michele; Mistraletti, Giovanni; Corbella, Davide; Cigada, Marco; Salini, Silvia; Morabito, Alberto; Iapichino, Gaetano
2012-12-01
Within the evidence-based medicine paradigm, randomized controlled trials represent the "gold standard" to produce reliable evidence. Indeed, planning and implementing randomized controlled trials in critical care medicine presents limitations because of intrinsic and structural problems. As a consequence, observational studies still occur frequently. In these cases, propensity score (PS) (probability of receiving a treatment conditional on observed covariates) is an increasingly used technique to adjust the results. Few studies addressed the specific issue of a PS correction of repeated-measures designs. Three techniques for correcting the analysis of nonrandomized designs (matching, stratification, regression adjustment) are presented in a tutorial form and applied to a real case study: the comparison between intravenous and enteral sedative therapy in the intensive care unit setting. After showing the results before and after the use of PS, we suggest that such a tool allows to partially overcoming the bias associated with the observational nature of the study. It permits to correct the estimates for any observed covariate, while unobserved confounders cannot be controlled for. Propensity score represents a useful additional tool to estimate the effects of treatments in nonrandomized studies. In the case study, an enteral sedation approach was equally effective to an intravenous regime, allowing for a lower level of sedation and spare of resources. Copyright © 2012 Elsevier Inc. All rights reserved.
Labor estimation by informational objective assessment (LEIOA) for preterm delivery prediction.
Malaina, Iker; Aranburu, Larraitz; Martínez, Luis; Fernández-Llebrez, Luis; Bringas, Carlos; De la Fuente, Ildefonso M; Pérez, Martín Blás; González, Leire; Arana, Itziar; Matorras, Roberto
2018-05-01
To introduce LEIOA, a new screening method to forecast which patients admitted to the hospital because of suspected threatened premature delivery will give birth in < 7 days, so that it can be used to assist in the prognosis and treatment jointly with other clinical tools. From 2010 to 2013, 286 tocographies from women with gestational ages comprehended between 24 and 37 weeks were collected and studied. Then, we developed a new predictive model based on uterine contractions which combine the Generalized Hurst Exponent and the Approximate Entropy by logistic regression (LEIOA model). We compared it with a model using exclusively obstetric variables, and afterwards, we joined both to evaluate the gain. Finally, a cross validation was performed. The combination of LEIOA with the medical model resulted in an increase (in average) of predictive values of 12% with respect to the medical model alone, giving a sensitivity of 0.937, a specificity of 0.747, a positive predictive value of 0.907 and a negative predictive value of 0.819. Besides, adding LEIOA reduced the percentage of incorrectly classified cases by the medical model by almost 50%. Due to the significant increase in predictive parameters and the reduction of incorrectly classified cases when LEIOA was combined with the medical variables, we conclude that it could be a very useful tool to improve the estimation of the immediacy of preterm delivery.
Uncertainty estimation and multi sensor fusion for kinematic laser tracker measurements
NASA Astrophysics Data System (ADS)
Ulrich, Thomas
2013-08-01
Laser trackers are widely used to measure kinematic tasks such as tracking robot movements. Common methods to evaluate the uncertainty in the kinematic measurement include approximations specified by the manufacturers, various analytical adjustment methods and the Kalman filter. In this paper a new, real-time technique is proposed, which estimates the 4D-path (3D-position + time) uncertainty of an arbitrary path in space. Here a hybrid system estimator is applied in conjunction with the kinematic measurement model. This method can be applied to processes, which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. The new approach is compared with the Kalman filter and a manufacturer's approximations. The comparison was made using data obtained by tracking an industrial robot's tool centre point with a Leica laser tracker AT901 and a Leica laser tracker LTD500. It shows that the new approach is more appropriate to analysing kinematic processes than the Kalman filter, as it reduces overshoots and decreases the estimated variance. In comparison with the manufacturer's approximations, the new approach takes account of kinematic behaviour with an improved description of the real measurement process and a reduction in estimated variance. This approach is therefore well suited to the analysis of kinematic processes with unknown changes in kinematic behaviour as well as the fusion among laser trackers.
Opti-Tool: EPA Region 1's Stormwater Management Optimization Tool
Opti-Tool assists stormwater managers & consulting engineers in preparing technically sound & cost-effective watershed SW mgmt plans to achieve needed pollutant & volume reductions more affordably from developed landscapes throughout the New England Region
Le Menach, Arnaud; Takala, Shannon; McKenzie, F Ellis; Perisse, Andre; Harris, Anthony; Flahault, Antoine; Smith, David L
2007-01-25
Insecticide Treated Nets (ITNs) are an important tool for malaria control. ITNs are effective because they work on several parts of the mosquito feeding cycle, including both adult killing and repelling effects. Using an elaborated description of the classic feeding cycle model, simple formulas have been derived to describe how ITNs change mosquito behaviour and the intensity of malaria transmission, as summarized by vectorial capacity and EIR. The predicted changes are illustrated as a function of the frequency of ITN use for four different vector populations using parameter estimates from the literature. The model demonstrates that ITNs simultaneously reduce mosquitoes' lifespans, lengthen the feeding cycle, and by discouraging human biting divert more bites onto non-human hosts. ITNs can substantially reduce vectorial capacity through small changes to all of these quantities. The total reductions in vectorial capacity differ, moreover, depending on baseline behavior in the absence of ITNs. Reductions in lifespan and vectorial capacity are strongest for vector species with high baseline survival. Anthropophilic and zoophilic species are affected differently by ITNs; the feeding cycle is lengthened more for anthrophilic species, and the proportion of bites that are diverted onto non-human hosts is higher for zoophilic species. This model suggests that the efficacy of ITNs should be measured as a total reduction in transmission intensity, and that the quantitative effects will differ by species and by transmission intensity. At very high rates of ITN use, ITNs can generate large reductions in transmission intensity that could provide very large reductions in transmission intensity, and effective malaria control in some areas, especially when used in combination with other control measures. At high EIR, ITNs will probably not substantially reduce the parasite rate, but when transmission intensity is low, reductions in vectorial capacity combine with reductions in the parasite rate to generate very large reductions in EIR.
ToxPredictor: a Toxicity Estimation Software Tool
The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Shaughnessy, Eric; Heeter, Jenny; Keyser, David
Cities are increasingly taking actions such as building code enforcement, urban planning, and public transit expansion to reduce emissions of carbon dioxide in their communities and municipal operations. However, many cities lack the quantitative information needed to estimate policy impacts and prioritize city actions in terms of carbon abatement potential and cost effectiveness. This report fills this research gap by providing methodologies to assess the carbon abatement potential of a variety of city actions. The methodologies are applied to an energy use data set of 23,458 cities compiled for the U.S. Department of Energy’s City Energy Profile tool. The analysismore » estimates the national carbon abatement potential of the most commonly implemented actions in six specific policy areas. The results of this analysis suggest that, in aggregate, cities could reduce nationwide carbon emissions by about 210 million metric tons of carbon dioxide (MMT CO 2) per year in a "moderate abatement scenario" by 2035 and 480 MMT CO 2/year in a "high abatement scenario" by 2035 through these common actions typically within a city’s control in the six policy areas. The aggregate carbon abatement potential of these specific areas equates to a reduction of 3%-7% relative to 2013 U.S. emissions. At the city level, the results suggest the average city could reduce carbon emissions by 7% (moderate) to 19% (high) relative to current city-level emissions. City carbon abatement potential is sensitive to national and state policies that affect the carbon intensity of electricity and transportation. Specifically, the U.S. Clean Power Plan and further renewable energy cost reductions could reduce city carbon emissions overall, helping cities achieve their carbon reduction goals.« less
Ruhago, George M; Ngalesoni, Frida N; Norheim, Ole F
2012-12-27
Inequity in access to and use of child and maternal health interventions is impeding progress towards the maternal and child health Millennium Development Goals. This study explores the potential health gains and equity impact if a set of priority interventions for mothers and under fives were scaled up to reach national universal coverage targets for MDGs in Tanzania. We used the Lives Saved Tool (LiST) to estimate potential reductions in maternal and child mortality and the number of lives saved across wealth quintiles and between rural and urban settings. High impact maternal and child health interventions were modelled for a five-year scale up, by linking intervention coverage, effectiveness and cause of mortality using data from Tanzania. Concentration curves were drawn and the concentration index estimated to measure the equity impact of the scale up. In the poorest population quintiles in Tanzania, the lives of more than twice as many mothers and under-fives were likely to be saved, compared to the richest quintile. Scaling up coverage to equal levels across quintiles would reduce inequality in maternal and child mortality from a pro rich concentration index of -0.11 (maternal) and -0.12 (children) to a more equitable concentration index of -0,03 and -0.03 respectively. In rural areas, there would likely be an eight times greater reduction in maternal deaths than in urban areas and a five times greater reduction in child deaths than in urban areas. Scaling up priority maternal and child health interventions to equal levels would potentially save far more lives in the poorest populations, and would accelerate equitable progress towards maternal and child health MDGs.
2012-01-01
Background Inequity in access to and use of child and maternal health interventions is impeding progress towards the maternal and child health Millennium Development Goals. This study explores the potential health gains and equity impact if a set of priority interventions for mothers and under fives were scaled up to reach national universal coverage targets for MDGs in Tanzania. Methods We used the Lives Saved Tool (LiST) to estimate potential reductions in maternal and child mortality and the number of lives saved across wealth quintiles and between rural and urban settings. High impact maternal and child health interventions were modelled for a five-year scale up, by linking intervention coverage, effectiveness and cause of mortality using data from Tanzania. Concentration curves were drawn and the concentration index estimated to measure the equity impact of the scale up. Results In the poorest population quintiles in Tanzania, the lives of more than twice as many mothers and under-fives were likely to be saved, compared to the richest quintile. Scaling up coverage to equal levels across quintiles would reduce inequality in maternal and child mortality from a pro rich concentration index of −0.11 (maternal) and −0.12 (children) to a more equitable concentration index of −0,03 and −0.03 respectively. In rural areas, there would likely be an eight times greater reduction in maternal deaths than in urban areas and a five times greater reduction in child deaths than in urban areas. Conclusions Scaling up priority maternal and child health interventions to equal levels would potentially save far more lives in the poorest populations, and would accelerate equitable progress towards maternal and child health MDGs. PMID:23270489
Urban and Transport Planning Related Exposures and Mortality: A Health Impact Assessment for Cities.
Mueller, Natalie; Rojas-Rueda, David; Basagaña, Xavier; Cirach, Marta; Cole-Hunter, Tom; Dadvand, Payam; Donaire-Gonzalez, David; Foraster, Maria; Gascon, Mireia; Martinez, David; Tonne, Cathryn; Triguero-Mas, Margarita; Valentín, Antònia; Nieuwenhuijsen, Mark
2017-01-01
By 2050, nearly 70% of the global population is projected to live in urban areas. Because the environments we inhabit affect our health, urban and transport designs that promote healthy living are needed. We estimated the number of premature deaths preventable under compliance with international exposure recommendations for physical activity (PA), air pollution, noise, heat, and access to green spaces. We developed and applied the Urban and TranspOrt Planning Health Impact Assessment (UTOPHIA) tool to Barcelona, Spain. Exposure estimates and mortality data were available for 1,357,361 residents. We compared recommended with current exposure levels. We quantified the associations between exposures and mortality and calculated population attributable fractions to estimate the number of premature deaths preventable. We also modeled life-expectancy and economic impacts. We estimated that annually, nearly 20% of mortality could be prevented if international recommendations for performance of PA; exposure to air pollution, noise, and heat; and access to green space were followed. Estimations showed that the greatest portion of preventable deaths was attributable to increases in PA, followed by reductions of exposure to air pollution, traffic noise, and heat. Access to green spaces had smaller effects on mortality. Compliance was estimated to increase the average life expectancy by 360 (95% CI: 219, 493) days and result in economic savings of 9.3 (95% CI: 4.9, 13.2) billion EUR/year. PA factors and environmental exposures can be modified by changes in urban and transport planning. We emphasize the need for a) the reduction of motorized traffic through the promotion of active and public transport and b) the provision of green infrastructure, both of which are suggested to provide opportunities for PA and for mitigation of air pollution, noise, and heat. Citation: Mueller N, Rojas-Rueda D, Basagaña X, Cirach M, Cole-Hunter T, Dadvand P, Donaire-Gonzalez D, Foraster M, Gascon M, Martinez D, Tonne C, Triguero-Mas M, Valentín A, Nieuwenhuijsen M. 2017. Urban and transport planning related exposures and mortality: a health impact assessment for cities. Environ Health Perspect 125:89-96; http://dx.doi.org/10.1289/EHP220.
Handler, Steven M.; Sharkey, Siobhan S.; Hudak, Sandra; Ouslander, Joseph G.
2012-01-01
A substantial reduction in hospitalization rates has been associated with the implementation of the Interventions to Reduce Acute Care Transfers (INTERACT) quality improvement intervention using the accompanying paper-based clinical practice tools (INTERACT II). There is significant potential to further increase the impact of INTERACT by integrating INTERACT II tools into nursing home (NH) health information technology (HIT) via standalone or integrated clinical decision support (CDS) systems. This article highlights the process of translating INTERACT II tools from paper to NH HIT. The authors believe that widespread dissemination and integration of INTERACT II CDS tools into various NH HIT products could lead to sustainable improvement in resident and clinician process and outcome measures, including enhanced interclinician communication and a reduction in potentially avoidable hospitalizations. PMID:22267955
Doubly Robust and Efficient Estimation of Marginal Structural Models for the Hazard Function
Zheng, Wenjing; Petersen, Maya; van der Laan, Mark
2016-01-01
In social and health sciences, many research questions involve understanding the causal effect of a longitudinal treatment on mortality (or time-to-event outcomes in general). Often, treatment status may change in response to past covariates that are risk factors for mortality, and in turn, treatment status may also affect such subsequent covariates. In these situations, Marginal Structural Models (MSMs), introduced by Robins (1997), are well-established and widely used tools to account for time-varying confounding. In particular, a MSM can be used to specify the intervention-specific counterfactual hazard function, i.e. the hazard for the outcome of a subject in an ideal experiment where he/she was assigned to follow a given intervention on their treatment variables. The parameters of this hazard MSM are traditionally estimated using the Inverse Probability Weighted estimation (IPTW, van der Laan and Petersen (2007), Robins et al. (2000b), Robins (1999), Robins et al. (2008)). This estimator is easy to implement and admits Wald-type confidence intervals. However, its consistency hinges on the correct specification of the treatment allocation probabilities, and the estimates are generally sensitive to large treatment weights (especially in the presence of strong confounding), which are difficult to stabilize for dynamic treatment regimes. In this paper, we present a pooled targeted maximum likelihood estimator (TMLE, van der Laan and Rubin (2006)) for MSM for the hazard function under longitudinal dynamic treatment regimes. The proposed estimator is semiparametric efficient and doubly robust, hence offers bias reduction and efficiency gain over the incumbent IPTW estimator. Moreover, the substitution principle rooted in the TMLE potentially mitigates the sensitivity to large treatment weights in IPTW. We compare the performance of the proposed estimator with the IPTW and a non-targeted substitution estimator in a simulation study. PMID:27227723
Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.
Zhao, Baoliang; Nelson, Carl A
2016-10-01
Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.
Nagata, Tomohisa; Mori, Koji; Aratake, Yutaka; Ide, Hiroshi; Ishida, Hiromi; Nobori, Junichiro; Kojima, Reiko; Odagami, Kiminori; Kato, Anna; Tsutsumi, Akizumi; Matsuda, Shinya
2014-01-01
The aim of the present study was to develop standardized cost estimation tools that provide information to employers about occupational safety and health (OSH) activities for effective and efficient decision making in Japanese companies. We interviewed OSH staff members including full-time professional occupational physicians to list all OSH activities. Using activity-based costing, cost data were obtained from retrospective analyses of occupational safety and health costs over a 1-year period in three manufacturing workplaces and were obtained from retrospective analyses of occupational health services costs in four manufacturing workplaces. We verified the tools additionally in four workplaces including service businesses. We created the OSH and occupational health standardized cost estimation tools. OSH costs consisted of personnel costs, expenses, outsourcing costs and investments for 15 OSH activities. The tools provided accurate, relevant information on OSH activities and occupational health services. The standardized information obtained from our OSH and occupational health cost estimation tools can be used to manage OSH costs, make comparisons of OSH costs between companies and organizations and help occupational health physicians and employers to determine the best course of action.
O'Connor, Elodie; Hatherly, Chris
2014-01-01
Background Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. Objective The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Methods Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Results Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups. Additionally, participants using the information-based website reported significantly higher scores on their ratings of the ease of navigation, F2,190=4.20, P=.02, than those using the mobile phone app and the interactive website. There were no significant differences between groups on ratings of ease of understanding the information, F2,188=0.27, P=.76. Most participants from each of the three intervention groups indicated that they intended to keep using the dementia risk reduction eHealth tool. Conclusions Overall, results indicated that while participants across all three intervention groups reported a generally positive experience with the targeted dementia risk reduction tools, participants using the information-based website provided a more favorable evaluation across a range of areas than participants using the mobile phone app. Further research is required to investigate whether targeted dementia risk reduction tools, in the form of interactive websites and mobile apps, can be improved to provide benefits above those gained by providing static information alone. PMID:26543904
O'Connor, Elodie; Farrow, Maree; Hatherly, Chris
2014-01-01
Encouraging middle-aged adults to maintain their physical and cognitive health may have a significant impact on reducing the prevalence of dementia in the future. Mobile phone apps and interactive websites may be one effective way to target this age group. However, to date there has been little research investigating the user experience of dementia risk reduction tools delivered in this way. The aim of this study was to explore participant engagement and evaluations of three different targeted smartphone and Web-based dementia risk reduction tools following a four-week intervention. Participants completed a Web-based screening questionnaire to collect eligibility information. Eligible participants were asked to complete a Web-based baseline questionnaire and were then randomly assigned to use one of the three dementia risk reduction tools for a period of four weeks: (1) a mobile phone application; (2) an information-based website; and (3) an interactive website. User evaluations were obtained via a Web-based follow-up questionnaire after completion of the intervention. Of 415 eligible participants, 370 (89.16%) completed the baseline questionnaire and were assigned to an intervention group; 200 (54.05%) completed the post-intervention questionnaire. The average age of participants was 52 years, and 149 (75%) were female. Findings indicated that participants from all three intervention groups reported a generally positive impression of the tools across a range of domains. Participants using the information-based website reported higher ratings of their overall impression of the tool, F2,191=4.12, P=.02; how interesting the information was, F2,189=3.53, P=.03; how helpful the information was, F2,192=4.15, P=.02; and how much they learned, F2,188=3.86, P=.02. Group differences were significant between the mobile phone app and information-based website users, but not between the interactive website users and the other two groups. Additionally, participants using the information-based website reported significantly higher scores on their ratings of the ease of navigation, F2,190=4.20, P=.02, than those using the mobile phone app and the interactive website. There were no significant differences between groups on ratings of ease of understanding the information, F2,188=0.27, P=.76. Most participants from each of the three intervention groups indicated that they intended to keep using the dementia risk reduction eHealth tool. Overall, results indicated that while participants across all three intervention groups reported a generally positive experience with the targeted dementia risk reduction tools, participants using the information-based website provided a more favorable evaluation across a range of areas than participants using the mobile phone app. Further research is required to investigate whether targeted dementia risk reduction tools, in the form of interactive websites and mobile apps, can be improved to provide benefits above those gained by providing static information alone.
Sang-Kyun Han; Han-Sup Han; William J. Elliot; Edward M. Bilek
2017-01-01
We developed a spreadsheet-based model, named ThinTool, to evaluate the cost of mechanical fuel reduction thinning including biomass removal, to predict net energy output, and to assess nutrient impacts from thinning treatments in northern California and southern Oregon. A combination of literature reviews, field-based studies, and contractor surveys was used to...
Machine Learning Based Diagnosis of Lithium Batteries
NASA Astrophysics Data System (ADS)
Ibe-Ekeocha, Chinemerem Christopher
The depletion of the world's current petroleum reserve, coupled with the negative effects of carbon monoxide and other harmful petrochemical by-products on the environment, is the driving force behind the movement towards renewable and sustainable energy sources. Furthermore, the growing transportation sector consumes a significant portion of the total energy used in the United States. A complete electrification of this sector would require a significant development in electric vehicles (EVs) and hybrid electric vehicles (HEVs), thus translating to a reduction in the carbon footprint. As the market for EVs and HEVs grows, their battery management systems (BMS) need to be improved accordingly. The BMS is not only responsible for optimally charging and discharging the battery, but also monitoring battery's state of charge (SOC) and state of health (SOH). SOC, similar to an energy gauge, is a representation of a battery's remaining charge level as a percentage of its total possible charge at full capacity. Similarly, SOH is a measure of deterioration of a battery; thus it is a representation of the battery's age. Both SOC and SOH are not measurable, so it is important that these quantities are estimated accurately. An inaccurate estimation could not only be inconvenient for EV consumers, but also potentially detrimental to battery's performance and life. Such estimations could be implemented either online, while battery is in use, or offline when battery is at rest. This thesis presents intelligent online SOC and SOH estimation methods using machine learning tools such as artificial neural network (ANN). ANNs are a powerful generalization tool if programmed and trained effectively. Unlike other estimation strategies, the techniques used require no battery modeling or knowledge of battery internal parameters but rather uses battery's voltage, charge/discharge current, and ambient temperature measurements to accurately estimate battery's SOC and SOH. The developed algorithms are evaluated experimentally using two different batteries namely lithium iron phosphate (LiFePO 4) and lithium titanate (LTO), both subjected to constant and dynamic current profiles. Results highlight the robustness of these algorithms to battery's nonlinear dynamic nature, hysteresis, aging, dynamic current profile, and parametric uncertainties. Consequently, these methods are susceptible and effective if incorporated with the BMS of EVs', HEVs', and other battery powered devices.
Spatial Representations in Older Adults are Not Modified by Action: Evidence from Tool Use
Costello, Matthew C.; Bloesch, Emily K.; Davoli, Christopher C.; Panting, Nicholas D.; Abrams, Richard A.; Brockmole, James R.
2015-01-01
Theories of embodied perception hold that the visual system is calibrated by both the body schema and the action system, allowing for adaptive action-perception responses. One example of embodied perception involves the effects of tool-use on distance perception, in which wielding a tool with the intention to act upon a target appears to bring that object closer. This tool-based spatial compression (i.e., tool-use effect) has been studied exclusively with younger adults, but it is unknown whether the phenomenon exists with older adults. In this study, we examined the effects of tool use on distance perception in younger and older adults in two experiments. In Experiment 1, younger and older adults estimated the distances of targets just beyond peripersonal space while either wielding a tool or pointing with the hand. Younger adults, but not older adults, estimated targets to be closer after reaching with a tool. In Experiment 2, younger and older adults estimated the distance to remote targets while using either a baton or laser pointer. Younger adults displayed spatial compression with the laser pointer compared to the baton, although older adults did not. Taken together, these findings indicate a generalized absence of the tool-use effect in older adults during distance estimation suggesting that the visuomotor system of older adults does not remap from peripersonal to extrapersonal spatial representations during tool use. PMID:26052886
Forecasting the remaining reservoir capacity in the Laurentian Great Lakes watershed
NASA Astrophysics Data System (ADS)
Alighalehbabakhani, Fatemeh; Miller, Carol J.; Baskaran, Mark; Selegean, James P.; Barkach, John H.; Dahl, Travis; Abkenar, Seyed Mohsen Sadatiyan
2017-12-01
Sediment accumulation behind a dam is a significant factor in reservoir operation and watershed management. There are many dams located within the Laurentian Great Lakes watershed whose operations have been adversely affected by excessive reservoir sedimentation. Reservoir sedimentation effects include reduction of flood control capability and limitations to both water supply withdrawals and power generation due to reduced reservoir storage. In this research, the sediment accumulation rates of twelve reservoirs within the Great Lakes watershed were evaluated using the Soil and Water Assessment Tool (SWAT). The estimated sediment accumulation rates by SWAT were compared to estimates relying on radionuclide dating of sediment cores and bathymetric survey methods. Based on the sediment accumulation rate, the remaining reservoir capacity for each study site was estimated. Evaluation of the anthropogenic impacts including land use change and dam construction on the sediment yield were assessed in this research. The regression analysis was done on the current and pre-European settlement sediment yield for the modeled watersheds to predict the current and natural sediment yield in un-modeled watersheds. These eleven watersheds are in the state of Indiana, Michigan, Ohio, New York, and Wisconsin.
Lower Risk of Cancer in the Areas Inhabited by the German Minority in the Region of Opole, Poland.
Chawińska, Ewa; Tukiendorf, Andrzej; Miszczyk, Leszek
2015-01-01
The lower risk of cancer in the areas inhabited by the German minority in the region of Opole, Poland, at the turn of the 1980's and 1990's has been already reported. A reanalysis of the present-day data was conducted. All the cancer cases (at all sites combined) registered within the years 2008-2012 with data collected by the Regional Cancer Registry in Opole were analyzed in this study. To estimate the risk of cancer in different spatial contexts, such as trends, clusters, and levels, modern geostatistical tools were applied. A statistically significant reduction of the cancer risk was reported in administrative units with ≥ 10% of the German minority. Average decreases in relative risk of 13% in men and 16% in women were estimated. The geographical patterns of the estimates are illustrated. The observed differences in the risk of cancer between the ethnic groups (Germans and repatriates) confirm a historical trend of the disease in the region of Opole, Poland. Some genetic, nutritional, or cultural aspects together with economic issues may play a role in the specified spatial disease patterns. © 2015 S. Karger GmbH, Freiburg.
Giesen, Daniel; van Gestel, Cornelis A M
2013-03-01
Quantitative structure-activity relationships (QSARs) are an established tool in environmental risk assessment and a valuable alternative to the exhaustive use of test animals under REACH. In this study a QSAR was developed for the toxicity of a series of six chloroanilines to the soil-dwelling collembolan Folsomia candida in standardized natural LUFA2.2 soil. Toxicity endpoints incorporated in the QSAR were the concentrations causing 10% (EC10) and 50% (EC50) reduction in reproduction of F. candida. Toxicity was based on concentrations in interstitial water estimated from nominal concentrations in the soil and published soil-water partition coefficients. Estimated effect concentrations were negatively correlated with the lipophilicity of the compounds. Interstitial water concentrations for both the EC10 and EC50 for four compounds were determined by using solid-phase microextraction (SPME). Measured and estimated concentrations were comparable only for tetra- and pentachloroaniline. With decreasing chlorination the disparity between modelled and actual concentrations increased. Optimisation of the QSAR therefore could not be accomplished, showing the necessity to move from total soil to (bio)available concentration measurements. Copyright © 2012 Elsevier Ltd. All rights reserved.
Cappuyns, Valérie; Kessen, Bram
2012-01-01
The choice between different options for the remediation of a contaminated site traditionally relies on economical, technical and regulatory criteria without consideration of the environmental impact of the soil remediation process itself. In the present study, the environmental impact assessment of two potential soil remediation techniques (excavation and off-site cleaning and in situ steam extraction) was performed using two life cycle assessment (LCA)-based evaluation tools, namely the REC (risk reduction, environmental merit and cost) method and the ReCiPe method. The comparison and evaluation of the different tools used to estimate the environmental impact of Brownfield remediation was based on a case study which consisted of the remediation of a former oil and fat processing plant. For the environmental impact assessment, both the REC and ReCiPe methods result in a single score for the environmental impact of the soil remediation process and allow the same conclusion to be drawn: excavation and off-site cleaning has a more pronounced environmental impact than in situ soil remediation by means of steam extraction. The ReCiPe method takes into account more impact categories, but is also more complex to work with and needs more input data. Within the routine evaluation of soil remediation alternatives, a detailed LCA evaluation will often be too time consuming and costly and the estimation of the environmental impact with the REC method will in most cases be sufficient. The case study worked out in this paper wants to provide a basis for a more sounded selection of soil remediation technologies based on a more detailed assessment of the secondary impact of soil remediation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, Tom P.
In its 2012 report NHTSA simulated the effect four fleetwide mass reduction scenarios would have on the change in annual fatalities. NHTSA estimated that the most aggressive of these scenarios (reducing mass 5.2% in heavier light trucks and 2.6% in all other vehicles types except lighter cars) would result in a small reduction in societal fatalities. LBNL replicated the methodology NHTSA used to simulate six mass reduction scenarios, including the mass reductions recommended in the 2015 NRC committee report, and estimated in 2021 and 2025 by EPA in the TAR, using the updated data through 2012. The analysis indicates thatmore » the estimated x change in fatalities under each scenario based on the updated analysis is comparable to that in the 2012 analysis, but less beneficial or more detrimental than that in the 2016 analysis. For example, an across the board 100-lb reduction in mass would result in an estimated 157 additional annual fatalities based on the 2012 analysis, but would result in only an estimated 91 additional annual fatalities based on the 2016 analysis, and an additional 87 fatalities based on the current analysis. The mass reductions recommended by the 2015 NRC committee report6 would result in a 224 increase in annual fatalities in the 2012 analysis, a 344 decrease in annual fatalities in the 2016 analysis, and a 141 increase in fatalities in the current analysis. The mass reductions EPA estimated for 2025 in the TAR7 would result in a 203 decrease in fatalities based on the 2016 analysis, but an increase of 39 fatalities based on the current analysis. These results support NHTSA’s conclusion from its 2012 study that, when footprint is held fixed, “no judicious combination of mass reductions in the various classes of vehicles results in a statistically significant fatality increase and many potential combinations are safety-neutral as point estimates.”Like the previous NHTSA studies, this updated report concludes that the estimated effect of mass reduction while maintaining footprint on societal U.S. fatality risk is small, and not statistically significant at the 95% or 90% confidence level for all vehicle types based on the jack-knife method NHTSA used. This report also finds that the estimated effects of other control variables, such as vehicle type, specific safety technologies, and crash conditions such as whether the crash occurred at night, in a rural county, or on a high-speed road, on risk are much larger, in some cases two orders of magnitude larger, than the estimated effect of mass or footprint reduction on risk. Finally, this report shows that after accounting for the many vehicle, driver, and crash variables NHTSA used in its regression analyses, there remains a wide variation in risk by vehicle make and model, and this variation is unrelated to vehicle mass. Although the purpose of the NHTSA and LBNL reports is to estimate the effect of vehicle mass reduction on societal risk, this is not exactly what the regression models are estimating. Rather, they are estimating the recent historical relationship between mass and risk, after accounting for most measurable differences between vehicles, drivers, and crash times and locations. In essence, the regression models are comparing the risk of a 2600-lb Dodge Neon with that of a 2500-lb Honda Civic, after attempting to account for all other differences between the two vehicles. The models are not estimating the effect of literally removing 100 pounds from the Neon, leaving everything else unchanged. In addition, the analyses are based on the relationship of vehicle mass and footprint on risk for recent vehicle designs (model year 2004 to 2011). These relationships may or may not continue into the future as manufacturers utilize new vehicle designs and incorporate new technologies, such as more extensive use of strong lightweight materials and specific safety technologies. Therefore, throughout this report we use the phrase “the estimated effect of mass (or footprint) reduction on risk” as shorthand for “the estimated change in risk as a function of its relationship to mass (or footprint) for vehicle models of recent design.”« less
Estimation of toxicity using a Java based software tool
A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...
SpecTracer: A Python-Based Interactive Solution for Echelle Spectra Reduction
NASA Astrophysics Data System (ADS)
Romero Matamala, Oscar Fernando; Petit, Véronique; Caballero-Nieves, Saida Maria
2018-01-01
SpecTracer is a newly developed interactive solution to reduce cross dispersed echelle spectra. The use of widgets saves the user the steep learning curves of currently available reduction software. SpecTracer uses well established image processing techniques based on IRAF to succesfully extract the stellar spectra. Comparisons with other reduction software, like IRAF, show comparable results, with the added advantages of ease of use, platform independence and portability. This tool can obtain meaningful scientific data and serve also as a training tool, especially for undergraduates doing research, in the procedure for spectroscopic analysis.
Phase-amplitude reduction of transient dynamics far from attractors for limit-cycling systems
NASA Astrophysics Data System (ADS)
Shirasaka, Sho; Kurebayashi, Wataru; Nakao, Hiroya
2017-02-01
Phase reduction framework for limit-cycling systems based on isochrons has been used as a powerful tool for analyzing the rhythmic phenomena. Recently, the notion of isostables, which complements the isochrons by characterizing amplitudes of the system state, i.e., deviations from the limit-cycle attractor, has been introduced to describe the transient dynamics around the limit cycle [Wilson and Moehlis, Phys. Rev. E 94, 052213 (2016)]. In this study, we introduce a framework for a reduced phase-amplitude description of transient dynamics of stable limit-cycling systems. In contrast to the preceding study, the isostables are treated in a fully consistent way with the Koopman operator analysis, which enables us to avoid discontinuities of the isostables and to apply the framework to system states far from the limit cycle. We also propose a new, convenient bi-orthogonalization method to obtain the response functions of the amplitudes, which can be interpreted as an extension of the adjoint covariant Lyapunov vector to transient dynamics in limit-cycling systems. We illustrate the utility of the proposed reduction framework by estimating the optimal injection timing of external input that efficiently suppresses deviations of the system state from the limit cycle in a model of a biochemical oscillator.
NASA Technical Reports Server (NTRS)
Nickol, Craig L.; Haller, William J.
2016-01-01
NASA's Environmentally Responsible Aviation (ERA) project has matured technologies to enable simultaneous reductions in fuel burn, noise, and nitrogen oxide (NOx) emissions for future subsonic commercial transport aircraft. The fuel burn reduction target was a 50% reduction in block fuel burn (relative to a 2005 best-in-class baseline aircraft), utilizing technologies with an estimated Technology Readiness Level (TRL) of 4-6 by 2020. Progress towards this fuel burn reduction target was measured through the conceptual design and analysis of advanced subsonic commercial transport concepts spanning vehicle size classes from regional jet (98 passengers) to very large twin aisle size (400 passengers). Both conventional tube-and-wing (T+W) concepts and unconventional (over-wing-nacelle (OWN), hybrid wing body (HWB), mid-fuselage nacelle (MFN)) concepts were developed. A set of propulsion and airframe technologies were defined and integrated onto these advanced concepts which were then sized to meet the baseline mission requirements. Block fuel burn performance was then estimated, resulting in reductions relative to the 2005 best-in-class baseline performance ranging from 39% to 49%. The advanced single-aisle and large twin aisle T+W concepts had reductions of 43% and 41%, respectively, relative to the 737-800 and 777-200LR aircraft. The single-aisle OWN concept and the large twin aisle class HWB concept had reductions of 45% and 47%, respectively. In addition to their estimated fuel burn reduction performance, these unconventional concepts have the potential to provide significant noise reductions due, in part, to engine shielding provided by the airframe. Finally, all of the advanced concepts also have the potential for significant NOx emissions reductions due to the use of advanced combustor technology. Noise and NOx emissions reduction estimates were also generated for these concepts as part of the ERA project.
Tools and Metrics for Environmental Sustainability
Within the U.S. Environmental Protection Agency’s Office of Research and Development the National Risk Management Research Laboratory has been developing tools to help design and evaluate chemical processes with a life cycle perspective. These tools include the Waste Reduction (...
Effect of sampling rate and record length on the determination of stability and control derivatives
NASA Technical Reports Server (NTRS)
Brenner, M. J.; Iliff, K. W.; Whitman, R. K.
1978-01-01
Flight data from five aircraft were used to assess the effects of sampling rate and record length reductions on estimates of stability and control derivatives produced by a maximum likelihood estimation method. Derivatives could be extracted from flight data with the maximum likelihood estimation method even if there were considerable reductions in sampling rate and/or record length. Small amplitude pulse maneuvers showed greater degradation of the derivative maneuvers than large amplitude pulse maneuvers when these reductions were made. Reducing the sampling rate was found to be more desirable than reducing the record length as a method of lessening the total computation time required without greatly degrading the quantity of the estimates.
Green, Christopher T.; Jurgens, Bryant; Zhang, Yong; Starn, Jeffrey; Singleton, Michael J.; Esser, Bradley K.
2016-01-01
Rates of oxygen and nitrate reduction are key factors in determining the chemical evolution of groundwater. Little is known about how these rates vary and covary in regional groundwater settings, as few studies have focused on regional datasets with multiple tracers and methods of analysis that account for effects of mixed residence times on apparent reaction rates. This study provides insight into the characteristics of residence times and rates of O2 reduction and denitrification (NO3− reduction) by comparing reaction rates using multi-model analytical residence time distributions (RTDs) applied to a data set of atmospheric tracers of groundwater age and geochemical data from 141 well samples in the Central Eastern San Joaquin Valley, CA. The RTD approach accounts for mixtures of residence times in a single sample to provide estimates of in-situ rates. Tracers included SF6, CFCs, 3H, He from 3H (tritiogenic He),14C, and terrigenic He. Parameter estimation and multi-model averaging were used to establish RTDs with lower error variances than those produced by individual RTD models. The set of multi-model RTDs was used in combination with NO3− and dissolved gas data to estimate zero order and first order rates of O2 reduction and denitrification. Results indicated that O2 reduction and denitrification rates followed approximately log-normal distributions. Rates of O2 and NO3− reduction were correlated and, on an electron milliequivalent basis, denitrification rates tended to exceed O2 reduction rates. Estimated historical NO3− trends were similar to historical measurements. Results show that the multi-model approach can improve estimation of age distributions, and that relatively easily measured O2 rates can provide information about trends in denitrification rates, which are more difficult to estimate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Christopher T.; Jurgens, Bryant C.; Zhang, Yong
Rates of oxygen and nitrate reduction are key factors in determining the chemical evolution of groundwater. Little is known about how these rates vary and covary in regional groundwater settings, as few studies have focused on regional datasets with multiple tracers and methods of analysis that account for effects of mixed residence times on apparent reaction rates. This study provides insight into the characteristics of residence times and rates of O 2 reduction and denitrification (NO 3 – reduction) by comparing reaction rates using multi-model analytical residence time distributions (RTDs) applied to a data set of atmospheric tracers of groundwatermore » age and geochemical data from 141 well samples in the Central Eastern San Joaquin Valley, CA. The RTD approach accounts for mixtures of residence times in a single sample to provide estimates of in-situ rates. Tracers included SF 6, CFCs, 3H, He from 3H (tritiogenic He), 14C, and terrigenic He. Parameter estimation and multi-model averaging were used to establish RTDs with lower error variances than those produced by individual RTD models. The set of multi-model RTDs was used in combination with NO 3 – and dissolved gas data to estimate zero order and first order rates of O 2 reduction and denitrification. Results indicated that O 2 reduction and denitrification rates followed approximately log-normal distributions. Rates of O 2 and NO 3 – reduction were correlated and, on an electron milliequivalent basis, denitrification rates tended to exceed O 2 reduction rates. Estimated historical NO 3 – trends were similar to historical measurements. Here, results show that the multi-model approach can improve estimation of age distributions, and that relatively easily measured O 2 rates can provide information about trends in denitrification rates, which are more difficult to estimate.« less
Green, Christopher T.; Jurgens, Bryant C.; Zhang, Yong; ...
2016-05-14
Rates of oxygen and nitrate reduction are key factors in determining the chemical evolution of groundwater. Little is known about how these rates vary and covary in regional groundwater settings, as few studies have focused on regional datasets with multiple tracers and methods of analysis that account for effects of mixed residence times on apparent reaction rates. This study provides insight into the characteristics of residence times and rates of O 2 reduction and denitrification (NO 3 – reduction) by comparing reaction rates using multi-model analytical residence time distributions (RTDs) applied to a data set of atmospheric tracers of groundwatermore » age and geochemical data from 141 well samples in the Central Eastern San Joaquin Valley, CA. The RTD approach accounts for mixtures of residence times in a single sample to provide estimates of in-situ rates. Tracers included SF 6, CFCs, 3H, He from 3H (tritiogenic He), 14C, and terrigenic He. Parameter estimation and multi-model averaging were used to establish RTDs with lower error variances than those produced by individual RTD models. The set of multi-model RTDs was used in combination with NO 3 – and dissolved gas data to estimate zero order and first order rates of O 2 reduction and denitrification. Results indicated that O 2 reduction and denitrification rates followed approximately log-normal distributions. Rates of O 2 and NO 3 – reduction were correlated and, on an electron milliequivalent basis, denitrification rates tended to exceed O 2 reduction rates. Estimated historical NO 3 – trends were similar to historical measurements. Here, results show that the multi-model approach can improve estimation of age distributions, and that relatively easily measured O 2 rates can provide information about trends in denitrification rates, which are more difficult to estimate.« less
Sub-Second Parallel State Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.
This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly statemore » estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate effects of severe events on the grid. The power grid continues to grow and the number of measurements is increasing at an accelerated rate due to the variety of smart grid devices being introduced. A parallel state estimation implementation will have better performance than traditional, sequential state estimation by utilizing the power of high performance computing (HPC). This increased performance positions parallel state estimators as valuable tools for operating the increasingly more complex power grid.« less
NASA Astrophysics Data System (ADS)
Sokolova, Ekaterina; Pettersson, Thomas J. R.; Bergstedt, Olof; Hermansson, Malte
2013-08-01
To mitigate the faecal contamination of drinking water sources and, consequently, to prevent waterborne disease outbreaks, an estimation of the contribution from different sources to the total faecal contamination at the raw water intake of a drinking water treatment plant is needed. The aim of this article was to estimate how much different sources contributed to the faecal contamination at the water intake in a drinking water source, Lake Rådasjön in Sweden. For this purpose, the fate and transport of faecal indicator Escherichia coli within Lake Rådasjön were simulated by a three-dimensional hydrodynamic model. The calibrated hydrodynamic model described the measured data on vertical temperature distribution in the lake well (the Pearson correlation coefficient was 0.99). The data on the E. coli load from the identified contamination sources were gathered and the fate and transport of E. coli released from these sources within the lake were simulated using the developed hydrodynamic model, taking the decay of the E. coli into account. The obtained modelling results were compared to the observed E. coli concentrations at the water intake. The results illustrated that the sources that contributed the most to the faecal contamination at the water intake in Lake Rådasjön were the discharges from the on-site sewers and the main inflow to the lake - the river Mölndalsån. Based on the modelling results recommendations for water producers were formulated. The study demonstrated that this modelling approach is a useful tool for estimating the contribution from different sources to the faecal contamination at the water intake of a drinking water treatment plant and provided decision-support information for the reduction of risks posed to the drinking water source.
Kulhánová, Ivana; Hoffmann, Rasmus; Judge, Ken; Looman, Caspar W N; Eikemo, Terje A; Bopp, Matthias; Deboosere, Patrick; Leinsalu, Mall; Martikainen, Pekka; Rychtaříková, Jitka; Wojtyniak, Bogdan; Menvielle, Gwenn; Mackenbach, Johan P
2014-09-01
Although higher education has been associated with lower mortality rates in many studies, the effect of potential improvements in educational distribution on future mortality levels is unknown. We therefore estimated the impact of projected increases in higher education on mortality in European populations. We used mortality and population data according to educational level from 21 European populations and developed counterfactual scenarios. The first scenario represented the improvement in the future distribution of educational attainment as expected on the basis of an assumption of cohort replacement. We estimated the effect of this counterfactual scenario on mortality with a 10-15-year time horizon among men and women aged 30-79 years using a specially developed tool based on population attributable fractions (PAF). We compared this with a second, upward levelling scenario in which everyone has obtained tertiary education. The reduction of mortality in the cohort replacement scenario ranged from 1.9 to 10.1% for men and from 1.7 to 9.0% for women. The reduction of mortality in the upward levelling scenario ranged from 22.0 to 57.0% for men and from 9.6 to 50.0% for women. The cohort replacement scenario was estimated to achieve only part (4-25% (men) and 10-31% (women)) of the potential mortality decrease seen in the upward levelling scenario. We concluded that the effect of on-going improvements in educational attainment on average mortality in the population differs across Europe, and can be substantial. Further investments in education may have important positive side-effects on population health. Copyright © 2014 Elsevier Ltd. All rights reserved.
Forlani, Lucas; Pedrini, Nicolás; Girotti, Juan R.; Mijailovsky, Sergio J.; Cardozo, Rubén M.; Gentile, Alberto G.; Hernández-Suárez, Carlos M.; Rabinovich, Jorge E.; Juárez, M. Patricia
2015-01-01
Background Current Chagas disease vector control strategies, based on chemical insecticide spraying, are growingly threatened by the emergence of pyrethroid-resistant Triatoma infestans populations in the Gran Chaco region of South America. Methodology and findings We have already shown that the entomopathogenic fungus Beauveria bassiana has the ability to breach the insect cuticle and is effective both against pyrethroid-susceptible and pyrethroid-resistant T. infestans, in laboratory as well as field assays. It is also known that T. infestans cuticle lipids play a major role as contact aggregation pheromones. We estimated the effectiveness of pheromone-based infection boxes containing B. bassiana spores to kill indoor bugs, and its effect on the vector population dynamics. Laboratory assays were performed to estimate the effect of fungal infection on female reproductive parameters. The effect of insect exuviae as an aggregation signal in the performance of the infection boxes was estimated both in the laboratory and in the field. We developed a stage-specific matrix model of T. infestans to describe the fungal infection effects on insect population dynamics, and to analyze the performance of the biopesticide device in vector biological control. Conclusions The pheromone-containing infective box is a promising new tool against indoor populations of this Chagas disease vector, with the number of boxes per house being the main driver of the reduction of the total domestic bug population. This ecologically safe approach is the first proven alternative to chemical insecticides in the control of T. infestans. The advantageous reduction in vector population by delayed-action fungal biopesticides in a contained environment is here shown supported by mathematical modeling. PMID:25969989
NASA Astrophysics Data System (ADS)
Maringanti, Chetan; Chaubey, Indrajeet; Popp, Jennie
2009-06-01
Best management practices (BMPs) are effective in reducing the transport of agricultural nonpoint source pollutants to receiving water bodies. However, selection of BMPs for placement in a watershed requires optimization of the available resources to obtain maximum possible pollution reduction. In this study, an optimization methodology is developed to select and place BMPs in a watershed to provide solutions that are both economically and ecologically effective. This novel approach develops and utilizes a BMP tool, a database that stores the pollution reduction and cost information of different BMPs under consideration. The BMP tool replaces the dynamic linkage of the distributed parameter watershed model during optimization and therefore reduces the computation time considerably. Total pollutant load from the watershed, and net cost increase from the baseline, were the two objective functions minimized during the optimization process. The optimization model, consisting of a multiobjective genetic algorithm (NSGA-II) in combination with a watershed simulation tool (Soil Water and Assessment Tool (SWAT)), was developed and tested for nonpoint source pollution control in the L'Anguille River watershed located in eastern Arkansas. The optimized solutions provided a trade-off between the two objective functions for sediment, phosphorus, and nitrogen reduction. The results indicated that buffer strips were very effective in controlling the nonpoint source pollutants from leaving the croplands. The optimized BMP plans resulted in potential reductions of 33%, 32%, and 13% in sediment, phosphorus, and nitrogen loads, respectively, from the watershed.
An estimating equation approach to dimension reduction for longitudinal data
Xu, Kelin; Guo, Wensheng; Xiong, Momiao; Zhu, Liping; Jin, Li
2016-01-01
Sufficient dimension reduction has been extensively explored in the context of independent and identically distributed data. In this article we generalize sufficient dimension reduction to longitudinal data and propose an estimating equation approach to estimating the central mean subspace. The proposed method accounts for the covariance structure within each subject and improves estimation efficiency when the covariance structure is correctly specified. Even if the covariance structure is misspecified, our estimator remains consistent. In addition, our method relaxes distributional assumptions on the covariates and is doubly robust. To determine the structural dimension of the central mean subspace, we propose a Bayesian-type information criterion. We show that the estimated structural dimension is consistent and that the estimated basis directions are root-\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$n$\\end{document} consistent, asymptotically normal and locally efficient. Simulations and an analysis of the Framingham Heart Study data confirm the effectiveness of our approach. PMID:27017956
REDUCTIVE DEHALOGENATION OF HALOMETHANES IN NATURAL AND MODEL SYSTEMS: QSAR ANALYSIS
Reductive dehalogenation is a dominant reaction pathway for halogenated organics in anoxic environments. Towards the goal of developing predictive tools for this reaction process, the reduction kinetics for a series of halomethanes were measured in batch studies with both natural...
Water Misting and Injection of Commercial Aircraft Engines to Reduce Airport NOx
NASA Technical Reports Server (NTRS)
Daggett, David L.; Hendricks, Robert C. (Technical Monitor)
2004-01-01
This report provides the first high level look at system design, airplane performance, maintenance, and cost implications of using water misting and water injection technology in aircraft engines for takeoff and climb-out NOx emissions reduction. With an engine compressor inlet water misting rate of 2.2 percent water-to-air ratio, a 47 percent NOx reduction was calculated. Combustor water injection could achieve greater reductions of about 85 percent, but with some performance penalties. For the water misting system on days above 59 F, a fuel efficiency benefit of about 3.5 percent would be experienced. Reductions of up to 436 F in turbine inlet temperature were also estimated, which could lead to increased hot section life. A 0.61 db noise reduction will occur. A nominal airplane weight penalty of less than 360 lb (no water) was estimated for a 305 passenger airplane. The airplane system cost is initially estimated at $40.92 per takeoff giving an attractive NOx emissions reduction cost/benefit ratio of about $1,663/ton.
Multiple Imputation of Cognitive Performance as a Repeatedly Measured Outcome
Rawlings, Andreea M.; Sang, Yingying; Sharrett, A. Richey; Coresh, Josef; Griswold, Michael; Kucharska-Newton, Anna M.; Palta, Priya; Wruck, Lisa M.; Gross, Alden L.; Deal, Jennifer A.; Power, Melinda C.; Bandeen-Roche, Karen
2016-01-01
Background Longitudinal studies of cognitive performance are sensitive to dropout, as participants experiencing cognitive deficits are less likely to attend study visits, which may bias estimated associations between exposures of interest and cognitive decline. Multiple imputation is a powerful tool for handling missing data, however its use for missing cognitive outcome measures in longitudinal analyses remains limited. Methods We use multiple imputation by chained equations (MICE) to impute cognitive performance scores of participants who did not attend the 2011-2013 exam of the Atherosclerosis Risk in Communities Study. We examined the validity of imputed scores using observed and simulated data under varying assumptions. We examined differences in the estimated association between diabetes at baseline and 20-year cognitive decline with and without imputed values. Lastly, we discuss how different analytic methods (mixed models and models fit using generalized estimate equations) and choice of for whom to impute result in different estimands. Results Validation using observed data showed MICE produced unbiased imputations. Simulations showed a substantial reduction in the bias of the 20-year association between diabetes and cognitive decline comparing MICE (3-4% bias) to analyses of available data only (16-23% bias) in a construct where missingness was strongly informative but realistic. Associations between diabetes and 20-year cognitive decline were substantially stronger with MICE than in available-case analyses. Conclusions Our study suggests when informative data are available for non-examined participants, MICE can be an effective tool for imputing cognitive performance and improving assessment of cognitive decline, though careful thought should be given to target imputation population and analytic model chosen, as they may yield different estimands. PMID:27619926
NASA Astrophysics Data System (ADS)
Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.
1997-12-01
Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.
NASA Astrophysics Data System (ADS)
González-Riancho, P.; Aguirre-Ayerbe, I.; Aniel-Quiroga, I.; Abad, S.; González, M.; Larreynaga, J.; Gavidia, F.; Gutiérrez, O. Q.; Álvarez-Gómez, J. A.; Medina, R.
2013-12-01
Advances in the understanding and prediction of tsunami impacts allow the development of risk reduction strategies for tsunami-prone areas. This paper presents an integral framework for the formulation of tsunami evacuation plans based on tsunami vulnerability assessment and evacuation modelling. This framework considers (i) the hazard aspects (tsunami flooding characteristics and arrival time), (ii) the characteristics of the exposed area (people, shelters and road network), (iii) the current tsunami warning procedures and timing, (iv) the time needed to evacuate the population, and (v) the identification of measures to improve the evacuation process. The proposed methodological framework aims to bridge between risk assessment and risk management in terms of tsunami evacuation, as it allows for an estimation of the degree of evacuation success of specific management options, as well as for the classification and prioritization of the gathered information, in order to formulate an optimal evacuation plan. The framework has been applied to the El Salvador case study, demonstrating its applicability to site-specific response times and population characteristics.
Changes in the interaction of resting-state neural networks from adolescence to adulthood.
Stevens, Michael C; Pearlson, Godfrey D; Calhoun, Vince D
2009-08-01
This study examined how the mutual interactions of functionally integrated neural networks during resting-state fMRI differed between adolescence and adulthood. Independent component analysis (ICA) was used to identify functionally connected neural networks in 100 healthy participants aged 12-30 years. Hemodynamic timecourses that represented integrated neural network activity were analyzed with tools that quantified system "causal density" estimates, which indexed the proportion of significant Granger causality relationships among system nodes. Mutual influences among networks decreased with age, likely reflecting stronger within-network connectivity and more efficient between-network influences with greater development. Supplemental tests showed that this normative age-related reduction in causal density was accompanied by fewer significant connections to and from each network, regional increases in the strength of functional integration within networks, and age-related reductions in the strength of numerous specific system interactions. The latter included paths between lateral prefrontal-parietal circuits and "default mode" networks. These results contribute to an emerging understanding that activity in widely distributed networks thought to underlie complex cognition influences activity in other networks. (c) 2009 Wiley-Liss, Inc.
Kasaie, Parastu; Mathema, Barun; Kelton, W David; Azman, Andrew S; Pennington, Jeff; Dowdy, David W
2015-01-01
In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission ("recent transmission proportion"), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional 'n-1' approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the 'n-1' technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the 'n-1' model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models' performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data.
Kasaie, Parastu; Mathema, Barun; Kelton, W. David; Azman, Andrew S.; Pennington, Jeff; Dowdy, David W.
2015-01-01
In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission (“recent transmission proportion”), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional ‘n-1’ approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the ‘n-1’ technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the ‘n-1’ model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models’ performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data. PMID:26679499
SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.
Zi, Zhike
2011-04-01
Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.
NASA Astrophysics Data System (ADS)
Lançon, F.
2011-06-01
The Anti-ship Missile (ASM) threat to be faced by ships will become more diverse and difficult. Intelligence, rules of engagement constraints, fast reaction-time for effective softkill solution require specific tools to design Electronic Warfare (EW) systems and to integrate it onboard ship. SAGEM Company provides decoy launcher system [1] and its associated Naval Electronic Warfare Simulation tool (NEWS) to permit softkill effectiveness analysis for anti-ship missile defence. NEWS tool generates virtual environment for missile-ship engagement and counter-measure simulator over a wide spectrum: RF, IR, EO. It integrates EW Command & Control (EWC2) process which is implemented in decoy launcher system and performs Monte-Carlo batch processing to evaluate softkill effectiveness in different engagement situations. NEWS is designed to allow immediate EWC2 process integration from simulation to real decoy launcher system. By design, it allows the final operator to be able to program, test and integrate its own EWC2 module and EW library onboard, so intelligence of each user is protected and evolution of threat can be taken into account through EW library update. The objectives of NEWS tool are also to define a methodology for trial definition and trial data reduction. Growth potential would permit to design new concept for EWC2 programmability and real time effectiveness estimation in EW system. This tool can also be used for operator training purpose. This paper presents the architecture design, the softkill programmability facility concept and the flexibility for onboard integration on ship. The concept of this operationally focused simulation, which is to use only one tool for design, development, trial validation and operational use, will be demonstrated.
Yousefzadeh, Samira; Matin, Atiyeh Rajabi; Ahmadi, Ehsan; Sabeti, Zahra; Alimohammadi, Mahmood; Aslani, Hassan; Nabizadeh, Ramin
2018-04-01
One of the most important aspects of environmental issues is the demand for clean and safe water. Meanwhile, disinfection process is one of the most important steps in safe water production. The present study aims at estimating the performance of UV, nano Zero-Valent Iron particles (nZVI, nano-Fe 0 ), and UV treatment with the addition of nZVI (combined process) for Bacillus subtilis spores inactivation. Effects of different factors on inactivation including contact time, initial nZVI concentration, UV irradiance and various aerations conditions were investigated. Response surface methodology, based on a five-level, two variable central composite design, was used to optimize target microorganism reduction and the experimental parameters. The results indicated that the disinfection time had the greatest positive impact on disinfection ability among the different selected independent variables. According to the results, it can be concluded that microbial reduction by UV alone was more effective than nZVI while the combined UV/nZVI process demonstrated the maximum log reduction. The optimum reduction of about 4 logs was observed at 491 mg/L of nZVI and 60 min of contact time when spores were exposed to UV radiation under deaerated condition. Therefore, UV/nZVI process can be suggested as a reliable method for Bacillus subtilis spores inactivation. Copyright © 2018. Published by Elsevier Ltd.
Power fluctuation reduction methodology for the grid-connected renewable power systems
NASA Astrophysics Data System (ADS)
Aula, Fadhil T.; Lee, Samuel C.
2013-04-01
This paper presents a new methodology for eliminating the influence of the power fluctuations of the renewable power systems. The renewable energy, which is to be considered an uncertain and uncontrollable resource, can only provide irregular electrical power to the power grid. This irregularity creates fluctuations of the generated power from the renewable power systems. These fluctuations cause instability to the power system and influence the operation of conventional power plants. Overall, the power system is vulnerable to collapse if necessary actions are not taken to reduce the impact of these fluctuations. This methodology aims at reducing these fluctuations and makes the generated power capability for covering the power consumption. This requires a prediction tool for estimating the generated power in advance to provide the range and the time of occurrence of the fluctuations. Since most of the renewable energies are weather based, as a result a weather forecast technique will be used for predicting the generated power. The reduction of the fluctuation also requires stabilizing facilities to maintain the output power at a desired level. In this study, a wind farm and a photovoltaic array as renewable power systems and a pumped-storage and batteries as stabilizing facilities are used, since they are best suitable for compensating the fluctuations of these types of power suppliers. As an illustrative example, a model of wind and photovoltaic power systems with battery energy and pumped hydro storage facilities for power fluctuation reduction is included, and its power fluctuation reduction is verified through simulation.
Do All Roads Lead to Rome? ("or" Reductions for Dummy Travelers)
ERIC Educational Resources Information Center
Kilpelainen, Pekka
2010-01-01
Reduction is a central ingredient of computational thinking, and an important tool in algorithm design, in computability theory, and in complexity theory. Reduction has been recognized to be a difficult topic for students to learn. Previous studies on teaching reduction have concentrated on its use in special courses on the theory of computing. As…
Mennini, Francesco Saverio; Marcellusi, Andrea; Gitto, Lara; Iannone, Florenzo
2017-04-01
Rheumatoid arthritis (RA) is an autoimmune disease with a substantial medical and economic burden. In Italy, it affects approximately 280,000 people, therefore representing the musculoskeletal disease with the highest economic impact in terms of costs for the National Health Service and the social security system. The aim of this study was to estimate the annual economic burden of RA in Italy and determine the potential cost reduction considering the most effective biologic treatment for early rapidly progressing RA (ERPRA) patients. The model developed considers both direct costs that are mainly due to the pharmacological treatments, and indirect costs, which also include the productivity lost because of the disease. A systematic literature review provided the epidemiological and economic data used to inform the model. A one-way probabilistic sensitivity analysis based on 5000 Monte Carlo simulations was performed. Furthermore, specific scenario analyses were developed for those patients presenting an ERPRA, with the aim of evaluating the effectiveness of different biologic treatments for this subgroup of patients and estimating potential cost reduction. The total economic burden associated with RA was estimated to be €2.0 billion per year (95% confidence interval [CI] €1.8-2.3 billion). Forty-five percent of the expenditure was due to indirect costs (95% CI €0.8-1.0 billion); 45% depended on direct medical costs (95% CI €0.7-1.1 billion), and the residual 10% was determined by direct non-medical costs (95% CI €0.16-0.25 billion). In particular, the costs estimated for ERPRA patients totalled €76,171,181, of which approximately €18 million was associated with patients with a high level of anti-citrullinated protein antibodies (ACPA). The results of the analysis outline how it is possible to obtain a cost reduction for ERPRA patients of between €1 and €3 million by varying the number of patients with a high level of immunoglobulin G treated with the most effective biologic drug. In fact, the latter may determine higher efficacy outcomes, especially for poor prognostic ERPRA patients, ensuing higher levels of productivity. This study presents a pioneering approach to estimate the direct and indirect costs of RA. The model developed is a useful tool for policy makers as it allows to understand the economic implications of RA treatment in Italy, identify the most effective allocation of resources, and select the most appropriate treatment for ERPRA patients.
Between-User Reliability of Tier 1 Exposure Assessment Tools Used Under REACH.
Lamb, Judith; Galea, Karen S; Miller, Brian G; Hesse, Susanne; Van Tongeren, Martie
2017-10-01
When applying simple screening (Tier 1) tools to estimate exposure to chemicals in a given exposure situation under the Registration, Evaluation, Authorisation and restriction of CHemicals Regulation 2006 (REACH), users must select from several possible input parameters. Previous studies have suggested that results from exposure assessments using expert judgement and from the use of modelling tools can vary considerably between assessors. This study aimed to investigate the between-user reliability of Tier 1 tools. A remote-completion exercise and in person workshop were used to identify and evaluate tool parameters and factors such as user demographics that may be potentially associated with between-user variability. Participants (N = 146) generated dermal and inhalation exposure estimates (N = 4066) from specified workplace descriptions ('exposure situations') and Tier 1 tool combinations (N = 20). Interactions between users, tools, and situations were investigated and described. Systematic variation associated with individual users was minor compared with random between-user variation. Although variation was observed between choices made for the majority of input parameters, differing choices of Process Category ('PROC') code/activity descriptor and dustiness level impacted most on the resultant exposure estimates. Exposure estimates ranging over several orders of magnitude were generated for the same exposure situation by different tool users. Such unpredictable between-user variation will reduce consistency within REACH processes and could result in under-estimation or overestimation of exposure, risking worker ill-health or the implementation of unnecessary risk controls, respectively. Implementation of additional support and quality control systems for all tool users is needed to reduce between-assessor variation and so ensure both the protection of worker health and avoidance of unnecessary business risk management expenditure. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
A Tool for Estimating Variability in Wood Preservative Treatment Retention
Patricia K. Lebow; Adam M. Taylor; Timothy M. Young
2015-01-01
Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...
A Cost Estimation Tool for Charter Schools
ERIC Educational Resources Information Center
Hayes, Cheryl D.; Keller, Eric
2009-01-01
To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…
The National Academy of Science (NAS) recently recommended exploration of predictive tools, such as interspecies correlation estimation (ICE), to estimate acute toxicity values for listed species and support development of species sensitivity distributions (SSDs). We explored the...
FIESTA—An R estimation tool for FIA analysts
Tracey S. Frescino; Paul L. Patterson; Gretchen G. Moisen; Elizabeth A. Freeman
2015-01-01
FIESTA (Forest Inventory ESTimation for Analysis) is a user-friendly R package that was originally developed to support the production of estimates consistent with current tools available for the Forest Inventory and Analysis (FIA) National Program, such as FIDO (Forest Inventory Data Online) and EVALIDator. FIESTA provides an alternative data retrieval and reporting...
Ion beam deposition system for depositing low defect density extreme ultraviolet mask blanks
NASA Astrophysics Data System (ADS)
Jindal, V.; Kearney, P.; Sohn, J.; Harris-Jones, J.; John, A.; Godwin, M.; Antohe, A.; Teki, R.; Ma, A.; Goodwin, F.; Weaver, A.; Teora, P.
2012-03-01
Extreme ultraviolet lithography (EUVL) is the leading next-generation lithography (NGL) technology to succeed optical lithography at the 22 nm node and beyond. EUVL requires a low defect density reflective mask blank, which is considered to be one of the top two critical technology gaps for commercialization of the technology. At the SEMATECH Mask Blank Development Center (MBDC), research on defect reduction in EUV mask blanks is being pursued using the Veeco Nexus deposition tool. The defect performance of this tool is one of the factors limiting the availability of defect-free EUVL mask blanks. SEMATECH identified the key components in the ion beam deposition system that is currently impeding the reduction of defect density and the yield of EUV mask blanks. SEMATECH's current research is focused on in-house tool components to reduce their contributions to mask blank defects. SEMATECH is also working closely with the supplier to incorporate this learning into a next-generation deposition tool. This paper will describe requirements for the next-generation tool that are essential to realize low defect density EUV mask blanks. The goal of our work is to enable model-based predictions of defect performance and defect improvement for targeted process improvement and component learning to feed into the new deposition tool design. This paper will also highlight the defect reduction resulting from process improvements and the restrictions inherent in the current tool geometry and components that are an impediment to meeting HVM quality EUV mask blanks will be outlined.
Comparative analysis of old-age mortality estimations in Africa.
Bendavid, Eran; Seligman, Benjamin; Kubo, Jessica
2011-01-01
Survival to old ages is increasing in many African countries. While demographic tools for estimating mortality up to age 60 have improved greatly, mortality patterns above age 60 rely on models based on little or no demographic data. These estimates are important for social planning and demographic projections. We provide direct estimations of older-age mortality using survey data. Since 2005, nationally representative household surveys in ten sub-Saharan countries record counts of living and recently deceased household members: Burkina Faso, Côte d'Ivoire, Ethiopia, Namibia, Nigeria, Swaziland, Tanzania, Uganda, Zambia, and Zimbabwe. After accounting for age heaping using multiple imputation, we use this information to estimate probability of death in 5-year intervals ((5)q(x)). We then compare our (5)q(x) estimates to those provided by the World Health Organization (WHO) and the United Nations Population Division (UNPD) to estimate the differences in mortality estimates, especially among individuals older than 60 years old. We obtained information on 505,827 individuals (18.4% over age 60, 1.64% deceased). WHO and UNPD mortality models match our estimates closely up to age 60 (mean difference in probability of death -1.1%). However, mortality probabilities above age 60 are lower using our estimations than either WHO or UNPD. The mean difference between our sample and the WHO is 5.9% (95% CI 3.8-7.9%) and between our sample is UNPD is 13.5% (95% CI 11.6-15.5%). Regardless of the comparator, the difference in mortality estimations rises monotonically above age 60. Mortality estimations above age 60 in ten African countries exhibit large variations depending on the method of estimation. The observed patterns suggest the possibility that survival in some African countries among adults older than age 60 is better than previously thought. Improving the quality and coverage of vital information in developing countries will become increasingly important with future reductions in mortality.
Fargo, Kelly L.; Johnston, Jessica; Stevenson, Kurt B.; Deutscher, Meredith
2015-01-01
Background: Studies evaluating the impact of passive cost visibility tools on antibiotic prescribing are lacking. Objective: The objective of this study was to evaluate whether the implementation of a passive antibiotic cost visibility tool would impact antibiotic prescribing and decrease antibiotic spending. Methods: An efficiency and effectiveness initiative (EEI) was implemented in October 2012. To support the EEI, an antibiotic cost visibility tool was created in June 2013 displaying the relative cost of antibiotics. Using an observational study of interrupted time series design, 3 time frames were studied: pre EEI, post EEI, and post cost visibility tool implementation. The primary outcome was antibiotic cost per 1,000 patient days. Secondary outcomes included case mix index (CMI)–adjusted antibiotic cost per 1,000 patient days and utilization of the cost visibility tool. Results: Initiation of the EEI was associated with a $4,675 decrease in antibiotic cost per 1,000 patient days (P = .003), and costs continued to decrease in the months following EEI (P = .009). After implementation of the cost visibility tool, costs remained stable (P = .844). Despite CMI increasing over time, adjustment for CMI had no impact on the directionality or statistical significance of the results. Conclusion: Our study demonstrated a significant and sustained decrease in antibiotic cost per 1,000 patient days when focused medication cost reduction efforts were implemented, but passive cost visibility tool implementation was not associated with additional cost reduction. Antibiotic cost visibility tools may be of most benefit when prior medication cost reduction efforts are lacking or when an active intervention is incorporated. PMID:26405341
NASA Astrophysics Data System (ADS)
Realmuto, Vincent J.; Berk, Alexander
2016-11-01
We describe the development of Plume Tracker, an interactive toolkit for the analysis of multispectral thermal infrared observations of volcanic plumes and clouds. Plume Tracker is the successor to MAP_SO2, and together these flexible and comprehensive tools have enabled investigators to map sulfur dioxide (SO2) emissions from a number of volcanoes with TIR data from a variety of airborne and satellite instruments. Our objective for the development of Plume Tracker was to improve the computational performance of the retrieval procedures while retaining the accuracy of the retrievals. We have achieved a 300 × improvement in the benchmark performance of the retrieval procedures through the introduction of innovative data binning and signal reconstruction strategies, and improved the accuracy of the retrievals with a new method for evaluating the misfit between model and observed radiance spectra. We evaluated the accuracy of Plume Tracker retrievals with case studies based on MODIS and AIRS data acquired over Sarychev Peak Volcano, and ASTER data acquired over Kilauea and Turrialba Volcanoes. In the Sarychev Peak study, the AIRS-based estimate of total SO2 mass was 40% lower than the MODIS-based estimate. This result was consistent with a 45% reduction in the AIRS-based estimate of plume area relative to the corresponding MODIS-based estimate. In addition, we found that our AIRS-based estimate agreed with an independent estimate, based on a competing retrieval technique, within a margin of ± 20%. In the Kilauea study, the ASTER-based concentration estimates from 21 May 2012 were within ± 50% of concurrent ground-level concentration measurements. In the Turrialba study, the ASTER-based concentration estimates on 21 January 2012 were in exact agreement with SO2 concentrations measured at plume altitude on 1 February 2012.
NASA Astrophysics Data System (ADS)
Di Vittorio, A. V.; Simmonds, M.; Nico, P. S.
2017-12-01
Land-based carbon sequestration and GreenHouse Gas (GHG) reduction strategies are often implemented in small patches and evaluated independently from each other, which poses several challenges to determining their potential benefits at the regional scales at which carbon/GHG targets are defined. These challenges include inconsistent methods, uncertain scalability to larger areas, and lack of constraints such as land ownership and competition among multiple strategies. To address such challenges we have developed an integrated carbon and GHG budget model of California's entire landscape, delineated by geographic region, land type, and ownership. This empirical model has annual time steps and includes net ecosystem carbon exchange, wildfire, multiple forest management practices including wood and bioenergy production, cropland and rangeland soil management, various land type restoration activities, and land cover change. While the absolute estimates vary considerably due to uncertainties in initial carbon densities and ecosystem carbon exchange rates, the estimated effects of particular management activities with respect to baseline are robust across these uncertainties. Uncertainty in land use/cover change data is also critical, as different rates of shrubland to grassland conversion can switch the system from a carbon source to a sink. The results indicate that reducing urban area expansion has substantial and consistent benefits, while the effects of direct land management practices vary and depend largely on the available management area. Increasing forest fuel reduction extent over the baseline contributes to annual GHG costs during increased management, and annual benefits after increased management ceases. Cumulatively, it could take decades to recover the cost of 14 years of increased fuel reduction. However, forest carbon losses can be completely offset within 20 years through increases in urban forest fraction and marsh restoration. Additionally, highly uncertain black carbon estimates dominate the overall GHG budget due to wildfire, forest management, and bioenergy production. Overall, this tool is well suited for exploring suites of management options and extents throughout California in order to quantify potential regional carbon sequestration and GHG emission benefits.
An urban runoff model designed to inform stormwater management decisions.
Beck, Nicole G; Conley, Gary; Kanner, Lisa; Mathias, Margaret
2017-05-15
We present an urban runoff model designed for stormwater managers to quantify runoff reduction benefits of mitigation actions that has lower input data and user expertise requirements than most commonly used models. The stormwater tool to estimate load reductions (TELR) employs a semi-distributed approach, where landscape characteristics and process representation are spatially-lumped within urban catchments on the order of 100 acres (40 ha). Hydrologic computations use a set of metrics that describe a 30-year rainfall distribution, combined with well-tested algorithms for rainfall-runoff transformation and routing to generate average annual runoff estimates for each catchment. User inputs include the locations and specifications for a range of structural best management practice (BMP) types. The model was tested in a set of urban catchments within the Lake Tahoe Basin of California, USA, where modeled annual flows matched that of the observed flows within 18% relative error for 5 of the 6 catchments and had good regional performance for a suite of performance metrics. Comparisons with continuous simulation models showed an average of 3% difference from TELR predicted runoff for a range of hypothetical urban catchments. The model usually identified the dominant BMP outflow components within 5% relative error of event-based measured flow data and simulated the correct proportionality between outflow components. TELR has been implemented as a web-based platform for use by municipal stormwater managers to inform prioritization, report program benefits and meet regulatory reporting requirements (www.swtelr.com). Copyright © 2017. Published by Elsevier Ltd.
Advancing the research agenda for diagnostic error reduction.
Zwaan, Laura; Schiff, Gordon D; Singh, Hardeep
2013-10-01
Diagnostic errors remain an underemphasised and understudied area of patient safety research. We briefly summarise the methods that have been used to conduct research on epidemiology, contributing factors and interventions related to diagnostic error and outline directions for future research. Research methods that have studied epidemiology of diagnostic error provide some estimate on diagnostic error rates. However, there appears to be a large variability in the reported rates due to the heterogeneity of definitions and study methods used. Thus, future methods should focus on obtaining more precise estimates in different settings of care. This would lay the foundation for measuring error rates over time to evaluate improvements. Research methods have studied contributing factors for diagnostic error in both naturalistic and experimental settings. Both approaches have revealed important and complementary information. Newer conceptual models from outside healthcare are needed to advance the depth and rigour of analysis of systems and cognitive insights of causes of error. While the literature has suggested many potentially fruitful interventions for reducing diagnostic errors, most have not been systematically evaluated and/or widely implemented in practice. Research is needed to study promising intervention areas such as enhanced patient involvement in diagnosis, improving diagnosis through the use of electronic tools and identification and reduction of specific diagnostic process 'pitfalls' (eg, failure to conduct appropriate diagnostic evaluation of a breast lump after a 'normal' mammogram). The last decade of research on diagnostic error has made promising steps and laid a foundation for more rigorous methods to advance the field.
Eby, Elizabeth L; Smolen, Lee J; Pitts, Amber C; Krueger, Linda A; Andrews, Jeffrey Scott
2014-12-01
Estimate budgetary impact for skilled nursing facility converting from individual patient supply (IPS) delivery of rapid-acting insulin analog (RAIA) 10-mL vials or 3-mL prefilled pens to 3-mL vials. A budget-impact model used insulin volume purchased and assumptions of length of stay (LOS), daily RAIA dose, and delivery protocol to estimate the cost impact of using 3-mL vials. Skilled nursing facility. Medicare Part A patients. Simulations conducted using 12-month current and future scenarios. Comparisons of RAIA use for 13- and 28-day LOS. RAIA costs and savings, waste reduction. For patients with 13-day LOS using 20 units/day of IPS insulin, the model estimated a 70% reduction in RAIA costs and units purchased and a 95% waste reduction for the 3-mL vial compared with the 10-mL vial. The estimated costs for prefilled pen use were 58% lower than for use of 10-mL vials. The incremental savings associated with 3-mL vial use instead of prefilled pens was 28%, attributable to differences in per-unit cost of insulin in vials versus prefilled pens. Using a more conservative scenario of 28-day LOS at 20 units/day, the model estimated a 40% reduction in RAIA costs and units purchased, resulting in a 91% reduction in RAIA waste for the 3-mL vial, compared with 10-mL vial. Budget-impact analysis of conversion from RAIA 10-mL vials or 3-mL prefilled pens to 3-mL vials estimated reductions in both insulin costs and waste across multiple scenarios of varying LOS and patient daily doses for skilled nursing facility stays.
vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments
2010-01-01
Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791
vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.
Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin
2010-05-18
The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.
Influence of model reduction on uncertainty of flood inundation predictions
NASA Astrophysics Data System (ADS)
Romanowicz, R. J.; Kiczko, A.; Osuch, M.
2012-04-01
Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of that uncertainty to be assessed. This work was supported by National Science Centre of Poland (grant 2011/01/B/ST10/06866).
NASA Astrophysics Data System (ADS)
Akhavan Niaki, Farbod
The objective of this research is first to investigate the applicability and advantage of statistical state estimation methods for predicting tool wear in machining nickel-based superalloys over deterministic methods, and second to study the effects of cutting tool wear on the quality of the part. Nickel-based superalloys are among those classes of materials that are known as hard-to-machine alloys. These materials exhibit a unique combination of maintaining their strength at high temperature and have high resistance to corrosion and creep. These unique characteristics make them an ideal candidate for harsh environments like combustion chambers of gas turbines. However, the same characteristics that make nickel-based alloys suitable for aggressive conditions introduce difficulties when machining them. High strength and low thermal conductivity accelerate the cutting tool wear and increase the possibility of the in-process tool breakage. A blunt tool nominally deteriorates the surface integrity and damages quality of the machined part by inducing high tensile residual stresses, generating micro-cracks, altering the microstructure or leaving a poor roughness profile behind. As a consequence in this case, the expensive superalloy would have to be scrapped. The current dominant solution for industry is to sacrifice the productivity rate by replacing the tool in the early stages of its life or to choose conservative cutting conditions in order to lower the wear rate and preserve workpiece quality. Thus, monitoring the state of the cutting tool and estimating its effects on part quality is a critical task for increasing productivity and profitability in machining superalloys. This work aims to first introduce a probabilistic-based framework for estimating tool wear in milling and turning of superalloys and second to study the detrimental effects of functional state of the cutting tool in terms of wear and wear rate on part quality. In the milling operation, the mechanisms of tool failure were first identified and, based on the rapid catastrophic failure of the tool, a Bayesian inference method (i.e., Markov Chain Monte Carlo, MCMC) was used for parameter calibration of tool wear using a power mechanistic model. The calibrated model was then used in the state space probabilistic framework of a Kalman filter to estimate the tool flank wear. Furthermore, an on-machine laser measuring system was utilized and fused into the Kalman filter to improve the estimation accuracy. In the turning operation the behavior of progressive wear was investigated as well. Due to the nonlinear nature of wear in turning, an extended Kalman filter was designed for tracking progressive wear, and the results of the probabilistic-based method were compared with a deterministic technique, where significant improvement (more than 60% increase in estimation accuracy) was achieved. To fulfill the second objective of this research in understanding the underlying effects of wear on part quality in cutting nickel-based superalloys, a comprehensive study on surface roughness, dimensional integrity and residual stress was conducted. The estimated results derived from a probabilistic filter were used for finding the proper correlations between wear, surface roughness and dimensional integrity, along with a finite element simulation for predicting the residual stress profile for sharp and worn cutting tool conditions. The output of this research provides the essential information on condition monitoring of the tool and its effects on product quality. The low-cost Hall effect sensor used in this work to capture spindle power in the context of the stochastic filter can effectively estimate tool wear in both milling and turning operations, while the estimated wear can be used to generate knowledge of the state of workpiece surface integrity. Therefore the true functionality and efficiency of the tool in superalloy machining can be evaluated without additional high-cost sensing.
Predicting muscle forces during the propulsion phase of single leg triple hop test.
Alvim, Felipe Costa; Lucareli, Paulo Roberto Garcia; Menegaldo, Luciano Luporini
2018-01-01
Functional biomechanical tests allow the assessment of musculoskeletal system impairments in a simple way. Muscle force synergies associated with movement can provide additional information for diagnosis. However, such forces cannot be directly measured noninvasively. This study aims to estimate muscle activations and forces exerted during the preparation phase of the single leg triple hop test. Two different approaches were tested: static optimization (SO) and computed muscle control (CMC). As an indirect validation, model-estimated muscle activations were compared with surface electromyography (EMG) of selected hip and thigh muscles. Ten physically healthy active women performed a series of jumps, and ground reaction forces, kinematics and EMG data were recorded. An existing OpenSim model with 92 musculotendon actuators was used to estimate muscle forces. Reflective markers data were processed using the OpenSim Inverse Kinematics tool. Residual Reduction Algorithm (RRA) was applied recursively before running the SO and CMC. For both, the same adjusted kinematics were used as inputs. Both approaches presented similar residuals amplitudes. SO showed a closer agreement between the estimated activations and the EMGs of some muscles. Due to inherent EMG methodological limitations, the superiority of SO in relation to CMC can be only hypothesized. It should be confirmed by conducting further studies comparing joint contact forces. The workflow presented in this study can be used to estimate muscle forces during the preparation phase of the single leg triple hop test and allows investigating muscle activation and coordination. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal
2016-04-01
The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment 443, 152-162 Dupas R., Delmas M., Dorioz J.M., Garnier J., Moatar F., Gascuel-Odoux C., 2015. Assessing the impact of agricultural pressures on N and P loads andeutrophication risk. Ecological Indicators 48, 396-407. Grizzetti B., Bouraoui F., De Marsily G., 2008. Assessing nitrogen pressures on European surface water. Global Biogeochemical Cycles; 22. Smith R.A., Schwarz G.E., Alexander R.B., 1997. Regional interpretation of water-quality monitoring data. Water Resources Research 1997; 33: 2781-2798.
Caçola, Priscila; Gabbard, Carl
2012-04-01
This study examined age-related characteristics associated with tool use in the perception and modulation of peripersonal and extrapersonal space. Seventy-six (76) children representing age groups 7-, 9-, 11 years and 36 adults were presented with two experiments using an estimation of reach paradigm involving arm and tool conditions and a switch-block of the opposite condition. Experiment 1 tested Arm and Tool (20 cm length) estimation and found a significant effect for Age, Space, and an Age × Space interaction (ps < 0.05). Both children and adults were less accurate in extrapersonal space, indicating an overestimation bias. Interestingly, the adjustment period during the switch-block condition was immediate and similar across age. Experiment 2 was similar to Experiment 1 with the exception of using a 40-cm-length tool. Results also revealed an age effect and a difference in Space (ps < 0.05), however, participants underestimated. Speculatively, participants were less confident when presented with a longer tool, even though the adjustment period with both tool lengths was similar. Considered together, these results hint that: (1) children as young as 6 years of age are capable of being as accurate when estimating reach with a tool as they are with their arm, (2) the adjustment period associated with extending and retracting spaces is immediate rather than gradual, and (3) tool length influences estimations of reach.
NASA Technical Reports Server (NTRS)
Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)
2002-01-01
This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.
A tool to estimate the Fermi Large Area Telescope background for short-duration observations
Vasileiou, Vlasios
2013-07-25
Here, the proper estimation of the background is a crucial component of data analyses in astrophysics, such as source detection, temporal studies, spectroscopy, and localization. For the case of the Large Area Telescope (LAT) on board the Fermi spacecraft, approaches to estimate the background for short (≲1000 s duration) observations fail if they ignore the strong dependence of the LAT background on the continuously changing observational conditions. We present a (to be) publicly available background-estimation tool created and used by the LAT Collaboration in several analyses of Gamma Ray Bursts. This tool can accurately estimate the expected LAT background formore » any observational conditions, including, for example, observations with rapid variations of the Fermi spacecraft’s orientation occurring during automatic repointings.« less
NASA Astrophysics Data System (ADS)
Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin
2016-12-01
This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.
NASA Astrophysics Data System (ADS)
Luo, X.; Heck, B.; Awange, J. L.
2013-12-01
Global Navigation Satellite Systems (GNSS) are emerging as possible tools for remote sensing high-resolution atmospheric water vapour that improves weather forecasting through numerical weather prediction models. Nowadays, the GNSS-derived tropospheric zenith total delay (ZTD), comprising zenith dry delay (ZDD) and zenith wet delay (ZWD), is achievable with sub-centimetre accuracy. However, if no representative near-site meteorological information is available, the quality of the ZDD derived from tropospheric models is degraded, leading to inaccurate estimation of the water vapour component ZWD as difference between ZTD and ZDD. On the basis of freely accessible regional surface meteorological data, this paper proposes a height-dependent linear correction model for a priori ZDD. By applying the ordinary least-squares estimation (OLSE), bootstrapping (BOOT), and leave-one-out cross-validation (CROS) methods, the model parameters are estimated and analysed with respect to outlier detection. The model validation is carried out using GNSS stations with near-site meteorological measurements. The results verify the efficiency of the proposed ZDD correction model, showing a significant reduction in the mean bias from several centimetres to about 5 mm. The OLSE method enables a fast computation, while the CROS procedure allows for outlier detection. All the three methods produce consistent results after outlier elimination, which improves the regression quality by about 20% and the model accuracy by up to 30%.
SKYNET: an efficient and robust neural network training tool for machine learning in astronomy
NASA Astrophysics Data System (ADS)
Graff, Philip; Feroz, Farhan; Hobson, Michael P.; Lasenby, Anthony
2014-06-01
We present the first public release of our generic neural network training algorithm, called SKYNET. This efficient and robust machine learning tool is able to train large and deep feed-forward neural networks, including autoencoders, for use in a wide range of supervised and unsupervised learning applications, such as regression, classification, density estimation, clustering and dimensionality reduction. SKYNET uses a `pre-training' method to obtain a set of network parameters that has empirically been shown to be close to a good solution, followed by further optimization using a regularized variant of Newton's method, where the level of regularization is determined and adjusted automatically; the latter uses second-order derivative information to improve convergence, but without the need to evaluate or store the full Hessian matrix, by using a fast approximate method to calculate Hessian-vector products. This combination of methods allows for the training of complicated networks that are difficult to optimize using standard backpropagation techniques. SKYNET employs convergence criteria that naturally prevent overfitting, and also includes a fast algorithm for estimating the accuracy of network outputs. The utility and flexibility of SKYNET are demonstrated by application to a number of toy problems, and to astronomical problems focusing on the recovery of structure from blurred and noisy images, the identification of gamma-ray bursters, and the compression and denoising of galaxy images. The SKYNET software, which is implemented in standard ANSI C and fully parallelized using MPI, is available at http://www.mrao.cam.ac.uk/software/skynet/.
SBML-PET: a Systems Biology Markup Language-based parameter estimation tool.
Zi, Zhike; Klipp, Edda
2006-11-01
The estimation of model parameters from experimental data remains a bottleneck for a major breakthrough in systems biology. We present a Systems Biology Markup Language (SBML) based Parameter Estimation Tool (SBML-PET). The tool is designed to enable parameter estimation for biological models including signaling pathways, gene regulation networks and metabolic pathways. SBML-PET supports import and export of the models in the SBML format. It can estimate the parameters by fitting a variety of experimental data from different experimental conditions. SBML-PET has a unique feature of supporting event definition in the SMBL model. SBML models can also be simulated in SBML-PET. Stochastic Ranking Evolution Strategy (SRES) is incorporated in SBML-PET for parameter estimation jobs. A classic ODE Solver called ODEPACK is used to solve the Ordinary Differential Equation (ODE) system. http://sysbio.molgen.mpg.de/SBML-PET/. The website also contains detailed documentation for SBML-PET.
Simons, Emily; Ferrari, Matthew; Fricks, John; Wannemuehler, Kathleen; Anand, Abhijeet; Burton, Anthony; Strebel, Peter
2012-06-09
In 2008 all WHO member states endorsed a target of 90% reduction in measles mortality by 2010 over 2000 levels. We developed a model to estimate progress made towards this goal. We constructed a state-space model with population and immunisation coverage estimates and reported surveillance data to estimate annual national measles cases, distributed across age classes. We estimated deaths by applying age-specific and country-specific case-fatality ratios to estimated cases in each age-country class. Estimated global measles mortality decreased 74% from 535,300 deaths (95% CI 347,200-976,400) in 2000 to 139,300 (71,200-447,800) in 2010. Measles mortality was reduced by more than three-quarters in all WHO regions except the WHO southeast Asia region. India accounted for 47% of estimated measles mortality in 2010, and the WHO African region accounted for 36%. Despite rapid progress in measles control from 2000 to 2007, delayed implementation of accelerated disease control in India and continued outbreaks in Africa stalled momentum towards the 2010 global measles mortality reduction goal. Intensified control measures and renewed political and financial commitment are needed to achieve mortality reduction targets and lay the foundation for future global eradication of measles. US Centers for Disease Control and Prevention (PMS 5U66/IP000161). Copyright © 2012 Elsevier Ltd. All rights reserved.
provide a general overview of the upstream oil and gas exploration and production processes and emissions covered by the tool; a discussion of EPA’s plans for the 2014 NEI pertaining to oil and gas; use of the tool to compile emissions estimates
An Economic Evaluation of Food Safety Education Interventions: Estimates and Critical Data Gaps.
Zan, Hua; Lambea, Maria; McDowell, Joyce; Scharff, Robert L
2017-08-01
The economic evaluation of food safety interventions is an important tool that practitioners and policy makers use to assess the efficacy of their efforts. These evaluations are built on models that are dependent on accurate estimation of numerous input variables. In many cases, however, there is no data available to determine input values and expert opinion is used to generate estimates. This study uses a benefit-cost analysis of the food safety component of the adult Expanded Food and Nutrition Education Program (EFNEP) in Ohio as a vehicle for demonstrating how results based on variable values that are not objectively determined may be sensitive to alternative assumptions. In particular, the focus here is on how reported behavioral change is translated into economic benefits. Current gaps in the literature make it impossible to know with certainty how many people are protected by the education (what are the spillover effects?), the length of time education remains effective, and the level of risk reduction from change in behavior. Based on EFNEP survey data, food safety education led 37.4% of participants to improve their food safety behaviors. Under reasonable default assumptions, benefits from this improvement significantly outweigh costs, yielding a benefit-cost ratio of between 6.2 and 10.0. Incorporation of a sensitivity analysis using alternative estimates yields a greater range of estimates (0.2 to 56.3), which highlights the importance of future research aimed at filling these research gaps. Nevertheless, most reasonable assumptions lead to estimates of benefits that justify their costs.
NASA Astrophysics Data System (ADS)
Aviles, Angelica I.; Widlak, Thomas; Casals, Alicia; Nillesen, Maartje M.; Ammari, Habib
2017-06-01
Cardiac motion estimation is an important diagnostic tool for detecting heart diseases and it has been explored with modalities such as MRI and conventional ultrasound (US) sequences. US cardiac motion estimation still presents challenges because of complex motion patterns and the presence of noise. In this work, we propose a novel approach to estimate cardiac motion using ultrafast ultrasound data. Our solution is based on a variational formulation characterized by the L 2-regularized class. Displacement is represented by a lattice of b-splines and we ensure robustness, in the sense of eliminating outliers, by applying a maximum likelihood type estimator. While this is an important part of our solution, the main object of this work is to combine low-rank data representation with topology preservation. Low-rank data representation (achieved by finding the k-dominant singular values of a Casorati matrix arranged from the data sequence) speeds up the global solution and achieves noise reduction. On the other hand, topology preservation (achieved by monitoring the Jacobian determinant) allows one to radically rule out distortions while carefully controlling the size of allowed expansions and contractions. Our variational approach is carried out on a realistic dataset as well as on a simulated one. We demonstrate how our proposed variational solution deals with complex deformations through careful numerical experiments. The low-rank constraint speeds up the convergence of the optimization problem while topology preservation ensures a more accurate displacement. Beyond cardiac motion estimation, our approach is promising for the analysis of other organs that exhibit motion.
Strehl-constrained reconstruction of post-adaptive optics data and the Software Package AIRY, v. 6.1
NASA Astrophysics Data System (ADS)
Carbillet, Marcel; La Camera, Andrea; Deguignet, Jérémy; Prato, Marco; Bertero, Mario; Aristidi, Éric; Boccacci, Patrizia
2014-08-01
We first briefly present the last version of the Software Package AIRY, version 6.1, a CAOS-based tool which includes various deconvolution methods, accelerations, regularizations, super-resolution, boundary effects reduction, point-spread function extraction/extrapolation, stopping rules, and constraints in the case of iterative blind deconvolution (IBD). Then, we focus on a new formulation of our Strehl-constrained IBD, here quantitatively compared to the original formulation for simulated near-infrared data of an 8-m class telescope equipped with adaptive optics (AO), showing their equivalence. Next, we extend the application of the original method to the visible domain with simulated data of an AO-equipped 1.5-m telescope, testing also the robustness of the method with respect to the Strehl ratio estimation.
Dust control effectiveness of drywall sanding tools.
Young-Corbett, Deborah E; Nussbaum, Maury A
2009-07-01
In this laboratory study, four drywall sanding tools were evaluated in terms of dust generation rates in the respirable and thoracic size classes. In a repeated measures study design, 16 participants performed simulated drywall finishing tasks with each of four tools: (1) ventilated sander, (2) pole sander, (3) block sander, and (4) wet sponge. Dependent variables of interest were thoracic and respirable breathing zone dust concentrations. Analysis by Friedman's Test revealed that the ventilated drywall sanding tool produced significantly less dust, of both size classes, than did the other three tools. The pole and wet sanders produced significantly less dust of both size classes than did the block sander. The block sander, the most commonly used tool in drywall finishing operations, produced significantly more dust of both size classes than did the other three tools. When compared with the block sander, the other tools offer substantial dust reduction. The ventilated tool reduced respirable concentrations by 88% and thoracic concentrations by 85%. The pole sander reduced respirable concentrations by 58% and thoracic by 50%. The wet sander produced reductions of 60% and 47% in the respirable and thoracic classes, respectively. Wet sponge sanders and pole sanders are effective at reducing breathing-zone dust concentrations; however, based on its superior dust control effectiveness, the ventilated sander is the recommended tool for drywall finishing operations.
Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project
NASA Astrophysics Data System (ADS)
van Eck, T.; Giardini, D.
2010-12-01
The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.
i-Tree: Tools to assess and manage structure, function, and value of community forests
NASA Astrophysics Data System (ADS)
Hirabayashi, S.; Nowak, D.; Endreny, T. A.; Kroll, C.; Maco, S.
2011-12-01
Trees in urban communities can mitigate many adverse effects associated with anthropogenic activities and climate change (e.g. urban heat island, greenhouse gas, air pollution, and floods). To protect environmental and human health, managers need to make informed decisions regarding urban forest management practices. Here we present the i-Tree suite of software tools (www.itreetools.org) developed by the USDA Forest Service and their cooperators. This software suite can help urban forest managers assess and manage the structure, function, and value of urban tree populations regardless of community size or technical capacity. i-Tree is a state-of-the-art, peer-reviewed Windows GUI- or Web-based software that is freely available, supported, and continuously refined by the USDA Forest Service and their cooperators. Two major features of i-Tree are 1) to analyze current canopy structures and identify potential planting spots, and 2) to estimate the environmental benefits provided by the trees, such as carbon storage and sequestration, energy conservation, air pollution removal, and storm water reduction. To cover diverse forest topologies, various tools were developed within the i-Tree suite: i-Tree Design for points (individual trees), i-Tree Streets for lines (street trees), and i-Tree Eco, Vue, and Canopy (in the order of complexity) for areas (community trees). Once the forest structure is identified with these tools, ecosystem services provided by trees can be estimated with common models and protocols, and reports in the form of texts, charts, and figures are then created for users. Since i-Tree was developed with a client/server architecture, nationwide data in the US such as location-related parameters, weather, streamflow, and air pollution data are stored in the server and retrieved to a user's computer at run-time. Freely available remote-sensed images (e.g. NLCD and Google maps) are also employed to estimate tree canopy characteristics. As the demand for i-Tree grows internationally, environmental databases from more countries will be coupled with the software suite. Two more i-Tree applications, i-Tree Forecast and i-Tree Landscape are now under development. i-Tree Forecast simulates canopy structures for up to 100 years based on planting and mortality rates and adds capabilities for other i-Tree applications to estimate the benefits of future canopy scenarios. While most i-Tree applications employ a spatially lumped approach, i-Tree landscape employs a spatially distributed approach that allows users to map changes in canopy cover and ecosystem services through time and space. These new i-Tree tools provide an advanced platform for urban managers to assess the impact of current and future urban forests. i-Tree allows managers to promote effective urban forest management and sound arboricultural practices by providing information for advocacy and planning, baseline data for making informed decisions, and standardization for comparisons with other communities.
Maternal morbidity measurement tool pilot: study protocol.
Say, Lale; Barreix, Maria; Chou, Doris; Tunçalp, Özge; Cottler, Sara; McCaw-Binns, Affette; Gichuhi, Gathari Ndirangu; Taulo, Frank; Hindin, Michelle
2016-06-09
While it is estimated that for every maternal death, 20-30 women suffer morbidity, these estimates are not based on standardized methods and measures. Lack of an agreed-upon definition, identification criteria, standardized assessment tools, and indicators has limited valid, routine, and comparable measurements of maternal morbidity. The World Health Organization (WHO) convened the Maternal Morbidity Working Group (MMWG) to develop standardized methods to improve estimates of maternal morbidity. To date, the MMWG has developed a definition and provided input into the development of a set of measurement tools. This protocol outlines the pilot test for measuring maternal morbidity in antenatal and postnatal clinical populations using these new tools. In each setting, the tools will be piloted on approximately 250 women receiving antenatal care (ANC) (at least 28 weeks pregnant) and 250 women receiving postpartum care (PPC) (at least 6 weeks postpartum). The tools will be administered by trained health care workers. Each tool has three modules as follows: 1. personal history - socio-economic information, and risk-factors (such as violence and substance abuse) 2. patient symptoms - WHO Disability Assessment Schedule (WHODAS) 12-item, and mental health questionnaires, General Anxiety Disorder, 7-item (GAD-7) and Personal Health Questionnaire, 9-item (PHQ-9) 3. physical examination - signs, laboratory tests and results. This pilot (planned for Jamaica, Kenya and Malawi) will allow for comparing the types of morbidities women experience between and across settings, and determine the feasibility, acceptability and utility of using a modified, streamlined tool for routine measurement and summary estimates of morbidity to inform resource allocation and service provision. As part of the post-2015 Sustainable Development Goals (SDGs) estimating and measuring maternal morbidity will be essential to ensure appropriate resources are allocated to address its impact and improve well-being.
The Community-Focused Exposure and Risk Screening Tool (C-FERST) is an online tool which provides access to resources that can help communities learn more about their environmental issues, and explore exposure and risk reduction options.
Pathogen reduction co-benefits of nutrient best management practices
Wainger, Lisa A.; Barber, Mary C.
2016-01-01
Background Many of the practices currently underway to reduce nitrogen, phosphorus, and sediment loads entering the Chesapeake Bay have also been observed to support reduction of disease-causing pathogen loadings. We quantify how implementation of these practices, proposed to meet the nutrient and sediment caps prescribed by the Total Maximum Daily Load (TMDL), could reduce pathogen loadings and provide public health co-benefits within the Chesapeake Bay system. Methods We used published data on the pathogen reduction potential of management practices and baseline fecal coliform loadings estimated as part of prior modeling to estimate the reduction in pathogen loadings to the mainstem Potomac River and Chesapeake Bay attributable to practices implemented as part of the TMDL. We then compare the estimates with the baseline loadings of fecal coliform loadings to estimate the total pathogen reduction potential of the TMDL. Results We estimate that the TMDL practices have the potential to decrease disease-causing pathogen loads from all point and non-point sources to the mainstem Potomac River and the entire Chesapeake Bay watershed by 19% and 27%, respectively. These numbers are likely to be underestimates due to data limitations that forced us to omit some practices from analysis. Discussion Based on known impairments and disease incidence rates, we conclude that efforts to reduce nutrients may create substantial health co-benefits by improving the safety of water-contact recreation and seafood consumption. PMID:27904807
Pathogen reduction co-benefits of nutrient best management practices.
Richkus, Jennifer; Wainger, Lisa A; Barber, Mary C
2016-01-01
Many of the practices currently underway to reduce nitrogen, phosphorus, and sediment loads entering the Chesapeake Bay have also been observed to support reduction of disease-causing pathogen loadings. We quantify how implementation of these practices, proposed to meet the nutrient and sediment caps prescribed by the Total Maximum Daily Load (TMDL), could reduce pathogen loadings and provide public health co-benefits within the Chesapeake Bay system. We used published data on the pathogen reduction potential of management practices and baseline fecal coliform loadings estimated as part of prior modeling to estimate the reduction in pathogen loadings to the mainstem Potomac River and Chesapeake Bay attributable to practices implemented as part of the TMDL. We then compare the estimates with the baseline loadings of fecal coliform loadings to estimate the total pathogen reduction potential of the TMDL. We estimate that the TMDL practices have the potential to decrease disease-causing pathogen loads from all point and non-point sources to the mainstem Potomac River and the entire Chesapeake Bay watershed by 19% and 27%, respectively. These numbers are likely to be underestimates due to data limitations that forced us to omit some practices from analysis. Based on known impairments and disease incidence rates, we conclude that efforts to reduce nutrients may create substantial health co-benefits by improving the safety of water-contact recreation and seafood consumption.
Impact of Paint Color on Rest Period Climate Control Loads in Long-Haul Trucks: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lustbader, J.; Kreutzer, C.; Jeffers, M.
Cab climate conditioning is one of the primary reasons for operating the main engine in a long-haul truck during driver rest periods. In the United States, sleeper cab trucks use approximately 667 million gallons of fuel annually for rest period idling. The U.S. Department of Energy's National Renewable Energy Laboratory's (NREL) CoolCab Project works closely with industry to design efficient thermal management systems for long-haul trucks that minimize engine idling and fuel use while maintaining occupant comfort. Heat transfer to the vehicle interior from opaque exterior surfaces is one of the major heat pathways that contribute to air conditioning loadsmore » during long-haul truck daytime rest period idling. To quantify the impact of paint color and the opportunity for advanced paints, NREL collaborated with Volvo Group North America, PPG Industries, and Dometic Environmental Corporation. Initial screening simulations using CoolCalc, NREL's rapid HVAC load estimation tool, showed promising air-conditioning load reductions due to paint color selection. Tests conducted at NREL's Vehicle Testing and Integration Facility using long-haul truck cab sections, 'test bucks,' showed a 31.1% of maximum possible reduction in rise over ambient temperature and a 20.8% reduction in daily electric air conditioning energy use by switching from black to white paint. Additionally, changing from blue to an advanced color-matched solar reflective blue paint resulted in a 7.3% reduction in daily electric air conditioning energy use for weather conditions tested in Colorado. National-level modeling results using weather data from major U.S. cities indicated that the increase in heating loads due to lighter paint colors is much smaller than the reduction in cooling loads.« less
Keita, Youssouf; Sangho, Hamadoun; Roberton, Timothy; Vignola, Emilia; Traoré, Mariam; Munos, Melinda
2017-11-07
Mali is one of four countries implementing a National Evaluation Platform (NEP) to build local capacity to answer evaluation questions for maternal, newborn, child health and nutrition (MNCH&N). In 2014-15, NEP-Mali addressed questions about the potential impact of Mali's MNCH&N plans and strategies, and identified priority interventions to achieve targeted mortality reductions. The NEP-Mali team modeled the potential impact of three intervention packages in the Lives Saved Tool (LiST) from 2014 to 2023. One projection included the interventions and targets from Mali's ten-year health strategy (PDDSS) for 2014-2023, and two others modeled intervention packages that included scale up of antenatal, intrapartum, and curative interventions, as well as reductions in stunting and wasting. We modeled the change in maternal, newborn and under-five mortality rates under these three projections, as well as the number of lives saved, overall and by intervention. If Mali were to achieve the MNCH&N coverage targets from its health strategy, under-5 mortality would be reduced from 121 per 1000 live births to 93 per 1000, far from the target of 69 deaths per 1000. Projections 1 and 2 produced estimated mortality reductions from 121 deaths per 1000 to 70 and 68 deaths per 1000, respectively. With respect to neonatal mortality, the mortality rate would be reduced from 39 to 32 deaths per 1000 live births under the current health strategy, and to 25 per 1000 under projections 1 and 2. This study revealed that achieving the coverage targets for the MNCH&N interventions in the 2014-23 PDDSS would likely not allow Mali to achieve its mortality targets. The NEP-Mali team was able to identify two packages of MNCH&N interventions (and targets) that achieved under-5 and neonatal mortality rates at, or very near, the PDDSS targets. The Malian Ministry of Health and Public Hygiene is using these results to revise its plans and strategies.
Predicting tool life in turning operations using neural networks and image processing
NASA Astrophysics Data System (ADS)
Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.
2018-05-01
A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.
NASA Astrophysics Data System (ADS)
Digiovanni, K. A.; Montalto, F. A.; Gaffin, S.; Rosenzweig, C.
2010-12-01
Green roofs and other urban green spaces can provide a variety of valuable benefits including reduction of the urban heat island effect, reduction of stormwater runoff, carbon sequestration, oxygen generation, air pollution mitigation etc. As many of these benefits are directly linked to the processes of evaporation and transpiration, accurate and representative estimation of urban evapotranspiration (ET) is a necessary tool for predicting and quantifying such benefits. However, many common ET estimation procedures were developed for agricultural applications, and thus carry inherent assumptions that may only be rarely applicable to urban green spaces. Various researchers have identified the estimation of expected urban ET rates as critical, yet poorly studied components of urban green space performance prediction and cite that further evaluation is needed to reconcile differences in predictions from varying ET modeling approaches. A small scale green roof lysimeter setup situated on the green roof of the Ethical Culture Fieldston School in the Bronx, NY has been the focus of ongoing monitoring initiated in June 2009. The experimental setup includes a 0.6 m by 1.2 m Lysimeter replicating the anatomy of the 500 m2 green roof of the building, with a roof membrane, drainage layer, 10 cm media depth, and planted with a variety of Sedum species. Soil moisture sensors and qualitative runoff measurements are also recorded in the Lysimeter, while a weather station situated on the rooftop records climatologic data. Direct quantification of actual evapotranspiration (AET) from the green roof weighing lysimeter was achieved through a mass balance approaches during periods absent of precipitation and drainage. A comparison of AET to estimates of potential evapotranspiration (PET) calculated from empirically and physically based ET models was performed in order to evaluate the applicability of conventional ET equations for the estimation of ET from green roofs. Results have shown that the empirically based Thornthwaite approach for estimating monthly average PET underestimates compared to AET by 54% over the course of a one year period, and performs similarly on a monthly basis. Estimates of PET from the Northeast Regional Climate Center MORECS model based on a variation of the Penman-Monteith model, overestimates compared to AET by only 2% over a one year period. However, monthly and daily estimates were not accurate, with the model overestimating during warm, summer months by as much as 206% and underestimating during winter months by as much as 58%, which would have significant implications if such estimates were utilized for the evaluation of potential benefits from green roofs. Thus, further evaluation and improvement of these and other methodologies are needed and will be pursued for estimation of ET from green roofs and other urban green spaces including NYC Greenstreets and urban parks.
A Comparative Analysis of Life-Cycle Assessment Tools for End-of-Life Materials Management Systems
We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal s...
Arterial waveguide model for shear wave elastography: implementation and in vitro validation
NASA Astrophysics Data System (ADS)
Vaziri Astaneh, Ali; Urban, Matthew W.; Aquino, Wilkins; Greenleaf, James F.; Guddati, Murthy N.
2017-07-01
Arterial stiffness is found to be an early indicator of many cardiovascular diseases. Among various techniques, shear wave elastography has emerged as a promising tool for estimating local arterial stiffness through the observed dispersion of guided waves. In this paper, we develop efficient models for the computational simulation of guided wave dispersion in arterial walls. The models are capable of considering fluid-loaded tubes, immersed in fluid or embedded in a solid, which are encountered in in vitro/ex vivo, and in vivo experiments. The proposed methods are based on judiciously combining Fourier transformation and finite element discretization, leading to a significant reduction in computational cost while fully capturing complex 3D wave propagation. The developed methods are implemented in open-source code, and verified by comparing them with significantly more expensive, fully 3D finite element models. We also validate the models using the shear wave elastography of tissue-mimicking phantoms. The computational efficiency of the developed methods indicates the possibility of being able to estimate arterial stiffness in real time, which would be beneficial in clinical settings.
Assessing fossil fuel CO2 emissions in California using atmospheric observations and models
NASA Astrophysics Data System (ADS)
Graven, H.; Fischer, M. L.; Lueker, T.; Jeong, S.; Guilderson, T. P.; Keeling, R. F.; Bambha, R.; Brophy, K.; Callahan, W.; Cui, X.; Frankenberg, C.; Gurney, K. R.; LaFranchi, B. W.; Lehman, S. J.; Michelsen, H.; Miller, J. B.; Newman, S.; Paplawsky, W.; Parazoo, N. C.; Sloop, C.; Walker, S. J.
2018-06-01
Analysis systems incorporating atmospheric observations could provide a powerful tool for validating fossil fuel CO2 (ffCO2) emissions reported for individual regions, provided that fossil fuel sources can be separated from other CO2 sources or sinks and atmospheric transport can be accurately accounted for. We quantified ffCO2 by measuring radiocarbon (14C) in CO2, an accurate fossil-carbon tracer, at nine observation sites in California for three months in 2014–15. There is strong agreement between the measurements and ffCO2 simulated using a high-resolution atmospheric model and a spatiotemporally-resolved fossil fuel flux estimate. Inverse estimates of total in-state ffCO2 emissions are consistent with the California Air Resources Board’s reported ffCO2 emissions, providing tentative validation of California’s reported ffCO2 emissions in 2014–15. Continuing this prototype analysis system could provide critical independent evaluation of reported ffCO2 emissions and emissions reductions in California, and the system could be expanded to other, more data-poor regions.
An approach to and web-based tool for infectious disease outbreak intervention analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daughton, Ashlynn R.; Generous, Nicholas; Priedhorsky, Reid
Infectious diseases are a leading cause of death globally. Decisions surrounding how to control an infectious disease outbreak currently rely on a subjective process involving surveillance and expert opinion. However, there are many situations where neither may be available. Modeling can fill gaps in the decision making process by using available data to provide quantitative estimates of outbreak trajectories. Effective reduction of the spread of infectious diseases can be achieved through collaboration between the modeling community and public health policy community. However, such collaboration is rare, resulting in a lack of models that meet the needs of the public healthmore » community. Here we show a Susceptible-Infectious-Recovered (SIR) model modified to include control measures that allows parameter ranges, rather than parameter point estimates, and includes a web user interface for broad adoption. We apply the model to three diseases, measles, norovirus and influenza, to show the feasibility of its use and describe a research agenda to further promote interactions between decision makers and the modeling community.« less
Dydrogesterone does not reverse the cardiovascular benefits of percutaneous estradiol.
Kuba, V M; Teixeira, M A M; Meirelles, R M R; Assumpção, C R L; Costa, O S
2013-02-01
To evaluate the influence of dydrogesterone on estimated cardiovascular risk of users of hormone replacement therapy (HRT) (with percutaneous 17β-estradiol in monotherapy and in combination with dydrogesterone) and HRT non-users through the Framingham score tool for a period of 2 years. Framingham scores were calculated from the medical records of patients treated for at least 2 years with 17β-estradiol alone or in combination with dydrogesterone, along with HRT non-users, through the analysis of patient medical records, followed for at least 2 years at Instituto Estadual de Diabetes e Endocrinologia Luiz Capriglione. Improvements in lipid profile, glucose and blood pressure levels, which reduced the estimated cardiovascular risk, were observed in the 17β-estradiol group. Similar changes were observed in the users of 17β-estradiol + dydrogesterone, suggesting that this progestogen does not attenuate the effects caused by 17β-estradiol. Both HRT groups showed a reduction in their Framingham score. In contrast to data from other HRT investigations on cardiovascular risk, these formulations proved to be safe, even in the first year of use.
An approach to and web-based tool for infectious disease outbreak intervention analysis
Daughton, Ashlynn R.; Generous, Nicholas; Priedhorsky, Reid; ...
2017-04-18
Infectious diseases are a leading cause of death globally. Decisions surrounding how to control an infectious disease outbreak currently rely on a subjective process involving surveillance and expert opinion. However, there are many situations where neither may be available. Modeling can fill gaps in the decision making process by using available data to provide quantitative estimates of outbreak trajectories. Effective reduction of the spread of infectious diseases can be achieved through collaboration between the modeling community and public health policy community. However, such collaboration is rare, resulting in a lack of models that meet the needs of the public healthmore » community. Here we show a Susceptible-Infectious-Recovered (SIR) model modified to include control measures that allows parameter ranges, rather than parameter point estimates, and includes a web user interface for broad adoption. We apply the model to three diseases, measles, norovirus and influenza, to show the feasibility of its use and describe a research agenda to further promote interactions between decision makers and the modeling community.« less
US forest carbon calculation tool: forest-land carbon stocks and net annual stock change
James E. Smith; Linda S. Heath; Michael C. Nichols
2007-01-01
The Carbon Calculation Tool 4.0, CCTv40.exe, is a computer application that reads publicly available forest inventory data collected by the U.S. Forest Service's Forest Inventory and Analysis Program (FIA) and generates state-level annualized estimates of carbon stocks on forest land based on FORCARB2 estimators. Estimates can be recalculated as...
Using FIESTA , an R-based tool for analysts, to look at temporal trends in forest estimates
Tracey S. Frescino; Paul L. Patterson; Elizabeth A. Freeman; Gretchen G. Moisen
2012-01-01
FIESTA (Forest Inventory Estimation for Analysis) is a user-friendly R package that supports the production of estimates for forest resources based on procedures from Bechtold and Patterson (2005). The package produces output consistent with current tools available for the Forest Inventory and Analysis National Program, such as FIDO (Forest Inventory Data Online) and...
Mychek-Londer, Justin G.; Bunnell, David B.
2013-01-01
Accurate estimates of fish consumption are required to understand trophic interactions and facilitate ecosystem-based fishery management. Despite their importance within the food-web, no method currently exists to estimate daily consumption for Great Lakes slimy (Cottus cognatus) and deepwater sculpin (Myoxocephalus thompsonii). We conducted experiments to estimate gastric evacuation (GEVAC) and collected field data from Lake Michigan to estimate index of fullness [(g prey/g fish weight)100%) to determine daily ration for water temperatures ranging 2–5 °C, coinciding with the winter and early spring season. Exponential GEVAC rates equaled 0.0115/h for slimy sculpin and 0.0147/h for deepwater sculpin, and did not vary between 2.7 °C and 5.1 °C for either species or between prey types (Mysis relicta and fish eggs) for slimy sculpin. Index of fullness varied with fish size, and averaged 1.93% and 1.85% for slimy and deepwater sculpins, respectively. Maximum index of fullness was generally higher (except for the smallest sizes) for both species in 2009–2010 than in 1976 despite reductions in a primary prey, Diporeia spp. Predictive daily ration equations were derived as a function of fish dry weight. Estimates of daily consumption ranged from 0.2 to 0.8% of their body weight, which was within the low range of estimates from other species at comparably low water temperatures. These results provide a tool to estimate the consumptive demand of sculpins which will improve our understanding of benthic offshore food webs and aid in management and restoration of these native species in the Great Lakes.
DOT National Transportation Integrated Search
2014-01-01
A flashing LED stop sign is essentially a normal octagonal stop sign with light emitted diodes (LED) on the : stop signs corners. A hierarchical Bayes observational before/after study found an estimated reduction of : about 41.5% in right-angle cr...
Ash reduction system using electrically heated particulate matter filter
Gonze, Eugene V [Pinckney, MI; Paratore, Jr., Michael J; He, Yongsheng [Sterling Heights, MI
2011-08-16
A control system for reducing ash comprises a temperature estimator module that estimates a temperature of an electrically heated particulate matter (PM) filter. A temperature and position estimator module estimates a position and temperature of an oxidation wave within the electrically heated PM filter. An ash reduction control module adjusts at least one of exhaust flow, fuel and oxygen levels in the electrically heated PM filter to adjust a position of the oxidation wave within the electrically heated PM filter based on the oxidation wave temperature and position.
NASA Astrophysics Data System (ADS)
Don, Steven; Whiting, Bruce R.; Hildebolt, Charles F.; Sehnert, W. James; Ellinwood, Jacquelyn S.; Töpfer, Karin; Masoumzadeh, Parinaz; Kraus, Richard A.; Kronemer, Keith A.; Herman, Thomas; McAlister, William H.
2006-03-01
The risk of radiation exposure is greatest for pediatric patients and, thus, there is a great incentive to reduce the radiation dose used in diagnostic procedures for children to "as low as reasonably achievable" (ALARA). Testing of low-dose protocols presents a dilemma, as it is unethical to repeatedly expose patients to ionizing radiation in order to determine optimum protocols. To overcome this problem, we have developed a computed-radiography (CR) dose-reduction simulation tool that takes existing images and adds synthetic noise to create realistic images that correspond to images generated with lower doses. The objective of our study was to determine the extent to which simulated, low-dose images corresponded with original (non-simulated) low-dose images. To make this determination, we created pneumothoraces of known volumes in five neonate cadavers and obtained images of the neonates at 10 mR, 1 mR and 0.1 mR (as measured at the cassette plate). The 10-mR exposures were considered "relatively-noise-free" images. We used these 10 mR-images and our simulation tool to create simulated 0.1- and 1-mR images. For the simulated and original images, we identified regions of interest (ROI) of the entire chest, free-in-air region, and liver. We compared the means and standard deviations of the ROI grey-scale values of the simulated and original images with paired t tests. We also had observers rate simulated and original images for image quality and for the presence or absence of pneumothoraces. There was no statistically significant difference in grey-scale-value means nor standard deviations between simulated and original entire chest ROI regions. The observer performance suggests that an exposure >=0.2 mR is required to detect the presence or absence of pneumothoraces. These preliminary results indicate that the use of the simulation tool is promising for achieving ALARA exposures in children.
An open tool for input function estimation and quantification of dynamic PET FDG brain scans.
Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro
2016-08-01
Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.
Graeve, Catherine; McGovern, Patricia; Nachreiner, Nancy M; Ayers, Lynn
2014-01-01
Occupational health nurses use their knowledge and skills to improve the health and safety of the working population; however, companies increasingly face budget constraints and may eliminate health and safety programs. Occupational health nurses must be prepared to document their services and outcomes, and use quantitative tools to demonstrate their value to employers. The aim of this project was to create and pilot test a quantitative tool for occupational health nurses to track their activities and potential cost savings for on-site occupational health nursing services. Tool developments included a pilot test in which semi-structured interviews with occupational health and safety leaders were conducted to identify currents issues and products used for estimating the value of occupational health nursing services. The outcome was the creation of a tool that estimates the economic value of occupational health nursing services. The feasibility and potential value of this tool is described.
Castañeda-Orjuela, Carlos; Romero, Martin; Arce, Patricia; Resch, Stephen; Janusz, Cara B; Toscano, Cristiana M; De la Hoz-Restrepo, Fernando
2013-07-02
The cost of Expanded Programs on Immunization (EPI) is an important aspect of the economic and financial analysis needed for planning purposes. Costs also are needed for cost-effectiveness analysis of introducing new vaccines. We describe a costing tool that improves the speed, accuracy, and availability of EPI costs and that was piloted in Colombia. The ProVac CostVac Tool is a spreadsheet-based tool that estimates overall EPI costs considering program inputs (personnel, cold chain, vaccines, supplies, etc.) at three administrative levels (central, departmental, and municipal) and one service delivery level (health facilities). It uses various costing methods. The tool was evaluated through a pilot exercise in Colombia. In addition to the costs obtained from the central and intermediate administrative levels, a survey of 112 local health facilities was conducted to collect vaccination costs. Total cost of the EPI, cost per dose of vaccine delivered, and cost per fully vaccinated child with the recommended immunization schedule in Colombia in 2009 were estimated. The ProVac CostVac Tool is a novel, user-friendly tool, which allows users to conduct an EPI costing study following guidelines for cost studies. The total costs of the Colombian EPI were estimated at US$ 107.8 million in 2009. The cost for a fully immunized child with the recommended schedule was estimated at US$ 153.62. Vaccines and vaccination supplies accounted for 58% of total costs, personnel for 21%, cold chain for 18%, and transportation for 2%. Most EPI costs are incurred at the central level (62%). The major cost driver at the department and municipal levels is personnel costs. The ProVac CostVac Tool proved to be a comprehensive and useful tool that will allow researchers and health officials to estimate the actual cost for national immunization programs. The present analysis shows that personnel, cold chain, and transportation are important components of EPI and should be carefully estimated in the cost analysis, particularly when evaluating new vaccine introduction. Copyright © 2013 Elsevier Ltd. All rights reserved.
Brady, Samuel L; Moore, Bria M; Yee, Brian S; Kaufman, Robert A
2014-01-01
To determine a comprehensive method for the implementation of adaptive statistical iterative reconstruction (ASIR) for maximal radiation dose reduction in pediatric computed tomography (CT) without changing the magnitude of noise in the reconstructed image or the contrast-to-noise ratio (CNR) in the patient. The institutional review board waived the need to obtain informed consent for this HIPAA-compliant quality analysis. Chest and abdominopelvic CT images obtained before ASIR implementation (183 patient examinations; mean patient age, 8.8 years ± 6.2 [standard deviation]; range, 1 month to 27 years) were analyzed for image noise and CNR. These measurements were used in conjunction with noise models derived from anthropomorphic phantoms to establish new beam current-modulated CT parameters to implement 40% ASIR at 120 and 100 kVp without changing noise texture or magnitude. Image noise was assessed in images obtained after ASIR implementation (492 patient examinations; mean patient age, 7.6 years ± 5.4; range, 2 months to 28 years) the same way it was assessed in the pre-ASIR analysis. Dose reduction was determined by comparing size-specific dose estimates in the pre- and post-ASIR patient cohorts. Data were analyzed with paired t tests. With 40% ASIR implementation, the average relative dose reduction for chest CT was 39% (2.7/4.4 mGy), with a maximum reduction of 72% (5.3/18.8 mGy). The average relative dose reduction for abdominopelvic CT was 29% (4.8/6.8 mGy), with a maximum reduction of 64% (7.6/20.9 mGy). Beam current modulation was unnecessary for patients weighing 40 kg or less. The difference between 0% and 40% ASIR noise magnitude was less than 1 HU, with statistically nonsignificant increases in patient CNR at 100 kVp of 8% (15.3/14.2; P = .41) for chest CT and 13% (7.8/6.8; P = .40) for abdominopelvic CT. Radiation dose reduction at pediatric CT was achieved when 40% ASIR was implemented as a dose reduction tool only; no net change to the magnitude of noise in the reconstructed image or the patient CNR occurred. © RSNA, 2013.
Estimation of Broadband Shock Noise Reduction in Turbulent Jets by Water Injection
NASA Technical Reports Server (NTRS)
Kandula, Max; Lonerjan, Michael J.
2008-01-01
The concept of effective jet properties introduced by the authors (AIAA-2007-3645) has been extended to the estimation of broadband shock noise reduction by water injection in supersonic jets. Comparison of the predictions with the test data for cold underexpanded supersonic nozzles shows a satisfactory agreement. The results also reveal the range of water mass flow rates over which saturation of mixing noise reduction and existence of parasitic noise are manifest.
Multi-category micro-milling tool wear monitoring with continuous hidden Markov models
NASA Astrophysics Data System (ADS)
Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon
2009-02-01
In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.
Giga-voxel computational morphogenesis for structural design
NASA Astrophysics Data System (ADS)
Aage, Niels; Andreassen, Erik; Lazarov, Boyan S.; Sigmund, Ole
2017-10-01
In the design of industrial products ranging from hearing aids to automobiles and aeroplanes, material is distributed so as to maximize the performance and minimize the cost. Historically, human intuition and insight have driven the evolution of mechanical design, recently assisted by computer-aided design approaches. The computer-aided approach known as topology optimization enables unrestricted design freedom and shows great promise with regard to weight savings, but its applicability has so far been limited to the design of single components or simple structures, owing to the resolution limits of current optimization methods. Here we report a computational morphogenesis tool, implemented on a supercomputer, that produces designs with giga-voxel resolution—more than two orders of magnitude higher than previously reported. Such resolution provides insights into the optimal distribution of material within a structure that were hitherto unachievable owing to the challenges of scaling up existing modelling and optimization frameworks. As an example, we apply the tool to the design of the internal structure of a full-scale aeroplane wing. The optimized full-wing design has unprecedented structural detail at length scales ranging from tens of metres to millimetres and, intriguingly, shows remarkable similarity to naturally occurring bone structures in, for example, bird beaks. We estimate that our optimized design corresponds to a reduction in mass of 2-5 per cent compared to currently used aeroplane wing designs, which translates into a reduction in fuel consumption of about 40-200 tonnes per year per aeroplane. Our morphogenesis process is generally applicable, not only to mechanical design, but also to flow systems, antennas, nano-optics and micro-systems.
Martyna, Agnieszka; Michalska, Aleksandra; Zadora, Grzegorz
2015-05-01
The problem of interpretation of common provenance of the samples within the infrared spectra database of polypropylene samples from car body parts and plastic containers as well as Raman spectra databases of blue solid and metallic automotive paints was under investigation. The research involved statistical tools such as likelihood ratio (LR) approach for expressing the evidential value of observed similarities and differences in the recorded spectra. Since the LR models can be easily proposed for databases described by a few variables, research focused on the problem of spectra dimensionality reduction characterised by more than a thousand variables. The objective of the studies was to combine the chemometric tools easily dealing with multidimensionality with an LR approach. The final variables used for LR models' construction were derived from the discrete wavelet transform (DWT) as a data dimensionality reduction technique supported by methods for variance analysis and corresponded with chemical information, i.e. typical absorption bands for polypropylene and peaks associated with pigments present in the car paints. Univariate and multivariate LR models were proposed, aiming at obtaining more information about the chemical structure of the samples. Their performance was controlled by estimating the levels of false positive and false negative answers and using the empirical cross entropy approach. The results for most of the LR models were satisfactory and enabled solving the stated comparison problems. The results prove that the variables generated from DWT preserve signal characteristic, being a sparse representation of the original signal by keeping its shape and relevant chemical information.
Giga-voxel computational morphogenesis for structural design.
Aage, Niels; Andreassen, Erik; Lazarov, Boyan S; Sigmund, Ole
2017-10-04
In the design of industrial products ranging from hearing aids to automobiles and aeroplanes, material is distributed so as to maximize the performance and minimize the cost. Historically, human intuition and insight have driven the evolution of mechanical design, recently assisted by computer-aided design approaches. The computer-aided approach known as topology optimization enables unrestricted design freedom and shows great promise with regard to weight savings, but its applicability has so far been limited to the design of single components or simple structures, owing to the resolution limits of current optimization methods. Here we report a computational morphogenesis tool, implemented on a supercomputer, that produces designs with giga-voxel resolution-more than two orders of magnitude higher than previously reported. Such resolution provides insights into the optimal distribution of material within a structure that were hitherto unachievable owing to the challenges of scaling up existing modelling and optimization frameworks. As an example, we apply the tool to the design of the internal structure of a full-scale aeroplane wing. The optimized full-wing design has unprecedented structural detail at length scales ranging from tens of metres to millimetres and, intriguingly, shows remarkable similarity to naturally occurring bone structures in, for example, bird beaks. We estimate that our optimized design corresponds to a reduction in mass of 2-5 per cent compared to currently used aeroplane wing designs, which translates into a reduction in fuel consumption of about 40-200 tonnes per year per aeroplane. Our morphogenesis process is generally applicable, not only to mechanical design, but also to flow systems, antennas, nano-optics and micro-systems.
1998-03-01
benefit estimation techniques used to monetize the value of flood hazard reduction in the City of Roanoke. Each method was then used to estimate...behavior. This framework justifies interpreting people’s choices to infer and then monetize their preferences. If individuals have well-ordered and...Journal of Agricultural Economics. 68 (1986) 2: 280-290. Soule, Don M. and Claude M. Vaughn, "Flood Protection Benefits as Reflected in Property
Managing and Transforming Waste Streams – A Tool for Communities
The Managing and Transforming Waste Streams Tool features 100 policy and program options communities can pursue to increase rates of recycling, composting, waste reduction, and materials reuse across waste stream generators.
Modern CACSD using the Robust-Control Toolbox
NASA Technical Reports Server (NTRS)
Chiang, Richard Y.; Safonov, Michael G.
1989-01-01
The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.
McCormack, M. Luke; Dickie, Ian A.; Eissenstat, David M.; ...
2015-03-10
Fine roots acquire essential soil resources and mediate biogeochemical cycling in terrestrial ecosystems. Estimates of carbon and nutrient allocation to build and maintain these structures remain uncertain due to challenges in consistent measurement and interpretation of fine-root systems. We define fine roots as all roots less than or equal to 2 mm in diameter, yet it is now recognized that this approach fails to capture the diversity of form and function observed among fine-root orders. We demonstrate how order-based and functional classification frameworks improve our understanding of dynamic root processes in ecosystems dominated by perennial plants. In these frameworks, finemore » roots are separated into either individual root orders or functionally defined into a shorter-lived absorptive pool and a longer-lived transport fine root pool. Furthermore, using these frameworks, we estimate that fine-root production and turnover represent 22% of terrestrial net primary production globally a ca. 30% reduction from previous estimates assuming a single fine-root pool. In the future we hope to develop tools to rapidly differentiate functional fine-root classes, explicit incorporation of mycorrhizal fungi in fine-root studies, and wider adoption of a two-pool approach to model fine roots provide opportunities to better understand belowground processes in the terrestrial biosphere.« less
Consultant management estimating tool : users' manual.
DOT National Transportation Integrated Search
2012-04-01
The Switchboard is the opening form displayed to users. Use : the Switchboard to access the main functions of the estimating : tool. Double-click on a box to select the desired function. From : the Switchboard a user can initiate a search for project...
PyCoTools: A Python Toolbox for COPASI.
Welsh, Ciaran M; Fullard, Nicola; Proctor, Carole J; Martinez-Guimera, Alvaro; Isfort, Robert J; Bascom, Charles C; Tasseff, Ryan; Przyborski, Stefan A; Shanley, Daryl P
2018-05-22
COPASI is an open source software package for constructing, simulating and analysing dynamic models of biochemical networks. COPASI is primarily intended to be used with a graphical user interface but often it is desirable to be able to access COPASI features programmatically, with a high level interface. PyCoTools is a Python package aimed at providing a high level interface to COPASI tasks with an emphasis on model calibration. PyCoTools enables the construction of COPASI models and the execution of a subset of COPASI tasks including time courses, parameter scans and parameter estimations. Additional 'composite' tasks which use COPASI tasks as building blocks are available for increasing parameter estimation throughput, performing identifiability analysis and performing model selection. PyCoTools supports exploratory data analysis on parameter estimation data to assist with troubleshooting model calibrations. We demonstrate PyCoTools by posing a model selection problem designed to show case PyCoTools within a realistic scenario. The aim of the model selection problem is to test the feasibility of three alternative hypotheses in explaining experimental data derived from neonatal dermal fibroblasts in response to TGF-β over time. PyCoTools is used to critically analyse the parameter estimations and propose strategies for model improvement. PyCoTools can be downloaded from the Python Package Index (PyPI) using the command 'pip install pycotools' or directly from GitHub (https://github.com/CiaranWelsh/pycotools). Documentation at http://pycotools.readthedocs.io. Supplementary data are available at Bioinformatics.
Urban and Transport Planning Related Exposures and Mortality: A Health Impact Assessment for Cities
Mueller, Natalie; Rojas-Rueda, David; Basagaña, Xavier; Cirach, Marta; Cole-Hunter, Tom; Dadvand, Payam; Donaire-Gonzalez, David; Foraster, Maria; Gascon, Mireia; Martinez, David; Tonne, Cathryn; Triguero-Mas, Margarita; Valentín, Antònia; Nieuwenhuijsen, Mark
2016-01-01
Background: By 2050, nearly 70% of the global population is projected to live in urban areas. Because the environments we inhabit affect our health, urban and transport designs that promote healthy living are needed. Objective: We estimated the number of premature deaths preventable under compliance with international exposure recommendations for physical activity (PA), air pollution, noise, heat, and access to green spaces. Methods: We developed and applied the Urban and TranspOrt Planning Health Impact Assessment (UTOPHIA) tool to Barcelona, Spain. Exposure estimates and mortality data were available for 1,357,361 residents. We compared recommended with current exposure levels. We quantified the associations between exposures and mortality and calculated population attributable fractions to estimate the number of premature deaths preventable. We also modeled life-expectancy and economic impacts. Results: We estimated that annually, nearly 20% of mortality could be prevented if international recommendations for performance of PA; exposure to air pollution, noise, and heat; and access to green space were followed. Estimations showed that the greatest portion of preventable deaths was attributable to increases in PA, followed by reductions of exposure to air pollution, traffic noise, and heat. Access to green spaces had smaller effects on mortality. Compliance was estimated to increase the average life expectancy by 360 (95% CI: 219, 493) days and result in economic savings of 9.3 (95% CI: 4.9, 13.2) billion EUR/year. Conclusions: PA factors and environmental exposures can be modified by changes in urban and transport planning. We emphasize the need for a) the reduction of motorized traffic through the promotion of active and public transport and b) the provision of green infrastructure, both of which are suggested to provide opportunities for PA and for mitigation of air pollution, noise, and heat. Citation: Mueller N, Rojas-Rueda D, Basagaña X, Cirach M, Cole-Hunter T, Dadvand P, Donaire-Gonzalez D, Foraster M, Gascon M, Martinez D, Tonne C, Triguero-Mas M, Valentín A, Nieuwenhuijsen M. 2017. Urban and transport planning related exposures and mortality: a health impact assessment for cities. Environ Health Perspect 125:89–96; http://dx.doi.org/10.1289/EHP220 PMID:27346385
Spreadsheet Assessment Tool v. 2.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, David J.; Martinez, Ruben
2016-03-03
The Spreadsheet Assessment Tool (SAT) is an easy to use, blast assessment tool that is intended to estimate the potential risk due to an explosive attack on a blood irradiator. The estimation of risk is based on the methodology, assumptions, and results of a detailed blast effects assessment study that is summarized in Sandia National Laboratories Technical Report SAND2015-6166. Risk as defined in the report and as used in the SAT is: "The potential risk of creating an air blast-induced vent opening at a buildings envelope surface". Vent openings can be created at a buildings envelope through the failure ofmore » an exterior building component—like a wall, window, or door—due to an explosive sabotage of an irradiator within the building. To estimate risk, the tool requires that users obtain and input information pertaining to the building's characteristics and the irradiator location. The tool also suggests several prescriptive mitigation strategies that can be considered to reduce risk. Given the variability in civilian building construction practices, the input parameters used by this tool may not apply to all buildings being assessed. The tool should not be used as a substitute for engineering judgment. The tool is intended for assessment purposes only.« less
Setton, Eleanor M; Veerman, Basil; Erickson, Anders; Deschenes, Steeve; Cheasley, Roz; Poplawski, Karla; Demers, Paul A; Keller, C Peter
2015-08-22
Emissions inventories aid in understanding the sources of hazardous air pollutants and how these vary regionally, supporting targeted reduction actions. Integrating information on the relative toxicity of emitted pollutants with respect to cancer in humans helps to further refine reduction actions or recommendations, but few national programs exist in North America that use emissions estimates in this way. The CAREX Canada Emissions Mapping Project provides key regional indicators of emissions (total annual and total annual toxic equivalent, circa 2011) of 21 selected known and suspected carcinogens. The indicators were calculated from industrial emissions reported to the National Pollutant Release Inventory (NPRI) and estimates of emissions from transportation (airports, trains, and car and truck traffic) and residential heating (oil, gas and wood), in conjunction with human toxicity potential factors. We also include substance-specific annual emissions in toxic equivalent kilograms and annual emissions in kilograms, to allow for ranking substances within any region. For provinces and territories in Canada, the indicators suggest the top five substances contributing to the total toxic equivalent emissions in any region could be prioritized for further investigation. Residents of Quebec and New Brunswick may be more at risk of exposure to industrial emissions than those in other regions, suggesting that a more detailed study of exposure to industrial emissions in these provinces is warranted. Residential wood smoke may be an important emission to control, particularly in the north and eastern regions of Canada. Residential oil and gas heating, along with rail emissions contribute little to regional emissions and therefore may not be an immediate regional priority. The developed indicators support the identification of pollutants and sources for additional investigation when planning exposure reduction actions among Canadian provinces and territories, but have important limitations similar to other emissions inventory-based tools. Additional research is required to evaluate how the Emissions Mapping Project is used by different groups and organizations with respect to informing actions aimed at reducing Canadians' potential exposure to harmful air pollutants.
Tensor integrand reduction via Laurent expansion
Hirschi, Valentin; Peraro, Tiziano
2016-06-09
We introduce a new method for the application of one-loop integrand reduction via the Laurent expansion algorithm, as implemented in the public C++ library Ninja. We show how the coefficients of the Laurent expansion can be computed by suitable contractions of the loop numerator tensor with cut-dependent projectors, making it possible to interface Ninja to any one-loop matrix element generator that can provide the components of this tensor. We implemented this technique in the Ninja library and interfaced it to MadLoop, which is part of the public MadGraph5_aMC@NLO framework. We performed a detailed performance study, comparing against other public reductionmore » tools, namely CutTools, Samurai, IREGI, PJFry++ and Golem95. We find that Ninja out-performs traditional integrand reduction in both speed and numerical stability, the latter being on par with that of the tensor integral reduction tool Golem95 which is however more limited and slower than Ninja. Lastly, we considered many benchmark multi-scale processes of increasing complexity, involving QCD and electro-weak corrections as well as effective non-renormalizable couplings, showing that Ninja’s performance scales well with both the rank and multiplicity of the considered process.« less
Machinability of titanium metal matrix composites (Ti-MMCs)
NASA Astrophysics Data System (ADS)
Aramesh, Maryam
Titanium metal matrix composites (Ti-MMCs), as a new generation of materials, have various potential applications in aerospace and automotive industries. The presence of ceramic particles enhances the physical and mechanical properties of the alloy matrix. However, the hard and abrasive nature of these particles causes various issues in the field of their machinability. Severe tool wear and short tool life are the most important drawbacks of machining this class of materials. There is very limited work in the literature regarding the machinability of this class of materials especially in the area of tool life estimation and tool wear. By far, polycrystalline diamond (PCD) tools appear to be the best choice for machining MMCs from researchers' point of view. However, due to their high cost, economical alternatives are sought. Cubic boron nitride (CBN) inserts, as the second hardest available tools, show superior characteristics such as great wear resistance, high hardness at elevated temperatures, a low coefficient of friction and a high melting point. Yet, so far CBN tools have not been studied during machining of Ti-MMCs. In this study, a comprehensive study has been performed to explore the tool wear mechanisms of CBN inserts during turning of Ti-MMCs. The unique morphology of the worn faces of the tools was investigated for the first time, which led to new insights in the identification of chemical wear mechanisms during machining of Ti-MMCs. Utilizing the full tool life capacity of cutting tools is also very crucial, due to the considerable costs associated with suboptimal replacement of tools. This strongly motivates development of a reliable model for tool life estimation under any cutting conditions. In this study, a novel model based on the survival analysis methodology is developed to estimate the progressive states of tool wear under any cutting conditions during machining of Ti-MMCs. This statistical model takes into account the machining time in addition to the effect of cutting parameters. Thus, promising results were obtained which showed a very good agreement with the experimental results. Moreover, a more advanced model was constructed, by adding the tool wear as another variable to the previous model. Therefore, a new model was proposed for estimating the remaining life of worn inserts under different cutting conditions, using the current tool wear data as an input. The results of this model were validated with the experimental results. The estimated results were well consistent with the results obtained from the experiments.
This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.
2014-04-01
Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.
NASA Astrophysics Data System (ADS)
Keen, A. S.; Lynett, P. J.; Ayca, A.
2016-12-01
Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once calibrated, the model was able to hindcast the damage produced in Santa Cruz Harbor during the 2010 Chile and 2011 Japan events. Results of the Santa Cruz analysis will be presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
Clean Cities Alternative Fuels and Advanced Vehicles Data Center (AFDC) features a wide range of Web-based tools to help vehicle fleets and individual consumers reduce their petroleum use. This brochure lists and describes Clean Cities online tools related to vehicles, alternative fueling stations, electric vehicle charging stations, fuel conservation, emissions reduction, fuel economy, and more.
Annualized earthquake loss estimates for California and their sensitivity to site amplification
Chen, Rui; Jaiswal, Kishor; Bausch, D; Seligson, H; Wills, C.J.
2016-01-01
Input datasets for annualized earthquake loss (AEL) estimation for California were updated recently by the scientific community, and include the National Seismic Hazard Model (NSHM), site‐response model, and estimates of shear‐wave velocity. Additionally, the Federal Emergency Management Agency’s loss estimation tool, Hazus, was updated to include the most recent census and economic exposure data. These enhancements necessitated a revisit to our previous AEL estimates and a study of the sensitivity of AEL estimates subjected to alternate inputs for site amplification. The NSHM ground motions for a uniform site condition are modified to account for the effect of local near‐surface geology. The site conditions are approximated in three ways: (1) by VS30 (time‐averaged shear‐wave velocity in the upper 30 m) value obtained from a geology‐ and topography‐based map consisting of 15 VS30 groups, (2) by site classes categorized according to National Earthquake Hazards Reduction Program (NEHRP) site classification, and (3) by a uniform NEHRP site class D. In case 1, ground motions are amplified using the Seyhan and Stewart (2014) semiempirical nonlinear amplification model. In cases 2 and 3, ground motions are amplified using the 2014 version of the NEHRP site amplification factors, which are also based on the Seyhan and Stewart model but are approximated to facilitate their use for building code applications. Estimated AELs are presented at multiple resolutions, starting with the state level assessment and followed by detailed assessments for counties, metropolitan statistical areas (MSAs), and cities. AEL estimate at the state level is ∼$3.7 billion, 70% of which is contributed from Los Angeles–Long Beach–Santa Ana, San Francisco–Oakland–Fremont, and Riverside–San Bernardino–Ontario MSAs. The statewide AEL estimate is insensitive to alternate assumptions of site amplification. However, we note significant differences in AEL estimates among the three sensitivity cases for smaller geographic units.
Reduced order modeling and active flow control of an inlet duct
NASA Astrophysics Data System (ADS)
Ge, Xiaoqing
Many aerodynamic applications require the modeling of compressible flows in or around a body, e.g., the design of aircraft, inlet or exhaust duct, wind turbines, or tall buildings. Traditional methods use wind tunnel experiments and computational fluid dynamics (CFD) to investigate the spatial and temporal distribution of the flows. Although they provide a great deal of insight into the essential characteristics of the flow field, they are not suitable for control analysis and design due to the high physical/computational cost. Many model reduction methods have been studied to reduce the complexity of the flow model. There are two main approaches: linearization based input/output modeling and proper orthogonal decomposition (POD) based model reduction. The former captures mostly the local behavior near a steady state, which is suitable to model laminar flow dynamics. The latter obtains a reduced order model by projecting the governing equation onto an "optimal" subspace and is able to model complex nonlinear flow phenomena. In this research we investigate various model reduction approaches and compare them in flow modeling and control design. We propose an integrated model-based control methodology and apply it to the reduced order modeling and active flow control of compressible flows within a very aggressive (length to exit diameter ratio, L/D, of 1.5) inlet duct and its upstream contraction section. The approach systematically applies reduced order modeling, estimator design, sensor placement and control design to improve the aerodynamic performance. The main contribution of this work is the development of a hybrid model reduction approach that attempts to combine the best features of input/output model identification and POD method. We first identify a linear input/output model by using a subspace algorithm. We next project the difference between CFD response and the identified model response onto a set of POD basis. This trajectory is fit to a nonlinear dynamical model to augment the linear input/output model. Thus, the full system is decomposed into a dominant linear subsystem and a low order nonlinear subsystem. The hybrid model is then used for control design and compared with other modeling methods in CFD simulations. Numerical results indicate that the hybrid model accurately predicts the nonlinear behavior of the flow for a 2D diffuser contraction section model. It also performs best in terms of feedback control design and learning control. Since some outputs of interest (e.g., the AIP pressure recovery) are not observable during normal operations, static and dynamic estimators are designed to recreate the information from available sensor measurements. The latter also provides a state estimation for feedback controller. Based on the reduced order models and estimators, different controllers are designed to improve the aerodynamic performance of the contraction section and inlet duct. The integrated control methodology is evaluated with CFD simulations. Numerical results demonstrate the feasibility and efficacy of the active flow control based on reduced order models. Our reduced order models not only generate a good approximation of the nonlinear flow dynamics over a wide input range, but also help to design controllers that significantly improve the flow response. The tools developed for model reduction, estimator and control design can also be applied to wind tunnel experiment.
Cost-benefit analysis of using sewage sludge as alternative fuel in a cement plant: a case study.
Nadal, Martí; Schuhmacher, Marta; Domingo, José L
2009-05-01
To enforce the implementation of the Kyoto Protocol targets, a number of governmental/international institutions have launched emission trade schemes as an approach to specify CO(2) caps and to regulate the emission trade in recent years. These schemes have been basically applied for large industrial sectors, including energy producers and energy-intensive users. Among them, cement plants are included among the big greenhouse gas (GHG) emitters. The use of waste as secondary fuel in clinker kilns is currently an intensive practice worldwide. However, people living in the vicinity of cement plants, where alternative fuels are being used, are frequently concerned about the potential increase in health risks. In the present study, a cost-benefit analysis was applied after substituting classical fuel for sewage sludge as an alternative fuel in a clinker kiln in Catalonia, Spain. The economical benefits resulting in the reduction of CO(2) emissions were compared with the changes in human health risks due to exposure to polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) and carcinogenic metals (As, Cd, Co, and Cr) before and after using sewage sludge to generate 20% of the thermal energy needed for pyro-processing. The exposure to PCDD/Fs and metals through air inhalation, soil ingestion and dermal absorption was calculated according to the environmental levels in soil. The carcinogenic risks were assessed, and the associated cost for the population was estimated by considering the DG Environment's recommended value for preventing a statistical fatality (VPF). In turn, the amount of CO(2) emitted was calculated, and the economical saving, according to the market prices, was evaluated. The use of sewage sludge as a substitute of conventional energy meant a probability cancer decrease of 4.60 for metals and a cancer risk increase of 0.04 for PCDD/Fs. Overall, a net reduction of 4.56 cancers for one million people can be estimated. The associated economical evaluation due to the decreasing cancer for 60,000 people, the current population living near the cement plant, would be of 0.56 million euros (US$ 0.83 million). In turn, a reduction of 144,000 tons of CO(2) emitted between 2003 and 2006 was estimated. Considering a cost of 20 euros per ton of CO(2), the global saving would be 2.88 million euros (US$ 4.26 million). After the partial substitution of the fuel, the current environmental exposure to metals and PCDD/Fs would even mean a potential decrease of health risks for the individuals living in the vicinity of the cement plant. The total benefit of using sewage sludge as an alternative fuel was calculated in 3.44 million euros (US$ 5.09 million). Environmental economics is becoming an interesting research field to convert environmental benefits (i.e., reduction of health risks, emission of pollutants, etc.) into economical value. The results show, that while the use of sewage sludge as secondary fuel is beneficial for the reduction in GHG emissions, no additional health risks for the population derived from PCDD/F and metal emissions are estimated. Cost-benefit analysis seems to be a suitable tool to estimate the environmental damage and benefit associated to industrial processes. Therefore, this should become a generalized practice, mainly for those more impacting sectors such as power industries. On the other hand, the extension of the study could vastly be enlarged by taking into account other potentially emitted GHGs, such as CH(4) and N(2)O, as well as other carcinogenic and non-carcinogenic micropollutants.
Transit Boardings Estimation and Simulation Tool (TBEST) calibration for guideway and BRT modes.
DOT National Transportation Integrated Search
2013-06-01
This research initiative was motivated by a desire of the Florida Department of Transportation and the : Transit Boardings Estimation and Simulation Tool (TBEST) project team to enhance the value of TBEST to : the planning community by improving its ...
Overview of T.E.S.T. (Toxicity Estimation Software Tool)
This talk provides an overview of T.E.S.T. (Toxicity Estimation Software Tool). T.E.S.T. predicts toxicity values and physical properties using a variety of different QSAR (quantitative structure activity relationship) approaches including hierarchical clustering, group contribut...
How freight moves : estimating milage and routes using an innovative GIS tool
DOT National Transportation Integrated Search
2007-06-01
The Bureau of Transportation Statistics (BTS) has developed an innovative software tool, called GeoMiler, that is helping researchers better estimate freight travel. GeoMiler is being used to compute mileages along likely routes for the nearly 6 mill...
Guimarães, Juliana L B; Brito, Maria A V P; Lange, Carla C; Silva, Márcio R; Ribeiro, João B; Mendonça, Letícia C; Mendonça, Juliana F M; Souza, Guilherme N
2017-07-01
The aim of this study was to estimate the economic impact of mastitis at the herd level and the weight (percent) of the components of this impact in a Holstein dairy herd under tropical conditions. Three estimates of the economic impact of mastitis were performed. In estimates 1 and 2 the real production and economic indices from February 2011 to January 2012 were considered. In the estimate 1, indices for mastitis classified as ideal were considered, whereas in the estimate 2, the mastitis indices used were those recorded at the farm and at Holstein Cattle Association of Minas Gerais State database (real indices). Ideal mastitis indices were bulk milk somatic cell counts less than 250,000 cells/mL, incidence of clinical mastitis less than 25 cases/100 cows/year, number of culls due to udder health problems less than 5% and the percentage of cows with somatic cell counts greater than 200,000 cells/mL less than 20%. Considering the ideal indices of mastitis, the economic impact was US$19,132.35. The three main components of the economic impact were culling cows (39.4%) and the reduction in milk production due to subclinical and clinical mastitis (32.3% and 18.2%, respectively). Estimate 2 using real mastitis indices showed an economic impact of US$61,623.13 and the reduction in milk production due to mastitis (77.7%) and milk disposal (14.0%) were the most relevant components. The real impact of culling cows was approximately 16 times less than the weight that was considered ideal, indicating that this procedure could have been more frequently adopted. The reduction in milk production was 27.2% higher than the reduction in Estimate 1, indicating a need to control and prevent mastitis. The estimate 3 considered the same indices as estimate 2, but for the period from February 2012 to January 2013. Its economic impact was US$91,552.69. During this period, 161 treatments of cows with an intramammary antibiotic were performed to eliminate Streptococcus agalactiae, and eight cows chronically infected with Staphylococcus aureus were culled. The reduction in milk production due to mastitis was the main component of the economic impact (54.9%). The culling of cows with chronic infection was associated with an increase in the economic impact of mastitis and a reduction in the average productivity per cow. At the herd level reduction in milk production was the component that presented the largest weight in the economic impact of the disease. Copyright © 2017 Elsevier B.V. All rights reserved.
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
NASA Astrophysics Data System (ADS)
Falinski, K. A.; Oleson, K.; Htun, H.; Kappel, C.; Lecky, J.; Rowe, C.; Selkoe, K.; White, C.
2016-12-01
Faced with anthropogenic stressors and declining coral reef states, managers concerned with restoration and resilience of coral reefs are increasingly recognizing the need to take a ridge-to-reef, ecosystem-based approach. An ecosystem services framing can help managers move towards these goals, helping to illustrate trade-offs and opportunities of management actions in terms of their impacts on society. We describe a research program building a spatial ecosystem services-based decision-support tool, and being applied to guide ridge-to-reef management in a NOAA priority site in West Maui. We use multiple modeling methods to link biophysical processes to ecosystem services and their spatial flows and social values in an integrating platform. Modeled services include water availability, sediment retention, nutrient retention and carbon sequestration on land. A coral reef ecosystem service model is under development to capture the linkages between terrestrial and coastal ecosystem services. Valuation studies are underway to quantify the implications for human well-being. The tool integrates techniques from decision science to facilitate decision making. We use the sediment retention model to illustrate the types of analyses the tool can support. The case study explores the tradeoffs between road rehabilitation costs and sediment export avoided. We couple the sediment and cost models with trade-off analysis to identify optimal distributed solutions that are most cost-effective in reducing erosion, and then use those models to estimate sediment exposure to coral reefs. We find that cooperation between land owners reveals opportunities for maximizing the benefits of fixing roads and minimizes costs. This research forms the building blocks of an ecosystem service decision support tool that we intend to continue to test and apply in other Pacific Island settings.
Land-Use Portfolio Modeler, Version 1.0
Taketa, Richard; Hong, Makiko
2010-01-01
Natural hazards pose significant threats to the public safety and economic health of many communities throughout the world. Community leaders and decision-makers continually face the challenges of planning and allocating limited resources to invest in protecting their communities against catastrophic losses from natural-hazard events. Public efforts to assess community vulnerability and encourage loss-reduction measures through mitigation often focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. The site-specific method usually provided the most accurate estimates, but was prohibitively expensive, whereas regional risk assessments were often too general to be of practical use. Policy makers lacked a systematic and quantitative method for conducting a regional-scale risk assessment of natural hazards. In response, Bernknopf and others developed the portfolio model, an intermediate-scale approach to assessing natural-hazard risks and mitigation policy alternatives. The basis for the portfolio-model approach was inspired by financial portfolio theory, which prescribes a method of optimizing return on investment while reducing risk by diversifying investments in different security types. In this context, a security type represents a unique combination of features and hazard-risk level, while financial return is defined as the reduction in losses resulting from an investment in mitigation of chosen securities. Features are selected for mitigation and are modeled like investment portfolios. Earth-science and economic data for the features are combined and processed in order to analyze each of the portfolios, which are then used to evaluate the benefits of mitigating the risk in selected locations. Ultimately, the decision maker seeks to choose a portfolio representing a mitigation policy that maximizes the expected return-on-investment, while minimizing the uncertainty associated with that return-on-investment. The portfolio model, now known as the Land-Use Portfolio Model (LUPM), provided the framework for the development of the Land-Use Portfolio Modeler, Version 1.0 software (LUPM v1.0). The software provides a geographic information system (GIS)-based modeling tool for evaluating alternative risk-reduction mitigation strategies for specific natural-hazard events. The modeler uses information about a specific natural-hazard event and the features exposed to that event within the targeted study region to derive a measure of a given mitigation strategy`s effectiveness. Harnessing the spatial capabilities of a GIS enables the tool to provide a rich, interactive mapping environment in which users can create, analyze, visualize, and compare different
Giovannelli, Justin; Curran, Emily
2017-02-01
Issue: Policymakers have sought to improve the shopping experience on the Affordable Care Act’s marketplaces by offering decision support tools that help consumers better understand and compare their health plan options. Cost estimators are one such tool. They are designed to provide consumers a personalized estimate of the total cost--premium, minus subsidy, plus cost-sharing--of their coverage options. Cost estimators were available in most states by the start of the fourth open enrollment period. Goal: To understand the experiences of marketplaces that offer a total cost estimator and the interests and concerns of policymakers from states that are not using them. Methods: Structured interviews with marketplace officials, consumer enrollment assisters, technology vendors, and subject matter experts; analysis of the total cost estimators available on the marketplaces as of October 2016. Key findings and conclusions: Informants strongly supported marketplace adoption of a total cost estimator. Marketplaces that offer an estimator faced a range of design choices and varied significantly in their approaches to resolving them. Interviews suggested a clear need for additional consumer testing and data analysis of tool usage and for sustained outreach to enrollment assisters to encourage greater use of the estimators.
Estimation of portion size in children's dietary assessment: lessons learnt.
Foster, E; Adamson, A J; Anderson, A S; Barton, K L; Wrieden, W L
2009-02-01
Assessing the dietary intake of young children is challenging. In any 1 day, children may have several carers responsible for providing them with their dietary requirements, and once children reach school age, traditional methods such as weighing all items consumed become impractical. As an alternative to weighed records, food portion size assessment tools are available to assist subjects in estimating the amounts of foods consumed. Existing food photographs designed for use with adults and based on adult portion sizes have been found to be inappropriate for use with children. This article presents a review and summary of a body of work carried out to improve the estimation of portion sizes consumed by children. Feasibility work was undertaken to determine the accuracy and precision of three portion size assessment tools; food photographs, food models and a computer-based Interactive Portion Size Assessment System (IPSAS). These tools were based on portion sizes served to children during the National Diet and Nutrition Survey. As children often do not consume all of the food served to them, smaller portions were included in each tool for estimation of leftovers. The tools covered 22 foods, which children commonly consume. Children were served known amounts of each food and leftovers were recorded. They were then asked to estimate both the amount of food that they were served and the amount of any food leftover. Children were found to estimate food portion size with an accuracy approaching that of adults using both the food photographs and IPSAS. Further development is underway to increase the number of food photographs and to develop IPSAS to cover a much wider range of foods and to validate the use of these tools in a 'real life' setting.
Assessing the cost of fuel reduction treatments: a critical review
Bob Rummer
2008-01-01
The basic costs of the operations for implementing fuel reduction treatments are used to evaluate treatment effectiveness, select among alternatives, estimate total project costs, and build national program strategies. However, a review of the literature indicates that there is questionable basis for many of the general estimates used to date. Different approaches to...
COMPARISON OF WEST GERMAN AND U.S. FLUE GAS DESULFURIZATION AND SELECTIVE CATALYTIC REDUCTION COSTS
The report documents a comparison of the actual cost retrofitting flue gas desulfurization (FGD) and selective catalytic reduction (SCR) on Federal Republic of German (FRG) boilers to cost estimating procedures used in the U.S. to estimate the retrofit of these controls on U.S. b...
Reduction of nitrogen inputs to estuaries can be achieved by the control of agricultural, atmospheric, and urban sources. We use the USGS MRB1 SPARROW model to estimate reductions necessary to decrease nitrogen loads to estuaries by 10%. As a first approximation we looked at s...
Spot and Runway Departure Advisor
NASA Technical Reports Server (NTRS)
Jung, Yoon Chul
2013-01-01
The Spot and Runway Departure Advisor (SARDA) is a research prototype of a decision support tool for ATC tower controllers to assist in manging and controlling traffic on the surface of an airport. SARDA employs a scheduler to generate an optimal runway schedule and gate push-back - spot release sequence and schedule that improves efficiency of surface operations. The advisories for ATC tower controllers are displayed on an Electronic Flight Strip (EFS) system. The human-in-the-loop simulation of the SARDA tool was conducted for east operations of Dallas-Ft. Worth International Airport (DFW) to evaluate performance of the SARDA tool and human factors, such as situational awareness and workload. The results indicates noticeable taxi delay reduction and fuel savings by using the SARDA tool. Reduction in controller workload were also observed throughout the scenario runs. The future plan includes modeling and simulation of the ramp operations of the Charlotte International Airport, and develop a decision support tool for the ramp controllers.
França, Elisabeth Barboza; Lansky, Sônia; Rego, Maria Albertina Santiago; Malta, Deborah Carvalho; França, Julia Santiago; Teixeira, Renato; Porto, Denise; Almeida, Marcia Furquim de; Souza, Maria de Fatima Marinho de; Szwarcwald, Célia Landman; Mooney, Meghan; Naghavi, Mohsen; Vasconcelos, Ana Maria Nogales
2017-05-01
To analyze under-5 mortality rates and leading causes in Brazil and states in 1990 and 2015, using the Global Burden of Disease Study (GBD) 2015 estimates. The main sources of data for all-causes under-5 mortality and live births estimates were the mortality information system, surveys, and censuses. Proportions and rates per 1,000 live births (LB) were calculated for total deaths and leading causes. Estimates of under-5 deaths in Brazil were 191,505 in 1990, and 51,226 in 2015, 90% of which were infant deaths. The rates per 1,000 LB showed a reduction of 67.6% from 1990 to 2015, achieving the proposed target established by the Millennium Development Goals (MDGs). The reduction generally was more than 60% in states, with a faster reduction in the poorest Northeast region. The ratio of the highest and lowest rates in the states decreased from 4.9 in 1990 to 2.3 in 2015, indicating a reduction in socioeconomic regional disparities. Although prematurity showed a 72% reduction, it still remains as the leading cause of death (COD), followed by diarrheal diseases in 1990, and congenital anomalies, birth asphyxia and septicemia neonatal in 2015. Under-5 mortality has decreased over the past 25 years, with reduction of regional disparities. However, pregnancy and childbirth-related causes remain as major causes of death, together with congenital anomalies. Intersectoral and specific public health policies must be continued to improve living conditions and health care in order to achieve further reduction of under-5 mortality rates in Brazil.
Enhancing a rainfall-runoff model to assess the impacts of BMPs and LID practices on storm runoff.
Liu, Yaoze; Ahiablame, Laurent M; Bralts, Vincent F; Engel, Bernard A
2015-01-01
Best management practices (BMPs) and low impact development (LID) practices are increasingly being used as stormwater management techniques to reduce the impacts of urban development on hydrology and water quality. To assist planners and decision-makers at various stages of development projects (planning, implementation, and evaluation), user-friendly tools are needed to assess the effectiveness of BMPs and LID practices. This study describes a simple tool, the Long-Term Hydrologic Impact Assessment-LID (L-THIA-LID), which is enhanced with additional BMPs and LID practices, improved approaches to estimate hydrology and water quality, and representation of practices in series (meaning combined implementation). The tool was used to evaluate the performance of BMPs and LID practices individually and in series with 30 years of daily rainfall data in four types of idealized land use units and watersheds (low density residential, high density residential, industrial, and commercial). Simulation results were compared with the results of other published studies. The simulated results showed that reductions in runoff volume and pollutant loads after implementing BMPs and LID practices, both individually and in series, were comparable with the observed impacts of these practices. The L-THIA-LID 2.0 model is capable of assisting decision makers in evaluating environmental impacts of BMPs and LID practices, thereby improving the effectiveness of stormwater management decisions. Copyright © 2014 Elsevier Ltd. All rights reserved.
Incident Waste Decision Support Tool - Waste Materials ...
Report This is the technical documentation to the waste materials estimator module of I-WASTE. This document outlines the methodology and data used to develop the Waste Materials Estimator (WME) contained in the Incident Waste Decision Support Tool (I-WASTE DST). Specifically, this document reflects version 6.4 of the I-WASTE DST. The WME is one of four primary features of the I-WASTE DST. The WME is both a standalone calculator that generates waste estimates in terms of broad waste categories, and is also integrated into the Incident Planning and Response section of the tool where default inventories of specific waste items are provided in addition to the estimates for the broader waste categories. The WME can generate waste estimates for both common materials found in open spaces (soil, vegetation, concrete, and asphalt) and for a vast array of items and materials found in common structures.
Estimating risk reduction required to break even in a health promotion program.
Ozminkowski, Ronald J; Goetzel, Ron Z; Santoro, Jan; Saenz, Betty-Jo; Eley, Christine; Gorsky, Bob
2004-01-01
To illustrate a formula to estimate the amount of risk reduction required to break even on a corporate health promotion program. A case study design was implemented. Base year (2001) health risk and medical expenditure data from the company, along with published information on the relationships between employee demographics, health risks, and medical expenditures, were used to forecast demographics, risks, and expenditures for 2002 through 2011 and estimate the required amount of risk reduction. Motorola. 52,124 domestic employees. Demographics included age, gender, race, and job type. Health risks for 2001 were measured via health risk appraisal. Risks were noted as either high or low and related to exercise/eating habits, body weight, blood pressure, blood sugar levels, cholesterol levels, depression, stress, smoking/drinking habits, and seat belt use. Medical claims for 2001 were used to calculate medical expenditures per employee. Assuming a dollar 282 per employee program cost, Motorola employees would need to reduce their lifestyle-related health risks by 1.08% to 1.42% per year to break even on health promotion programming, depending upon the discount rate. Higher or lower program investments would change the risk reduction percentages. Employers can use information from published studies, along with their own data, to estimate the amount of risk reduction required to break even on their health promotion programs.
Wilcox, Meredith L; Mason, Helen; Fouad, Fouad M; Rastam, Samer; al Ali, Radwan; Page, Timothy F; Capewell, Simon; O'Flaherty, Martin; Maziak, Wasim
2015-01-01
This study presents a cost-effectiveness analysis of salt reduction policies to lower coronary heart disease in Syria. Costs and benefits of a health promotion campaign about salt reduction (HP); labeling of salt content on packaged foods (L); reformulation of salt content within packaged foods (R); and combinations of the three were estimated over a 10-year time frame. Policies were deemed cost-effective if their cost-effectiveness ratios were below the region's established threshold of $38,997 purchasing power parity (PPP). Sensitivity analysis was conducted to account for the uncertainty in the reduction of salt intake. HP, L, and R+HP+L were cost-saving using the best estimates. The remaining policies were cost-effective (CERs: R=$5,453 PPP/LYG; R+HP=$2,201 PPP/LYG; R+L=$2,125 PPP/LYG). R+HP+L provided the largest benefit with net savings using the best and maximum estimates, while R+L was cost-effective with the lowest marginal cost using the minimum estimates. This study demonstrated that all policies were cost-saving or cost effective, with the combination of reformulation plus labeling and a comprehensive policy involving all three approaches being the most promising salt reduction strategies to reduce CHD mortality in Syria.
Williams, Michael S; Ebel, Eric D
2012-01-01
A common approach to reducing microbial contamination has been the implementation of a Hazard Analysis and Critical Control Point (HACCP) program to prevent or reduce contamination during production. One example is the Pathogen Reduction HACCP program implemented by the U.S. Department of Agriculture's Food Safety and Inspection Service (FSIS). This program consisted of a staged implementation between 1996 and 2000 to reduce microbial contamination on meat and poultry products. Of the commodities regulated by FSIS, one of the largest observed reductions was for Salmonella contamination on broiler chicken carcasses. Nevertheless, how this reduction might have influenced the total number of salmonellosis cases in the United States has not been assessed. This study incorporates information from public health surveillance and surveys of the poultry slaughter industry into a model that estimates the number of broiler-related salmonellosis cases through time. The model estimates that-following the 56% reduction in the proportion of contaminated broiler carcasses observed between 1995 and 2000-approximately 190,000 fewer annual salmonellosis cases (attributed to broilers) occurred in 2000 compared with 1995. The uncertainty bounds for this estimate range from approximately 37,000 to 500,000 illnesses. Estimated illnesses prevented, due to the more modest reduction in contamination of 13% between 2000 and 2007, were not statistically significant. An analysis relating the necessary magnitude of change in contamination required for detection via human surveillance also is provided.
Weber-Spickschen, T S; Oszwald, M; Westphal, R; Krettek, C; Wahl, F; Gosling, T
2010-01-01
Robot assisted fracture reduction of femoral shaft fractures provides precise alignment while reducing the amount of intraoperative imaging. The connection between the robot and the fracture fragment should allow conventional intramedullary nailing, be minimally invasive and provide interim fracture stability. In our study we tested three different reduction tools: a conventional External Fixator, a Reposition-Plate and a Three-Point-Device with two variations (a 40 degrees and a 90 degrees version). We measured relative movements between the tools and the bone fragments in all translation and rotation planes. The Three-Point-Device 90 degrees showed the smallest average relative displacement and was the only device able to withstand the maximum applied load of 70 Nm without failure of any bone fragment. The Three-Point-Device 90 degrees complies with all the stipulated requirements and is a suitable interface for robot assisted fracture reduction of femoral shaft fractures.
Milosevic, Matija; McConville, Kristiina M Valter
2012-01-01
Operation of handheld power tools results in exposure to hand-arm vibrations, which over time lead to numerous health complications. The objective of this study was to evaluate protective equipment and working techniques for the reduction of vibration exposure. Vibration transmissions were recorded during different work techniques: with one- and two-handed grip, while wearing protective gloves (standard, air and anti-vibration gloves) and while holding a foam-covered tool handle. The effect was examined by analyzing the reduction of transmitted vibrations at the wrist. The vibration transmission was recorded with a portable device using a triaxial accelerometer. The results suggest large and significant reductions of vibration with appropriate safety equipment. Reductions of 85.6% were achieved when anti-vibration gloves were used. Our results indicated that transmitted vibrations were affected by several factors and could be measured and significantly reduced.
NASA Astrophysics Data System (ADS)
Sateesh Kumar, Ch; Patel, Saroj Kumar; Das, Anshuman
2018-03-01
Temperature generation in cutting tools is one of the major causes of tool failure especially during hard machining where machining forces are quite high resulting in elevated temperatures. Thus, the present work investigates the temperature generation during hard machining of AISI 52100 steel (62 HRC hardness) with uncoated and PVD AlTiN coated Al2O3/TiCN mixed ceramic cutting tools. The experiments were performed on a heavy duty lathe machine with both coated and uncoated cutting tools under dry cutting environment. The temperature of the cutting zone was measured using an infrared thermometer and a finite element model has been adopted to predict the temperature distribution in cutting tools during machining for comparative assessment with the measured temperature. The experimental and numerical results revealed a significant reduction of cutting zone temperature during machining with PVD AlTiN coated cutting tools when compared to uncoated cutting tools during each experimental run. The main reason for decrease in temperature for AlTiN coated tools is the lower coefficient of friction offered by the coating material which allows the free flow of the chips on the rake surface when compared with uncoated cutting tools. Further, the superior wear behaviour of AlTiN coating resulted in reduction of cutting temperature.
In this study, we evaluate the suitability of a three-dimensional chemical transport model (CTM) as a tool for assessing ammonia emission inventories, calculate the improvement in CTM performance owing to recent advances in temporally-varying ammonia emission estimates, and ident...
For Third Enrollment Period, Marketplaces Expand Decision Support Tools To Assist Consumers.
Wong, Charlene A; Polsky, Daniel E; Jones, Arthur T; Weiner, Janet; Town, Robert J; Baker, Tom
2016-04-01
The design of the Affordable Care Act's online health insurance Marketplaces can improve how consumers make complex health plan choices. We examined the choice environment on the state-based Marketplaces and HealthCare.gov in the third open enrollment period. Compared to previous enrollment periods, we found greater adoption of some decision support tools, such as total cost estimators and integrated provider lookups. Total cost estimators differed in how they generated estimates: In some Marketplaces, consumers categorized their own utilization, while in others, consumers answered detailed questions and were assigned a utilization profile. The tools available before creating an account (in the window-shopping period) and afterward (in the real-shopping period) differed in several Marketplaces. For example, five Marketplaces provided total cost estimators to window shoppers, but only two provided them to real shoppers. Further research is needed on the impact of different choice environments and on which tools are most effective in helping consumers pick optimal plans. Project HOPE—The People-to-People Health Foundation, Inc.
NASA Astrophysics Data System (ADS)
Jin, T.
The effect and degree of smoke concentration on the reduction in thinking and memory was studied. The change in memory and thinking power was estimated by the change in correction ratio for mental calculation in various smoke concentrations. The reduction of memory in a smokey environment estimated by the correction ratio for remembering the order of four color panels which were shown before smoke exposure. It is found that it is difficult to evaluate the relationship between the reduction in thinking power against smoke concentration, and relationship between the reductions of memory against smoke concentration.
Spot Urine-guided Salt Reduction in Chronic Kidney Disease Patients.
Uchiyama, Kiyotaka; Yanai, Akane; Ishibashi, Yoshitaka
2017-09-01
Dietary salt restriction is important in patients with chronic kidney disease (CKD) to reduce hypertension, cardiovascular events, progression of CKD, and mortality. However, recommending salt reduction for patients is difficult without knowing their actual sodium intake. This study evaluated the effectiveness of spot urine-guided salt reduction in CKD outpatients. A prospective cohort study was used. This study included a total of 127 adult outpatients (aged 60 ± 18 years, 80 males) with CKD. Their baseline estimated glomerular filtration rate was 51.4 ± 25.1 (mL/minute/1.73 m 2 ), and 64 (50%) of them were with CKD stage 3a or 3b (both 32 [25%]). We informed the patients of their individual spot urine-estimated salt intake every time they visited the outpatient clinic. Based on the data, the nephrologist encouraged the patients to achieve their salt restriction goal. The primary outcome was the estimated salt excretion, and the secondary outcome was the urinary protein-to-Cr ratio (UPCR). Multiple regression analyses were performed to clarify the contributing factors of changes in both outcomes. Over a follow-up of 12 months, the median number of patients' visits was 7 (5-8). The estimated salt intake was significantly reduced from 7.98 ± 2.49 g/day to 6.77 ± 1.77 g/day (P < .0001). The median UPCR was also reduced from 0.20 (0.10-0.80) to 0.10 (0.10-0.48) (P < .0001). On multiple regression analysis, a reduction in UPCR was positively associated with the baseline UPCR and a reduction in systolic blood pressure significantly (P < .0001 and P < .01, respectively) as well as positively correlated with a reduction in the estimated salt intake, with borderline significance (P = .08). Providing spot urine-estimated salt intake feedback effectively motivated CKD patients to reduce their salt intake. Spot urine-guided salt reduction may slow CKD progression through decreased urinary protein excretion. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Nguyen, Nhan; Ting, Eric; Lebofsky, Sonia
2015-01-01
This paper presents data analysis of a flexible wing wind tunnel model with a variable camber continuous trailing edge flap (VCCTEF) design for drag minimization tested at the University of Washington Aeronautical Laboratory (UWAL). The wind tunnel test was designed to explore the relative merit of the VCCTEF concept for improved cruise efficiency through the use of low-cost aeroelastic model test techniques. The flexible wing model is a 10%-scale model of a typical transport wing and is constructed of woven fabric composites and foam core. The wing structural stiffness in bending is tailored to be half of the stiffness of a Boeing 757-era transport wing while the torsional stiffness is about the same. This stiffness reduction results in a wing tip deflection of about 10% of the wing semi-span. The VCCTEF is a multi-segment flap design having three chordwise camber segments and five spanwise flap sections for a total of 15 individual flap elements. The three chordwise camber segments can be positioned appropriately to create a desired trailing edge camber. Elastomeric material is used to cover the gaps in between the spanwise flap sections, thereby creating a continuous trailing edge. Wind tunnel data analysis conducted previously shows that the VCCTEF can achieve a drag reduction of up to 6.31% and an improvement in the lift-to-drag ratio (L=D) of up to 4.85%. A method for estimating the bending and torsional stiffnesses of the flexible wingUWAL wind tunnel model from static load test data is presented. The resulting estimation indicates that the stiffness of the flexible wing is significantly stiffer in torsion than in bending by as much as 9 to 1. The lift prediction for the flexible wing is computed by a coupled aerodynamic-structural model. The coupled model is developed by coupling a conceptual aerodynamic tool Vorlax with a finite-element model of the flexible wing via an automated geometry deformation tool. Based on the comparison of the lift curve slope, the lift prediction for the rigid wing is in good agreement with the estimated lift coefficients derived from the wind tunnel test data. Due to the movement of the VCCTEF during the wind tunnel test, uncertainty in the lift prediction due to the indicated variations of the VCCTEF deflection is studied. The results show a significant spread in the lift prediction which contradicts the consistency in the aerodynamic measurements, thus suggesting that the indicated variations as measured by the VICON system may not be reliable. The lift prediction of the flexible wing agrees very well with the measured lift curve for the baseline configuration. The computed bending deflection and wash-out twist of the flexible wing also match reasonably well with the aeroelastic deflection measurements. The results demonstrate the validity of the aerodynamic-structural tool for use to analyze aerodynamic performance of flexible wings.
Barnett, M C; McFarlane, J R; Hegarty, R S
2015-06-01
Ruminant methane yield (MY) is positively correlated with mean retention time (MRT) of digesta. The hormone triiodothyronine (T3 ), which is negatively correlated with ambient temperature, is known to influence MRT. It was hypothesised that exposing sheep to low ambient temperatures would increase plasma T3 concentration and decrease MRT of digesta within the rumen of sheep, resulting in a reduction of MY. To test this hypothesis, six Merino sheep were exposed to two different ambient temperatures (cold treatment, 9 ± 1 °C; warm control 26 ± 1 °C). The effects on MY, digesta MRT, plasma T3 concentration, CO2 production, DM intake, DM digestibility, change in body weight (BW), rumen volatile fatty acid (VFA) concentrations, estimated microbial protein output, protozoa abundance, wool growth, water intake, urine output and rectal temperature were studied. Cold treatment resulted in a reduction in MY (p < 0.01); digesta MRT in rumen (p < 0.01), hindgut (p = 0.01) and total digestive tract (p < 0.01); protozoa abundance (p < 0.05); and water intake (p < 0.001). Exposure to cold temperature increased plasma T3 concentration (p < 0.05), CO2 production (p = 0.01), total VFA concentrations (p = 0.03) and estimated microbial output from the rumen (p = 0.03). The rate of wool growth increased (p < 0.01) due to cold treatment, but DM intake, DM digestibility and BW change were not affected. The results suggest that exposure of sheep to cold ambient temperatures reduces digesta retention time in the gastrointestinal tract, leading to a reduction in enteric methane yield. Further research is warranted to determine whether T3 could be used as an indirect selection tool for genetic selection of low enteric methane-producing ruminants. Journal of Animal Physiology and Animal Nutrition © 2014 Blackwell Verlag GmbH.
Nakamura, Y.; Tucker, B. E.
1988-01-01
Today, Japanese society is well aware of the prediction of the Tokai earthquake. It is estimated by the Tokyo earthquake. It is estimated by the Tokyo muncipal government that this predicted earthquake could kill 30,000 people. (this estimate is viewed by many as conservative; other Japanese government agencies have made estimates but they have not been published.) Reduction in the number deaths from 120,000 to 30,000 between the Kanto earthquake and the predicted Tokai earthquake is due in large part to the reduction in the proportion of wooden construction (houses).
Sun, MIn; Perry, Kevin L.
2015-11-20
A system according to the principles of the present disclosure includes a storage estimation module and an air/fuel ratio control module. The storage estimation module estimates a first amount of ammonia stored in a first selective catalytic reduction (SCR) catalyst and estimates a second amount of ammonia stored in a second SCR catalyst. The air/fuel ratio control module controls an air/fuel ratio of an engine based on the first amount, the second amount, and a temperature of a substrate disposed in the second SCR catalyst.
A tool for the estimation of the distribution of landslide area in R
NASA Astrophysics Data System (ADS)
Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.
2012-04-01
We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery. The tool can also be used to evaluate the probability density and the frequency density of landslide volume.
Vapor Intrusion Estimation Tool for Unsaturated Zone Contaminant Sources. User’s Guide
2016-08-30
324449 Page Intentionally Left Blank iii Executive Summary Soil vapor extraction (SVE) is a prevalent remediation approach for volatile contaminants...strength and location, vadose zone transport, and a model for estimating movement of soil -gas vapor contamination into buildings. The tool may be...framework for estimating the impact of a vadose zone contaminant source on soil gas concentrations and vapor intrusion into a building
Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.
2013-01-01
Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.
Update on Supersonic Jet Noise Research at NASA
NASA Technical Reports Server (NTRS)
Henderson, Brenda
2010-01-01
An update on jet noise research conducted in the Fundamental Aeronautics and Integrated Systems Research Programs was presented. Highlighted research projects included those focused on the development of prediction tools, diagnostic tools, and noise reduction concepts.
WASTE REDUCTION USING COMPUTER-AIDED DESIGN TOOLS
Growing environmental concerns have spurred considerable interest in pollution prevention. In most instances, pollution prevention involves introducing radical changes to the design of processes so that waste generation is minimized.
Process simulators can be effective tools i...
Computational tool for optimizing the essential oils utilization in inhibiting the bacterial growth
El-Attar, Noha E; Awad, Wael A
2017-01-01
Day after day, the importance of relying on nature in many fields such as food, medical, pharmaceutical industries, and others is increasing. Essential oils (EOs) are considered as one of the most significant natural products for use as antimicrobials, antioxidants, antitumorals, and anti-inflammatories. Optimizing the usage of EOs is a big challenge faced by the scientific researchers because of the complexity of chemical composition of every EO, in addition to the difficulties to determine the best in inhibiting the bacterial activity. The goal of this article is to present a new computational tool based on two methodologies: reduction by using rough sets and optimization with particle swarm optimization. The developed tool dubbed as Essential Oil Reduction and Optimization Tool is applied on 24 types of EOs that have been tested toward 17 different species of bacteria. PMID:28919787
NASA Technical Reports Server (NTRS)
Lee, David; Long, Dou; Etheridge, Mel; Plugge, Joana; Johnson, Jesse; Kostiuk, Peter
1998-01-01
We present a general method for making cross comparable estimates of the benefits of NASA-developed decision support technologies for air traffic management, and we apply a specific implementation of the method to estimate benefits of three decision support tools (DSTs) under development in NASA's advanced Air Transportation Technologies Program: Active Final Approach Spacing Tool (A-FAST), Expedite Departure Path (EDP), and Conflict Probe and Trial Planning Tool (CPTP). The report also reviews data about the present operation of the national airspace system (NAS) to identify opportunities for DST's to reduce delays and inefficiencies.
2014-01-01
Background Interest in the impact of burnout on physicians has been growing because of the possible burden this may have on health care systems. The objective of this study is to estimate the cost of burnout on early retirement and reduction in clinical hours of practicing physicians in Canada. Methods Using an economic model, the costs related to early retirement and reduction in clinical hours of physicians were compared for those who were experiencing burnout against a scenario in which they did not experience burnout. The January 2012 Canadian Medical Association Masterfile was used to determine the number of practicing physicians. Transition probabilities were estimated using 2007–2008 Canadian Physician Health Survey and 2007 National Physician Survey data. Adjustments were also applied to outcome estimates based on ratio of actual to planned retirement and reduction in clinical hours. Results The total cost of burnout for all physicians practicing in Canada is estimated to be $213.1 million ($185.2 million due to early retirement and $27.9 million due to reduced clinical hours). Family physicians accounted for 58.8% of the burnout costs, followed by surgeons for 24.6% and other specialists for 16.6%. Conclusion The cost of burnout associated with early retirement and reduction in clinical hours is substantial and a significant proportion of practicing physicians experience symptoms of burnout. As health systems struggle with human resource shortages and expanding waiting times, this estimate sheds light on the extent to which the burden could be potentially decreased through prevention and promotion activities to address burnout among physicians. PMID:24927847
NASA Astrophysics Data System (ADS)
Sun, Alexander Y.; Morris, Alan P.; Mohanty, Sitakanta
2009-07-01
Estimated parameter distributions in groundwater models may contain significant uncertainties because of data insufficiency. Therefore, adaptive uncertainty reduction strategies are needed to continuously improve model accuracy by fusing new observations. In recent years, various ensemble Kalman filters have been introduced as viable tools for updating high-dimensional model parameters. However, their usefulness is largely limited by the inherent assumption of Gaussian error statistics. Hydraulic conductivity distributions in alluvial aquifers, for example, are usually non-Gaussian as a result of complex depositional and diagenetic processes. In this study, we combine an ensemble Kalman filter with grid-based localization and a Gaussian mixture model (GMM) clustering techniques for updating high-dimensional, multimodal parameter distributions via dynamic data assimilation. We introduce innovative strategies (e.g., block updating and dimension reduction) to effectively reduce the computational costs associated with these modified ensemble Kalman filter schemes. The developed data assimilation schemes are demonstrated numerically for identifying the multimodal heterogeneous hydraulic conductivity distributions in a binary facies alluvial aquifer. Our results show that localization and GMM clustering are very promising techniques for assimilating high-dimensional, multimodal parameter distributions, and they outperform the corresponding global ensemble Kalman filter analysis scheme in all scenarios considered.
Hobbs, Brian P.; Carlin, Bradley P.; Mandrekar, Sumithra J.; Sargent, Daniel J.
2011-01-01
Summary Bayesian clinical trial designs offer the possibility of a substantially reduced sample size, increased statistical power, and reductions in cost and ethical hazard. However when prior and current information conflict, Bayesian methods can lead to higher than expected Type I error, as well as the possibility of a costlier and lengthier trial. This motivates an investigation of the feasibility of hierarchical Bayesian methods for incorporating historical data that are adaptively robust to prior information that reveals itself to be inconsistent with the accumulating experimental data. In this paper, we present several models that allow for the commensurability of the information in the historical and current data to determine how much historical information is used. A primary tool is elaborating the traditional power prior approach based upon a measure of commensurability for Gaussian data. We compare the frequentist performance of several methods using simulations, and close with an example of a colon cancer trial that illustrates a linear models extension of our adaptive borrowing approach. Our proposed methods produce more precise estimates of the model parameters, in particular conferring statistical significance to the observed reduction in tumor size for the experimental regimen as compared to the control regimen. PMID:21361892
An adaptive observer for on-line tool wear estimation in turning, Part I: Theory
NASA Astrophysics Data System (ADS)
Danai, Kourosh; Ulsoy, A. Galip
1987-04-01
On-line sensing of tool wear has been a long-standing goal of the manufacturing engineering community. In the absence of any reliable on-line tool wear sensors, a new model-based approach for tool wear estimation has been proposed. This approach is an adaptive observer, based on force measurement, which uses both parameter and state estimation techniques. The design of the adaptive observer is based upon a dynamic state model of tool wear in turning. This paper (Part I) presents the model, and explains its use as the basis for the adaptive observer design. This model uses flank wear and crater wear as state variables, feed as the input, and the cutting force as the output. The suitability of the model as the basis for adaptive observation is also verified. The implementation of the adaptive observer requires the design of a state observer and a parameter estimator. To obtain the model parameters for tuning the adaptive observer procedures for linearisation of the non-linear model are specified. The implementation of the adaptive observer in turning and experimental results are presented in a companion paper (Part II).
The Benefits of Internalizing Air Quality and Greenhouse Gas Externalities in the US Energy System
NASA Astrophysics Data System (ADS)
Brown, Kristen E.
The emission of pollutants from energy use has effects on both local air quality and the global climate, but the price of energy does not reflect these externalities. This study aims to analyze the effect that internalizing these externalities in the cost of energy would have on the US energy system, emissions, and human health. In this study, we model different policy scenarios in which fees are added to emissions related to generation and use of energy. The fees are based on values of damages estimated in the literature and are applied to upstream and combustion emissions related to electricity generation, industrial energy use, transportation energy use, residential energy use, and commercial energy use. The energy sources and emissions are modeled through 2055 in five-year time steps. The emissions in 2045 are incorporated into a continental-scale atmospheric chemistry and transport model, CMAQ, to determine the change in air quality due to different emissions reduction scenarios. A benefit analysis tool, BenMAP, is used with the air quality results to determine the monetary benefit of emissions reductions related to the improved air quality. We apply fees to emissions associated with health impacts, climate change, and a combination of both. We find that the fees we consider lead to reductions in targeted emissions as well as co-reducing non-targeted emissions. For fees on the electric sector alone, health impacting pollutant (HIP) emissions reductions are achieved mainly through control devices while Greenhouse Gas (GHG) fees are addressed through changes in generation technologies. When sector specific fees are added, reductions come mainly from the industrial and electricity generation sectors, and are achieved through a mix of energy efficiency, increased use of renewables, and control devices. Air quality is improved in almost all areas of the country with fees, including when only GHG fees are applied. Air quality tends to improve more in regions with larger emissions reductions, especially for PM2.5.
Developing index maps of water-harvest potential in Africa
Senay, G.B.; Verdin, J.P.
2004-01-01
The food security problem in Africa is tied to the small farmer, whose subsistence farming relies heavily on rain-fed agriculture. A dry spell lasting two to three weeks can cause a significant yield reduction. A small-scale irrigation scheme from small-capacity ponds can alleviate this problem. This solution would require a water harvest mechanism at a farm level. In this study, we looked at the feasibility of implementing such a water harvest mechanism in drought prone parts of Africa. A water balance study was conducted at different watershed levels. Runoff (watershed yield) was estimated using the SCS curve number technique and satellite derived rainfall estimates (RFE). Watersheds were delineated from the Africa-wide HYDRO-1K digital elevation model (DEM) data set in a GIS environment. Annual runoff volumes that can potentially be stored in a pond during storm events were estimated as the product of the watershed area and runoff excess estimated from the SCS Curve Number method. Estimates were made for seepage and net evaporation losses. A series of water harvest index maps were developed based on a combination of factors that took into account the availability of runoff, evaporation losses, population density, and the required watershed size needed to fill a small storage reservoir that can be used to alleviate water stress during a crop growing season. This study presents Africa-wide water-harvest index maps that could be used for conducting feasibility studies at a regional scale in assessing the relative differences in runoff potential between regions for the possibility of using ponds as a water management tool. ?? 2004 American Society of Agricultural Engineers.
Chesson, Harrell W; Ekwueme, Donatus U; Saraiya, Mona; Dunne, Eileen F; Markowitz, Lauri E
2013-08-20
The objective of this study was to estimate the number of years after onset of a quadrivalent HPV vaccination program before notable reductions in genital warts and cervical intraepithelial neoplasia (CIN) will occur in teenagers and young adults in the United States. We applied a previously published model of HPV vaccination in the United States and focused on the timing of reductions in genital warts among both sexes and reductions in CIN 2/3 among females. Using different coverage scenarios, the lowest being consistent with current 3-dose coverage in the United States, we estimated the number of years before reductions of 10%, 25%, and 50% would be observed after onset of an HPV vaccination program for ages 12-26 years. The model suggested female-only HPV vaccination in the intermediate coverage scenario will result in a 10% reduction in genital warts within 2-4 years for females aged 15-19 years and a 10% reduction in CIN 2/3 among females aged 20-29 years within 7-11 years. Coverage had a major impact on when reductions would be observed. For example, in the higher coverage scenario a 25% reduction in CIN2/3 would be observed with 8 years compared with 15 years in the lower coverage scenario. Our model provides estimates of the potential timing and magnitude of the impact of HPV vaccination on genital warts and CIN 2/3 at the population level in the United States. Notable, population-level impacts of HPV vaccination on genital warts and CIN 2/3 can occur within a few years after onset of vaccination, particularly among younger age groups. Our results are generally consistent with early reports of declines in genital warts among youth. Published by Elsevier Ltd.
Screening Tools to Estimate Mold Burdens in Homes
Objective: The objective of this study was to develop screening tools that could be used to estimate the mold burden in a home which would indicate whether more detailed testing might be useful. Methods: Previously, in the American Healthy Home Survey, a DNA-based method of an...
Omitted variable bias in crash reduction factors.
DOT National Transportation Integrated Search
2015-09-01
Transportation planners and traffic engineers are increasingly turning to crash reduction factors to evaluate changes in road : geometric and design features in order to reduce crashes. Crash reduction factors are typically estimated based on segment...
Sleeter, Benjamin M.; Wood, Nathan J.; Soulard, Christopher E.; Wilson, Tamara
2017-01-01
Tsunamis have the potential to cause considerable damage to communities along the U.S. Pacific Northwest coastline. As coastal communities expand over time, the potential societal impact of tsunami inundation changes. To understand how community exposure to tsunami hazards may change in coming decades, we projected future development (i.e. urban, residential, and rural), households, and residents over a 50-year period (2011–2061) along the Washington, Oregon, and northern California coasts. We created a spatially explicit, land use/land cover, state-and-transition simulation model to project future developed land use based on historical development trends. We then compared our development projection results to tsunami-hazard zones associated with a Cascadia subduction zone (CSZ) earthquake. Changes in tsunami-hazard exposure by 2061 were estimated for 50 incorporated cities, 7 tribal reservations, and 17 counties relative to current (2011) estimates. Across the region, 2061 population exposure in tsunami-hazard zones was projected to increase by 3880 households and 6940 residents. The top ten communities with highest population exposure to CSZ-related tsunamis in 2011 are projected to remain the areas with the highest population exposure by 2061. The largest net population increases in tsunami-hazard zones were projected in the unincorporated portions of several counties, including Skagit, Coos, and Humboldt. Land-change simulation modeling of projected future development serves as an exploratory tool aimed at helping local governments understand the hazard-exposure implications of community growth and to include this knowledge in risk-reduction planning.
Byrne, Abbey; Hodge, Andrew; Jimenez-Soto, Eliana
2015-11-01
Many priority countries in the countdown to the millennium development goals deadline are lagging in progress towards maternal and child health (MCH) targets. Papua New Guinea (PNG) is one such country beset by challenges of geographical inaccessibility, inequity and health system weakness. Several countries, however, have made progress through focused initiatives which align with the burden of disease and overcome specific inequities. This study identifies the potential impact on maternal and child mortality through increased coverage of prioritised interventions within the PNG health system. The burden of disease and health system environment of PNG was documented to inform prioritised MCH interventions at community, outreach, and clinical levels. Potential reductions in maternal and child mortality through increased intervention coverage to close the geographical equity gap were estimated with the lives saved tool. A set community-level interventions, with highest feasibility, would yield significant reductions in newborn and child mortality. Adding the outreach group delivers gains for maternal mortality, particularly through family planning. The clinical services group of interventions demands greater investment but are essential to reach MCH targets. Cumulatively, the increased coverage is estimated to reduce the rates of under-five mortality by 19 %, neonatal mortality by 26 %, maternal mortality ratio by 10 % and maternal mortality by 33 %. Modest investments in health systems focused on disadvantaged populations can accelerate progress in maternal and child survival even in fragile health systems like PNG. The critical approach may be to target interventions and implementation appropriately to the sensitive context of lagging countries.
Hoffmann, Rasmus; Eikemo, Terje Andreas; Kulhánová, Ivana; Dahl, Espen; Deboosere, Patrick; Dzúrová, Dagmar; van Oyen, Herman; Rychtaríková, Jitka; Strand, Bjørn Heine; Mackenbach, Johan P
2013-01-01
Socioeconomic differences in health are a major challenge for public health. However, realistic estimates to what extent they are modifiable are scarce. This problem can be met through the systematic application of the population attributable fraction (PAF) to socioeconomic health inequalities. The authors used cause-specific mortality data by educational level from Belgium, Norway and Czech Republic and data on the prevalence of smoking, alcohol, lack of physical activity and high body mass index from national health surveys. Information on the impact of these risk factors on mortality comes from the epidemiological literature. The authors calculated PAFs to quantify the impact on socioeconomic health inequalities of a social redistribution of risk factors. The authors developed an Excel tool covering a wide range of possible scenarios and the authors compare the results of the PAF approach with a conventional regression. In a scenario where the whole population gets the risk factor prevalence currently seen among the highly educated inequalities in mortality can be reduced substantially. According to the illustrative results, the reduction of inequality for all risk factors combined varies between 26% among Czech men and 94% among Norwegian men. Smoking has the highest impact for both genders, and physical activity has more impact among women. After discussing the underlying assumptions of the PAF, the authors concluded that the approach is promising for estimating the extent to which health inequalities can be potentially reduced by interventions on specific risk factors. This reduction is likely to differ substantially between countries, risk factors and genders.
Quantitative Microbial Risk Assessment for Escherichia coli O157:H7 in Fresh-Cut Lettuce.
Pang, Hao; Lambertini, Elisabetta; Buchanan, Robert L; Schaffner, Donald W; Pradhan, Abani K
2017-02-01
Leafy green vegetables, including lettuce, are recognized as potential vehicles for foodborne pathogens such as Escherichia coli O157:H7. Fresh-cut lettuce is potentially at high risk of causing foodborne illnesses, as it is generally consumed without cooking. Quantitative microbial risk assessments (QMRAs) are gaining more attention as an effective tool to assess and control potential risks associated with foodborne pathogens. This study developed a QMRA model for E. coli O157:H7 in fresh-cut lettuce and evaluated the effects of different potential intervention strategies on the reduction of public health risks. The fresh-cut lettuce production and supply chain was modeled from field production, with both irrigation water and soil as initial contamination sources, to consumption at home. The baseline model (with no interventions) predicted a mean probability of 1 illness per 10 million servings and a mean of 2,160 illness cases per year in the United States. All intervention strategies evaluated (chlorine, ultrasound and organic acid, irradiation, bacteriophage, and consumer washing) significantly reduced the estimated mean number of illness cases when compared with the baseline model prediction (from 11.4- to 17.9-fold reduction). Sensitivity analyses indicated that retail and home storage temperature were the most important factors affecting the predicted number of illness cases. The developed QMRA model provided a framework for estimating risk associated with consumption of E. coli O157:H7-contaminated fresh-cut lettuce and can guide the evaluation and development of intervention strategies aimed at reducing such risk.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
Development of demand forecasting tool for natural resources recouping from municipal solid waste.
Zaman, Atiq Uz; Lehmann, Steffen
2013-10-01
Sustainable waste management requires an integrated planning and design strategy for reliable forecasting of waste generation, collection, recycling, treatment and disposal for the successful development of future residential precincts. The success of the future development and management of waste relies to a high extent on the accuracy of the prediction and on a comprehensive understanding of the overall waste management systems. This study defies the traditional concepts of waste, in which waste was considered as the last phase of production and services, by putting forward the new concept of waste as an intermediate phase of production and services. The study aims to develop a demand forecasting tool called 'zero waste index' (ZWI) for measuring the natural resources recouped from municipal solid waste. The ZWI (ZWI demand forecasting tool) quantifies the amount of virgin materials recovered from solid waste and subsequently reduces extraction of natural resources. In addition, the tool estimates the potential amount of energy, water and emissions avoided or saved by the improved waste management system. The ZWI is tested in a case study of waste management systems in two developed cities: Adelaide (Australia) and Stockholm (Sweden). The ZWI of waste management systems in Adelaide and Stockholm is 0.33 and 0.17 respectively. The study also enumerates per capita energy savings of 2.9 GJ and 2.83 GJ, greenhouse gas emissions reductions of 0.39 tonnes (CO2e) and 0.33 tonnes (CO2e), as well as water savings of 2.8 kL and 0.92 kL in Adelaide and Stockholm respectively.
Use of RecA protein to enrich for homologous genes in a genomic library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taidi-Laskowski, B.; Grumet, F.C.; Tyan, D.
1988-08-25
RecA protein-coated probe has been utilized to enrich genomic digests for desired genes in order to facilitate cloning from genomic libraries. Using a previously cloned HLA-B27 gene as the recA-coated enrichment probe, the authors obtained a mean 108x increase in the ratio of specific to nonspecific plaques in lambda libraries screened for B27 variant alleles of estimated 99% homology to the probe. Class I genes of lesser homology were less enriched. Loss of genomic DNA during the enrichment procedure can, however, restrict application of this technique whenever starting genomic DNA is very limited. Nevertheless, the impressive reduction in cloning effortmore » and material makes recA enrichment a useful new tool for cloning homologous genes from genomic DNA.« less
Reduction of variance in spectral estimates for correction of ultrasonic aberration.
Astheimer, Jeffrey P; Pilkington, Wayne C; Waag, Robert C
2006-01-01
A variance reduction factor is defined to describe the rate of convergence and accuracy of spectra estimated from overlapping ultrasonic scattering volumes when the scattering is from a spatially uncorrelated medium. Assuming that the individual volumes are localized by a spherically symmetric Gaussian window and that centers of the volumes are located on orbits of an icosahedral rotation group, the factor is minimized by adjusting the weight and radius of each orbit. Conditions necessary for the application of the variance reduction method, particularly for statistical estimation of aberration, are examined. The smallest possible value of the factor is found by allowing an unlimited number of centers constrained only to be within a ball rather than on icosahedral orbits. Computations using orbits formed by icosahedral vertices, face centers, and edge midpoints with a constraint radius limited to a small multiple of the Gaussian width show that a significant reduction of variance can be achieved from a small number of centers in the confined volume and that this reduction is nearly the maximum obtainable from an unlimited number of centers in the same volume.
Li, Xiaoyan; Rymer, William Zev; Zhou, Ping
2013-01-01
Motor unit number index (MUNIX) measurement has recently achieved increasing attention as a tool to evaluate the progression of motoneuron diseases. In our current study, the sensitivity of the MUNIX technique to changes in motoneuron and muscle properties was explored by a simulation approach utilizing variations on published motoneuron pool and surface electromyogram (EMG) models. Our simulation results indicate that, when keeping motoneuron pool and muscle parameters unchanged and varying the input motor unit numbers to the model, then MUNIX estimates can appropriately characterize changes in motor unit numbers. Such MUNIX estimates are not sensitive to different motor unit recruitment and rate coding strategies used in the model. Furthermore, alterations in motor unit control properties do not have a significant effect on the MUNIX estimates. Neither adjustment of the motor unit recruitment range nor reduction of the motor unit firing rates jeopardizes the MUNIX estimates. The MUNIX estimates closely correlate with the maximum M wave amplitude. However, if we reduce the amplitude of each motor unit action potential rather than simply reduce motor unit number, then MUNIX estimates substantially underestimate the motor unit numbers in the muscle. These findings suggest that the current MUNIX definition is most suitable for motoneuron diseases that demonstrate secondary evidence of muscle fiber reinnervation. In this regard, when MUNIX is applied, it is of much importance to examine a parallel measurement of motor unit size index (MUSIX), defined as the ratio of the maximum M wave amplitude to the MUNIX. However, there are potential limitations in the application of the MUNIX methods in atrophied muscle, where it is unclear whether the atrophy is accompanied by loss of motor units or loss of muscle fiber size. PMID:22514208
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-22
... collection of information unless it displays a currently valid OMB Control Number. No person shall be subject... Reduction Act (PRA) that does not display a valid OMB Control Number. DATES: Written Paperwork Reduction Act... estimate(s); ways to enhance the quality, utility, and clarity of the information collected; ways to...
Practical Applications for Earthquake Scenarios Using ShakeMap
NASA Astrophysics Data System (ADS)
Wald, D. J.; Worden, B.; Quitoriano, V.; Goltz, J.
2001-12-01
In planning and coordinating emergency response, utilities, local government, and other organizations are best served by conducting training exercises based on realistic earthquake situations-ones that they are most likely to face. Scenario earthquakes can fill this role; they can be generated for any geologically plausible earthquake or for actual historic earthquakes. ShakeMap Web pages now display selected earthquake scenarios (www.trinet.org/shake/archive/scenario/html) and more events will be added as they are requested and produced. We will discuss the methodology and provide practical examples where these scenarios are used directly for risk reduction. Given a selected event, we have developed tools to make it relatively easy to generate a ShakeMap earthquake scenario using the following steps: 1) Assume a particular fault or fault segment will (or did) rupture over a certain length, 2) Determine the magnitude of the earthquake based on assumed rupture dimensions, 3) Estimate the ground shaking at all locations in the chosen area around the fault, and 4) Represent these motions visually by producing ShakeMaps and generating ground motion input for loss estimation modeling (e.g., FEMA's HAZUS). At present, ground motions are estimated using empirical attenuation relationships to estimate peak ground motions on rock conditions. We then correct the amplitude at that location based on the local site soil (NEHRP) conditions as we do in the general ShakeMap interpolation scheme. Finiteness is included explicitly, but directivity enters only through the empirical relations. Although current ShakeMap earthquake scenarios are empirically based, substantial improvements in numerical ground motion modeling have been made in recent years. However, loss estimation tools, HAZUS for example, typically require relatively high frequency (3 Hz) input for predicting losses, above the range of frequencies successfully modeled to date. Achieving full-synthetic ground motion estimates that will substantially improve over empirical relations at these frequencies will require developing cost-effective numerical tools for proper theoretical inclusion of known complex ground motion effects. Current efforts underway must continue in order to obtain site, basin, and deeper crustal structure, and to characterize and test 3D earth models (including attenuation and nonlinearity). In contrast, longer period synthetics (>2 sec) are currently being generated in a deterministic fashion to include 3D and shallow site effects, an improvement on empirical estimates alone. As progress is made, we will naturally incorporate such advances into the ShakeMap scenario earthquake and processing methodology. Our scenarios are currently used heavily in emergency response planning and loss estimation. Primary users include city, county, state and federal government agencies (e.g., the California Office of Emergency Services, FEMA, the County of Los Angeles) as well as emergency response planners and managers for utilities, businesses, and other large organizations. We have found the scenarios are also of fundamental interest to many in the media and the general community interested in the nature of the ground shaking likely experienced in past earthquakes as well as effects of rupture on known faults in the future.
Posthuma, Leo; Wahlstrom, Emilia; Nijenhuis, René; Dijkens, Chris; de Zwart, Dick; van de Meent, Dik; Hollander, Anne; Brand, Ellen; den Hollander, Henri A; van Middelaar, Johan; van Dijk, Sander; Hall, E F; Hoffer, Sally
2014-11-01
The United Nations response mechanism to environmental emergencies requested a tool to support disaster assessment and coordination actions by United Nations Disaster Assessment and Coordination (UNDAC) teams. The tool should support on-site decision making when substantial chemical emissions affect human health directly or via the environment and should be suitable for prioritizing impact reduction management options under challenging conditions worldwide. To answer this need, the Flash Environmental Assessment Tool (FEAT) was developed and the scientific and practical underpinning and application of this tool are described in this paper. FEAT consists of a printed decision framework and lookup tables, generated by combining the scientific data on chemicals, exposure pathways and vulnerabilities with the pragmatic needs of emergency field teams. Application of the tool yields information that can help prioritize impact reduction measures. The first years of use illustrated the usefulness of the tool as well as suggesting additional uses and improvements. An additional use is application of the back-office tool (Hazard Identification Tool, HIT), the results of which aid decision-making by the authorities of affected countries and the preparation of field teams for on-site deployment. Another extra use is in disaster pro action and prevention. In this case, the application of the tool supports safe land-use planning and improved technical design of chemical facilities. UNDAC teams are trained to use the tool after large-scale sudden onset natural disasters. Copyright © 2014 Elsevier Ltd. All rights reserved.
Alternative Fuels Data Center: Biodiesel Vehicle Emissions
Petroleum Reduction Planning Tool AFLEET Tool All Tools Vehicle Cost Calculator Choose a vehicle to compare fuel cost and emissions with a conventional vehicle. Select Fuel/Technology Electric Hybrid Electric Cost Calculator Vehicle 0 City 0 Hwy (mi/gal) 0 City 0 Hwy (kWh/100m) Gasoline Vehicle 0 City 0 Hwy (mi
Ahn, SangNam; Smith, Matthew Lee; Altpeter, Mary; Post, Lindsey; Ory, Marcia G
2015-01-01
Chronic disease self-management education (CDSME) programs have been delivered to more than 100,000 older Americans with chronic conditions. As one of the Stanford suite of evidence-based CDSME programs, the chronic disease self-management program (CDSMP) has been disseminated in diverse populations and settings. The objective of this paper is to introduce a practical, universally applicable tool to assist program administrators and decision makers plan implementation efforts and make the case for continued program delivery. This tool was developed utilizing data from a recent National Study of CDSMP to estimate national savings associated with program participation. Potential annual healthcare savings per CDSMP participant were calculated based on averted emergency room visits and hospitalizations. While national data can be utilized to estimate cost savings, the tool has built-in features allowing users to tailor calculations based on their site-specific data. Building upon the National Study of CDSMP's documented potential savings of $3.3 billion in healthcare costs by reaching 5% of adults with one or more chronic conditions, two heuristic case examples were also explored based on different population projections. The case examples show how a small county and large metropolitan city were not only able to estimate healthcare savings ($38,803 for the small county; $732,290 for the large metropolitan city) for their existing participant populations but also to project significant healthcare savings if they plan to reach higher proportions of middle-aged and older adults. Having a tool to demonstrate the monetary value of CDSMP can contribute to the ongoing dissemination and sustainability of such community-based interventions. Next steps will be creating a user-friendly, internet-based version of Healthcare Cost Savings Estimator Tool: CDSMP, followed by broadening the tool to consider cost savings for other evidence-based programs.
Gazoorian, Christopher L.
2015-01-01
A graphical user interface, with an integrated spreadsheet summary report, has been developed to estimate and display the daily mean streamflows and statistics and to evaluate different water management or water withdrawal scenarios with the estimated monthly data. This package of regression equations, U.S. Geological Survey streamgage data, and spreadsheet application produces an interactive tool to estimate an unaltered daily streamflow hydrograph and streamflow statistics at ungaged sites in New York. Among other uses, the New York Streamflow Estimation Tool can assist water managers with permitting water withdrawals, implementing habitat protection, estimating contaminant loads, or determining the potential affect from chemical spills.
NASA Technical Reports Server (NTRS)
Seshadri, Banavara R.; Smith, Stephen W.
2007-01-01
Variation in constraint through the thickness of a specimen effects the cyclic crack-tip-opening displacement (DELTA CTOD). DELTA CTOD is a valuable measure of crack growth behavior, indicating closure development, constraint variations and load history effects. Fatigue loading with a continual load reduction was used to simulate the load history associated with fatigue crack growth threshold measurements. The constraint effect on the estimated DELTA CTOD is studied by carrying out three-dimensional elastic-plastic finite element simulations. The analysis involves numerical simulation of different standard fatigue threshold test schemes to determine how each test scheme affects DELTA CTOD. The American Society for Testing and Materials (ASTM) prescribes standard load reduction procedures for threshold testing using either the constant stress ratio (R) or constant maximum stress intensity (K(sub max)) methods. Different specimen types defined in the standard, namely the compact tension, C(T), and middle cracked tension, M(T), specimens were used in this simulation. The threshold simulations were conducted with different initial K(sub max) values to study its effect on estimated DELTA CTOD. During each simulation, the DELTA CTOD was estimated at every load increment during the load reduction procedure. Previous numerical simulation results indicate that the constant R load reduction method generates a plastic wake resulting in remote crack closure during unloading. Upon reloading, this remote contact location was observed to remain in contact well after the crack tip was fully open. The final region to open is located at the point at which the load reduction was initiated and at the free surface of the specimen. However, simulations carried out using the constant Kmax load reduction procedure did not indicate remote crack closure. Previous analysis results using various starting K(sub max) values and different load reduction rates have indicated DELTA CTOD is independent of specimen size. A study of the effect of specimen thickness and geometry on the measured DELTA CTOD for various load reduction procedures and its implication in the estimation of fatigue crack growth threshold values is discussed.
NASA Astrophysics Data System (ADS)
Haddag, B.; Kagnaya, T.; Nouari, M.; Cutard, T.
2013-01-01
Modelling machining operations allows estimating cutting parameters which are difficult to obtain experimentally and in particular, include quantities characterizing the tool-workpiece interface. Temperature is one of these quantities which has an impact on the tool wear, thus its estimation is important. This study deals with a new modelling strategy, based on two steps of calculation, for analysis of the heat transfer into the cutting tool. Unlike the classical methods, considering only the cutting tool with application of an approximate heat flux at the cutting face, estimated from experimental data (e.g. measured cutting force, cutting power), the proposed approach consists of two successive 3D Finite Element calculations and fully independent on the experimental measurements; only the definition of the behaviour of the tool-workpiece couple is necessary. The first one is a 3D thermomechanical modelling of the chip formation process, which allows estimating cutting forces, chip morphology and its flow direction. The second calculation is a 3D thermal modelling of the heat diffusion into the cutting tool, by using an adequate thermal loading (applied uniform or non-uniform heat flux). This loading is estimated using some quantities obtained from the first step calculation, such as contact pressure, sliding velocity distributions and contact area. Comparisons in one hand between experimental data and the first calculation and at the other hand between measured temperatures with embedded thermocouples and the second calculation show a good agreement in terms of cutting forces, chip morphology and cutting temperature.
PREMER: a Tool to Infer Biological Networks.
Villaverde, Alejandro F; Becker, Kolja; Banga, Julio R
2017-10-04
Inferring the structure of unknown cellular networks is a main challenge in computational biology. Data-driven approaches based on information theory can determine the existence of interactions among network nodes automatically. However, the elucidation of certain features - such as distinguishing between direct and indirect interactions or determining the direction of a causal link - requires estimating information-theoretic quantities in a multidimensional space. This can be a computationally demanding task, which acts as a bottleneck for the application of elaborate algorithms to large-scale network inference problems. The computational cost of such calculations can be alleviated by the use of compiled programs and parallelization. To this end we have developed PREMER (Parallel Reverse Engineering with Mutual information & Entropy Reduction), a software toolbox that can run in parallel and sequential environments. It uses information theoretic criteria to recover network topology and determine the strength and causality of interactions, and allows incorporating prior knowledge, imputing missing data, and correcting outliers. PREMER is a free, open source software tool that does not require any commercial software. Its core algorithms are programmed in FORTRAN 90 and implement OpenMP directives. It has user interfaces in Python and MATLAB/Octave, and runs on Windows, Linux and OSX (https://sites.google.com/site/premertoolbox/).
Power, Christopher; Gerhard, Jason I; Karaoulis, Marios; Tsourlos, Panagiotis; Giannopoulos, Antonios
2014-07-01
Practical, non-invasive tools do not currently exist for mapping the remediation of dense non-aqueous phase liquids (DNAPLs). Electrical resistivity tomography (ERT) exhibits significant potential but has not yet become a practitioner's tool due to challenges in interpreting the survey results at real sites. This study explores the effectiveness of recently developed four-dimensional (4D, i.e., 3D space plus time) time-lapse surface ERT to monitor DNAPL source zone remediation. A laboratory experiment demonstrated the approach for mapping a changing NAPL distribution over time. A recently developed DNAPL-ERT numerical model was then employed to independently simulate the experiment, providing confidence that the DNAPL-ERT model is a reliable tool for simulating real systems. The numerical model was then used to evaluate the potential for this approach at the field scale. Four DNAPL source zones, exhibiting a range of complexity, were initially simulated, followed by modeled time-lapse ERT monitoring of complete DNAPL remediation by enhanced dissolution. 4D ERT inversion provided estimates of the regions of the source zone experiencing mass reduction with time. Results show that 4D time-lapse ERT has significant potential to map both the outline and the center of mass of the evolving treated portion of the source zone to within a few meters in each direction. In addition, the technique can provide a reasonable, albeit conservative, estimate of the DNAPL volume remediated with time: 25% underestimation in the upper 2m and up to 50% underestimation at late time between 2 and 4m depth. The technique is less reliable for identifying cleanup of DNAPL stringers outside the main DNAPL body. Overall, this study demonstrates that 4D time-lapse ERT has potential for mapping where and how quickly DNAPL mass changes in real time during site remediation. Copyright © 2014 Elsevier B.V. All rights reserved.
Janhsen, B.; Daniliuc, C. G.
2017-01-01
In this paper, the application of the double radical nucleophilic aromatic substitution (SRN1) in various dihalogenated, mostly diiodinated, π-conjugated systems as a tool for qualitatively estimating their π-conjugation is described. This approach uses electron delocalisation as a measure of π-conjugation. Electron injection into the π-system is achieved via reaction of an intermediate aryl radical, itself generated from a dihalogenated π-system via SET-reduction of the C–I bond and subsequent reaction with a thiolate anion. The generated arene radical anion can then further react with the second aryl-halogen moiety within the π-system via an intramolecular electron transfer process. The efficiency of this intramolecular electron transfer is related to the π-conjugation of the radical anion. If the π-conjugation within the aromatic unit is weak, the arene radical anion reacts via an intermolecular ET with the starting dihalide. The intramolecular ET process delivers a product of a double SRN1 substitution whereas the intermolecular ET pathway provides a product of a mono- SRN1 substitution. By simple product analysis of mono- versus double substitution, π-conjugation can be qualitatively evaluated. This mechanistic tool is applied to various dihalogenated π-conjugated systems and the results are discussed within the context of π-conjugation. The conjugation mode within the π-system and the length of the aromatic system are varied, and the effect of relative positioning of the two halides within small π-systems is also addressed. PMID:28580099
Contingent valuation of fuel hazard reduction treatments
John B. Loomis; Armando Gonzalez-Caban
2008-01-01
This chapter presents a stated preference technique for estimating the public benefits of reducing wildfires to residents of California, Florida, and Montana from two alternative fuel reduction programs: prescribed burning, and mechanical fuels reduction. The two fuel reduction programs under study are quite relevant to people living in California, Florida, and...
NASA Astrophysics Data System (ADS)
Rincón, A.; Jorba, O.; Baldasano, J. M.
2010-09-01
The increased contribution of solar energy in power generation sources requires an accurate estimation of surface solar irradiance conditioned by geographical, temporal and meteorological conditions. The knowledge of the variability of these factors is essential to estimate the expected energy production and therefore help stabilizing the electricity grid and increase the reliability of available solar energy. The use of numerical meteorological models in combination with statistical post-processing tools may have the potential to satisfy the requirements for short-term forecasting of solar irradiance for up to several days ahead and its application in solar devices. In this contribution, we present an assessment of a short-term irradiance prediction system based on the WRF-ARW mesoscale meteorological model (Skamarock et al., 2005) and several post-processing tools in order to improve the overall skills of the system in an annual simulation of the year 2004 in Spain. The WRF-ARW model is applied with 4 km x 4 km horizontal resolution and 38 vertical layers over the Iberian Peninsula. The hourly model irradiance is evaluated against more than 90 surface stations. The stations are used to assess the temporal and spatial fluctuations and trends of the system evaluating three different post-processes: Model Output Statistics technique (MOS; Glahn and Lowry, 1972), Recursive statistical method (REC; Boi, 2004) and Kalman Filter Predictor (KFP, Bozic, 1994; Roeger et al., 2003). A first evaluation of the system without post-processing tools shows an overestimation of the surface irradiance, due to the lack of atmospheric absorbers attenuation different than clouds not included in the meteorological model. This produces an annual BIAS of 16 W m-2 h-1, annual RMSE of 106 W m-2 h-1 and annual NMAE of 42%. The largest errors are observed in spring and summer, reaching RMSE of 350 W m-2 h-1. Results using Kalman Filter Predictor show a reduction of 8% of RMSE, 83% of BIAS, and NMAE decreases down to 32%. The REC method shows a reduction of 6% of RMSE, 79% of BIAS, and NMAE decreases down to 28%. When comparing stations at different altitudes, the overestimation is enhanced at coastal stations (less than 200m) up to 900 W m-2 h-1. The results allow us to analyze strengths and drawbacks of the irradiance prediction system and its application in the estimation of energy production from photovoltaic system cells. References Boi, P.: A statistical method for forecasting extreme daily temperatures using ECMWF 2-m temperatures and ground station measurements, Meteorol. Appl., 11, 245-251, 2004. Bozic, S.: Digital and Kalman filtering, John Wiley, Hoboken, New Jersey, 2nd edn., 1994. Glahn, H. and Lowry, D.: The use of Model Output Statistics (MOS) in Objective Weather Forecasting, Applied Meteorology, 11, 1203-1211, 1972. Roeger, C., Stull, R., McClung, D., Hacker, J., Deng, X., and Modzelewski, H.: Verification of Mesoscale Numerical Weather Forecasts in Mountainous Terrain for Application to Avalanche Prediction, Weather and forecasting, 18, 1140-1160, 2003. Skamarock, W., Klemp, J., Dudhia, J., Gill, D., Barker, D. M., Wang, W., and Powers, J. G.: A Description of the Advanced Research WRF Version 2, Tech. Rep. NCAR/TN-468+STR, NCAR Technical note, 2005.
Shen, Angela K; Warnock, Rob; Brereton, Stephaeno; McKean, Stephen; Wernecke, Michael; Chu, Steve; Kelman, Jeffrey A
2018-04-11
Older adults are at great risk of developing serious complications from seasonal influenza. We explore vaccination coverage estimates in the Medicare population through the use of administrative claims data and describe a tool designed to help shape outreach efforts and inform strategies to help raise influenza vaccination rates. This interactive mapping tool uses claims data to compare vaccination levels between geographic (i.e., state, county, zip code) and demographic (i.e., race, age) groups at different points in a season. Trends can also be compared across seasons. Utilization of this tool can assist key actors interested in prevention - medical groups, health plans, hospitals, and state and local public health authorities - in supporting strategies for reaching pools of unvaccinated beneficiaries where general national population estimates of coverage are less informative. Implementing evidence-based tools can be used to address persistent racial and ethnic disparities and prevent a substantial number of influenza cases and hospitalizations.
Hans-Erik Andersen; Jacob Strunk; Hailemariam Temesgen
2011-01-01
Airborne laser scanning, collected in a sampling mode, has the potential to be a valuable tool for estimating the biomass resources available to support bioenergy production in rural communities of interior Alaska. In this study, we present a methodology for estimating forest biomass over a 201,226-ha area (of which 163,913 ha are forested) in the upper Tanana valley...
Estimating the deposition of urban atmospheric NO2 to the urban forest in Portland-Vancouver USA
NASA Astrophysics Data System (ADS)
Rao, M.; Gonzalez Abraham, R.; George, L. A.
2016-12-01
Cities are hotspots of atmospheric emissions of reactive nitrogen oxides, including nitrogen dioxide (NO2), a US EPA criteria pollutant that affects both human and environmental health. A fraction of this anthropogenic, atmospheric NO2 is deposited onto the urban forest, potentially mitigating the impact of NO2 on respiratory health within cities. However, the role of the urban forest in removal of atmospheric NO2 through deposition has not been well studied. Here, using an observationally-based statistical model, we first estimate the reduction of NO2 associated with the urban forest in Portland-Vancouver, USA, and the health benefits accruing from this reduction. In order to assess if this statistically observed reduction in NO2 associated with the urban forest is consistent with deposition, we then compare the amount of NO2 removed through deposition to the urban forest as estimated using a 4km CMAQ simulation. We further undertake a sensitivity analysis in CMAQ to estimate the range of NO2removed as a function of bulk stomatal resistance. We find that NO2 deposition estimated by CMAQ accounts for roughly one-third of the reduction in NO2 shown by the observationally-based statistical model (Figure). Our sensitivity analysis shows that a 3-10 fold increase in the bulk stomatal resistance parameter in CMAQ would align CMAQ-estimated deposition with the statistical model. The reduction of NO2 by the urban forest in the Portland-Vancouver area may yield a health benefit of at least $1.5 million USD annually, providing strong motivation to better understand the mechanism through which the urban forest may be removing air pollutants such as NO2and thus helping create healthier urban atmospheres. Figure: Comparing the amount of NO2 deposition as estimated by CMAQ and the observationally-based statistical model (LURF). Each point corresponds to a single 4 x 4km CMAQ grid cell.
A simple model of carbon in the soil profile for agricultural soils in Northwestern Europe
NASA Astrophysics Data System (ADS)
Taghizadeh-Toosi, Arezoo; Hutchings, Nicholas J.; Vejlin, Jonas; Christensen, Bent T.; Olesen, Jørgen E.
2014-05-01
World soil carbon (C) stocks are second to those in the ocean, and represent three times as much C as currently present in the atmosphere. The amount of C in soil may play a significant role in carbon exchanges between the atmosphere and the terrestrial environment. The C-TOOL model is a three-pool linked soil organic carbon (SOC) model in well-drained mineral soils under agricultural land management to allow generalized parameterization for estimating effects of management measures at medium to long time scales for the entire soil profile (0-100 cm). C-TOOL has been developed to enable simulations of SOC turnover in soil using temperature dependent first order kinetics for describing decomposition. Compared with many other SOC models, C-TOOL applies a less complicated structure, which facilitates easier calibration, and it requires only few inputs (i.e., average monthly air temperature, soil clay content,soil carbon-to-nitrogen ratio, and C inputs to the soil from plants and other sources). C-TOOL was parameterized using SOC and radiocarbon data from selected long-term field treatments in United Kingdom, Sweden and Denmark. However, less data were available for evaluation of subsoil C (25-100 cm) from the long-term experiments applied. In Denmark a national 7×7 km grid net was established in 1986 for soil C monitoring down to 100 cm depth. The results of SOC showed a significant decline from 1997 to 2009 in the 0-50 cm soil layer. This was mainly attributed to changes in the 25-50 cm layer, where a decline in SOC was found for all soil texture types. Across the period 1986 to 2009 there was clear tendency for increasing SOC on the sandy soils and reductions on the loamy soils. This effect is linked to land use, since grasslands and dairy farms are more abundant in the western parts of Denmark, where most of the sandy soils are located. The results and the data from soil monitoring have been used to validate the C-TOOL modelling approach used for accounting of changes in SOC of Danish agricultural soils and for verification of the national inventories of SOC changes in agricultural soils. Future work will focus on further evaluating effects on subsoil C as well as improving the estimation of C inputs, particularly root C input at different soil depth. Key words: Soil organic carbon, modelling, C-TOOL, agriculture, management, grassland
Improvements in Spectrum's fit to program data tool.
Mahiane, Severin G; Marsh, Kimberly; Grantham, Kelsey; Crichlow, Shawna; Caceres, Karen; Stover, John
2017-04-01
The Joint United Nations Program on HIV/AIDS-supported Spectrum software package (Glastonbury, Connecticut, USA) is used by most countries worldwide to monitor the HIV epidemic. In Spectrum, HIV incidence trends among adults (aged 15-49 years) are derived by either fitting to seroprevalence surveillance and survey data or generating curves consistent with program and vital registration data, such as historical trends in the number of newly diagnosed infections or people living with HIV and AIDS related deaths. This article describes development and application of the fit to program data (FPD) tool in Joint United Nations Program on HIV/AIDS' 2016 estimates round. In the FPD tool, HIV incidence trends are described as a simple or double logistic function. Function parameters are estimated from historical program data on newly reported HIV cases, people living with HIV or AIDS-related deaths. Inputs can be adjusted for proportions undiagnosed or misclassified deaths. Maximum likelihood estimation or minimum chi-squared distance methods are used to identify the best fitting curve. Asymptotic properties of the estimators from these fits are used to estimate uncertainty. The FPD tool was used to fit incidence for 62 countries in 2016. Maximum likelihood and minimum chi-squared distance methods gave similar results. A double logistic curve adequately described observed trends in all but four countries where a simple logistic curve performed better. Robust HIV-related program and vital registration data are routinely available in many middle-income and high-income countries, whereas HIV seroprevalence surveillance and survey data may be scarce. In these countries, the FPD tool offers a simpler, improved approach to estimating HIV incidence trends.
Fiedler, John L; Macdonald, Barbara
2009-12-01
Food fortification is a promising strategy for combating micronutrient deficiencies, which plague one-third of the world's population. Which foods to fortify, with which micronutrients, and in which countries remain essential questions that to date have not been addressed at the global level. To provide a tool for international agencies to identify and organize the next phase of the unfinished global fortification agenda by prioritizing roughly 250 potential interventions in 48 priority countries. By explicitly defining the structure and operations of the fortification interventions in a detailed and transparent manner, and incorporating a substantial amount of country-specific data, the study also provides a potentially useful starting point for policy discussions in each of the 48 countries, which--it is hoped--will help to catalyze the development of public-private partnerships and accelerate the introduction of fortification and reduction of micronutrient deficiencies. Forty-eight high-priority countries were identified, and the feasibility of fortifying vegetable oil and sugar with vitamin A and fortifying wheat flour and maize flour with two alternative multiple micronutrient formulations was assessed. One hundred twenty-two country-, food-, and fortification formulation-specific interventions were assessed to be feasible, and the costs of each intervention were estimated. Assuming a 30% reduction in the micronutrient deficiencies of the persons consuming the food, the number of disability-adjusted life years (DALYs) saved by each of the programs was estimated. The cost per DALY saved was calculated for each of the 122 interventions, and the interventions were rank-ordered by cost-effectiveness. It is estimated that the 60 most cost-effective interventions would carry a 10-year price tag of US$1 billion and have costs per DALY saved ranging from US$1 to US$134. The single "best bet" intervention--i.e., the most cost-effective intervention--in each of the 48 countries was identified. This study provides a detailed, transparent, evidence-based approach to defining and estimating the costs and cost-effectiveness of the unfinished global fortification agenda in the 48 priority countries. Other considerations in designing a strategic approach to the unfinished global fortification agenda are also discussed.
Multi-model assessment of health impacts of air pollution in Europe and the U.S.
NASA Astrophysics Data System (ADS)
Im, Ulas; Brandt, Jørgen; Christensen, Jesper H.; Geels, Camilla; Hansen, Kaj M.; Andersen, Mikael S.; Solazzo, Efisio; Hogrefe, Christian; Galmarini, Stefano
2017-04-01
According to the World Health Organization (WHO), air pollution is now the world's largest single environmental health risk. Assessments of health impacts and the associated external costs related to air pollution are estimated based on observed and/or modelled air pollutant levels. Chemistry and transport models (CTMs) are useful tools to calculate the concentrations of health-related pollutants taking into account the non-linearities in the chemistry and the complex interactions between meteorology and chemistry. However, the CTMs include different chemical and aerosol schemes that introduce differences in the representation of the processes. Likewise, will differences in the emissions and boundary conditions used in the models add to the overall uncertainties. These uncertainties are introduced also into the health impact estimates using output from the CTMs. Multi-model (MM) ensembles can be useful to minimize these uncertainties introduced by the individual CTMs. In the present study, the simulated surface concentrations of health related air pollutants for the year 2010 from fifteen modelling groups participating in the AQMEII exercise, serve as input to the Economic Valuation of Air Pollution model (EVA), in order to calculate the impacts of these pollutants on human health and the associated external costs in Europe and U.S. In addition, the impacts of a 20% global emission reduction scenario on the human health and associated costs have been calculated. Preliminary results show that in Europe and U.S., the MM mean number of premature deaths due to air pollution is calculated to be 400 000 and 160 000, respectively. Estimated health impacts among different models can vary up to a factor of 3 and 1.2 in Europe and U.S., respectively. PM is calculated to be the major pollutant affecting the health impacts and the differences in models regarding the treatment of aerosol composition, physics and dynamics is a key factor. The total MM mean costs due to health impacts of air pollution are estimated to be 400 and 170 billion € in Europe and U.S., respectively. Finally, the scenario with a 20% reduction in global anthropogenic emissions leads to a decrease of 18% of all health outcomes.
Using bayesian model to estimate the cost of traffic injuries in Iran in 2013
Ainy, Elaheh; Soori, Hamid; Ganjali, Mojtaba; Bahadorimonfared, Ayad
2017-01-01
Background and Aim: A significant social and economic burden inflicts by road traffic injuries (RTIs). We aimed to use Bayesian model, to present the precise method, and to estimate the cost of RTIs in Iran in 2013. Materials and Methods: In a cross-sectional study on costs resulting from traffic injuries, 846 people per road user were randomly selected and investigated during 3 months (1st September–1st December) in 2013. The research questionnaire was prepared based on the standard for willingness to pay (WTP) method considering perceived risks, especially in Iran. Data were collected along with four scenarios for occupants, pedestrians, vehicle drivers, and motorcyclists. Inclusion criterion was having at least high school education and being in the age range of 18–65 years old; risk perception was an important factor to the study and measured by visual tool. Samples who did not have risk perception were excluded from the study. Main outcome measure was cost estimation of traffic injuries using WTP method. Results: Mean WTP was 2,612,050 internal rate of return (IRR) among these road users. Statistical value of life was estimated according to 20,408 death cases 402,314,106,073,648 IRR, equivalent to 13,410,470,202$ based on the dollar free market rate of 30,000 IRR (purchase power parity). In sum, injury and death cases came to 1,171,450,232,238,648 IRR equivalents to 39,048,341,074$. Moreover, in 2013, costs of traffic accident constituted 6.46% of gross national income, which was 604,300,000,000$. WTP had a significant relationship with age, middle and high income, daily payment to injury reduction, more payment to time reduction, trip mileage, private cars drivers, bus, minibus vehicles, and occupants (P < 0.01). Conclusion: Costs of traffic injuries included noticeable portion of gross national income. If policy-making and resource allocation are made based on the scientific pieces of evidence, an enormous amount of capital can be saved through reducing death and injury rates. PMID:28971031
Hakama, Matti; Moss, Sue M; Stenman, Ulf-Hakan; Roobol, Monique J; Zappa, Marco; Carlsson, Sigrid; Randazzo, Marco; Nelen, Vera; Hugosson, Jonas
2017-06-01
Objectives To calculate design-corrected estimates of the effect of screening on prostate cancer mortality by centre in the European Randomised Study of Screening for Prostate Cancer (ERSPC). Setting The ERSPC has shown a 21% reduction in prostate cancer mortality in men invited to screening with follow-up truncated at 13 years. Centres either used pre-consent randomisation (effectiveness design) or post-consent randomisation (efficacy design). Methods In six centres (three effectiveness design, three efficacy design) with follow-up until the end of 2010, or maximum 13 years, the effect of screening was estimated as both effectiveness (mortality reduction in the target population) and efficacy (reduction in those actually screened). Results The overall crude prostate cancer mortality risk ratio in the intervention arm vs control arm for the six centres was 0.79 ranging from a 14% increase to a 38% reduction. The risk ratio was 0.85 in centres with effectiveness design and 0.73 in those with efficacy design. After correcting for design, overall efficacy was 27%, 24% in pre-consent and 29% in post-consent centres, ranging between a 12% increase and a 52% reduction. Conclusion The estimated overall effect of screening in attenders (efficacy) was a 27% reduction in prostate cancer mortality at 13 years' follow-up. The variation in efficacy between centres was greater than the range in risk ratio without correction for design. The centre-specific variation in the mortality reduction could not be accounted for by the randomisation method.
Health benefit modelling and optimization of vehicular pollution control strategies
NASA Astrophysics Data System (ADS)
Sonawane, Nayan V.; Patil, Rashmi S.; Sethi, Virendra
2012-12-01
This study asserts that the evaluation of pollution reduction strategies should be approached on the basis of health benefits. The framework presented could be used for decision making on the basis of cost effectiveness when the strategies are applied concurrently. Several vehicular pollution control strategies have been proposed in literature for effective management of urban air pollution. The effectiveness of these strategies has been mostly studied as a one at a time approach on the basis of change in pollution concentration. The adequacy and practicality of such an approach is studied in the present work. Also, the assessment of respective benefits of these strategies has been carried out when they are implemented simultaneously. An integrated model has been developed which can be used as a tool for optimal prioritization of various pollution management strategies. The model estimates health benefits associated with specific control strategies. ISC-AERMOD View has been used to provide the cause-effect relation between control options and change in ambient air quality. BenMAP, developed by U.S. EPA, has been applied for estimation of health and economic benefits associated with various management strategies. Valuation of health benefits has been done for impact indicators of premature mortality, hospital admissions and respiratory syndrome. An optimization model has been developed to maximize overall social benefits with determination of optimized percentage implementations for multiple strategies. The model has been applied for sub-urban region of Mumbai city for vehicular sector. Several control scenarios have been considered like revised emission standards, electric, CNG, LPG and hybrid vehicles. Reduction in concentration and resultant health benefits for the pollutants CO, NOx and particulate matter are estimated for different control scenarios. Finally, an optimization model has been applied to determine optimized percentage implementation of specific control strategies with maximization of social benefits, when these strategies are applied simultaneously.
Singh, Rajiv; Sinha, Saurabh; Bill, Alan; Turner-Stokes, Lynne
2017-04-01
To identify the needs for specialised rehabilitation provision in a cohort of neurosurgical patients; to determine if these were met, and to estimate the potential cost implications and cost-benefits of meeting any unmet rehabilitation needs. A prospective study of in-patient admissions to a regional neurosurgical ward. Assessment of needs for specialised rehabilitation (Category A or B needs) was made with the Patient Categorisation Tool. The number of patients who were referred and admitted for specialised rehabilitation was calculated. Data from the unit's submission to the UK Rehabilitation Outcomes Collaborative (UKROC) national clinical database 2012-2015 were used to estimate the potential mean lifetime savings generated through reduction in the costs of on-going care in the community. Of 223 neurosurgical in-patients over 3 months, 156 (70%) had Category A or B needs. Out of the 105 patients who were eligible for admission to the local specialised rehabilitation service, only 20 (19%) were referred and just 11 (10%) were actually admitted. The mean transfer time was 70.2 (range 28-127) days, compared with the national standard of 42 days. In the 3-year sample, mean savings in the cost of on-going care were £568 per week. Assuming a 10-year reduction in life expectancy, the approximate net lifetime saving for post-neurosurgical patients was estimated as at least £600K per patient. We calculated that provision of additional bed capacity in the specialist rehabilitation unit could generate net savings of £3.6M/bed-year. This preliminary single-centre study identified a considerable gap in provision of specialised rehabilitation for neurosurgical patients, which must be addressed if patients are to fulfil their potential for recovery. A 5-fold increase in bed capacity would cost £9.3m/year, but could lead to potential net savings of £24m/year. Our findings now require confirmation on a wider scale through prospective multi-centre studies.
Lee, Chan Ho; Park, Young Joo; Ku, Ja Yoon; Ha, Hong Koo
2017-06-01
To evaluate the clinical application of computed tomography-based measurement of renal cortical volume and split renal volume as a single tool to assess the anatomy and renal function in patients with renal tumors before and after partial nephrectomy, and to compare the findings with technetium-99m dimercaptosuccinic acid renal scan. The data of 51 patients with a unilateral renal tumor managed by partial nephrectomy were retrospectively analyzed. The renal cortical volume of tumor-bearing and contralateral kidneys was measured using ImageJ software. Split estimated glomerular filtration rate and split renal volume calculated using this renal cortical volume were compared with the split renal function measured with technetium-99m dimercaptosuccinic acid renal scan. A strong correlation between split renal function and split renal volume of the tumor-bearing kidney was observed before and after surgery (r = 0.89, P < 0.001 and r = 0.94, P < 0.001). The preoperative and postoperative split estimated glomerular filtration rate of the operated kidney showed a moderate correlation with split renal function (r = 0.39, P = 0.004 and r = 0.49, P < 0.001). The correlation between reductions in split renal function and split renal volume of the operated kidney (r = 0.87, P < 0.001) was stronger than that between split renal function and percent reduction in split estimated glomerular filtration rate (r = 0.64, P < 0.001). The split renal volume calculated using computed tomography-based renal volumetry had a strong correlation with the split renal function measured using technetium-99m dimercaptosuccinic acid renal scan. Computed tomography-based split renal volume measurement before and after partial nephrectomy can be used as a single modality for anatomical and functional assessment of the tumor-bearing kidney. © 2017 The Japanese Urological Association.
Attribution of declining Western U.S. Snowpack to human effects
Pierce, D.W.; Barnett, T.P.; Hidalgo, H.G.; Das, T.; Bonfils, Celine; Santer, B.D.; Bala, G.; Dettinger, M.D.; Cayan, D.R.; Mirin, A.; Wood, A.W.; Nozawa, T.
2008-01-01
Observations show snowpack has declined across much of the western United States over the period 1950-99. This reduction has important social and economic implications, as water retained in the snowpack from winter storms forms an important part of the hydrological cycle and water supply in the region. A formal model-based detection and attribution (D-A) study of these reductions is performed. The detection variable is the ratio of 1 April snow water equivalent (SWE) to water-year-to-date precipitation (P), chosen to reduce the effect of P variability on the results. Estimates of natural internal climate variability are obtained from 1600 years of two control simulations performed with fully coupled ocean-atmosphere climate models. Estimates of the SWE/P response to anthropogenic greenhouse gases, ozone, and some aerosols are taken from multiple-member ensembles of perturbation experiments run with two models. The D-A shows the observations and anthropogenically forced models have greater SWE/P reductions than can be explained by natural internal climate variability alone. Model-estimated effects of changes in solar and volcanic forcing likewise do not explain the SWE/P reductions. The mean model estimate is that about half of the SWE/P reductions observed in the west from 1950 to 1999 are the result of climate changes forced by anthropogenic greenhouse gases, ozone, and aerosols. ?? 2008 American Meteorological Society.
NASA Astrophysics Data System (ADS)
Rachmawati, Vimala; Khusnul Arif, Didik; Adzkiya, Dieky
2018-03-01
The systems contained in the universe often have a large order. Thus, the mathematical model has many state variables that affect the computation time. In addition, generally not all variables are known, so estimations are needed to measure the magnitude of the system that cannot be measured directly. In this paper, we discuss the model reduction and estimation of state variables in the river system to measure the water level. The model reduction of a system is an approximation method of a system with a lower order without significant errors but has a dynamic behaviour that is similar to the original system. The Singular Perturbation Approximation method is one of the model reduction methods where all state variables of the equilibrium system are partitioned into fast and slow modes. Then, The Kalman filter algorithm is used to estimate state variables of stochastic dynamic systems where estimations are computed by predicting state variables based on system dynamics and measurement data. Kalman filters are used to estimate state variables in the original system and reduced system. Then, we compare the estimation results of the state and computational time between the original and reduced system.
A holistic approach to age estimation in refugee children.
Sypek, Scott A; Benson, Jill; Spanner, Kate A; Williams, Jan L
2016-06-01
Many refugee children arriving in Australia have an inaccurately documented date of birth (DOB). A medical assessment of a child's age is often requested when there is a concern that their documented DOB is incorrect. This study's aim was to assess the accuracy a holistic age assessment tool (AAT) in estimating the age of refugee children newly settled in Australia. A holistic AAT that combines medical and non-medical approaches was used to estimate the ages of 60 refugee children with a known DOB. The tool used four components to assess age: an oral narrative, developmental assessment, anthropometric measures and pubertal assessment. Assessors were blinded to the true age of the child. Correlation coefficients for the actual and estimated age were calculated for the tool overall and individual components. The correlation coefficient between the actual and estimated age from the AAT was very strong at 0.9802 (boys 0.9748, girls 0.9876). The oral narrative component of the tool performed best (R = 0.9603). Overall, 86.7% of age estimates were within 1 year of the true age. The range of differences was -1.43 to 3.92 years with a standard deviation of 0.77 years (9.24 months). The AAT is a holistic, simple and safe instrument that can be used to estimate age in refugee children with results comparable with radiological methods currently used. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).
Lafeber, Melvin; Webster, Ruth; Visseren, Frank Lj; Bots, Michiel L; Grobbee, Diederick E; Spiering, W; Rodgers, Anthony
2016-08-01
Recent data indicate that fixed-dose combination (FDC) pills, polypills, can produce sizeable risk factor reductions. There are very few published data on the consistency of the effects of a polypill in different patient populations. It is unclear for example whether the effects of the polypill are mainly driven by the individuals with high individual risk factor levels. The aim of the present study is to examine whether baseline risk factor levels modify the effect of polypill treatment on low-density lipoprotein (LDL)-cholesterol, blood pressure (BP), calculated cardiovascular relative risk reduction and adverse events. This paper describes a post-hoc analysis of a randomised, placebo-controlled trial of a polypill (containing aspirin 75 mg, simvastatin 20 mg, lisinopril 10 mg and hydrochlorothiazide 12.5 mg) in 378 individuals without an indication for any component of the polypill, but who had an estimated five-year risk for cardiovascular disease ≥7.5%. The outcomes considered were effect modification by baseline risk factor levels on change in LDL-cholesterol, systolic BP, calculated cardiovascular relative risk reduction and adverse events. The mean LDL-cholesterol in the polypill group was 0.9 mmol/l (95% confidence interval (CI): 0.8-1.0) lower compared with the placebo group during follow-up. Those with a baseline LDL-cholesterol >3.6 mmol/l achieved a greater absolute LDL-cholesterol reduction with the polypill compared with placebo, than patients with an LDL-cholesterol ≤3.6 mmol/l (-1.1 versus -0.6 mmol/l, respectively). The mean systolic BP was 10 mm Hg (95% CI: 8-12) lower in the polypill group. In participants with a baseline systolic BP >135 mm Hg the polypill resulted in a greater absolute systolic BP reduction with the polypill compared with placebo, than participants with a systolic BP ≤ 135 mm Hg (-12 versus -7 mm Hg, respectively). Calculated from individual risk factor reductions, the mean cardiovascular relative risk reduction was 48% (95% CI: 43-52) in the polypill group. Both baseline LDL-cholesterol and estimated cardiovascular risk were significant modifiers of the estimated cardiovascular relative risk reduction caused by the polypill. Adverse events did not appear to be related to baseline risk factor levels or the estimated cardiovascular risk. This study demonstrated that the effect of a cardiovascular polypill on risk factor levels is modified by the level of these risk factors. Groups defined by baseline LDL-cholesterol or systolic BP had large differences in risk factor reductions but only moderate differences in estimated cardiovascular relative risk reduction, suggesting also that patients with mildly increased risk factor levels but an overall raised cardiovascular risk benefit from being treated with a polypill. © The European Society of Cardiology 2016.
NASA Astrophysics Data System (ADS)
Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.
2012-08-01
The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.
Benefit-cost estimation for alternative drinking water maximum contaminant levels
NASA Astrophysics Data System (ADS)
Gurian, Patrick L.; Small, Mitchell J.; Lockwood, John R.; Schervish, Mark J.
2001-08-01
A simulation model for estimating compliance behavior and resulting costs at U.S. Community Water Suppliers is developed and applied to the evaluation of a more stringent maximum contaminant level (MCL) for arsenic. Probability distributions of source water arsenic concentrations are simulated using a statistical model conditioned on system location (state) and source water type (surface water or groundwater). This model is fit to two recent national surveys of source waters, then applied with the model explanatory variables for the population of U.S. Community Water Suppliers. Existing treatment types and arsenic removal efficiencies are also simulated. Utilities with finished water arsenic concentrations above the proposed MCL are assumed to select the least cost option compatible with their existing treatment from among 21 available compliance strategies and processes for meeting the standard. Estimated costs and arsenic exposure reductions at individual suppliers are aggregated to estimate the national compliance cost, arsenic exposure reduction, and resulting bladder cancer risk reduction. Uncertainties in the estimates are characterized based on uncertainties in the occurrence model parameters, existing treatment types, treatment removal efficiencies, costs, and the bladder cancer dose-response function for arsenic.
Sonko, Bakary J; Miller, Leland V; Jones, Richard H; Donnelly, Joseph E; Jacobsen, Dennis J; Hill, James O; Fennessey, Paul V
2003-12-15
Reducing water to hydrogen gas by zinc or uranium metal for determining D/H ratio is both tedious and time consuming. This has forced most energy metabolism investigators to use the "two-point" technique instead of the "Multi-point" technique for estimating total energy expenditure (TEE). Recently, we purchased a new platinum (Pt)-equilibration system that significantly reduces both time and labor required for D/H ratio determination. In this study, we compared TEE obtained from nine overweight but healthy subjects, estimated using the traditional Zn-reduction method to that obtained from the new Pt-equilibration system. Rate constants, pool spaces, and CO2 production rates obtained from use of the two methodologies were not significantly different. Correlation analysis demonstrated that TEEs estimated using the two methods were significantly correlated (r=0.925, p=0.0001). Sample equilibration time was reduced by 66% compared to those of similar methods. The data demonstrated that the Zn-reduction method could be replaced by the Pt-equilibration method when TEE was estimated using the "Multi-Point" technique. Furthermore, D equilibration time was significantly reduced.
Identification of differences in health impact modelling of salt reduction
Geleijnse, Johanna M.; van Raaij, Joop M. A.; Cappuccio, Francesco P.; Cobiac, Linda C.; Scarborough, Peter; Nusselder, Wilma J.; Jaccard, Abbygail; Boshuizen, Hendriek C.
2017-01-01
We examined whether specific input data and assumptions explain outcome differences in otherwise comparable health impact assessment models. Seven population health models estimating the impact of salt reduction on morbidity and mortality in western populations were compared on four sets of key features, their underlying assumptions and input data. Next, assumptions and input data were varied one by one in a default approach (the DYNAMO-HIA model) to examine how it influences the estimated health impact. Major differences in outcome were related to the size and shape of the dose-response relation between salt and blood pressure and blood pressure and disease. Modifying the effect sizes in the salt to health association resulted in the largest change in health impact estimates (33% lower), whereas other changes had less influence. Differences in health impact assessment model structure and input data may affect the health impact estimate. Therefore, clearly defined assumptions and transparent reporting for different models is crucial. However, the estimated impact of salt reduction was substantial in all of the models used, emphasizing the need for public health actions. PMID:29182636
Hans-Erik Andersen; Jacob Strunk; Hailemariam Temesgen
2011-01-01
Airborne laser scanning, collected in a sampling mode, has the potential to be a valuable tool for estimating the biomass resources available to support bioenergy production in rural communities of interior Alaska. In this study, we present a methodology for estimating forest biomass over a 201,226-ha area (of which 163,913 ha are forested) in the upper Tanana valley...
The Acquisition Cost-Estimating Workforce. Census and Characteristics
2009-01-01
Abbreviations AAC Air Armament Center ACAT acquisition category ACEIT Automated Cost Estimating Integrated Tools AF Air Force AFB Air Force Base AFCAA Air...3 3 4 Automated Cost Estimating Integrated Tools ( ACEIT ) 0 1 12 6 Tecolotea training 0 0 10 5 Other 3 13 24 18 No training 18 4 29 18 Total 100 100...other sources, including AFIT, ACEIT ,9 or the contracting agency that employed them. The remain- ing 29 percent reported having received no training
König, S; Tsehay, F; Sitzenstock, F; von Borstel, U U; Schmutz, M; Preisinger, R; Simianer, H
2010-04-01
Due to consistent increases of inbreeding of on average 0.95% per generation in layer populations, selection tools should consider both genetic gain and genetic relationships in the long term. The optimum genetic contribution theory using official estimated breeding values for egg production was applied for 3 different lines of a layer breeding program to find the optimal allocations of hens and sires. Constraints in different scenarios encompassed restrictions related to additive genetic relationships, the increase of inbreeding, the number of selected sires and hens, and the number of selected offspring per mating. All these constraints enabled higher genetic gain up to 10.9% at the same level of additive genetic relationships or in lower relationships at the same gain when compared with conventional selection schemes ignoring relationships. Increases of inbreeding and genetic gain were associated with the number of selected sires. For the lowest level of the allowed average relationship at 10%, the optimal number of sires was 70 and the estimated breeding value for egg production of the selected group was 127.9. At the highest relationship constraint (16%), the optimal number of sires decreased to 15, and the average genetic value increased to 139.7. Contributions from selected sires and hens were used to develop specific mating plans to minimize inbreeding in the following generation by applying a simulated annealing algorithm. The additional reduction of average additive genetic relationships for matings was up to 44.9%. An innovative deterministic approach to estimate kinship coefficients between and within defined selection groups based on gene flow theory was applied to compare increases of inbreeding from random matings with layer populations undergoing selection. Large differences in rates of inbreeding were found, and they underline the necessity to establish selection tools controlling long-term relationships. Furthermore, it was suggested to use optimum genetic contribution theory for conservation schemes or, for example, the experimental line in our study.
Influence of the watermark in immersion lithography process
NASA Astrophysics Data System (ADS)
Kawamura, Daisuke; Takeishi, Tomoyuki; Sho, Koutarou; Matsunaga, Kentarou; Shibata, Naofumi; Ozawa, Kaoru; Shimura, Satoru; Kyoda, Hideharu; Kawasaki, Tetsu; Ishida, Seiki; Toshima, Takayuki; Oonishi, Yasunobu; Ito, Shinichi
2005-05-01
In the liquid immersion lithography, uses of the cover material (C/M) films were discussed to reduce elution of resist components to fluid. With fluctuation of exposure tool or resist process, it is possible to remain of waterdrop on the wafer and watermark (W/M) will be made. The investigation of influence of the W/M on resist patterns, formation process of W/M, and reduction of pattern defect due to W/M will be discussed. Resist patterns within and around the intentionally made W/M were observed in three cases, which were without C/M, TOK TSP-3A and alkali-soluble C/M. In all C/M cases, pattern defect were T-topped shapes. Reduction of pattern defects due to waterdrop was examined. It was found that remained waterdrop made defect. It should be required to remove waterdrop before drying, and/or to remove the defect due to waterdrop. But new dry technique and/or unit will be need for making no W/M. It was examined that the observation of waterdrop through the drying step and simulative reproduction of experiment in order to understand the formation mechanism of W/M. If maximum drying time of waterdrop using immersion exposure tool is estimated 90 seconds, the watermark of which volume and diameter are less than 0.02 uL and 350um will be dried and will make pattern defect. The threshold will be large with wafer speed become faster. From result and speculations in this work, it is considered that it will be difficult to development C/M as single film, which makes no pattern defects due to remained waterdrop.
Waterhammer Transient Simulation and Model Anchoring for the Robotic Lunar Lander Propulsion System
NASA Technical Reports Server (NTRS)
Stein, William B.; Trinh, Huu P.; Reynolds, Michael E.; Sharp, David J.
2011-01-01
Waterhammer transients have the potential to adversely impact propulsion system design if not properly addressed. Waterhammer can potentially lead to system plumbing, and component damage. Multi-thruster propulsion systems also develop constructive/destructive wave interference which becomes difficult to predict without detailed models. Therefore, it is important to sufficiently characterize propulsion system waterhammer in order to develop a robust design with minimal impact to other systems. A risk reduction activity was performed at Marshall Space Flight Center to develop a tool for estimating waterhammer through the use of anchored simulation for the Robotic Lunar Lander (RLL) propulsion system design. Testing was performed to simulate waterhammer surges due to rapid valve closure and consisted of twenty-two series of waterhammer tests, resulting in more than 300 valve actuations. These tests were performed using different valve actuation schemes and three system pressures. Data from the valve characterization tests were used to anchor the models that employed MSCSoftware.EASY5 v.2010 to model transient fluid phenomena by using transient forms of mass and energy conservation. The anchoring process was performed by comparing initial model results to experimental data and then iterating the model input to match the simulation results with the experimental data. The models provide good correlation with experimental results, supporting the use of EASY5 as a tool to model fluid transients and provide a baseline for future RLL system modeling. This paper addresses tasks performed during the waterhammer risk reduction activity for the RLL propulsion system. The problem of waterhammer simulation anchoring as applied to the RLL system is discussed with results from the corresponding experimental valve tests. Important factors for waterhammer mitigation are discussed along with potential design impacts to the RLL propulsion system.
Comparison of Predictive Modeling Methods of Aircraft Landing Speed
NASA Technical Reports Server (NTRS)
Diallo, Ousmane H.
2012-01-01
Expected increases in air traffic demand have stimulated the development of air traffic control tools intended to assist the air traffic controller in accurately and precisely spacing aircraft landing at congested airports. Such tools will require an accurate landing-speed prediction to increase throughput while decreasing necessary controller interventions for avoiding separation violations. There are many practical challenges to developing an accurate landing-speed model that has acceptable prediction errors. This paper discusses the development of a near-term implementation, using readily available information, to estimate/model final approach speed from the top of the descent phase of flight to the landing runway. As a first approach, all variables found to contribute directly to the landing-speed prediction model are used to build a multi-regression technique of the response surface equation (RSE). Data obtained from operations of a major airlines for a passenger transport aircraft type to the Dallas/Fort Worth International Airport are used to predict the landing speed. The approach was promising because it decreased the standard deviation of the landing-speed error prediction by at least 18% from the standard deviation of the baseline error, depending on the gust condition at the airport. However, when the number of variables is reduced to the most likely obtainable at other major airports, the RSE model shows little improvement over the existing methods. Consequently, a neural network that relies on a nonlinear regression technique is utilized as an alternative modeling approach. For the reduced number of variables cases, the standard deviation of the neural network models errors represent over 5% reduction compared to the RSE model errors, and at least 10% reduction over the baseline predicted landing-speed error standard deviation. Overall, the constructed models predict the landing-speed more accurately and precisely than the current state-of-the-art.
RSA and registries: the quest for phased introduction of new implants.
Nelissen, Rob G H H; Pijls, Bart G; Kärrholm, Johan; Malchau, Henrik; Nieuwenhuijse, Marc J; Valstar, Edward R
2011-12-21
Although the overall survival of knee and hip prostheses at ten years averages 90%, recent problems with several hip and knee prostheses have illustrated that the orthopaedic community, industry, and regulators can still further improve patient safety. Given the early predictive properties of roentgen stereophotogrammetric analysis (RSA) and the meticulous follow-up of national joint registries, these two methods are ideal tools for such a phased clinical introduction. In this paper, we elaborate on the predictive power of RSA within a two-year follow-up after arthroplasty and its relationship to national joint registries. The association between RSA prosthesis-migration data and registry data is evaluated. The five-year rate of revision of RSA-tested total knee replacements was compared with that of non-RSA-tested total knee replacements. Data were extracted from the published results of the national joint registries of Sweden, Australia, and New Zealand. There was a 22% to 35% reduction in the number of revisions of RSA-tested total knee replacements as compared with non-RSA-tested total knee replacements in the national joint registries. Assuming that the total cost of total knee arthroplasty is $37,000 in the United States, a 22% to 35% reduction in the number of revisions (currently close to 55,000 annually) could lead to an estimated annual savings of over $400 million to the health-care system. The phased clinical introduction of new prostheses with two-year RSA results as a qualitative tool could lead to better patient care and could reduce the costs associated with revision total knee arthroplasty. Follow-up in registries is necessary to substantiate these results and to improve post-market surveillance.
Association Between Connecticut's Permit-to-Purchase Handgun Law and Homicides.
Rudolph, Kara E; Stuart, Elizabeth A; Vernick, Jon S; Webster, Daniel W
2015-08-01
We sought to estimate the effect of Connecticut's implementation of a handgun permit-to-purchase law in October 1995 on subsequent homicides. Using the synthetic control method, we compared Connecticut's homicide rates after the law's implementation to rates we would have expected had the law not been implemented. To estimate the counterfactual, we used longitudinal data from a weighted combination of comparison states identified based on the ability of their prelaw homicide trends and covariates to predict prelaw homicide trends in Connecticut. We estimated that the law was associated with a 40% reduction in Connecticut's firearm homicide rates during the first 10 years that the law was in place. By contrast, there was no evidence for a reduction in nonfirearm homicides. Consistent with prior research, this study demonstrated that Connecticut's handgun permit-to-purchase law was associated with a subsequent reduction in homicide rates. As would be expected if the law drove the reduction, the policy's effects were only evident for homicides committed with firearms.
NASA Astrophysics Data System (ADS)
Bradley, Larry; Sipocz, Brigitta; Robitaille, Thomas; Tollerud, Erik; Deil, Christoph; Vinícius, Zè; Barbary, Kyle; Günther, Hans Moritz; Bostroem, Azalee; Droettboom, Michael; Bray, Erik; Bratholm, Lars Andersen; Pickering, T. E.; Craig, Matt; Pascual, Sergio; Greco, Johnny; Donath, Axel; Kerzendorf, Wolfgang; Littlefair, Stuart; Barentsen, Geert; D'Eugenio, Francesco; Weaver, Benjamin Alan
2016-09-01
Photutils provides tools for detecting and performing photometry of astronomical sources. It can estimate the background and background rms in astronomical images, detect sources in astronomical images, estimate morphological parameters of those sources (e.g., centroid and shape parameters), and perform aperture and PSF photometry. Written in Python, it is an affiliated package of Astropy (ascl:1304.002).
2016-01-01
Reliably estimating wildlife abundance is fundamental to effective management. Aerial surveys are one of the only spatially robust tools for estimating large mammal populations, but statistical sampling methods are required to address detection biases that affect accuracy and precision of the estimates. Although various methods for correcting aerial survey bias are employed on large mammal species around the world, these have rarely been rigorously validated. Several populations of feral horses (Equus caballus) in the western United States have been intensively studied, resulting in identification of all unique individuals. This provided a rare opportunity to test aerial survey bias correction on populations of known abundance. We hypothesized that a hybrid method combining simultaneous double-observer and sightability bias correction techniques would accurately estimate abundance. We validated this integrated technique on populations of known size and also on a pair of surveys before and after a known number was removed. Our analysis identified several covariates across the surveys that explained and corrected biases in the estimates. All six tests on known populations produced estimates with deviations from the known value ranging from -8.5% to +13.7% and <0.7 standard errors. Precision varied widely, from 6.1% CV to 25.0% CV. In contrast, the pair of surveys conducted around a known management removal produced an estimated change in population between the surveys that was significantly larger than the known reduction. Although the deviation between was only 9.1%, the precision estimate (CV = 1.6%) may have been artificially low. It was apparent that use of a helicopter in those surveys perturbed the horses, introducing detection error and heterogeneity in a manner that could not be corrected by our statistical models. Our results validate the hybrid method, highlight its potentially broad applicability, identify some limitations, and provide insight and guidance for improving survey designs. PMID:27139732
Lubow, Bruce C; Ransom, Jason I
2016-01-01
Reliably estimating wildlife abundance is fundamental to effective management. Aerial surveys are one of the only spatially robust tools for estimating large mammal populations, but statistical sampling methods are required to address detection biases that affect accuracy and precision of the estimates. Although various methods for correcting aerial survey bias are employed on large mammal species around the world, these have rarely been rigorously validated. Several populations of feral horses (Equus caballus) in the western United States have been intensively studied, resulting in identification of all unique individuals. This provided a rare opportunity to test aerial survey bias correction on populations of known abundance. We hypothesized that a hybrid method combining simultaneous double-observer and sightability bias correction techniques would accurately estimate abundance. We validated this integrated technique on populations of known size and also on a pair of surveys before and after a known number was removed. Our analysis identified several covariates across the surveys that explained and corrected biases in the estimates. All six tests on known populations produced estimates with deviations from the known value ranging from -8.5% to +13.7% and <0.7 standard errors. Precision varied widely, from 6.1% CV to 25.0% CV. In contrast, the pair of surveys conducted around a known management removal produced an estimated change in population between the surveys that was significantly larger than the known reduction. Although the deviation between was only 9.1%, the precision estimate (CV = 1.6%) may have been artificially low. It was apparent that use of a helicopter in those surveys perturbed the horses, introducing detection error and heterogeneity in a manner that could not be corrected by our statistical models. Our results validate the hybrid method, highlight its potentially broad applicability, identify some limitations, and provide insight and guidance for improving survey designs.
Determining Level of Service for Multilane Median Opening Zone
NASA Astrophysics Data System (ADS)
Ali, Paydar; Johnnie, Ben-Edigbe
2017-08-01
The road system is a capital-intensive investment, requiring thorough schematic framework and funding. Roads are built to provide an intrinsic quality of service which satisfies the road users. Roads that provide good services are expected to deliver operational performance that is consistent with their design specifications. Level of service and cumulative percentile speed distribution methods have been used in previous studies to estimate the quality of multilane highway service. Whilst the level of service approach relies on speed/flow curve, the cumulative percentile speed distribution is based solely speed. These estimation methods were used in studies carried out in Johor Malaysia. The aim of the studies is to ascertain the extent of speed reduction caused by midblock U-turn facilities as well as verify which estimation method is more reliable. At selected sites, road segments for both directional flows were divided into free-flow and midblock zones. Traffic volume, speed and vehicle type data for each zone were collected continuously for six weeks. Both estimation methods confirmed that speed reduction would be caused by midblock u-turn facilities. However level of service methods suggested that the quality of service would improve from level F to E or D at midblock zone in spite of speed reduction. Level of service was responding to traffic volume reduction at midblock u-turn facility not travel speed reduction. The studies concluded that since level of service was more responsive to traffic volume reduction than travel speed, it cannot be solely relied upon when assessing the quality of multilane highway service.
Predicting Operator Execution Times Using CogTool
NASA Technical Reports Server (NTRS)
Santiago-Espada, Yamira; Latorella, Kara A.
2013-01-01
Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.
Mammographic compression after breast conserving therapy: Controlling pressure instead of force
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groot, J. E. de, E-mail: jerry.degroot@sigmascreening.com; Branderhorst, W.; Grimbergen, C. A.
Purpose: X-ray mammography is the primary tool for early detection of breast cancer and for follow-up after breast conserving therapy (BCT). BCT-treated breasts are smaller, less elastic, and more sensitive to pain. Instead of the current force-controlled approach of applying the same force to each breast, pressure-controlled protocols aim to improve standardization in terms of physiology by taking breast contact area and inelasticity into account. The purpose of this study is to estimate the potential for pressure protocols to reduce discomfort and pain, particularly the number of severe pain complaints for BCT-treated breasts. Methods: A prospective observational study including 58more » women having one BCT-treated breast and one untreated nonsymptomatic breast, following our hospital's 18 decanewton (daN) compression protocol was performed. Breast thickness, applied force, contact area, mean pressure, breast volume, and inelasticity (mean E-modulus) were statistically compared between the within-women breast pairs, and data were used as predictors for severe pain, i.e., scores 7 and higher on an 11-point Numerical Rating Scale. Curve-fitting models were used to estimate how pressure-controlled protocols affect breast thickness, compression force, and pain experience. Results: BCT-treated breasts had on average 27% smaller contact areas, 30% lower elasticity, and 30% higher pain scores than untreated breasts (allp < 0.001). Contact area was the strongest predictor for severe pain (p < 0.01). Since BCT-treatment is associated with an average 0.36 dm{sup 2} decrease in contact area, as well as increased pain sensitivity, BCT-breasts had on average 5.3 times higher odds for severe pain than untreated breasts. Model estimations for a pressure-controlled protocol with a 10 kPa target pressure, which is below normal arterial pressure, suggest an average 26% (range 10%–36%) reduction in pain score, and an average 77% (range 46%–95%) reduction of the odds for severe pain. The estimated increase in thickness is +6.4% for BCT breasts. Conclusions: After BCT, women have hardly any choice in avoiding an annual follow-up mammogram. Model estimations show that a 10 kPa pressure-controlled protocol has the potential to reduce pain and severe pain particularly for these women. The results highly motivate conducting further research in larger subject groups.« less
Takeuchi, Hiroyoshi; Suzuki, Takefumi; Bies, Robert R; Remington, Gary; Watanabe, Koichiro; Mimura, Masaru; Uchida, Hiroyuki
2014-11-01
While acute-phase antipsychotic response has been attributed to 65%-80% dopamine D₂ receptor blockade, the degree of occupancy for relapse prevention in the maintenance treatment of schizophrenia remains unknown. In this secondary study of an open-label, 28-week, randomized, controlled trial conducted between April 2009 and August 2011, clinically stable patients with schizophrenia (DSM-IV) treated with risperidone or olanzapine were randomly assigned to the reduction group (dose reduced by 50%) or maintenance group (dose kept constant). Plasma antipsychotic concentrations at peak and trough before and after dose reduction were estimated with population pharmacokinetic techniques, using 2 collected plasma samples. Corresponding dopamine D₂ occupancy levels were then estimated using the model we developed. Relapse was defined as worsening in 4 Positive and Negative Syndrome Scale-Positive subscale items: delusion, conceptual disorganization, hallucinatory behavior, and suspiciousness. Plasma antipsychotic concentrations were available for 16 and 15 patients in the reduction and maintenance groups, respectively. Estimated dopamine D₂ occupancy (mean ± SD) decreased following dose reduction from 75.6% ± 4.9% to 66.8% ± 6.4% at peak and 72.3% ± 5.7% to 62.0% ± 6.8% at trough. In the reduction group, 10 patients (62.5%) did not demonstrate continuous D₂ receptor blockade above 65% (ie, < 65% at trough) after dose reduction; furthermore, 7 patients (43.8%) did not achieve a threshold of 65% occupancy even at peak. Nonetheless, only 1 patient met our relapse criteria after dose reduction during the 6 months of the study. The results suggest that the therapeutic threshold regarding dopamine D₂ occupancy may be lower for those who are stable in antipsychotic maintenance versus acute-phase treatment. Positron emission tomography studies are warranted to further test our preliminary findings. UMIN Clinical Trials Registry identifier: UMIN000001834. © Copyright 2014 Physicians Postgraduate Press, Inc.
Wenzel, Tom
2013-07-01
The National Highway Traffic Safety Administration (NHTSA) recently updated its 2003 and 2010 logistic regression analyses of the effect of a reduction in light-duty vehicle mass on US fatality risk per vehicle mile traveled (VMT). The current NHTSA analysis is the most thorough investigation of this issue to date. LBNL's assessment of the analysis indicates that the estimated effect of mass reduction on risk is smaller than in the previous studies, and statistically non-significant for all but the lightest cars. The effects three recent trends in vehicle designs and technologies have on societal fatality risk per VMT are estimated, and whether these changes might affect the relationship between vehicle mass and fatality risk in the future. Side airbags are found to reduce fatality risk in cars, but not necessarily light trucks or CUVs/minivans, struck in the side by another light-duty vehicle; reducing the number of fatalities in cars struck in the side is predicted to reduce the estimated detrimental effect of footprint reduction, but increase the detrimental effect of mass reduction, in cars on societal fatality risk. Better alignment of light truck bumpers with those of other vehicles appears to result in a statistically significant reduction in risk imposed on car occupants; however, reducing this type of fatality will likely have little impact on the estimated effect of mass or footprint reduction on risk. Finally, shifting light truck drivers into safer, car-based vehicles, such as sedans, CUVs, and minivans, would result in larger reductions in societal fatalities than expected from even substantial reductions in the masses of light trucks. A strategy of shifting drivers from truck-based to car-based vehicles would reduce fuel use and greenhouse gas emissions, while improving societal safety. Copyright © 2013 Elsevier Ltd. All rights reserved.
Granato, Gregory E.
2014-01-01
The U.S. Geological Survey (USGS) developed the Stochastic Empirical Loading and Dilution Model (SELDM) in cooperation with the Federal Highway Administration (FHWA) to indicate the risk for stormwater concentrations, flows, and loads to be above user-selected water-quality goals and the potential effectiveness of mitigation measures to reduce such risks. SELDM models the potential effect of mitigation measures by using Monte Carlo methods with statistics that approximate the net effects of structural and nonstructural best management practices (BMPs). In this report, structural BMPs are defined as the components of the drainage pathway between the source of runoff and a stormwater discharge location that affect the volume, timing, or quality of runoff. SELDM uses a simple stochastic statistical model of BMP performance to develop planning-level estimates of runoff-event characteristics. This statistical approach can be used to represent a single BMP or an assemblage of BMPs. The SELDM BMP-treatment module has provisions for stochastic modeling of three stormwater treatments: volume reduction, hydrograph extension, and water-quality treatment. In SELDM, these three treatment variables are modeled by using the trapezoidal distribution and the rank correlation with the associated highway-runoff variables. This report describes methods for calculating the trapezoidal-distribution statistics and rank correlation coefficients for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater BMPs and provides the calculated values for these variables. This report also provides robust methods for estimating the minimum irreducible concentration (MIC), which is the lowest expected effluent concentration from a particular BMP site or a class of BMPs. These statistics are different from the statistics commonly used to characterize or compare BMPs. They are designed to provide a stochastic transfer function to approximate the quantity, duration, and quality of BMP effluent given the associated inflow values for a population of storm events. A database application and several spreadsheet tools are included in the digital media accompanying this report for further documentation of methods and for future use. In this study, analyses were done with data extracted from a modified copy of the January 2012 version of International Stormwater Best Management Practices Database, designated herein as the January 2012a version. Statistics for volume reduction, hydrograph extension, and water-quality treatment were developed with selected data. Sufficient data were available to estimate statistics for 5 to 10 BMP categories by using data from 40 to more than 165 monitoring sites. Water-quality treatment statistics were developed for 13 runoff-quality constituents commonly measured in highway and urban runoff studies including turbidity, sediment and solids; nutrients; total metals; organic carbon; and fecal coliforms. The medians of the best-fit statistics for each category were selected to construct generalized cumulative distribution functions for the three treatment variables. For volume reduction and hydrograph extension, interpretation of available data indicates that selection of a Spearman’s rho value that is the average of the median and maximum values for the BMP category may help generate realistic simulation results in SELDM. The median rho value may be selected to help generate realistic simulation results for water-quality treatment variables. MIC statistics were developed for 12 runoff-quality constituents commonly measured in highway and urban runoff studies by using data from 11 BMP categories and more than 167 monitoring sites. Four statistical techniques were applied for estimating MIC values with monitoring data from each site. These techniques produce a range of lower-bound estimates for each site. Four MIC estimators are proposed as alternatives for selecting a value from among the estimates from multiple sites. Correlation analysis indicates that the MIC estimates from multiple sites were weakly correlated with the geometric mean of inflow values, which indicates that there may be a qualitative or semiquantitative link between the inflow quality and the MIC. Correlations probably are weak because the MIC is influenced by the inflow water quality and the capability of each individual BMP site to reduce inflow concentrations.
Stated Preference Survey Estimating the Willingness to Pay ...
A national stated preference survey designed to elicit household willingness to pay for reductions in impinged and entrained fish at cooling water intake structures. To improve estimation of environmental benefits estimation
The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.
ERIC Educational Resources Information Center
Ravia, Silvana; Gamenara, Daniela; Schapiro, Valeria; Bellomo, Ana; Adum, Jorge; Seoane, Gustavo; Gonzalez, David
2006-01-01
The use of biocatalysis and biotransformations are important tools in green chemistry. The enantioselective reduction of a ketone by crude plant parts, using carrot ("Daucus carota") as the reducing agent is presented. The experiment introduces an example of a green chemistry procedure that can be tailored to fit in a regular laboratory session.…
Community-Focused Exposure and Risk Screening Tool (C-FERST): Introduction and Demonstration
Public Need: Communities and decision makers are concerned about where they live, work, and play. C-FERST is a user-friendly tool that helps: Identify environmental issues in communities; Learn about these issues; Explore exposure and risk reduction options.
Mechanism-Based FE Simulation of Tool Wear in Diamond Drilling of SiCp/Al Composites.
Xiang, Junfeng; Pang, Siqin; Xie, Lijing; Gao, Feinong; Hu, Xin; Yi, Jie; Hu, Fang
2018-02-07
The aim of this work is to analyze the micro mechanisms underlying the wear of macroscale tools during diamond machining of SiC p /Al6063 composites and to develop the mechanism-based diamond wear model in relation to the dominant wear behaviors. During drilling, high volume fraction SiC p /Al6063 composites containing Cu, the dominant wear mechanisms of diamond tool involve thermodynamically activated physicochemical wear due to diamond-graphite transformation catalyzed by Cu in air atmosphere and mechanically driven abrasive wear due to high-frequency scrape of hard SiC reinforcement on tool surface. An analytical diamond wear model, coupling Usui abrasive wear model and Arrhenius extended graphitization wear model was proposed and implemented through a user-defined subroutine for tool wear estimates. Tool wear estimate in diamond drilling of SiC p /Al6063 composites was achieved by incorporating the combined abrasive-chemical tool wear subroutine into the coupled thermomechanical FE model of 3D drilling. The developed drilling FE model for reproducing diamond tool wear was validated for feasibility and reliability by comparing numerically simulated tool wear morphology and experimentally observed results after drilling a hole using brazed polycrystalline diamond (PCD) and chemical vapor deposition (CVD) diamond coated tools. A fairly good agreement of experimental and simulated results in cutting forces, chip and tool wear morphologies demonstrates that the developed 3D drilling FE model, combined with a subroutine for diamond tool wear estimate can provide a more accurate analysis not only in cutting forces and chip shape but also in tool wear behavior during drilling SiC p /Al6063 composites. Once validated and calibrated, the developed diamond tool wear model in conjunction with other machining FE models can be easily extended to the investigation of tool wear evolution with various diamond tool geometries and other machining processes in cutting different workpiece materials.
Mechanism-Based FE Simulation of Tool Wear in Diamond Drilling of SiCp/Al Composites
Xiang, Junfeng; Pang, Siqin; Xie, Lijing; Gao, Feinong; Hu, Xin; Yi, Jie; Hu, Fang
2018-01-01
The aim of this work is to analyze the micro mechanisms underlying the wear of macroscale tools during diamond machining of SiCp/Al6063 composites and to develop the mechanism-based diamond wear model in relation to the dominant wear behaviors. During drilling, high volume fraction SiCp/Al6063 composites containing Cu, the dominant wear mechanisms of diamond tool involve thermodynamically activated physicochemical wear due to diamond-graphite transformation catalyzed by Cu in air atmosphere and mechanically driven abrasive wear due to high-frequency scrape of hard SiC reinforcement on tool surface. An analytical diamond wear model, coupling Usui abrasive wear model and Arrhenius extended graphitization wear model was proposed and implemented through a user-defined subroutine for tool wear estimates. Tool wear estimate in diamond drilling of SiCp/Al6063 composites was achieved by incorporating the combined abrasive-chemical tool wear subroutine into the coupled thermomechanical FE model of 3D drilling. The developed drilling FE model for reproducing diamond tool wear was validated for feasibility and reliability by comparing numerically simulated tool wear morphology and experimentally observed results after drilling a hole using brazed polycrystalline diamond (PCD) and chemical vapor deposition (CVD) diamond coated tools. A fairly good agreement of experimental and simulated results in cutting forces, chip and tool wear morphologies demonstrates that the developed 3D drilling FE model, combined with a subroutine for diamond tool wear estimate can provide a more accurate analysis not only in cutting forces and chip shape but also in tool wear behavior during drilling SiCp/Al6063 composites. Once validated and calibrated, the developed diamond tool wear model in conjunction with other machining FE models can be easily extended to the investigation of tool wear evolution with various diamond tool geometries and other machining processes in cutting different workpiece materials. PMID:29414839
Esteve, E; Rathleff, M S; Bagur-Calafat, C; Urrútia, G; Thorborg, K
2015-06-01
Groin injuries are common in football and ice hockey, and previous groin injury is a strong risk factor for future groin injuries, which calls for primary prevention. The aim of this systematic review was to evaluate the effect of specific groin-injury prevention programmes in sports. A comprehensive search was performed in May 2014 yielding 1747 potentially relevant references. Two independent assessors evaluated randomised controlled trials for inclusion, extracted data and performed quality assessments using Cochrane's risk of bias tool. Quantitative analyses were performed in Review Manager 5.3. Seven trials were included: six on football players (four male and two female populations) and one on male handball players. In total there were 4191 participants with a total of 157 injuries. The primary analysis, including all participants, did not show a significant reduction in the number of groin injuries after completing a groin injury prevention programme (relative risk (RR) 0.81; 95% CI 0.60 to 1.09). Subgroup analysis based on type of sports, gender and type of prevention programme showed similar non-significant estimates with RR ranging from 0.48 to 0.81. Meta-analysis revealed a potential clinically meaningful groin injury reduction of 19%, even though no statistical significant reduction in sport-related groin injuries could be documented. PROSPERO registration ID CRD42014009614. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Effects of tools inserted through snake-like surgical manipulators.
Murphy, Ryan J; Otake, Yoshito; Wolfe, Kevin C; Taylor, Russell H; Armand, Mehran
2014-01-01
Snake-like manipulators with a large, open lumen can offer improved treatment alternatives for minimally-and less-invasive surgeries. In these procedures, surgeons use the manipulator to introduce and control flexible tools in the surgical environment. This paper describes a predictive algorithm for estimating manipulator configuration given tip position for nonconstant curvature, cable-driven manipulators using energy minimization. During experimental bending of the manipulator with and without a tool inserted in its lumen, images were recorded from an overhead camera in conjunction with actuation cable tension and length. To investigate the accuracy, the estimated manipulator configuration from the model and the ground-truth configuration measured from the image were compared. Additional analysis focused on the response differences for the manipulator with and without a tool inserted through the lumen. Results indicate that the energy minimization model predicts manipulator configuration with an error of 0.24 ± 0.22mm without tools in the lumen and 0.24 ± 0.19mm with tools in the lumen (no significant difference, p = 0.81). Moreover, tools did not introduce noticeable perturbations in the manipulator trajectory; however, there was an increase in requisite force required to reach a configuration. These results support the use of the proposed estimation method for calculating the shape of the manipulator with an tool inserted in its lumen when an accuracy range of at least 1mm is required.
Eyles, Helen; Shields, Emma; Webster, Jacqui; Ni Mhurchu, Cliona
2016-08-01
Excess sodium intake is one of the top 2 dietary risk factors contributing to the global burden of disease. As such, many countries are now developing national sodium reduction strategies, a key component of which is a sodium reduction model that includes sodium targets for packaged foods and other sources of dietary sodium. We sought to develop a sodium reduction model to determine the reductions required in the sodium content of packaged foods and other dietary sources of sodium to reduce adult population salt intake by ∼30% toward the optimal WHO target of 5 g/d. Nationally representative household food-purchasing data for New Zealand were linked with branded food composition information to determine the mean contribution of major packaged food categories to total population sodium consumption. Discretionary salt use and the contribution of sodium from fresh foods and foods consumed away from the home were estimated with the use of national nutrition survey data. Reductions required in the sodium content of packaged foods and other dietary sources of sodium to achieve a 30% reduction in dietary sodium intakes were estimated. A 36% reduction (1.6 g salt or 628 mg Na) in the sodium content of packaged foods in conjunction with a 40% reduction in discretionary salt use and the sodium content of foods consumed away from the home would reduce total population salt intake in New Zealand by 35% (from 8.4 to 5.5 g/d) and thus meet the WHO 2025 30% relative reduction target. Key reductions required include a decrease of 21% in the sodium content of white bread, 27% for hard cheese, 42% for sausages, and 54% for ready-to-eat breakfast cereals. Achieving the WHO sodium target in New Zealand will take considerable efforts by both food manufacturers and consumers and will likely require a national government-led sodium reduction strategy. © 2016 American Society for Nutrition.
NASA Technical Reports Server (NTRS)
Stewart, R. D.
1979-01-01
Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.
Ogawa, Takahiro; Haseyama, Miki
2013-03-01
A missing texture reconstruction method based on an error reduction (ER) algorithm, including a novel estimation scheme of Fourier transform magnitudes is presented in this brief. In our method, Fourier transform magnitude is estimated for a target patch including missing areas, and the missing intensities are estimated by retrieving its phase based on the ER algorithm. Specifically, by monitoring errors converged in the ER algorithm, known patches whose Fourier transform magnitudes are similar to that of the target patch are selected from the target image. In the second approach, the Fourier transform magnitude of the target patch is estimated from those of the selected known patches and their corresponding errors. Consequently, by using the ER algorithm, we can estimate both the Fourier transform magnitudes and phases to reconstruct the missing areas.
Quantitative PET Imaging in Drug Development: Estimation of Target Occupancy.
Naganawa, Mika; Gallezot, Jean-Dominique; Rossano, Samantha; Carson, Richard E
2017-12-11
Positron emission tomography, an imaging tool using radiolabeled tracers in humans and preclinical species, has been widely used in recent years in drug development, particularly in the central nervous system. One important goal of PET in drug development is assessing the occupancy of various molecular targets (e.g., receptors, transporters, enzymes) by exogenous drugs. The current linear mathematical approaches used to determine occupancy using PET imaging experiments are presented. These algorithms use results from multiple regions with different target content in two scans, a baseline (pre-drug) scan and a post-drug scan. New mathematical estimation approaches to determine target occupancy, using maximum likelihood, are presented. A major challenge in these methods is the proper definition of the covariance matrix of the regional binding measures, accounting for different variance of the individual regional measures and their nonzero covariance, factors that have been ignored by conventional methods. The novel methods are compared to standard methods using simulation and real human occupancy data. The simulation data showed the expected reduction in variance and bias using the proper maximum likelihood methods, when the assumptions of the estimation method matched those in simulation. Between-method differences for data from human occupancy studies were less obvious, in part due to small dataset sizes. These maximum likelihood methods form the basis for development of improved PET covariance models, in order to minimize bias and variance in PET occupancy studies.
Seng, Bunrith; Kaneko, Hidehiro; Hirayama, Kimiaki; Katayama-Hirayama, Keiko
2012-01-01
This paper presents a mathematical model of vertical water movement and a performance evaluation of the model in static pile composting operated with neither air supply nor turning. The vertical moisture content (MC) model was developed with consideration of evaporation (internal and external evaporation), diffusion (liquid and vapour diffusion) and percolation, whereas additional water from substrate decomposition and irrigation was not taken into account. The evaporation term in the model was established on the basis of reference evaporation of the materials at known temperature, MC and relative humidity of the air. Diffusion of water vapour was estimated as functions of relative humidity and temperature, whereas diffusion of liquid water was empirically obtained from experiment by adopting Fick's law. Percolation was estimated by following Darcy's law. The model was applied to a column of composting wood chips with an initial MC of 60%. The simulation program was run for four weeks with calculation span of 1 s. The simulated results were in reasonably good agreement with the experimental results. Only a top layer (less than 20 cm) had a considerable MC reduction; the deeper layers were comparable to the initial MC, and the bottom layer was higher than the initial MC. This model is a useful tool to estimate the MC profile throughout the composting period, and could be incorporated into biodegradation kinetic simulation of composting.
A reduced order model based on Kalman filtering for sequential data assimilation of turbulent flows
NASA Astrophysics Data System (ADS)
Meldi, M.; Poux, A.
2017-10-01
A Kalman filter based sequential estimator is presented in this work. The estimator is integrated in the structure of segregated solvers for the analysis of incompressible flows. This technique provides an augmented flow state integrating available observation in the CFD model, naturally preserving a zero-divergence condition for the velocity field. Because of the prohibitive costs associated with a complete Kalman Filter application, two model reduction strategies have been proposed and assessed. These strategies dramatically reduce the increase in computational costs of the model, which can be quantified in an augmentation of 10%- 15% with respect to the classical numerical simulation. In addition, an extended analysis of the behavior of the numerical model covariance Q has been performed. Optimized values are strongly linked to the truncation error of the discretization procedure. The estimator has been applied to the analysis of a number of test cases exhibiting increasing complexity, including turbulent flow configurations. The results show that the augmented flow successfully improves the prediction of the physical quantities investigated, even when the observation is provided in a limited region of the physical domain. In addition, the present work suggests that these Data Assimilation techniques, which are at an embryonic stage of development in CFD, may have the potential to be pushed even further using the augmented prediction as a powerful tool for the optimization of the free parameters in the numerical simulation.
Bergander, Tryggve; Nilsson-Välimaa, Kristina; Oberg, Katarina; Lacki, Karol M
2008-01-01
Steadily increasing demand for more efficient and more affordable biomolecule-based therapies put a significant burden on biopharma companies to reduce the cost of R&D activities associated with introduction of a new drug to the market. Reducing the time required to develop a purification process would be one option to address the high cost issue. The reduction in time can be accomplished if more efficient methods/tools are available for process development work, including high-throughput techniques. This paper addresses the transitions from traditional column-based process development to a modern high-throughput approach utilizing microtiter filter plates filled with a well-defined volume of chromatography resin. The approach is based on implementing the well-known batch uptake principle into microtiter plate geometry. Two variants of the proposed approach, allowing for either qualitative or quantitative estimation of dynamic binding capacity as a function of residence time, are described. Examples of quantitative estimation of dynamic binding capacities of human polyclonal IgG on MabSelect SuRe and of qualitative estimation of dynamic binding capacity of amyloglucosidase on a prototype of Capto DEAE weak ion exchanger are given. The proposed high-throughput method for determination of dynamic binding capacity significantly reduces time and sample consumption as compared to a traditional method utilizing packed chromatography columns without sacrificing the accuracy of data obtained.
Multivariate Strategies in Functional Magnetic Resonance Imaging
ERIC Educational Resources Information Center
Hansen, Lars Kai
2007-01-01
We discuss aspects of multivariate fMRI modeling, including the statistical evaluation of multivariate models and means for dimensional reduction. In a case study we analyze linear and non-linear dimensional reduction tools in the context of a "mind reading" predictive multivariate fMRI model.
OBJECTIVE REDUCTION OF THE SPACE-TIME DOMAIN DIMENSIONALITY FOR EVALUATING MODEL PERFORMANCE
In the United States, photochemical air quality models are the principal tools used by governmental agencies to develop emission reduction strategies aimed at achieving National Ambient Air Quality Standards (NAAQS). Before they can be applied with confidence in a regulatory sett...
Collentine, Dennis; Johnsson, Holger; Larsson, Peter; Markensten, Hampus; Persson, Kristian
2015-03-01
Riparian buffer zones are the only measure which has been used extensively in Sweden to reduce phosphorus losses from agricultural land. This paper describes how the FyrisSKZ web tool can be used to evaluate allocation scenarios using data from the Svärta River, an agricultural catchment located in central Sweden. Three scenarios are evaluated: a baseline, a uniform 6-m-wide buffer zone in each sub-catchment, and an allocation of areas of buffer zones to sub-catchments based on the average cost of reduction. The total P reduction increases by 30 % in the second scenario compared to the baseline scenario, and the average reduction per hectare increases by 90 % while total costs of the program fall by 32 %. In the third scenario, the average cost per unit of reduction (
The Speckle Toolbox: A Powerful Data Reduction Tool for CCD Astrometry
NASA Astrophysics Data System (ADS)
Harshaw, Richard; Rowe, David; Genet, Russell
2017-01-01
Recent advances in high-speed low-noise CCD and CMOS cameras, coupled with breakthroughs in data reduction software that runs on desktop PCs, has opened the domain of speckle interferometry and high-accuracy CCD measurements of double stars to amateurs, allowing them to do useful science of high quality. This paper describes how to use a speckle interferometry reduction program, the Speckle Tool Box (STB), to achieve this level of result. For over a year the author (Harshaw) has been using STB (and its predecessor, Plate Solve 3) to obtain measurements of double stars based on CCD camera technology for pairs that are either too wide (the stars not sharing the same isoplanatic patch, roughly 5 arc-seconds in diameter) or too faint to image in the coherence time required for speckle (usually under 40ms). This same approach - using speckle reduction software to measure CCD pairs with greater accuracy than possible with lucky imaging - has been used, it turns out, for several years by the U. S. Naval Observatory.
A reduction package for cross-dispersed echelle spectrograph data in IDL
NASA Astrophysics Data System (ADS)
Hall, Jeffrey C.; Neff, James E.
1992-12-01
We have written in IDL a data reduction package that performs reduction and extraction of cross-dispersed echelle spectrograph data. The present package includes a complete set of tools for extracting data from any number of spectral orders with arbitrary tilt and curvature. Essential elements include debiasing and flatfielding of the raw CCD image, removal of scattered light background, either nonoptimal or optimal extraction of data, and wavelength calibration and continuum normalization of the extracted orders. A growing set of support routines permits examination of the frame being processed to provide continuing checks on the statistical properties of the data and on the accuracy of the extraction. We will display some sample reductions and discuss the algorithms used. The inherent simplicity and user-friendliness of the IDL interface make this package a useful tool for spectroscopists. We will provide an email distribution list for those interested in receiving the package, and further documentation will be distributed at the meeting.
Climate Action Planning Tool | NREL
NREL's Climate Action Planning Tool provides a quick, basic estimate of how various technology options can contribute to an overall climate action plan for your research campus. Use the tool to
The Global Earthquake Model and Disaster Risk Reduction
NASA Astrophysics Data System (ADS)
Smolka, A. J.
2015-12-01
Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all projects are undertaken with strong involvement of local scientific and risk reduction communities. Open-source software and careful documentation of the methodologies create full transparency of the modelling process, so that results can be reproduced any time by third parties.
Uddin, Muhammad Shahin; Tahtali, Murat; Lambert, Andrew J; Pickering, Mark R; Marchese, Margaret; Stuart, Iain
2016-05-20
Compared with other medical-imaging modalities, ultrasound (US) imaging is a valuable way to examine the body's internal organs, and two-dimensional (2D) imaging is currently the most common technique used in clinical diagnoses. Conventional 2D US imaging systems are highly flexible cost-effective imaging tools that permit operators to observe and record images of a large variety of thin anatomical sections in real time. Recently, 3D US imaging has also been gaining popularity due to its considerable advantages over 2D US imaging. It reduces dependency on the operator and provides better qualitative and quantitative information for an effective diagnosis. Furthermore, it provides a 3D view, which allows the observation of volume information. The major shortcoming of any type of US imaging is the presence of speckle noise. Hence, speckle reduction is vital in providing a better clinical diagnosis. The key objective of any speckle-reduction algorithm is to attain a speckle-free image while preserving the important anatomical features. In this paper we introduce a nonlinear multi-scale complex wavelet-diffusion based algorithm for speckle reduction and sharp-edge preservation of 2D and 3D US images. In the proposed method we use a Rayleigh and Maxwell-mixture model for 2D and 3D US images, respectively, where a genetic algorithm is used in combination with an expectation maximization method to estimate mixture parameters. Experimental results using both 2D and 3D synthetic, physical phantom, and clinical data demonstrate that our proposed algorithm significantly reduces speckle noise while preserving sharp edges without discernible distortions. The proposed approach performs better than the state-of-the-art approaches in both qualitative and quantitative measures.
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Alan
2011-01-01
Current aircraft departure release times are based on manual estimates of aircraft takeoff times. Uncertainty in takeoff time estimates may result in missed opportunities to merge into constrained en route streams and lead to lost throughput. However, technology exists to improve takeoff time estimates by using the aircraft surface trajectory predictions that enable air traffic control tower (ATCT) decision support tools. NASA s Precision Departure Release Capability (PDRC) is designed to use automated surface trajectory-based takeoff time estimates to improve en route tactical departure scheduling. This is accomplished by integrating an ATCT decision support tool with an en route tactical departure scheduling decision support tool. The PDRC concept and prototype software have been developed, and an initial test was completed at air traffic control facilities in Dallas/Fort Worth. This paper describes the PDRC operational concept, system design, and initial observations.
Dallas, Lorna J; Devos, Alexandre; Fievet, Bruno; Turner, Andrew; Lyons, Brett P; Jha, Awadhesh N
2016-05-01
Accurate dosimetry is critically important for ecotoxicological and radioecological studies on the potential effects of environmentally relevant radionuclides, such as tritium ((3)H). Previous studies have used basic dosimetric equations to estimate dose from (3)H exposure in ecologically important organisms, such as marine mussels. This study compares four different methods of estimating dose to adult mussels exposed to 1 or 15 MBq L(-1) tritiated water (HTO) under laboratory conditions. These methods were (1) an equation converting seawater activity concentrations to dose rate with fixed parameters; (2) input into the ERICA tool of seawater activity concentrations only; (3) input into the ERICA tool of estimated whole organism concentrations (woTACs), comprising dry activity plus estimated tissue free water tritium (TFWT) activity (TFWT volume × seawater activity concentration); and (4) input into the ERICA tool of measured whole organism activity concentrations, comprising dry activity plus measured TFWT activity (TFWT volume × TFWT activity concentration). Methods 3 and 4 are recommended for future ecotoxicological experiments as they produce values for individual animals and are not reliant on transfer predictions (estimation of concentration ratio). Method 1 may be suitable if measured whole organism concentrations are not available, as it produced results between 3 and 4. As there are technical complications to accurately measuring TFWT, we recommend that future radiotoxicological studies on mussels or other aquatic invertebrates measure whole organism activity in non-dried tissues (i.e. incorporating TFWT and dry activity as one, rather than as separate fractions) and input this data into the ERICA tool. Copyright © 2016 Elsevier Ltd. All rights reserved.
Valuing preferences over stormwater management outcomes including improved hydrologic function
NASA Astrophysics Data System (ADS)
LondoñO Cadavid, Catalina; Ando, Amy W.
2013-07-01
Stormwater runoff causes environmental problems such as flooding, soil erosion, and water pollution. Conventional stormwater management has focused primarily on flood reduction, while a new generation of decentralized stormwater solutions yields ancillary benefits such as healthier aquatic habitat, improved surface water quality, and increased water table recharge. Previous research has estimated values for flood reduction from stormwater management, but no estimates exist for the willingness to pay (WTP) for some of the other environmental benefits of alternative approaches to stormwater control. This paper uses a choice experiment survey of households in Champaign-Urbana, Illinois, to estimate the values of several attributes of stormwater management outcomes. We analyzed data from 131 surveyed households in randomly selected neighborhoods. We find that people value reduced basement flooding more than reductions in yard or street flooding, but WTP for basement flood reduction in the area only exists if individuals are currently experiencing significant flooding themselves. Citizens value both improved water quality and improved hydrologic function and aquatic habitat from runoff reduction. Thus, widespread investment in low impact development stormwater solutions could have very large total benefits, and stormwater managers should be wary of policies and infrastructure plans that reduce flooding at the expense of water quality and aquatic habitat.
Reducing CO2 Emissions through Lightweight Design and Manufacturing
NASA Astrophysics Data System (ADS)
Carruth, Mark A.; Allwood, Julian M.; Milford, Rachel L.
2011-05-01
To meet targeted 50% reductions in industrial CO2 emissions by 2050, demand for steel and aluminium must be cut. Many steel and aluminium products include redundant material, and the manufacturing routes to produce them use more material than is necessary. Lightweight design and optimized manufacturing processes offer a means of demand reduction, whilst creating products to perform the same service as existing ones. This paper examines two strategies for demand reduction: lightweight product design; and minimizing yield losses through the product supply chain. Possible mass savings are estimated for specific case-studies on metal-intensive products, such as I-beams and food cans. These estimates are then extrapolated to other sectors to produce a global estimate for possible demand reductions. Results show that lightweight product design may offer potential mass savings of up to 30% for some products, whilst yield in the production of others could be improved by over 20%. If these two strategies could be combined for all products, global demand for steel and aluminium would be reduced by nearly 50%. The impact of demand reduction on CO2 emissions is presented, and barriers to the adoption of new, lightweight technologies are discussed.
Liu, Ya L; Liu, Kui; Yuan, Li Y; Chai, Zhi F; Shi, Wei Q
2016-08-15
In this work, the compositions of Ce-Al, Er-Al and La-Bi intermetallic compounds were estimated by the cyclic voltammetry (CV) technique. At first, CV measurements were carried out at different reverse potentials to study the co-reduction processes of Ce-Al, Er-Al and La-Bi systems. The CV curves obtained were then re-plotted with the current as a function of time, and the coulomb number of each peak was calculated. By comparing the coulomb number of the related peaks, the compositions of the Ce-Al, Er-Al and La-Bi intermetallic compounds formed in the co-reduction process could be estimated. The results showed that Al11Ce3, Al3Ce, Al2Ce and AlCe could be formed by the co-reduction of Ce(iii) and Al(iii). For the co-reduction of Er(iii) and Al(iii), Al3Er2, Al2Er and AlEr were formed. In a La(iii) and Bi(iii) co-existing system in LiCl-KCl melts, LaBi2, LaBi and Li3Bi were the major products as a result of co-reduction.
Valuation of irrigation water in South-western Iran using a hedonic pricing model
NASA Astrophysics Data System (ADS)
Esmaeili, Abdoulkarim; Shahsavari, Zahra
2011-12-01
Population growth, improved socioeconomic conditions, increased demand for various types of water use, and a reduction in water supply has created more competition for scarce water supplies leveling many countries. Efficient allocation of water supplies between different economic sectors is therefore very important. Water valuation is a useful tool to determine water price. Water pricing can play a major part in improving water allocation by encouraging users to conserve scarce water resources, and promoting improvements in productivity. We used a hedonic pricing method to reveal the implicit value of irrigation water by analyzing agricultural land values in farms under the Doroodzan dam in South-western Iran. The method was applied to farms in which irrigation water came from wells and canals. The availability of irrigation water was one of the most important factors influencing land prices. The value of irrigation water in the farms investigated was estimated to be 0.046 per cubic meter. The estimated price for water was clearly higher than the price farmers currently pay for water in the area of study. Efficient water pricing could help the sustainability of the water resources. Farmers must therefore be informed of the real value of irrigation water used on their land.
Carbon and energy saving markets in compressed air
NASA Astrophysics Data System (ADS)
Cipollone, R.
2015-08-01
CO2 reduction and fossil fuel saving represent two of the cornerstones of the environmental commitments of all the countries of the world. The first engagement is of a medium to long term type, and unequivocally calls for a new energetic era. The second delays in time the fossil fuel technologies to favour an energetic transition. In order to sustain the two efforts, new immaterial markets have been established in almost all the countries of the world, whose exchanges (purchases and sales) concern CO2 emissions and equivalent fossil fuels that have not been emitted or burned. This paper goes deep inside two aspects not yet exploited: specific CO2 emissions and equivalent fossil fuel burned, as a function of compressed air produced. Reference is made to the current compressor technology, carefully analysing CAGI's (Compressed Air Gas Institute) data and integrating it with the PNUEROP (European Association of manufacturers of compressors, vacuum pumps, pneumatic tools and allied equipment) contribution on the compressor European market. On the base of energy saving estimates that could be put in place, this article also estimates the financial value of the CO2 emissions and fossil fuels avoided.
An open-source model and solution method to predict co-contraction in the finger.
MacIntosh, Alexander R; Keir, Peter J
2017-10-01
A novel open-source biomechanical model of the index finger with an electromyography (EMG)-constrained static optimization solution method are developed with the goal of improving co-contraction estimates and providing means to assess tendon tension distribution through the finger. The Intrinsic model has four degrees of freedom and seven muscles (with a 14 component extensor mechanism). A novel plugin developed for the OpenSim modelling software applied the EMG-constrained static optimization solution method. Ten participants performed static pressing in three finger postures and five dynamic free motion tasks. Index finger 3D kinematics, force (5, 15, 30 N), and EMG (4 extrinsic muscles and first dorsal interosseous) were used in the analysis. The Intrinsic model predicted co-contraction increased by 29% during static pressing over the existing model. Further, tendon tension distribution patterns and forces, known to be essential to produce finger action, were determined by the model across all postures. The Intrinsic model and custom solution method improved co-contraction estimates to facilitate force propagation through the finger. These tools improve our interpretation of loads in the finger to develop better rehabilitation and workplace injury risk reduction strategies.
LQG control of a deformable mirror adaptive optics system with time-delayed measurements
NASA Astrophysics Data System (ADS)
Anderson, David J.
1991-12-01
This thesis proposes a linear quadratic Gaussian (LQG) control law for a ground-based deformable mirror adaptive optics system. The incoming image wavefront is distorted, primarily in phase, due to the turbulent effects of the earth's atmosphere. The adaptive optics system attempts to compensate for the distortion with a deformable mirror. A Hartman wavefront sensor measures the degree of distortion in the image wavefront. The measurements are input to a Kalman filter which estimates the system states. The state estimates are processed by a linear quadratic regulator which generates the appropriate control voltages to apply to the deformable mirror actuators. The dynamics model for the atmospheric phase distortion consists of 14 Zernike coefficient states; each modeled as a first-order linear time-invariant shaping filter driven by zero-mean white Gaussian noise. The dynamics of the deformable mirror are also model as 14 Zernike coefficients with first-order deterministic dynamics. A significant reduction in total wavefront phase distortion is achieved in the presence of time-delayed measurements. Wavefront sensor sampling rate is the major factor limiting system performance. The Multimode Simulation for Optimal Filter Evaluation (MSOFE) software is the performance evaluation tool of choice for this research.
NASA Astrophysics Data System (ADS)
Tsang, Sik-Ho; Chan, Yui-Lam; Siu, Wan-Chi
2017-01-01
Weighted prediction (WP) is an efficient video coding tool that was introduced since the establishment of the H.264/AVC video coding standard, for compensating the temporal illumination change in motion estimation and compensation. WP parameters, including a multiplicative weight and an additive offset for each reference frame, are required to be estimated and transmitted to the decoder by slice header. These parameters cause extra bits in the coded video bitstream. High efficiency video coding (HEVC) provides WP parameter prediction to reduce the overhead. Therefore, WP parameter prediction is crucial to research works or applications, which are related to WP. Prior art has been suggested to further improve the WP parameter prediction by implicit prediction of image characteristics and derivation of parameters. By exploiting both temporal and interlayer redundancies, we propose three WP parameter prediction algorithms, enhanced implicit WP parameter, enhanced direct WP parameter derivation, and interlayer WP parameter, to further improve the coding efficiency of HEVC. Results show that our proposed algorithms can achieve up to 5.83% and 5.23% bitrate reduction compared to the conventional scalable HEVC in the base layer for SNR scalability and 2× spatial scalability, respectively.
Effects of subsampling of passive acoustic recordings on acoustic metrics.
Thomisch, Karolin; Boebel, Olaf; Zitterbart, Daniel P; Samaran, Flore; Van Parijs, Sofie; Van Opzeeland, Ilse
2015-07-01
Passive acoustic monitoring is an important tool in marine mammal studies. However, logistics and finances frequently constrain the number and servicing schedules of acoustic recorders, requiring a trade-off between deployment periods and sampling continuity, i.e., the implementation of a subsampling scheme. Optimizing such schemes to each project's specific research questions is desirable. This study investigates the impact of subsampling on the accuracy of two common metrics, acoustic presence and call rate, for different vocalization patterns (regimes) of baleen whales: (1) variable vocal activity, (2) vocalizations organized in song bouts, and (3) vocal activity with diel patterns. To this end, above metrics are compared for continuous and subsampled data subject to different sampling strategies, covering duty cycles between 50% and 2%. The results show that a reduction of the duty cycle impacts negatively on the accuracy of both acoustic presence and call rate estimates. For a given duty cycle, frequent short listening periods improve accuracy of daily acoustic presence estimates over few long listening periods. Overall, subsampling effects are most pronounced for low and/or temporally clustered vocal activity. These findings illustrate the importance of informed decisions when applying subsampling strategies to passive acoustic recordings or analyses for a given target species.
Fisher, Jason C.
2013-01-01
Long-term groundwater monitoring networks can provide essential information for the planning and management of water resources. Budget constraints in water resource management agencies often mean a reduction in the number of observation wells included in a monitoring network. A network design tool, distributed as an R package, was developed to determine which wells to exclude from a monitoring network because they add little or no beneficial information. A kriging-based genetic algorithm method was used to optimize the monitoring network. The algorithm was used to find the set of wells whose removal leads to the smallest increase in the weighted sum of the (1) mean standard error at all nodes in the kriging grid where the water table is estimated, (2) root-mean-squared-error between the measured and estimated water-level elevation at the removed sites, (3) mean standard deviation of measurements across time at the removed sites, and (4) mean measurement error of wells in the reduced network. The solution to the optimization problem (the best wells to retain in the monitoring network) depends on the total number of wells removed; this number is a management decision. The network design tool was applied to optimize two observation well networks monitoring the water table of the eastern Snake River Plain aquifer, Idaho; these networks include the 2008 Federal-State Cooperative water-level monitoring network (Co-op network) with 166 observation wells, and the 2008 U.S. Geological Survey-Idaho National Laboratory water-level monitoring network (USGS-INL network) with 171 wells. Each water-level monitoring network was optimized five times: by removing (1) 10, (2) 20, (3) 40, (4) 60, and (5) 80 observation wells from the original network. An examination of the trade-offs associated with changes in the number of wells to remove indicates that 20 wells can be removed from the Co-op network with a relatively small degradation of the estimated water table map, and 40 wells can be removed from the USGS-INL network before the water table map degradation accelerates. The optimal network designs indicate the robustness of the network design tool. Observation wells were removed from high well-density areas of the network while retaining the spatial pattern of the existing water-table map.
NASA Astrophysics Data System (ADS)
Muthukrishnan, A.; Sangaranarayanan, M. V.
2007-10-01
The reduction of carbon-fluorine bond in 4-fluorobenzonitrile in acetonitrile as the solvent, is analyzed using convolution potential sweep voltammetry and the dependence of the transfer coefficient on potential is investigated within the framework of Marcus-Hush quadratic activation-driving force theory. The validity of stepwise mechanism is inferred from solvent reorganization energy estimates as well as bond length calculations using B3LYP/6-31g(d) method. A novel method of estimating the standard reduction potential of the 4-fluorobenzonitrile in acetonitrile is proposed.
NASA Astrophysics Data System (ADS)
Bergquist, B. A.; Blum, J. D.
2007-12-01
Mercury is a globally distributed and highly toxic pollutant, the mobility and bioaccumulation of which is dependent on its redox cycling. Hg isotope analysis is an important new tool for identifying Hg sources and tracking Hg transformations in the environment. Most natural samples analyzed for Hg isotopes display mass-dependent isotope fractionation (MDF), but a small body of data suggests that some natural samples also display mass- independent isotope fractionation (MIF) of the odd Hg isotopes. Here we document MIF of Hg isotopes during an important natural process, constrain the potential mechanism of isotope fractionation, and apply the MIF observed in natural samples to quantify the photochemical reduction of Hg species in the environment. Reduction of Hg species to Hg0 vapor is an important pathway for removal of Hg from aqueous systems into the atmosphere and occurs by abiotic and biotic mechanisms. In laboratory experiments, we find that photochemical reduction Hg species by natural sunlight leads to large MIF of the odd isotopes. Also, the relationship between MIF for the two odd isotopes of Hg is significantly different for different photo-reduction pathways. In contrast, both biological reduction (Kritee et al., 2006) and dark abiotic organically-mediated reduction follow MDF. Natural samples from aquatic ecosystems preserve both MDF and MIF. In fish, MDF increases with the size and Hg concentration of fish suggesting MDF may be useful in understanding Hg bioaccumulation. Fish also display a large range in MIF (4‰), and the relationship between the MIF of the two odd isotopes in fish has a similar slope to the slope found for photo-reduction of CH3Hg+. Since fish bioaccumulate CH3Hg+, fish may be recording the extent to which CH3Hg+ is lost via photochemical reduction in an aquatic ecosystem. Fish populations from different locations have different MIF values, but mostly display similar MIF within a given locale. This suggests that MIF is preserved in the food web and could be used to quantify photo-reduction of CH3Hg+ in ecosystems. Both MDF and MIF of Hg isotopes will be useful for quantifying and understanding Hg biogeochemical cycling in the environment.
Brady, Samuel L.; Moore, Bria M.; Yee, Brian S.; Kaufman, Robert A.
2015-01-01
Purpose To determine a comprehensive method for the implementation of adaptive statistical iterative reconstruction (ASIR) for maximal radiation dose reduction in pediatric computed tomography (CT) without changing the magnitude of noise in the reconstructed image or the contrast-to-noise ratio (CNR) in the patient. Materials and Methods The institutional review board waived the need to obtain informed consent for this HIPAA-compliant quality analysis. Chest and abdominopelvic CT images obtained before ASIR implementation (183 patient examinations; mean patient age, 8.8 years ± 6.2 [standard deviation]; range, 1 month to 27 years) were analyzed for image noise and CNR. These measurements were used in conjunction with noise models derived from anthropomorphic phantoms to establish new beam current–modulated CT parameters to implement 40% ASIR at 120 and 100 kVp without changing noise texture or magnitude. Image noise was assessed in images obtained after ASIR implementation (492 patient examinations; mean patient age, 7.6 years ± 5.4; range, 2 months to 28 years) the same way it was assessed in the pre-ASIR analysis. Dose reduction was determined by comparing size-specific dose estimates in the pre- and post-ASIR patient cohorts. Data were analyzed with paired t tests. Results With 40% ASIR implementation, the average relative dose reduction for chest CT was 39% (2.7/4.4 mGy), with a maximum reduction of 72% (5.3/18.8 mGy). The average relative dose reduction for abdominopelvic CT was 29% (4.8/6.8 mGy), with a maximum reduction of 64% (7.6/20.9 mGy). Beam current modulation was unnecessary for patients weighing 40 kg or less. The difference between 0% and 40% ASIR noise magnitude was less than 1 HU, with statistically nonsignificant increases in patient CNR at 100 kVp of 8% (15.3/14.2; P = .41) for chest CT and 13% (7.8/6.8; P = .40) for abdominopelvic CT. Conclusion Radiation dose reduction at pediatric CT was achieved when 40% ASIR was implemented as a dose reduction tool only; no net change to the magnitude of noise in the reconstructed image or the patient CNR occurred. PMID:23901128