Sample records for cost estimation techniques

  1. Estimating the cost of major ongoing cost plus hardware development programs

    NASA Technical Reports Server (NTRS)

    Bush, J. C.

    1990-01-01

    Approaches are developed for forecasting the cost of major hardware development programs while these programs are in the design and development C/D phase. Three approaches are developed: a schedule assessment technique for bottom-line summary cost estimation, a detailed cost estimation approach, and an intermediate cost element analysis procedure. The schedule assessment technique was developed using historical cost/schedule performance data.

  2. Estimating Environmental Compliance Costs for Industry (1981)

    EPA Pesticide Factsheets

    The paper discusses the pros and cons of existing approaches to compliance cost estimation such as ex post survey estimation and ex ante estimation techniques (input cost accounting methods, engineering process models and, econometric models).

  3. Software Development Cost Estimation Executive Summary

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Menzies, Tim

    2006-01-01

    Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.

  4. Techniques for estimating health care costs with censored data: an overview for the health services researcher

    PubMed Central

    Wijeysundera, Harindra C; Wang, Xuesong; Tomlinson, George; Ko, Dennis T; Krahn, Murray D

    2012-01-01

    Objective The aim of this study was to review statistical techniques for estimating the mean population cost using health care cost data that, because of the inability to achieve complete follow-up until death, are right censored. The target audience is health service researchers without an advanced statistical background. Methods Data were sourced from longitudinal heart failure costs from Ontario, Canada, and administrative databases were used for estimating costs. The dataset consisted of 43,888 patients, with follow-up periods ranging from 1 to 1538 days (mean 576 days). The study was designed so that mean health care costs over 1080 days of follow-up were calculated using naïve estimators such as full-sample and uncensored case estimators. Reweighted estimators – specifically, the inverse probability weighted estimator – were calculated, as was phase-based costing. Costs were adjusted to 2008 Canadian dollars using the Bank of Canada consumer price index (http://www.bankofcanada.ca/en/cpi.html). Results Over the restricted follow-up of 1080 days, 32% of patients were censored. The full-sample estimator was found to underestimate mean cost ($30,420) compared with the reweighted estimators ($36,490). The phase-based costing estimate of $37,237 was similar to that of the simple reweighted estimator. Conclusion The authors recommend against the use of full-sample or uncensored case estimators when censored data are present. In the presence of heavy censoring, phase-based costing is an attractive alternative approach. PMID:22719214

  5. Proposed Reliability/Cost Model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1982-01-01

    New technique estimates cost of improvement in reliability for complex system. Model format/approach is dependent upon use of subsystem cost-estimating relationships (CER's) in devising cost-effective policy. Proposed methodology should have application in broad range of engineering management decisions.

  6. Cost collection and analysis for health economic evaluation.

    PubMed

    Smith, Kristine A; Rudmik, Luke

    2013-08-01

    To improve the understanding of common health care cost collection, estimation, analysis, and reporting methodologies. Ovid MEDLINE (1947 to December 2012), Cochrane Central register of Controlled Trials, Database of Systematic Reviews, Health Technology Assessment, and National Health Service Economic Evaluation Database. This article discusses the following cost collection methods: defining relevant resources, quantification of consumed resources, and resource valuation. It outlines the recommendations for cost reporting in economic evaluations and reviews the techniques on how to handle cost data uncertainty. Last, it discusses the controversial topics of future costs and patient productivity losses. Health care cost collection and estimation can be challenging, and an organized approach is required to optimize accuracy of economic evaluation outcomes. Understanding health care cost collection and estimation techniques will improve both critical appraisal and development of future economic evaluations.

  7. A Framework of Combining Case-Based Reasoning with a Work Breakdown Structure for Estimating the Cost of Online Course Production Projects

    ERIC Educational Resources Information Center

    He, Wu

    2014-01-01

    Currently, a work breakdown structure (WBS) approach is used as the most common cost estimation approach for online course production projects. To improve the practice of cost estimation, this paper proposes a novel framework to estimate the cost for online course production projects using a case-based reasoning (CBR) technique and a WBS. A…

  8. A Cost Analysis of Colonoscopy using Microcosting and Time-and-motion Techniques

    PubMed Central

    Ness, Reid M.; Stiles, Renée A.; Shintani, Ayumi K.; Dittus, Robert S.

    2007-01-01

    Background The cost of an individual colonoscopy is an important determinant of the overall cost and cost-effectiveness of colorectal cancer screening. Published cost estimates vary widely and typically report institutional costs derived from gross-costing methods. Objective Perform a cost analysis of colonoscopy using micro-costing and time-and-motion techniques to determine the total societal cost of colonoscopy, which includes direct health care costs as well as direct non-health care costs and costs related to patients’ time. The design is prospective cohort. The participants were 276 contacted, eligible patients who underwent colonoscopy between July 2001 and June 2002, at either a Veterans’ Affairs Medical Center or a University Hospital in the Southeastern United States. Major results The median direct health care cost for colonoscopy was $379 (25%, 75%; $343, $433). The median direct non-health care and patient time costs were $226 (25%, 75%; $187, $323) and $274 (25%, 75%; $186, $368), respectively. The median total societal cost of colonoscopy was $923 (25%, 75%; $805, $1047). The median direct health care, direct non-health care, patient time costs, and total costs at the VA were $391, $288, $274, and $958, respectively; analogous costs at the University Hospital were $376, $189, $368, and $905, respectively. Conclusion Microcosting techniques and time-and-motion studies can produce accurate, detailed cost estimates for complex medical interventions. Cost estimates that inform health policy decisions or cost-effectiveness analyses should use total costs from the societal perspective. Societal cost estimates, which include patient and caregiver time costs, may affect colonoscopy screening rates. PMID:17665271

  9. Modular space station phase B extension program cost and schedules. Volume 1: Cost and schedule estimating process and results

    NASA Technical Reports Server (NTRS)

    Frassinelli, G. J.

    1972-01-01

    Cost estimates and funding schedules are presented for a given configuration and costing ground rules. Cost methodology is described and the cost evolution from a baseline configuration to a selected configuration is given, emphasizing cases in which cost was a design driver. Programmatic cost avoidance techniques are discussed.

  10. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  11. 77 FR 15004 - Updating of Employer Identification Numbers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-14

    ... the IRS, including whether the information will have practical utility; The accuracy of the estimated... techniques or other forms of information technology; and Estimates of capital or start-up costs and costs of... respondents are persons that have an EIN. Estimated total annual reporting burden: 403,177 hours. Estimated...

  12. Comparative evaluation of features and techniques for identifying activity type and estimating energy cost from accelerometer data

    PubMed Central

    Kate, Rohit J.; Swartz, Ann M.; Welch, Whitney A.; Strath, Scott J.

    2016-01-01

    Wearable accelerometers can be used to objectively assess physical activity. However, the accuracy of this assessment depends on the underlying method used to process the time series data obtained from accelerometers. Several methods have been proposed that use this data to identify the type of physical activity and estimate its energy cost. Most of the newer methods employ some machine learning technique along with suitable features to represent the time series data. This paper experimentally compares several of these techniques and features on a large dataset of 146 subjects doing eight different physical activities wearing an accelerometer on the hip. Besides features based on statistics, distance based features and simple discrete features straight from the time series were also evaluated. On the physical activity type identification task, the results show that using more features significantly improve results. Choice of machine learning technique was also found to be important. However, on the energy cost estimation task, choice of features and machine learning technique were found to be less influential. On that task, separate energy cost estimation models trained specifically for each type of physical activity were found to be more accurate than a single model trained for all types of physical activities. PMID:26862679

  13. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are needed as well as improved usage of risk data by decision-makers. More and better ways to display and communicate cost and cost risk to management are required.

  14. Costing behavioral interventions: a practical guide to enhance translation.

    PubMed

    Ritzwoller, Debra P; Sukhanova, Anna; Gaglio, Bridget; Glasgow, Russell E

    2009-04-01

    Cost and cost effectiveness of behavioral interventions are critical parts of dissemination and implementation into non-academic settings. Due to the lack of indicative data and policy makers' increasing demands for both program effectiveness and efficiency, cost analyses can serve as valuable tools in the evaluation process. To stimulate and promote broader use of practical techniques that can be used to efficiently estimate the implementation costs of behavioral interventions, we propose a set of analytic steps that can be employed across a broad range of interventions. Intervention costs must be distinguished from research, development, and recruitment costs. The inclusion of sensitivity analyses is recommended to understand the implications of implementation of the intervention into different settings using different intervention resources. To illustrate these procedures, we use data from a smoking reduction practical clinical trial to describe the techniques and methods used to estimate and evaluate the costs associated with the intervention. Estimated intervention costs per participant were $419, with a range of $276 to $703, depending on the number of participants.

  15. Manned Mars mission cost estimate

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph; Smith, Keith

    1986-01-01

    The potential costs of several options of a manned Mars mission are examined. A cost estimating methodology based primarily on existing Marshall Space Flight Center (MSFC) parametric cost models is summarized. These models include the MSFC Space Station Cost Model and the MSFC Launch Vehicle Cost Model as well as other modes and techniques. The ground rules and assumptions of the cost estimating methodology are discussed and cost estimates presented for six potential mission options which were studied. The estimated manned Mars mission costs are compared to the cost of the somewhat analogous Apollo Program cost after normalizing the Apollo cost to the environment and ground rules of the manned Mars missions. It is concluded that a manned Mars mission, as currently defined, could be accomplished for under $30 billion in 1985 dollars excluding launch vehicle development and mission operations.

  16. Readings in program control

    NASA Technical Reports Server (NTRS)

    Hoban, Francis T. (Editor); Lawbaugh, William M. (Editor); Hoffman, Edward J. (Editor)

    1994-01-01

    Under the heading of Program Control, a number of related topics are discussed: cost estimating methods; planning and scheduling; cost overruns in the defense industry; the history of estimating; the advantages of cost plus award fee contracts; and how program control techniques led to the success of a NASA development project.

  17. Selected Tether Applications Cost Model

    NASA Technical Reports Server (NTRS)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  18. A novel methodology for estimating upper limits of major cost drivers for profitable conceptual launch system architectures

    NASA Astrophysics Data System (ADS)

    Rhodes, Russel E.; Byrd, Raymond J.

    1998-01-01

    This paper presents a ``back of the envelope'' technique for fast, timely, on-the-spot, assessment of affordability (profitability) of commercial space transportation architectural concepts. The tool presented here is not intended to replace conventional, detailed costing methodology. The process described enables ``quick look'' estimations and assumptions to effectively determine whether an initial concept (with its attendant cost estimating line items) provides focus for major leapfrog improvement. The Cost Charts Users Guide provides a generic sample tutorial, building an approximate understanding of the basic launch system cost factors and their representative magnitudes. This process will enable the user to develop a net ``cost (and price) per payload-mass unit to orbit'' incorporating a variety of significant cost drivers, supplemental to basic vehicle cost estimates. If acquisition cost and recurring cost factors (as a function of cost per payload-mass unit to orbit) do not meet the predetermined system-profitability goal, the concept in question will be clearly seen as non-competitive. Multiple analytical approaches, and applications of a variety of interrelated assumptions, can be examined in a quick, (on-the-spot) cost approximation analysis as this tool has inherent flexibility. The technique will allow determination of concept conformance to system objectives.

  19. Solid rocket motor cost model

    NASA Technical Reports Server (NTRS)

    Harney, A. G.; Raphael, L.; Warren, S.; Yakura, J. K.

    1972-01-01

    A systematic and standardized procedure for estimating life cycle costs of solid rocket motor booster configurations. The model consists of clearly defined cost categories and appropriate cost equations in which cost is related to program and hardware parameters. Cost estimating relationships are generally based on analogous experience. In this model the experience drawn on is from estimates prepared by the study contractors. Contractors' estimates are derived by means of engineering estimates for some predetermined level of detail of the SRM hardware and program functions of the system life cycle. This method is frequently referred to as bottom-up. A parametric cost analysis is a useful technique when rapid estimates are required. This is particularly true during the planning stages of a system when hardware designs and program definition are conceptual and constantly changing as the selection process, which includes cost comparisons or trade-offs, is performed. The use of cost estimating relationships also facilitates the performance of cost sensitivity studies in which relative and comparable cost comparisons are significant.

  20. Cost estimation: An expert-opinion approach. [cost analysis of research projects using the Delphi method (forecasting)

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Fogleman, S.; Gielecki, M.

    1976-01-01

    A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.

  1. An analysis of production and costs in high-lead yarding.

    Treesearch

    Magnus E. Tennas; Robert H. Ruth; Carl M. Berntsen

    1955-01-01

    In recent years loggers and timber owners have needed better information for estimating logging costs in the Douglas-fir region. Brandstrom's comprehensive study, published in 1933 (1), has long been used as a guide in making cost estimates. But the use of new equipment and techniques and an overall increase in logging costs have made it increasingly difficult to...

  2. Evaluation of techniques for handling missing cost-to-charge ratios in the USA Nationwide Inpatient Sample: a simulation study.

    PubMed

    Yu, Tzy-Chyi; Zhou, Huanxue

    2015-09-01

    Evaluate performance of techniques used to handle missing cost-to-charge ratio (CCR) data in the USA Healthcare Cost and Utilization Project's Nationwide Inpatient Sample. Four techniques to replace missing CCR data were evaluated: deleting discharges with missing CCRs (complete case analysis), reweighting as recommended by Healthcare Cost and Utilization Project, reweighting by adjustment cells and hot deck imputation by adjustment cells. Bias and root mean squared error of these techniques on hospital cost were evaluated in five disease cohorts. Similar mean cost estimates would be obtained with any of the four techniques when the percentage of missing data is low (<10%). When total cost is the outcome of interest, a reweighting technique to avoid underestimation from dropping observations with missing data should be adopted.

  3. Automated array assembly

    NASA Technical Reports Server (NTRS)

    Williams, B. F.

    1976-01-01

    Manufacturing techniques are evaluated using expenses based on experience and studying basic cost factors for each step to evaluate expenses from a first-principles point of view. A formal cost accounting procedure is developed which is used throughout the study for cost comparisons. The first test of this procedure is a comparison of its predicted costs for array module manufacturing with costs from a study which is based on experience factors. A manufacturing cost estimate for array modules of $10/W is based on present-day manufacturing techniques, expenses, and materials costs.

  4. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    NASA Astrophysics Data System (ADS)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.

  5. Cotton yield estimation using very high-resolution digital images acquired on a low-cost small unmanned aerial vehicle

    USDA-ARS?s Scientific Manuscript database

    Yield estimation is a critical task in crop management. A number of traditional methods are available for crop yield estimation but they are costly, time-consuming and difficult to expand to a relatively large field. Remote sensing provides techniques to develop quick coverage over a field at any sc...

  6. Estimating plant biomass in early-successional subtropical vegetation using a visual obstruction technique

    Treesearch

    Genie M. Fleming; Joseph M. Wunderle; David N. Ewert; Joseph O' Brien

    2014-01-01

    Aim: Non-destructive methods for quantifying above-ground plant biomass are important tools in many ecological studies and management endeavours, but estimation methods can be labour intensive and particularly difficult in structurally diverse vegetation types. We aimed to develop a low-cost, but reasonably accurate, estimation technique within early-successional...

  7. Selected Logistics Models and Techniques.

    DTIC Science & Technology

    1984-09-01

    TI - 59 Programmable Calculator LCC...Program 27 TI - 59 Programmable Calculator LCC Model 30 Unmanned Spacecraft Cost Model 31 iv I: TABLE OF CONTENTS (CONT’D) (Subject Index) LOGISTICS...34"" - % - "° > - " ° .° - " .’ > -% > ]*° - LOGISTICS ANALYSIS MODEL/TECHNIQUE DATA MODEL/TECHNIQUE NAME: TI - 59 Programmable Calculator LCC Model TYPE MODEL: Cost Estimating DEVELOPED BY:

  8. Evaluating cost-efficiency and accuracy of hunter harvest survey designs

    USGS Publications Warehouse

    Lukacs, P.M.; Gude, J.A.; Russell, R.E.; Ackerman, B.B.

    2011-01-01

    Effective management of harvested wildlife often requires accurate estimates of the number of animals harvested annually by hunters. A variety of techniques exist to obtain harvest data, such as hunter surveys, check stations, mandatory reporting requirements, and voluntary reporting of harvest. Agencies responsible for managing harvested wildlife such as deer (Odocoileus spp.), elk (Cervus elaphus), and pronghorn (Antilocapra americana) are challenged with balancing the cost of data collection versus the value of the information obtained. We compared precision, bias, and relative cost of several common strategies, including hunter self-reporting and random sampling, for estimating hunter harvest using a realistic set of simulations. Self-reporting with a follow-up survey of hunters who did not report produces the best estimate of harvest in terms of precision and bias, but it is also, by far, the most expensive technique. Self-reporting with no followup survey risks very large bias in harvest estimates, and the cost increases with increased response rate. Probability-based sampling provides a substantial cost savings, though accuracy can be affected by nonresponse bias. We recommend stratified random sampling with a calibration estimator used to reweight the sample based on the proportions of hunters responding in each covariate category as the best option for balancing cost and accuracy. ?? 2011 The Wildlife Society.

  9. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  10. Standard cost elements for technology programs

    NASA Technical Reports Server (NTRS)

    Christensen, Carisa B.; Wagenfuehrer, Carl

    1992-01-01

    The suitable structure for an effective and accurate cost estimate for general purposes is discussed in the context of a NASA technology program. Cost elements are defined for research, management, and facility-construction portions of technology programs. Attention is given to the mechanisms for insuring the viability of spending programs, and the need for program managers is established for effecting timely fund disbursement. Formal, structures, and intuitive techniques are discussed for cost-estimate development, and cost-estimate defensibility can be improved with increased documentation. NASA policies for cash management are examined to demonstrate the importance of the ability to obligate funds and the ability to cost contracted funds. The NASA approach to consistent cost justification is set forth with a list of standard cost-element definitions. The cost elements reflect the three primary concerns of cost estimates: the identification of major assumptions, the specification of secondary analytic assumptions, and the status of program factors.

  11. An activity-based methodology for operations cost analysis

    NASA Technical Reports Server (NTRS)

    Korsmeyer, David; Bilby, Curt; Frizzell, R. A.

    1991-01-01

    This report describes an activity-based cost estimation method, proposed for the Space Exploration Initiative (SEI), as an alternative to NASA's traditional mass-based cost estimation method. A case study demonstrates how the activity-based cost estimation technique can be used to identify the operations that have a significant impact on costs over the life cycle of the SEI. The case study yielded an operations cost of $101 billion for the 20-year span of the lunar surface operations for the Option 5a program architecture. In addition, the results indicated that the support and training costs for the missions were the greatest contributors to the annual cost estimates. A cost-sensitivity analysis of the cultural and architectural drivers determined that the length of training and the amount of support associated with the ground support personnel for mission activities are the most significant cost contributors.

  12. Productivity and cost estimators for conventional ground-based skidding on steep terrain using preplanned skid roads

    Treesearch

    Michael D. Erickson; Curt C. Hassler; Chris B. LeDoux

    1991-01-01

    Continuous time and motion study techniques were used to develop productivity and cost estimators for the skidding component of ground-based logging systems, operating on steep terrain using preplanned skid roads. Comparisons of productivity and costs were analyzed for an overland random access skidding method, verses a skidding method utilizing a network of preplanned...

  13. Low-complexity DOA estimation from short data snapshots for ULA systems using the annihilating filter technique

    NASA Astrophysics Data System (ADS)

    Bellili, Faouzi; Amor, Souheib Ben; Affes, Sofiène; Ghrayeb, Ali

    2017-12-01

    This paper addresses the problem of DOA estimation using uniform linear array (ULA) antenna configurations. We propose a new low-cost method of multiple DOA estimation from very short data snapshots. The new estimator is based on the annihilating filter (AF) technique. It is non-data-aided (NDA) and does not impinge therefore on the whole throughput of the system. The noise components are assumed temporally and spatially white across the receiving antenna elements. The transmitted signals are also temporally and spatially white across the transmitting sources. The new method is compared in performance to the Cramér-Rao lower bound (CRLB), the root-MUSIC algorithm, the deterministic maximum likelihood estimator and another Bayesian method developed precisely for the single snapshot case. Simulations show that the new estimator performs well over a wide SNR range. Prominently, the main advantage of the new AF-based method is that it succeeds in accurately estimating the DOAs from short data snapshots and even from a single snapshot outperforming by far the state-of-the-art techniques both in DOA estimation accuracy and computational cost.

  14. Evaluation of Methodology for Estimating the Cost of Air Force On-The-Job Training. Final Report.

    ERIC Educational Resources Information Center

    Samers, Bernard N.; And Others

    Described is the final phase of a study directed at the development of an on-the-job training (OJT) costing methodology. Utilizing a modification of survey techniques tested and evaluated during the previous phase, estimates were obtained for the cost of OJT for airman training from the l-level (unskilled to the 3-level (semiskilled) in five…

  15. The Role of Inflation and Price Escalation Adjustments in Properly Estimating Program Costs: F-35 Case Study

    DTIC Science & Technology

    2016-03-01

    regression models that yield hedonic price indexes is closely related to standard techniques for developing cost estimating relationships ( CERs ...October 2014). iii analysis) and derives a price index from the coefficients on variables reflecting the year of purchase. In CER development, the...index. The relevant cost metric in both cases is unit recurring flyaway (URF) costs. For the current project, we develop a “Baseline” CER model, taking

  16. Wavelet Analyses of Oil Prices, USD Variations and Impact on Logistics

    NASA Astrophysics Data System (ADS)

    Melek, M.; Tokgozlu, A.; Aslan, Z.

    2009-07-01

    This paper is related with temporal variations of historical oil prices and Dollar and Euro in Turkey. Daily data based on OECD and Central Bank of Turkey records beginning from 1946 has been considered. 1D-continuous wavelets and wavelet packets analysis techniques have been applied on data. Wavelet techniques help to detect abrupt changing's, increasing and decreasing trends of data. Estimation of variables has been presented by using linear regression estimation techniques. The results of this study have been compared with the small and large scale effects. Transportation costs of track show a similar variation with fuel prices. The second part of the paper is related with estimation of imports, exports, costs, total number of vehicles and annual variations by considering temporal variation of oil prices and Dollar currency in Turkey. Wavelet techniques offer a user friendly methodology to interpret some local effects on increasing trend of imports and exports data.

  17. Estimating acreage by double sampling using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Pont, F.; Horwitz, H.; Kauth, R. (Principal Investigator)

    1982-01-01

    Double sampling techniques employing LANDSAT data for estimating the acreage of corn and soybeans was investigated and evaluated. The evaluation was based on estimated costs and correlations between two existing procedures having differing cost/variance characteristics, and included consideration of their individual merits when coupled with a fictional 'perfect' procedure of zero bias and variance. Two features of the analysis are: (1) the simultaneous estimation of two or more crops; and (2) the imposition of linear cost constraints among two or more types of resource. A reasonably realistic operational scenario was postulated. The costs were estimated from current experience with the measurement procedures involved, and the correlations were estimated from a set of 39 LACIE-type sample segments located in the U.S. Corn Belt. For a fixed variance of the estimate, double sampling with the two existing LANDSAT measurement procedures can result in a 25% or 50% cost reduction. Double sampling which included the fictional perfect procedure results in a more cost effective combination when it is used with the lower cost/higher variance representative of the existing procedures.

  18. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    NASA Technical Reports Server (NTRS)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  19. Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates

    NASA Technical Reports Server (NTRS)

    Peffley, Al F.

    1991-01-01

    The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.

  20. Estimating the costs of tsetse control options: an example for Uganda.

    PubMed

    Shaw, A P M; Torr, S J; Waiswa, C; Cecchi, G; Wint, G R W; Mattioli, R C; Robinson, T P

    2013-07-01

    Decision-making and financial planning for tsetse control is complex, with a particularly wide range of choices to be made on location, timing, strategy and methods. This paper presents full cost estimates for eliminating or continuously controlling tsetse in a hypothetical area of 10,000km(2) located in south-eastern Uganda. Four tsetse control techniques were analysed: (i) artificial baits (insecticide-treated traps/targets), (ii) insecticide-treated cattle (ITC), (iii) aerial spraying using the sequential aerosol technique (SAT) and (iv) the addition of the sterile insect technique (SIT) to the insecticide-based methods (i-iii). For the creation of fly-free zones and using a 10% discount rate, the field costs per km(2) came to US$283 for traps (4 traps per km(2)), US$30 for ITC (5 treated cattle per km(2) using restricted application), US$380 for SAT and US$758 for adding SIT. The inclusion of entomological and other preliminary studies plus administrative overheads adds substantially to the overall cost, so that the total costs become US$482 for traps, US$220 for ITC, US$552 for SAT and US$993 - 1365 if SIT is added following suppression using another method. These basic costs would apply to trouble-free operations dealing with isolated tsetse populations. Estimates were also made for non-isolated populations, allowing for a barrier covering 10% of the intervention area, maintained for 3 years. Where traps were used as a barrier, the total cost of elimination increased by between 29% and 57% and for ITC barriers the increase was between 12% and 30%. In the case of continuous tsetse control operations, costs were estimated over a 20-year period and discounted at 10%. Total costs per km(2) came to US$368 for ITC, US$2114 for traps, all deployed continuously, and US$2442 for SAT applied at 3-year intervals. The lower costs compared favourably with the regular treatment of cattle with prophylactic trypanocides (US$3862 per km(2) assuming four doses per annum at 45 cattle per km(2)). Throughout the study, sensitivity analyses were conducted to explore the impact on cost estimates of different densities of ITC and traps, costs of baseline studies and discount rates. The present analysis highlights the cost differentials between the different intervention techniques, whilst attesting to the significant progress made over the years in reducing field costs. Results indicate that continuous control activities can be cost-effective in reducing tsetse populations, especially where the creation of fly-free zones is challenging and reinvasion pressure high. Copyright © 2013 Food and Agriculture Organization of the United Nations. Published by Elsevier B.V. All rights reserved.

  1. Alternative Strategies for Pricing Home Work Time.

    ERIC Educational Resources Information Center

    Zick, Cathleen D.; Bryant, W. Keith

    1983-01-01

    Discusses techniques for measuring the value of home work time. Estimates obtained using the reservation wage technique are contrasted with market alternative estimates derived with the same data set. Findings suggest that the market alternative cost method understates the true value of a woman's home time to the household. (JOW)

  2. School Cost Functions: A Meta-Regression Analysis

    ERIC Educational Resources Information Center

    Colegrave, Andrew D.; Giles, Margaret J.

    2008-01-01

    The education cost literature includes econometric studies attempting to determine economies of scale, or estimate an optimal school or district size. Not only do their results differ, but the studies use dissimilar data, techniques, and models. To derive value from these studies requires that the estimates be made comparable. One method to do…

  3. Using the Delphi technique in economic evaluation: time to revisit the oracle?

    PubMed

    Simoens, S

    2006-12-01

    Although the Delphi technique has been commonly used as a data source in medical and health services research, its application in economic evaluation of medicines has been more limited. The aim of this study was to describe the methodology of the Delphi technique, to present a case for using the technique in economic evaluation, and to provide recommendations to improve such use. The literature was accessed through MEDLINE focusing on studies discussing the methodology of the Delphi technique and economic evaluations of medicines using the Delphi technique. The Delphi technique can be used to provide estimates of health care resources required and to modify such estimates when making inter-country comparisons. The Delphi technique can also contribute to mapping the treatment process under investigation, to identifying the appropriate comparator to be used, and to ensuring that the economic evaluation estimates cost-effectiveness rather than cost-efficacy. Ideally, economic evaluations of medicines should be based on real-patient data. In the absence of such data, evaluations need to incorporate the best evidence available by employing approaches such as the Delphi technique. Evaluations based on this approach should state the limitations, and explore the impact of the associated uncertainty in the results.

  4. Assessment of transparency of cost estimates in economic evaluations of patient safety programmes.

    PubMed

    Fukuda, Haruhisa; Imanaka, Yuichi

    2009-06-01

    Transparency of costing is essential for decision-makers who require information on the efficiency of a health care programme, because effective decisions depend largely on applicability to their settings. The main objectives of this study were to assess published studies for transparency of cost estimates. We first developed criteria with two axes by reviewing publications dealing with economic evaluations and cost accounting studies: clarification of the scope of costing and accuracy of method evaluating costs. We then performed systematic searches of the literature for studies which estimated prevention costs and assessed the transparency and accuracy of costing based on our criteria. Forty studies met the inclusion criteria. Half of the studies reported data for both the quantity and unit price of programmes in regard to prevention costs. Although 30 studies estimated costs of adverse events, 19 of these described the scope of costing only, and just five studies used a micro-costing method. Among 30 studies that estimated 'gross cost savings' and 'net cost savings', there was a huge discrepancy in labels. Even if a cost study was conducted in accordance with existing techniques of economic evaluation which mostly paid attention to internal validity of cost estimates, without adequate explanation of the process of costing, reproducibility cannot be assured and the study may lose its value as scientific information. This study found that there is tremendous room for improvement.

  5. A study of methods for lowering aerial environmental survey cost

    NASA Technical Reports Server (NTRS)

    Stansberry, J. R.

    1973-01-01

    The results are presented of a study of methods for lowering the cost of environmental aerial surveys. A wide range of low cost techniques were investigated for possible application to current pressing urban and rural problems. The objective of the study is to establish a definition of the technical problems associated with conducting aerial surveys using various low cost techniques, to conduct a survey of equipment which may be used in low cost systems, and to establish preliminary estimates of cost. A set of candidate systems were selected and described for the environmental survey tasks.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, L.T.; Hickey, M.

    This paper summarizes the progress to date by CH2M HILL and the UKAEA in development of a parametric modelling capability for estimating the costs of large nuclear decommissioning projects in the United Kingdom (UK) and Europe. The ability to successfully apply parametric cost estimating techniques will be a key factor to commercial success in the UK and European multi-billion dollar waste management, decommissioning and environmental restoration markets. The most useful parametric models will be those that incorporate individual components representing major elements of work: reactor decommissioning, fuel cycle facility decommissioning, waste management facility decommissioning and environmental restoration. Models must bemore » sufficiently robust to estimate indirect costs and overheads, permit pricing analysis and adjustment, and accommodate the intricacies of international monetary exchange, currency fluctuations and contingency. The development of a parametric cost estimating capability is also a key component in building a forward estimating strategy. The forward estimating strategy will enable the preparation of accurate and cost-effective out-year estimates, even when work scope is poorly defined or as yet indeterminate. Preparation of cost estimates for work outside the organizations current sites, for which detailed measurement is not possible and historical cost data does not exist, will also be facilitated. (authors)« less

  7. The use of cluster analysis techniques in spaceflight project cost risk estimation

    NASA Technical Reports Server (NTRS)

    Fox, G.; Ebbeler, D.; Jorgensen, E.

    2003-01-01

    Project cost risk is the uncertainty in final project cost, contingent on initial budget, requirements and schedule. For a proposed mission, a dynamic simulation model relying for some of its input on a simple risk elicitation is used to identify and quantify systemic cost risk.

  8. Transit Project Planning Guidance : Estimation of Transit Supply Parameters

    DOT National Transportation Integrated Search

    1984-04-01

    This report discusses techniques applicable to the estimation of transit vehicle fleet requirements, vehicle-hours and vehicle-miles, and other related transit supply parameters. These parameters are used for estimating operating costs and certain ca...

  9. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.

  10. The costs of turnover in nursing homes.

    PubMed

    Mukamel, Dana B; Spector, William D; Limcangco, Rhona; Wang, Ying; Feng, Zhanlian; Mor, Vincent

    2009-10-01

    Turnover rates in nursing homes have been persistently high for decades, ranging upwards of 100%. To estimate the net costs associated with turnover of direct care staff in nursing homes. DATA AND SAMPLE: Nine hundred two nursing homes in California in 2005. Data included Medicaid cost reports, the Minimum Data Set, Medicare enrollment files, Census, and Area Resource File. We estimated total cost functions, which included in addition to exogenous outputs and wages, the facility turnover rate. Instrumental variable limited information maximum likelihood techniques were used for estimation to deal with the endogeneity of turnover and costs. The cost functions exhibited the expected behavior, with initially increasing and then decreasing returns to scale. The ordinary least square estimate did not show a significant association between costs and turnover. The instrumental variable estimate of turnover costs was negative and significant (P = 0.039). The marginal cost savings associated with a 10% point increase in turnover for an average facility was $167,063 or 2.9% of annual total costs. The net savings associated with turnover offer an explanation for the persistence of this phenomenon over the last decades, despite the many policy initiatives to reduce it. Future policy efforts need to recognize the complex relationship between turnover and costs.

  11. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  12. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  13. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  14. Early‐Stage Capital Cost Estimation of Biorefinery Processes: A Comparative Study of Heuristic Techniques

    PubMed Central

    Couturier, Jean‐Luc; Kokossis, Antonis; Dubois, Jean‐Luc

    2016-01-01

    Abstract Biorefineries offer a promising alternative to fossil‐based processing industries and have undergone rapid development in recent years. Limited financial resources and stringent company budgets necessitate quick capital estimation of pioneering biorefinery projects at the early stages of their conception to screen process alternatives, decide on project viability, and allocate resources to the most promising cases. Biorefineries are capital‐intensive projects that involve state‐of‐the‐art technologies for which there is no prior experience or sufficient historical data. This work reviews existing rapid cost estimation practices, which can be used by researchers with no previous cost estimating experience. It also comprises a comparative study of six cost methods on three well‐documented biorefinery processes to evaluate their accuracy and precision. The results illustrate discrepancies among the methods because their extrapolation on biorefinery data often violates inherent assumptions. This study recommends the most appropriate rapid cost methods and urges the development of an improved early‐stage capital cost estimation tool suitable for biorefinery processes. PMID:27484398

  15. Streamlining the Design Tradespace for Earth Imaging Constellations

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; Hughes, Steven P.; Le Moigne, Jacqueline J.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.

  16. Advanced space power requirements and techniques. Task 1: Mission projections and requirements. Volume 3: Appendices. [cost estimates and computer programs

    NASA Technical Reports Server (NTRS)

    Wolfe, M. G.

    1978-01-01

    Contents: (1) general study guidelines and assumptions; (2) launch vehicle performance and cost assumptions; (3) satellite programs 1959 to 1979; (4) initiative mission and design characteristics; (5) satellite listing; (6) spacecraft design model; (7) spacecraft cost model; (8) mission cost model; and (9) nominal and optimistic budget program cost summaries.

  17. Cost-of-illness studies: concepts, scopes, and methods

    PubMed Central

    2014-01-01

    Liver diseases are one of the main causes of death, and their ever-increasing prevalence is threatening to cause significant damage both to individuals and society as a whole. This damage is especially serious for the economically active population in Korea. From the societal perspective, it is therefore necessary to consider the economic impacts associated with liver diseases, and identify interventions that can reduce the burden of these diseases. The cost-of-illness study is considered to be an essential evaluation technique in health care. By measuring and comparing the economic burdens of diseases to society, such studies can help health-care decision-makers to set up and prioritize health-care policies and interventions. Using economic theories, this paper introduces various study methods that are generally applicable to most disease cases for estimating the costs of illness associated with mortality, morbidity, disability, and other disease characteristics. It also presents concepts and scopes of costs along with different cost categories from different research perspectives in cost estimations. By discussing the epidemiological and economic grounds of the cost-of-illness study, the reported results represent useful information about several evaluation techniques at an advanced level, such as cost-benefit analysis, cost-effectiveness analysis, and cost-utility analysis. PMID:25548737

  18. Cost-of-illness studies: concepts, scopes, and methods.

    PubMed

    Jo, Changik

    2014-12-01

    Liver diseases are one of the main causes of death, and their ever-increasing prevalence is threatening to cause significant damage both to individuals and society as a whole. This damage is especially serious for the economically active population in Korea. From the societal perspective, it is therefore necessary to consider the economic impacts associated with liver diseases, and identify interventions that can reduce the burden of these diseases. The cost-of-illness study is considered to be an essential evaluation technique in health care. By measuring and comparing the economic burdens of diseases to society, such studies can help health-care decision-makers to set up and prioritize health-care policies and interventions. Using economic theories, this paper introduces various study methods that are generally applicable to most disease cases for estimating the costs of illness associated with mortality, morbidity, disability, and other disease characteristics. It also presents concepts and scopes of costs along with different cost categories from different research perspectives in cost estimations. By discussing the epidemiological and economic grounds of the cost-of-illness study, the reported results represent useful information about several evaluation techniques at an advanced level, such as cost-benefit analysis, cost-effectiveness analysis, and cost-utility analysis.

  19. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  20. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  1. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  2. Using computerised patient-level costing data for setting DRG weights: the Victorian (Australia) cost weight studies.

    PubMed

    Jackson, T

    2001-05-01

    Casemix-funding systems for hospital inpatient care require a set of resource weights which will not inadvertently distort patterns of patient care. Few health systems have very good sources of cost information, and specific studies to derive empirical cost relativities are themselves costly. This paper reports a 5 year program of research into the use of data from hospital management information systems (clinical costing systems) to estimate resource relativities for inpatient hospital care used in Victoria's DRG-based payment system. The paper briefly describes international approaches to cost weight estimation. It describes the architecture of clinical costing systems, and contrasts process and job costing approaches to cost estimation. Techniques of data validation and reliability testing developed in the conduct of four of the first five of the Victorian Cost Weight Studies (1993-1998) are described. Improvement in sampling, data validity and reliability are documented over the course of the research program, the advantages of patient-level data are highlighted. The usefulness of these byproduct data for estimation of relative resource weights and other policy applications may be an important factor in hospital and health system decisions to invest in clinical costing technology.

  3. The costs of turnover in nursing homes

    PubMed Central

    Mukamel, Dana B.; Spector, William D.; Limcangco, Rhona; Wang, Ying; Feng, Zhanlian; Mor, Vincent

    2009-01-01

    Background Turnover rates in nursing homes have been persistently high for decades, ranging upwards of 100%. Objectives To estimate the net costs associated with turnover of direct care staff in nursing homes. Data and sample 902 nursing homes in California in 2005. Data included Medicaid cost reports, the Minimum Data Set (MDS), Medicare enrollment files, Census and Area Resource File (ARF). Research Design We estimated total cost functions, which included in addition to exogenous outputs and wages, the facility turnover rate. Instrumental variable (IV) limited information maximum likelihood techniques were used for estimation to deal with the endogeneity of turnover and costs. Results The cost functions exhibited the expected behavior, with initially increasing and then decreasing returns to scale. The ordinary least square estimate did not show a significant association between costs and turnover. The IV estimate of turnover costs was negative and significant (p=0.039). The marginal cost savings associated with a 10 percentage point increase in turnover for an average facility was $167,063 or 2.9% of annual total costs. Conclusion The net savings associated with turnover offer an explanation for the persistence of this phenomenon over the last decades, despite the many policy initiatives to reduce it. Future policy efforts need to recognize the complex relationship between turnover and costs. PMID:19648834

  4. Pesticide Environmental Accounting: a method for assessing the external costs of individual pesticide applications.

    PubMed

    Leach, A W; Mumford, J D

    2008-01-01

    The Pesticide Environmental Accounting (PEA) tool provides a monetary estimate of environmental and health impacts per hectare-application for any pesticide. The model combines the Environmental Impact Quotient method and a methodology for absolute estimates of external pesticide costs in UK, USA and Germany. For many countries resources are not available for intensive assessments of external pesticide costs. The model converts external costs of a pesticide in the UK, USA and Germany to Mediterranean countries. Economic and policy applications include estimating impacts of pesticide reduction policies or benefits from technologies replacing pesticides, such as sterile insect technique. The system integrates disparate data and approaches into a single logical method. The assumptions in the system provide transparency and consistency but at the cost of some specificity and precision, a reasonable trade-off for a method that provides both comparative estimates of pesticide impacts and area-based assessments of absolute impacts.

  5. Cost Analysis of Instructional Technology.

    ERIC Educational Resources Information Center

    Johnson, F. Craig; Dietrich, John E.

    Although some serious limitations in the cost analysis technique do exist, the need for cost data in decision making is so great that every effort should be made to obtain accurate estimates. This paper discusses the several issues which arise when an attempt is made to make quality, trade-off, or scope decisions based on cost data. Three methods…

  6. Forestry BMP Implementation Costs for Virginia

    Treesearch

    R.M. Shaffer; H.L. Haney; E.G. Worrell; W.M. Aust

    1998-01-01

    Forestry Best Management Practices (BMPs) are operational techniques used to protect water quality during timber harvesting operations. The implementation cost of BMPs is important to loggers, forest landowners, and the forest industry. This study provides an estimate of BMP implementation cost on a per harvested acre basis for the coastal plain, Piedmont, and...

  7. 76 FR 33023 - Agency Information Collection Activity; Proposed Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-07

    ... amount of withholding tax under the provisions of an income tax treaty must disclose its reliance upon a... techniques or other forms of information technology; and (e) estimates of capital or start-up costs and costs...

  8. Evaluation of a cost-effective loads approach. [shock spectra/impedance method for Viking Orbiter

    NASA Technical Reports Server (NTRS)

    Garba, J. A.; Wada, B. K.; Bamford, R.; Trubert, M. R.

    1976-01-01

    A shock spectra/impedance method for loads predictions is used to estimate member loads for the Viking Orbiter, a 7800-lb interplanetary spacecraft that has been designed using transient loads analysis techniques. The transient loads analysis approach leads to a lightweight structure but requires complex and costly analyses. To reduce complexity and cost, a shock spectra/impedance method is currently being used to design the Mariner Jupiter Saturn spacecraft. This method has the advantage of using low-cost in-house loads analysis techniques and typically results in more conservative structural loads. The method is evaluated by comparing the increase in Viking member loads to the loads obtained by the transient loads analysis approach. An estimate of the weight penalty incurred by using this method is presented. The paper also compares the calculated flight loads from the transient loads analyses and the shock spectra/impedance method to measured flight data.

  9. Evaluation of a cost-effective loads approach. [for Viking Orbiter light weight structural design

    NASA Technical Reports Server (NTRS)

    Garba, J. A.; Wada, B. K.; Bamford, R.; Trubert, M. R.

    1976-01-01

    A shock spectra/impedance method for loads prediction is used to estimate member loads for the Viking Orbiter, a 7800-lb interplanetary spacecraft that has been designed using transient loads analysis techniques. The transient loads analysis approach leads to a lightweight structure but requires complex and costly analyses. To reduce complexity and cost a shock spectra/impedance method is currently being used to design the Mariner Jupiter Saturn spacecraft. This method has the advantage of using low-cost in-house loads analysis techniques and typically results in more conservative structural loads. The method is evaluated by comparing the increase in Viking member loads to the loads obtained by the transient loads analysis approach. An estimate of the weight penalty incurred by using this method is presented. The paper also compares the calculated flight loads from the transient loads analyses and the shock spectra/impedance method to measured flight data.

  10. Contractor Accounting, Reporting and Estimating (CARE).

    DTIC Science & Technology

    Contractor Accounting Reporting and Estimating (CARE) provides check lists that may be used as guides in evaluating the accounting system, financial reporting , and cost estimating capabilities of the contractor. Experience gained from the Management Review Technique was used as a basis for the check lists. (Author)

  11. Cost characteristics of hospitals.

    PubMed

    Smet, Mike

    2002-09-01

    Modern hospitals are complex multi-product organisations. The analysis of a hospital's production and/or cost structure should therefore use the appropriate techniques. Flexible functional forms based on the neo-classical theory of the firm seem to be most suitable. Using neo-classical cost functions implicitly assumes minimisation of (variable) costs given that input prices and outputs are exogenous. Local and global properties of flexible functional forms and short-run versus long-run equilibrium are further issues that require thorough investigation. In order to put the results based on econometric estimations of cost functions in the right perspective, it is important to keep these considerations in mind when using flexible functional forms. The more recent studies seem to agree that hospitals generally do not operate in their long-run equilibrium (they tend to over-invest in capital (capacity and equipment)) and that it is therefore appropriate to estimate a short-run variable cost function. However, few studies explicitly take into account the implicit assumptions and restrictions embedded in the models they use. An alternative method to explain differences in costs uses management accounting techniques to identify the cost drivers of overhead costs. Related issues such as cost-shifting and cost-adjusting behaviour of hospitals and the influence of market structure on competition, prices and costs are also discussed shortly.

  12. Early-Stage Capital Cost Estimation of Biorefinery Processes: A Comparative Study of Heuristic Techniques.

    PubMed

    Tsagkari, Mirela; Couturier, Jean-Luc; Kokossis, Antonis; Dubois, Jean-Luc

    2016-09-08

    Biorefineries offer a promising alternative to fossil-based processing industries and have undergone rapid development in recent years. Limited financial resources and stringent company budgets necessitate quick capital estimation of pioneering biorefinery projects at the early stages of their conception to screen process alternatives, decide on project viability, and allocate resources to the most promising cases. Biorefineries are capital-intensive projects that involve state-of-the-art technologies for which there is no prior experience or sufficient historical data. This work reviews existing rapid cost estimation practices, which can be used by researchers with no previous cost estimating experience. It also comprises a comparative study of six cost methods on three well-documented biorefinery processes to evaluate their accuracy and precision. The results illustrate discrepancies among the methods because their extrapolation on biorefinery data often violates inherent assumptions. This study recommends the most appropriate rapid cost methods and urges the development of an improved early-stage capital cost estimation tool suitable for biorefinery processes. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  13. Performance of US teaching hospitals: a panel analysis of cost inefficiency.

    PubMed

    Rosko, Michael D

    2004-02-01

    This research summarizes an analysis of the impact of environment pressures on hospital inefficiency during the period 1990-1999. The panel design included 616 hospitals. Of these, 211 were academic medical centers and 415 were hospitals with smaller teaching programs. The primary sources of data were the American Hospital Association's Annual Survey of Hospitals and Medicare Cost Reports. Hospital inefficiency was estimated by a regression technique called stochastic frontier analysis. This technique estimates a "best practice cost frontier" for each hospital that is based on the hospital's outputs and input prices. The cost efficiency of each hospital was defined as the ratio of the stochastic frontier total costs to observed total costs. Average inefficiency declined from 14.35% in 1990 to 11.42% in 1998. It increased to 11.78% in 1999. Decreases in inefficiency were associated with the HMO penetration rate and time. Increases in inefficiency were associated with for-profit ownership status and Medicare share of admissions. The implementation of the provisions of the Balanced Budget Act of 1997 was followed by a small decrease in average hospital inefficiency. Analysis found that the SFA results were moderately sensitive to the specification of the teaching output variable. Thus, although the SFA technique can be useful for detecting differences in inefficiency between groups of hospitals (i.e., those with high versus those with low Medicare shares or for-profit versus not-for-profit hospitals), its relatively low precision indicates it should not be used for exact estimates of the magnitude of differences associated with inefficiency-effects variables.

  14. Application of Boosting Regression Trees to Preliminary Cost Estimation in Building Construction Projects

    PubMed Central

    2015-01-01

    Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project. PMID:26339227

  15. Application of Boosting Regression Trees to Preliminary Cost Estimation in Building Construction Projects.

    PubMed

    Shin, Yoonseok

    2015-01-01

    Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project.

  16. Optical rangefinding applications using communications modulation technique

    NASA Astrophysics Data System (ADS)

    Caplan, William D.; Morcom, Christopher John

    2010-10-01

    A novel range detection technique combines optical pulse modulation patterns with signal cross-correlation to produce an accurate range estimate from low power signals. The cross-correlation peak is analyzed by a post-processing algorithm such that the phase delay is proportional to the range to target. This technique produces a stable range estimate from noisy signals. The advantage is higher accuracy obtained with relatively low optical power transmitted. The technique is useful for low cost, low power and low mass sensors suitable for tactical use. The signal coding technique allows applications including IFF and battlefield identification systems.

  17. Location estimation in wireless sensor networks using spring-relaxation technique.

    PubMed

    Zhang, Qing; Foh, Chuan Heng; Seet, Boon-Chong; Fong, A C M

    2010-01-01

    Accurate and low-cost autonomous self-localization is a critical requirement of various applications of a large-scale distributed wireless sensor network (WSN). Due to its massive deployment of sensors, explicit measurements based on specialized localization hardware such as the Global Positioning System (GPS) is not practical. In this paper, we propose a low-cost WSN localization solution. Our design uses received signal strength indicators for ranging, light weight distributed algorithms based on the spring-relaxation technique for location computation, and the cooperative approach to achieve certain location estimation accuracy with a low number of nodes with known locations. We provide analysis to show the suitability of the spring-relaxation technique for WSN localization with cooperative approach, and perform simulation experiments to illustrate its accuracy in localization.

  18. Liquid electrolyte informatics using an exhaustive search with linear regression.

    PubMed

    Sodeyama, Keitaro; Igarashi, Yasuhiko; Nakayama, Tomofumi; Tateyama, Yoshitaka; Okada, Masato

    2018-06-14

    Exploring new liquid electrolyte materials is a fundamental target for developing new high-performance lithium-ion batteries. In contrast to solid materials, disordered liquid solution properties have been less studied by data-driven information techniques. Here, we examined the estimation accuracy and efficiency of three information techniques, multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), and exhaustive search with linear regression (ES-LiR), by using coordination energy and melting point as test liquid properties. We then confirmed that ES-LiR gives the most accurate estimation among the techniques. We also found that ES-LiR can provide the relationship between the "prediction accuracy" and "calculation cost" of the properties via a weight diagram of descriptors. This technique makes it possible to choose the balance of the "accuracy" and "cost" when the search of a huge amount of new materials was carried out.

  19. A preliminary benefit-cost study of a Sandia wind farm.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehlen, Mark Andrew; Griffin, Taylor; Loose, Verne W.

    In response to federal mandates and incentives for renewable energy, Sandia National Laboratories conducted a feasibility study of installing an on-site wind farm on Sandia National Laboratories and Kirtland Air Force Base property. This report describes this preliminary analysis of the costs and benefits of installing and operating a 15-turbine, 30-MW-capacity wind farm that delivers an estimated 16 percent of 2010 onsite demand. The report first describes market and non-market economic costs and benefits associated with operating a wind farm, and then uses a standard life-cycle costing and benefit-cost framework to estimate the costs and benefits of a wind farm.more » Based on these 'best-estimates' of costs and benefits and on factor, uncertainty and sensitivity analysis, the analysis results suggest that the benefits of a Sandia wind farm are greater than its costs. The analysis techniques used herein are applicable to the economic assessment of most if not all forms of renewable energy.« less

  20. Cost minimization analysis for combinations of sampling techniques in bronchoscopy of endobronchial lesions.

    PubMed

    Roth, Kjetil; Hardie, Jon Andrew; Andreassen, Alf Henrik; Leh, Friedemann; Eagan, Tomas Mikal Lind

    2009-06-01

    The choice of sampling techniques in bronchoscopy with sampling from a visible lesion will depend on the expected diagnostic yields and the costs of the sampling techniques. The aim of this study was to determine the most economical combination of sampling techniques when approaching endobronchial visible lesions. A cost minimization analysis was performed. All bronchoscopies from 2003 and 2004 at Haukeland university hospital, Bergen, Norway, were reviewed retrospectively for diagnostic yields. 162 patients with endobronchial disease were included. Potential sampling techniques used were biopsy, brushing, endobronchial needle aspiration (EBNA) and washings. Costs were estimated based on registration of equipment costs and personnel costs. Sensitivity analyses were performed to determine threshold values. The combination of biopsy, brushing and EBNA was the most economical strategy with an average cost of Euro 893 (95% CI: 657, 1336). The cost of brushing had to be below Euro 83 and it had to increase the diagnostic yield more than 2.2%, for biopsy and brushing to be more economical than biopsy alone. The combination of biopsy, brushing and EBNA was more economical than biopsy and brushing when the cost of EBNA was below Euro 205 and the increase in diagnostic yield was above 5.2%. In the current study setting, biopsy, brushing and EBNA was the most economical combination of sampling techniques for endobronchial visible lesions.

  1. Cost estimation and analysis using the Sherpa Automated Mine Cost Engineering System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stebbins, P.E.

    1993-09-01

    The Sherpa Automated Mine Cost Engineering System is a menu-driven software package designed to estimate capital and operating costs for proposed surface mining operations. The program is engineering (as opposed to statistically) based, meaning that all equipment, manpower, and supply requirements are determined from deposit geology, project design and mine production information using standard engineering techniques. These requirements are used in conjunction with equipment, supply, and labor cost databases internal to the program to estimate all associated costs. Because virtually all on-site cost parameters are interrelated within the program, Sherpa provides an efficient means of examining the impact of changesmore » in the equipment mix on total capital and operating costs. If any aspect of the operation is changed, Sherpa immediately adjusts all related aspects as necessary. For instance, if the user wishes to examine the cost ramifications of selecting larger trucks, the program not only considers truck purchase and operation costs, it also automatically and immediately adjusts excavator requirements, operator and mechanic needs, repair facility size, haul road construction and maintenance costs, and ancillary equipment specifications.« less

  2. Basinwide Estimation of Habitat and Fish Populations in Streams

    Treesearch

    C. Andrew Dolloff; David G. Hankin; Gordon H. Reeves

    1993-01-01

    Basinwide visual estimation techniques (BVET) are statistically reliable and cost effective for estimating habitat and fish populations across entire watersheds. Survey teams visit habitats in every reach of the study area to record visual observations. At preselected intervals, teams also record actual measurements. These observations and measurements are used to...

  3. Estimating the costs of intensity-modulated and 3-dimensional conformal radiotherapy in Ontario.

    PubMed

    Yong, J H E; McGowan, T; Redmond-Misner, R; Beca, J; Warde, P; Gutierrez, E; Hoch, J S

    2016-06-01

    Radiotherapy is a common treatment for many cancers, but up-to-date estimates of the costs of radiotherapy are lacking. In the present study, we estimated the unit costs of intensity-modulated radiotherapy (imrt) and 3-dimensional conformal radiotherapy (3D-crt) in Ontario. An activity-based costing model was developed to estimate the costs of imrt and 3D-crt in prostate cancer. It included the costs of equipment, staff, and supporting infrastructure. The framework was subsequently adapted to estimate the costs of radiotherapy in breast cancer and head-and-neck cancer. We also tested various scenarios by varying the program maturity and the use of volumetric modulated arc therapy (vmat) alongside imrt. From the perspective of the health care system, treating prostate cancer with imrt and 3D-crt respectively cost $12,834 and $12,453 per patient. The cost of radiotherapy ranged from $5,270 to $14,155 and was sensitive to analytic perspective, radiation technique, and disease site. Cases of head-and-neck cancer were the most costly, being driven by treatment complexity and fractions per treatment. Although imrt was more costly than 3D-crt, its cost will likely decline over time as programs mature and vmat is incorporated. Our costing model can be modified to estimate the costs of 3D-crt and imrt for various disease sites and settings. The results demonstrate the important role of capital costs in studies of radiotherapy cost from a health system perspective, which our model can accommodate. In addition, our study established the need for future analyses of imrt cost to consider how vmat affects time consumption.

  4. A methodology to evaluate differential costs of full field digital as compared to conventional screen film mammography in a clinical setting.

    PubMed

    Ciatto, S; Brancato, B; Baglioni, R; Turci, M

    2006-01-01

    The use of full field digital mammography (FFDM) in alternative to conventional screen film mammography (SFM) in the current practice is delayed by the high costs of FFDM. The present study, performed at the Centro per lo Studio e la Prevenzione Oncologica of Florence, using both FFDM and SFM, was aimed at estimating the impact of introducing the new FFDM technique on overall mammography costs. We estimated the differential costs of both methods, based on real expenditures, as provided by the administrative department, and on radiologists, radiographers and other staff's working time. Two different workload scenarios (5000 and 10,000 tests/year per mammography equipment) were considered. Common costs of both techniques were censored for study purpose. Beside a higher cost due to purchase and hire/leasing costs of equipment, FFDM implies a greater workload for radiologists (reading time almost doubled). SFM implies a greater workload for the administrative staff to run the archive and for loading/unloading films of the roller viewer, whereas no different workload has been observed for radiographers. Overall FFDM costs 24.22-26.46 for examination more than SFM for the 5000 tests scenario and 9.91-12.15 more for the 10,000 tests scenario. Although present study estimates cannot easily be generalised to any local setting, the model for cost calculation is easy to be exported to another scenario by applying different local parameters. The advantages made available by FFDM (computerised data recording, tele-transmission, tele-reporting, tele-consulting, automatic display on monitor of previous exams and use of CAD) may justify the higher cost, but a limited reduction in purchase and assistance costs could easily allow a turnover, with FFDM being more convenient than SFM even on the cost side.

  5. Cogestion and recreation site demand: a model of demand-induced quality effects

    USGS Publications Warehouse

    Douglas, Aaron J.; Johnson, Richard L.

    1993-01-01

    This analysis focuses on problems of estimating site-specific dollar benefits conferred by outdoor recreation sites in the face of congestion costs. Encounters, crowding effects and congestion costs have often been treated by natural resource economists in a piecemeal fashion. In the current paper, encounters and crowding effects are treated systematically. We emphasize the quantitative impact of congestion costs on site-specific estimates of benefits conferred by improvements in outdoor recreation sites. The principal analytic conclusion is that techniques that streamline on data requirements produce biased estimates of benefits conferred by site improvements at facilities with significant crowding effects. The principal policy recommendation is that the Federal and state agencies should collect and store information on visitation rates, encounter levels and congestion costs at various outdoor recreation sites.

  6. Sensitivity analysis of add-on price estimate for select silicon wafering technologies

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1982-01-01

    The cost of producing wafers from silicon ingots is a major component of the add-on price of silicon sheet. Economic analyses of the add-on price estimates and their sensitivity internal-diameter (ID) sawing, multiblade slurry (MBS) sawing and fixed-abrasive slicing technique (FAST) are presented. Interim price estimation guidelines (IPEG) are used for estimating a process add-on price. Sensitivity analysis of price is performed with respect to cost parameters such as equipment, space, direct labor, materials (blade life) and utilities, and the production parameters such as slicing rate, slices per centimeter and process yield, using a computer program specifically developed to do sensitivity analysis with IPEG. The results aid in identifying the important cost parameters and assist in deciding the direction of technology development efforts.

  7. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  8. The effect of bovine somatotropin on the cost of producing milk: Estimates using propensity scores.

    PubMed

    Tauer, Loren W

    2016-04-01

    Annual farm-level data from New York dairy farms from the years 1994 through 2013 were used to estimate the cost effect from bovine somatotropin (bST) using propensity score matching. Cost of production was computed using the whole-farm method, which subtracts sales of crops and animals from total costs under the assumption that the cost of producing those products is equal to their sales values. For a farm to be included in this data set, milk receipts on that farm must have comprised 85% or more of total receipts, indicating that these farms are primarily milk producers. Farm use of bST, where 25% or more of the herd was treated, ranged annually from 25 to 47% of the farms. The average cost effect from the use of bST was estimated to be a reduction of $2.67 per 100 kg of milk produced in 2013 dollars, although annual cost reduction estimates ranged from statistical zero to $3.42 in nominal dollars. Nearest neighbor matching techniques generated a similar estimate of $2.78 in 2013 dollars. These cost reductions estimated from the use of bST represented a cost savings of 5.5% per kilogram of milk produced. Herd-level production increase per cow from the use of bST over 20 yr averaged 1,160 kg. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. Can global navigation satellite system signals reveal the ecological attributes of forests?

    NASA Astrophysics Data System (ADS)

    Liu, Jingbin; Hyyppä, Juha; Yu, Xiaowei; Jaakkola, Anttoni; Liang, Xinlian; Kaartinen, Harri; Kukko, Antero; Zhu, Lingli; Wang, Yunsheng; Hyyppä, Hannu

    2016-08-01

    Forests have important impacts on the global carbon cycle and climate, and they are also related to a wide range of industrial sectors. Currently, one of the biggest challenges in forestry research is effectively and accurately measuring and monitoring forest variables, as the exploitation potential of forest inventory products largely depends on the accuracy of estimates and on the cost of data collection. A low-cost crowdsourcing solution is needed for forest inventory to collect forest variables. Here, we propose global navigation satellite system (GNSS) signals as a novel type of observables for predicting forest attributes and show the feasibility of utilizing GNSS signals for estimating important attributes of forest plots, including mean tree height, mean diameter at breast height, basal area, stem volume and tree biomass. The prediction accuracies of the proposed technique were better in boreal forest conditions than those of the conventional techniques of 2D remote sensing. More importantly, this technique provides a novel, cost-effective way of collecting large-scale forest measurements in the crowdsourcing context. This technique can be applied by, for example, harvesters or persons hiking or working in forests because GNSS devices are widely used, and the field operation of this technique is simple and does not require professional forestry skills.

  10. An extended Kalman filter approach to non-stationary Bayesian estimation of reduced-order vocal fold model parameters.

    PubMed

    Hadwin, Paul J; Peterson, Sean D

    2017-04-01

    The Bayesian framework for parameter inference provides a basis from which subject-specific reduced-order vocal fold models can be generated. Previously, it has been shown that a particle filter technique is capable of producing estimates and associated credibility intervals of time-varying reduced-order vocal fold model parameters. However, the particle filter approach is difficult to implement and has a high computational cost, which can be barriers to clinical adoption. This work presents an alternative estimation strategy based upon Kalman filtering aimed at reducing the computational cost of subject-specific model development. The robustness of this approach to Gaussian and non-Gaussian noise is discussed. The extended Kalman filter (EKF) approach is found to perform very well in comparison with the particle filter technique at dramatically lower computational cost. Based upon the test cases explored, the EKF is comparable in terms of accuracy to the particle filter technique when greater than 6000 particles are employed; if less particles are employed, the EKF actually performs better. For comparable levels of accuracy, the solution time is reduced by 2 orders of magnitude when employing the EKF. By virtue of the approximations used in the EKF, however, the credibility intervals tend to be slightly underpredicted.

  11. Cost projections for Redox Energy storage systems

    NASA Technical Reports Server (NTRS)

    Michaels, K.; Hall, G.

    1980-01-01

    A preliminary design and system cost analysis was performed for the redox energy storage system. A conceptual design and cost estimate was prepared for each of two energy applications: (1) electric utility 100-MWh requirement (10-MW for ten hours) for energy storage for utility load leveling application, and (2) a 500-kWh requirement (10-kW for 50 hours) for use with a variety of residential or commercial applications, including stand alone solar photovoltaic systems. The conceptual designs were based on cell performance levels, system design parameters, and special material costs. These data were combined with estimated thermodynamic and hydraulic analysis to provide preliminary system designs. Results indicate that the redox cell stack to be amenable to mass production techniques with a relatively low material cost.

  12. Economic and workflow analysis of a blood bank automated system.

    PubMed

    Shin, Kyung-Hwa; Kim, Hyung Hoi; Chang, Chulhun L; Lee, Eun Yup

    2013-07-01

    This study compared the estimated costs and times required for ABO/Rh(D) typing and unexpected antibody screening using an automated system and manual methods. The total cost included direct and labor costs. Labor costs were calculated on the basis of the average operator salaries and unit values (minutes), which was the hands-on time required to test one sample. To estimate unit values, workflows were recorded on video, and the time required for each process was analyzed separately. The unit values of ABO/Rh(D) typing using the manual method were 5.65 and 8.1 min during regular and unsocial working hours, respectively. The unit value was less than 3.5 min when several samples were tested simultaneously. The unit value for unexpected antibody screening was 2.6 min. The unit values using the automated method for ABO/Rh(D) typing, unexpected antibody screening, and both simultaneously were all 1.5 min. The total cost of ABO/Rh(D) typing of only one sample using the automated analyzer was lower than that of testing only one sample using the manual technique but higher than that of testing several samples simultaneously. The total cost of unexpected antibody screening using an automated analyzer was less than that using the manual method. ABO/Rh(D) typing using an automated analyzer incurs a lower unit value and cost than that using the manual technique when only one sample is tested at a time. Unexpected antibody screening using an automated analyzer always incurs a lower unit value and cost than that using the manual technique.

  13. Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney

    2010-01-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  14. The cost of asthma in Kuwait.

    PubMed

    Khadadah, Mousa

    2013-01-01

    To evaluate the direct costs of treating asthma in Kuwait. Population figures were obtained from the 2005 census and projected to 2008. Treatment profiles were obtained from the Asthma Insights and Reality for the Gulf and Near East (AIRGNE) study. Asthma prevalence and unit cost estimates were based on results from a Delphi technique. These estimates were applied to the total Kuwaiti population aged 5 years and over to obtain the number of people diagnosed with asthma. The estimates from the Delphi exercise and the AIRGNE results were used to determine the number of asthma patients managed in government facilities. Direct drug costs were provided by the Ministry of Health. Treatment costs (Kuwaiti dinars, KD) were also calculated using the Delphi exercise and the AIRGNE data. The prevalence of asthma was estimated to be 15% of adults and 18% of children (93,923 adults; 70,158 children). Of these, 84,530 (90%) adults and 58,932 (84.0%) children were estimated to be using government healthcare facilities. Inpatient visits accounted for the largest portion of total direct costs (43%), followed by emergency room visits (29%), outpatient visits (21%) and medications (7%). The annual cost of treatment, excluding medications, was KD 29,946,776 (USD 107,076,063) for adults and KD 24,295,439 (USD 86,869,450) for children. Including medications, the total annual direct cost of asthma treatment was estimated to be over KD 58 million (USD 207 million). Asthma costs Kuwait a huge sum of money, though the estimates were conservative because only Kuwaiti nationals were included. Given the high medical expenditures associated with emergency room and inpatient visits, relative to lower medication costs, efforts should be focused on improving asthma control rather than reducing expenditure on procurement of medication. Copyright © 2012 S. Karger AG, Basel.

  15. Estimating the cost of cervical cancer screening in five developing countries

    PubMed Central

    Goldhaber-Fiebert, Jeremy D; Goldie, Sue J

    2006-01-01

    Background Cost-effectiveness analyses (CEAs) can provide useful information to policymakers concerned with the broad allocation of resources as well as to local decision makers choosing between different options for reducing the burden from a single disease. For the latter, it is important to use country-specific data when possible and to represent cost differences between countries that might make one strategy more or less attractive than another strategy locally. As part of a CEA of cervical cancer screening in five developing countries, we supplemented limited primary cost data by developing other estimation techniques for direct medical and non-medical costs associated with alternative screening approaches using one of three initial screening tests: simple visual screening, HPV DNA testing, and cervical cytology. Here, we report estimation methods and results for three cost areas in which data were lacking. Methods To supplement direct medical costs, including staff, supplies, and equipment depreciation using country-specific data, we used alternative techniques to quantify cervical cytology and HPV DNA laboratory sample processing costs. We used a detailed quantity and price approach whose face validity was compared to an adaptation of a US laboratory estimation methodology. This methodology was also used to project annual sample processing capacities for each laboratory type. The cost of sample transport from the clinic to the laboratory was estimated using spatial models. A plausible range of the cost of patient time spent seeking and receiving screening was estimated using only formal sector employment and wages as well as using both formal and informal sector participation and country-specific minimum wages. Data sources included primary data from country-specific studies, international databases, international prices, and expert opinion. Costs were standardized to year 2000 international dollars using inflation adjustment and purchasing power parity. Results Cervical cytology laboratory processing costs were I$1.57–3.37 using the quantity and price method compared to I$1.58–3.02 from the face validation method. HPV DNA processing costs were I$6.07–6.59. Rural laboratory transport costs for cytology were I$0.12–0.64 and I$0.14–0.74 for HPV DNA laboratories. Under assumptions of lower resource efficiency, these estimates increased to I$0.42–0.83 and I$0.54–1.06. Estimates of the value of an hour of patient time using only formal sector participation were I$0.07–4.16, increasing to I$0.30–4.80 when informal and unpaid labor was also included. The value of patient time for traveling, waiting, and attending a screening visit was I$0.68–17.74. With the total cost of screening for cytology and HPV DNA testing ranging from I$4.85–40.54 and I$11.30–48.77 respectively, the cost of the laboratory transport, processing, and patient time accounted for 26–66% and 33–65% of the total costs. From a payer perspective, laboratory transport and processing accounted for 18–48% and 25–60% of total direct medical costs of I$4.11–19.96 and I$10.57–28.18 respectively. Conclusion Cost estimates of laboratory processing, sample transport, and patient time account for a significant proportion of total cervical cancer screening costs in five developing countries and provide important inputs for CEAs of alternative screening modalities. PMID:16887041

  16. Probabilistic distance-based quantizer design for distributed estimation

    NASA Astrophysics Data System (ADS)

    Kim, Yoon Hak

    2016-12-01

    We consider an iterative design of independently operating local quantizers at nodes that should cooperate without interaction to achieve application objectives for distributed estimation systems. We suggest as a new cost function a probabilistic distance between the posterior distribution and its quantized one expressed as the Kullback Leibler (KL) divergence. We first present the analysis that minimizing the KL divergence in the cyclic generalized Lloyd design framework is equivalent to maximizing the logarithmic quantized posterior distribution on the average which can be further computationally reduced in our iterative design. We propose an iterative design algorithm that seeks to maximize the simplified version of the posterior quantized distribution and discuss that our algorithm converges to a global optimum due to the convexity of the cost function and generates the most informative quantized measurements. We also provide an independent encoding technique that enables minimization of the cost function and can be efficiently simplified for a practical use of power-constrained nodes. We finally demonstrate through extensive experiments an obvious advantage of improved estimation performance as compared with the typical designs and the novel design techniques previously published.

  17. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  18. SAMICS Validation. SAMICS Support Study, Phase 3

    NASA Technical Reports Server (NTRS)

    1979-01-01

    SAMICS provides a consistent basis for estimating array costs and compares production technology costs. A review and a validation of the SAMICS model are reported. The review had the following purposes: (1) to test the computational validity of the computer model by comparison with preliminary hand calculations based on conventional cost estimating techniques; (2) to review and improve the accuracy of the cost relationships being used by the model: and (3) to provide an independent verification to users of the model's value in decision making for allocation of research and developement funds and for investment in manufacturing capacity. It is concluded that the SAMICS model is a flexible, accurate, and useful tool for managerial decision making.

  19. Precision Parameter Estimation and Machine Learning

    NASA Astrophysics Data System (ADS)

    Wandelt, Benjamin D.

    2008-12-01

    I discuss the strategy of ``Acceleration by Parallel Precomputation and Learning'' (AP-PLe) that can vastly accelerate parameter estimation in high-dimensional parameter spaces and costly likelihood functions, using trivially parallel computing to speed up sequential exploration of parameter space. This strategy combines the power of distributed computing with machine learning and Markov-Chain Monte Carlo techniques efficiently to explore a likelihood function, posterior distribution or χ2-surface. This strategy is particularly successful in cases where computing the likelihood is costly and the number of parameters is moderate or large. We apply this technique to two central problems in cosmology: the solution of the cosmological parameter estimation problem with sufficient accuracy for the Planck data using PICo; and the detailed calculation of cosmological helium and hydrogen recombination with RICO. Since the APPLe approach is designed to be able to use massively parallel resources to speed up problems that are inherently serial, we can bring the power of distributed computing to bear on parameter estimation problems. We have demonstrated this with the CosmologyatHome project.

  20. Multi-level methods and approximating distribution functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D., E-mail: daniel.wilson@dtc.ox.ac.uk; Baker, R. E.

    2016-07-15

    Biochemical reaction networks are often modelled using discrete-state, continuous-time Markov chains. System statistics of these Markov chains usually cannot be calculated analytically and therefore estimates must be generated via simulation techniques. There is a well documented class of simulation techniques known as exact stochastic simulation algorithms, an example of which is Gillespie’s direct method. These algorithms often come with high computational costs, therefore approximate stochastic simulation algorithms such as the tau-leap method are used. However, in order to minimise the bias in the estimates generated using them, a relatively small value of tau is needed, rendering the computational costs comparablemore » to Gillespie’s direct method. The multi-level Monte Carlo method (Anderson and Higham, Multiscale Model. Simul. 10:146–179, 2012) provides a reduction in computational costs whilst minimising or even eliminating the bias in the estimates of system statistics. This is achieved by first crudely approximating required statistics with many sample paths of low accuracy. Then correction terms are added until a required level of accuracy is reached. Recent literature has primarily focussed on implementing the multi-level method efficiently to estimate a single system statistic. However, it is clearly also of interest to be able to approximate entire probability distributions of species counts. We present two novel methods that combine known techniques for distribution reconstruction with the multi-level method. We demonstrate the potential of our methods using a number of examples.« less

  1. Laboratory Workflow Analysis of Culture of Periprosthetic Tissues in Blood Culture Bottles.

    PubMed

    Peel, Trisha N; Sedarski, John A; Dylla, Brenda L; Shannon, Samantha K; Amirahmadi, Fazlollaah; Hughes, John G; Cheng, Allen C; Patel, Robin

    2017-09-01

    Culture of periprosthetic tissue specimens in blood culture bottles is more sensitive than conventional techniques, but the impact on laboratory workflow has yet to be addressed. Herein, we examined the impact of culture of periprosthetic tissues in blood culture bottles on laboratory workflow and cost. The workflow was process mapped, decision tree models were constructed using probabilities of positive and negative cultures drawn from our published study (T. N. Peel, B. L. Dylla, J. G. Hughes, D. T. Lynch, K. E. Greenwood-Quaintance, A. C. Cheng, J. N. Mandrekar, and R. Patel, mBio 7:e01776-15, 2016, https://doi.org/10.1128/mBio.01776-15), and the processing times and resource costs from the laboratory staff time viewpoint were used to compare periprosthetic tissues culture processes using conventional techniques with culture in blood culture bottles. Sensitivity analysis was performed using various rates of positive cultures. Annualized labor savings were estimated based on salary costs from the U.S. Labor Bureau for Laboratory staff. The model demonstrated a 60.1% reduction in mean total staff time with the adoption of tissue inoculation into blood culture bottles compared to conventional techniques (mean ± standard deviation, 30.7 ± 27.6 versus 77.0 ± 35.3 h per month, respectively; P < 0.001). The estimated annualized labor cost savings of culture using blood culture bottles was $10,876.83 (±$337.16). Sensitivity analysis was performed using various rates of culture positivity (5 to 50%). Culture in blood culture bottles was cost-effective, based on the estimated labor cost savings of $2,132.71 for each percent increase in test accuracy. In conclusion, culture of periprosthetic tissue in blood culture bottles is not only more accurate than but is also cost-saving compared to conventional culture methods. Copyright © 2017 American Society for Microbiology.

  2. 75 FR 30905 - Proposed Collection; Comment Request for Regulation Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-02

    ... enables taxpayers to take advantage of various benefits provided by the Internal Revenue Code. Current.... Affected Public: Individuals or households, business or other for- profit organizations, not-for-profit... techniques or other forms of information technology; and (e) estimates of capital or start-up costs and costs...

  3. 76 FR 33811 - Proposed Information Collections; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-09

    ... techniques or other forms of information technology; and (e) estimates of capital or start-up costs and costs... Businesses. OMB Control Number: 1513-XXXX (To be assigned). TTB Form Numbers: 5600.17 and 5600.18....18 is used to collect financial information from businesses. When an industry member cannot pay their...

  4. 75 FR 73165 - Proposed Information Collections; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-29

    ...-2686 (facsimile); or [email protected] (e-mail). Please send separate comments for each specific... techniques or other forms of information technology; and (e) estimates of capital or start-up costs and costs... cigarette papers and tubes produced. However, these items can be removed without the payment of tax if they...

  5. 48 CFR 9905.501-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... of the date of final agreement on price, as shown on the signed certificate of current cost or... situation. Accordingly, it is neither appropriate nor practical to prescribe a single set of accounting practices which would be consistent in all situations with the practices of estimating costs. Therefore, the...

  6. Near-Field Source Localization by Using Focusing Technique

    NASA Astrophysics Data System (ADS)

    He, Hongyang; Wang, Yide; Saillard, Joseph

    2008-12-01

    We discuss two fast algorithms to localize multiple sources in near field. The symmetry-based method proposed by Zhi and Chia (2007) is first improved by implementing a search-free procedure for the reduction of computation cost. We present then a focusing-based method which does not require symmetric array configuration. By using focusing technique, the near-field signal model is transformed into a model possessing the same structure as in the far-field situation, which allows the bearing estimation with the well-studied far-field methods. With the estimated bearing, the range estimation of each source is consequently obtained by using 1D MUSIC method without parameter pairing. The performance of the improved symmetry-based method and the proposed focusing-based method is compared by Monte Carlo simulations and with Crammer-Rao bound as well. Unlike other near-field algorithms, these two approaches require neither high-computation cost nor high-order statistics.

  7. Mobile multiple access study

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Multiple access techniques (FDMA, CDMA, TDMA) for the mobile user and attempts to identify the current best technique are discussed. Traffic loading is considered as well as voice and data modulation and spacecraft and system design. Emphasis is placed on developing mobile terminal cost estimates for the selected design. In addition, design examples are presented for the alternative techniques of multiple access in order to compare with the selected technique.

  8. Estimating the opportunity costs of bed‐days

    PubMed Central

    Robotham, Julie V.; Deeny, Sarah R.; Edmunds, W. John; Jit, Mark

    2017-01-01

    Abstract Opportunity costs of bed‐days are fundamental to understanding the value of healthcare systems. They greatly influence burden of disease estimations and economic evaluations involving stays in healthcare facilities. However, different estimation techniques employ assumptions that differ crucially in whether to consider the value of the second‐best alternative use forgone, of any available alternative use, or the value of the actually chosen alternative. Informed by economic theory, this paper provides a taxonomic framework of methodologies for estimating the opportunity costs of resources. This taxonomy is then applied to bed‐days by classifying existing approaches accordingly. We highlight differences in valuation between approaches and the perspective adopted, and we use our framework to appraise the assumptions and biases underlying the standard approaches that have been widely adopted mostly unquestioned in the past, such as the conventional use of reference costs and administrative accounting data. Drawing on these findings, we present a novel approach for estimating the opportunity costs of bed‐days in terms of health forgone for the second‐best patient, but expressed monetarily. This alternative approach effectively re‐connects to the concept of choice and explicitly considers net benefits. It is broadly applicable across settings and for other resources besides bed‐days. PMID:29105894

  9. 48 CFR 9904.409-50 - Techniques for application.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the appropriate depreciation charges involves estimates both of service life and of the likely pattern of consumption of services in the cost accounting periods included in such life. In selecting service life estimates and in selecting depreciation methods, many of the same physical and economic factors...

  10. 48 CFR 9904.409-50 - Techniques for application.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the appropriate depreciation charges involves estimates both of service life and of the likely pattern of consumption of services in the cost accounting periods included in such life. In selecting service life estimates and in selecting depreciation methods, many of the same physical and economic factors...

  11. Household water use and conservation models using Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Cahill, R.; Lund, J. R.; DeOreo, B.; Medellín-Azuara, J.

    2013-04-01

    The increased availability of water end use measurement studies allows for more mechanistic and detailed approaches to estimating household water demand and conservation potential. This study uses, probability distributions for parameters affecting water use estimated from end use studies and randomly sampled in Monte Carlo iterations to simulate water use in a single-family residential neighborhood. This model represents existing conditions and is calibrated to metered data. A two-stage mixed integer optimization model is then developed to estimate the least-cost combination of long- and short-term conservation actions for each household. This least-cost conservation model provides an estimate of the upper bound of reasonable conservation potential for varying pricing and rebate conditions. The models were adapted from previous work in Jordan and are applied to a neighborhood in San Ramon, California in eastern San Francisco Bay Area. The existing conditions model produces seasonal use results very close to the metered data. The least-cost conservation model suggests clothes washer rebates are among most cost-effective rebate programs for indoor uses. Retrofit of faucets and toilets is also cost effective and holds the highest potential for water savings from indoor uses. This mechanistic modeling approach can improve understanding of water demand and estimate cost-effectiveness of water conservation programs.

  12. Air quality mapping using GIS and economic evaluation of health impact for Mumbai City, India.

    PubMed

    Kumar, Awkash; Gupta, Indrani; Brandt, Jørgen; Kumar, Rakesh; Dikshit, Anil Kumar; Patil, Rashmi S

    2016-05-01

    Mumbai, a highly populated city in India, has been selected for air quality mapping and assessment of health impact using monitored air quality data. Air quality monitoring networks in Mumbai are operated by National Environment Engineering Research Institute (NEERI), Maharashtra Pollution Control Board (MPCB), and Brihanmumbai Municipal Corporation (BMC). A monitoring station represents air quality at a particular location, while we need spatial variation for air quality management. Here, air quality monitored data of NEERI and BMC were spatially interpolated using various inbuilt interpolation techniques of ArcGIS. Inverse distance weighting (IDW), Kriging (spherical and Gaussian), and spline techniques have been applied for spatial interpolation for this study. The interpolated results of air pollutants sulfur dioxide (SO2), nitrogen dioxide (NO2) and suspended particulate matter (SPM) were compared with air quality data of MPCB in the same region. Comparison of results showed good agreement for predicted values using IDW and Kriging with observed data. Subsequently, health impact assessment of a ward was carried out based on total population of the ward and air quality monitored data within the ward. Finally, health cost within a ward was estimated on the basis of exposed population. This study helps to estimate the valuation of health damage due to air pollution. Operating more air quality monitoring stations for measurement of air quality is highly resource intensive in terms of time and cost. The appropriate spatial interpolation techniques can be used to estimate concentration where air quality monitoring stations are not available. Further, health impact assessment for the population of the city and estimation of economic cost of health damage due to ambient air quality can help to make rational control strategies for environmental management. The total health cost for Mumbai city for the year 2012, with a population of 12.4 million, was estimated as USD8000 million.

  13. Budesonide + formoterol delivered via Spiromax® for the management of asthma and COPD: The potential impact on unscheduled healthcare costs of improving inhalation technique compared with Turbuhaler®.

    PubMed

    Lewis, A; Torvinen, S; Dekhuijzen, P N R; Chrystyn, H; Melani, A; Zöllner, Y; Kolbe, K; Watson, A T; Blackney, M; Plich, A

    2017-08-01

    Fixed-dose combinations of inhaled corticosteroids and long-acting β 2 agonists are commonly used for the treatment of asthma and COPD. However, the most frequently prescribed dry powder inhaler delivering this medicine - Symbicort ® (budesonide and formoterol, BF) Turbuhaler ® - is associated with poor inhalation technique, which can lead to poor disease control and high disease management costs. A recent study showed that patients make fewer inhaler errors when using the novel DuoResp ® (BF) Spiromax ® inhaler, compared with BF Turbuhaler ® . Therefore switching patients from BF Turbuhaler ® to BF Spiromax ® could improve inhalation technique, and potentially lead to better disease control and healthcare cost savings. A model was developed to estimate the budget impact of reducing poor inhalation technique by switching asthma and COPD patients from BF Turbuhaler ® to BF Spiromax ® over three years in Germany, Italy, Sweden and the UK. The model estimated changes to the number, and associated cost, of unscheduled healthcare events. The model considered two scenarios: in Scenario 1, all patients were immediately switched from BF Turbuhaler ® to BF Spiromax ® ; in Scenario 2, 4%, 8% and 12% of patients were switched in years 1, 2 and 3 of the model, respectively. In Scenario 1, per patient cost savings amounted to €60.10, €49.67, €94.14 and €38.20 in Germany, Italy, Sweden and the UK, respectively. Total cost savings in each country were €100.86 million, €19.42 million, €36.65 million and €15.44 million over three years, respectively, with an estimated 597,754, 151,480, 228,986 and 122,368 healthcare events avoided. In Scenario 2, cost savings totalled €8.07 million, €1.55 million, €2.93 million and €1.23 million over three years, respectively, with 47,850, 12,118, 18,319, and 9789 healthcare events avoided. Savings per patient were €4.81, €3.97, €7.53 and €3.06. We demonstrated that reductions in poor inhalation technique by switching patients from BF Turbuhaler ® to BF Spiromax ® are likely to improve patients' disease control and generate considerable cost savings through healthcare events avoided. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Cost-effective sampling of ¹³⁷Cs-derived net soil redistribution: part 1--estimating the spatial mean across scales of variation.

    PubMed

    Li, Y; Chappell, A; Nyamdavaa, B; Yu, H; Davaasuren, D; Zoljargal, K

    2015-03-01

    The (137)Cs technique for estimating net time-integrated soil redistribution is valuable for understanding the factors controlling soil redistribution by all processes. The literature on this technique is dominated by studies of individual fields and describes its typically time-consuming nature. We contend that the community making these studies has inappropriately assumed that many (137)Cs measurements are required and hence estimates of net soil redistribution can only be made at the field scale. Here, we support future studies of (137)Cs-derived net soil redistribution to apply their often limited resources across scales of variation (field, catchment, region etc.) without compromising the quality of the estimates at any scale. We describe a hybrid, design-based and model-based, stratified random sampling design with composites to estimate the sampling variance and a cost model for fieldwork and laboratory measurements. Geostatistical mapping of net (1954-2012) soil redistribution as a case study on the Chinese Loess Plateau is compared with estimates for several other sampling designs popular in the literature. We demonstrate the cost-effectiveness of the hybrid design for spatial estimation of net soil redistribution. To demonstrate the limitations of current sampling approaches to cut across scales of variation, we extrapolate our estimate of net soil redistribution across the region, show that for the same resources, estimates from many fields could have been provided and would elucidate the cause of differences within and between regional estimates. We recommend that future studies evaluate carefully the sampling design to consider the opportunity to investigate (137)Cs-derived net soil redistribution across scales of variation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Douglas, Freddie; Bourgeois, Edit Kaminsky

    2005-01-01

    The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).

  16. A model for calculating the costs of in vivo dosimetry and portal imaging in radiotherapy departments.

    PubMed

    Kesteloot, K; Dutreix, A; van der Schueren, E

    1993-08-01

    The costs of in vivo dosimetry and portal imaging in radiotherapy are estimated, on the basis of a detailed overview of the activities involved in both quality assurance techniques. These activities require the availability of equipment, the use of material and workload. The cost calculations allow to conclude that for most departments in vivo dosimetry with diodes will be a cheaper alternative than in vivo dosimetry with TLD-meters. Whether TLD measurements can be performed cheaper with an automatic reader (with a higher equipment cost, but lower workload) or with a semi-automatic reader (lower equipment cost, but higher workload), depends on the number of checks in the department. LSP-systems (with a very high equipment cost) as well as on-line imaging systems will be cheaper portal imaging techniques than conventional port films (with high material costs) for large departments, or for smaller departments that perform frequent volume checks.

  17. Some insight on censored cost estimators.

    PubMed

    Zhao, H; Cheng, Y; Bang, H

    2011-08-30

    Censored survival data analysis has been studied for many years. Yet, the analysis of censored mark variables, such as medical cost, quality-adjusted lifetime, and repeated events, faces a unique challenge that makes standard survival analysis techniques invalid. Because of the 'informative' censorship imbedded in censored mark variables, the use of the Kaplan-Meier (Journal of the American Statistical Association 1958; 53:457-481) estimator, as an example, will produce biased estimates. Innovative estimators have been developed in the past decade in order to handle this issue. Even though consistent estimators have been proposed, the formulations and interpretations of some estimators are less intuitive to practitioners. On the other hand, more intuitive estimators have been proposed, but their mathematical properties have not been established. In this paper, we prove the analytic identity between some estimators (a statistically motivated estimator and an intuitive estimator) for censored cost data. Efron (1967) made similar investigation for censored survival data (between the Kaplan-Meier estimator and the redistribute-to-the-right algorithm). Therefore, we view our study as an extension of Efron's work to informatively censored data so that our findings could be applied to other marked variables. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering process parameters are elicited from the engineer/expert on the project and are based on that expert's technical knowledge. These are converted by a parametric cost model into a cost estimate. The method discussed makes no assumptions about the distribution underlying the distribution of possible costs, and is not tied to the analysis of previous projects, except through the expert calibrations performed by the parametric cost analyst.

  19. Analysis of space tug operating techniques (study 2.4). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The costs of tug refurbishment were studied, using existing cost estimating relationships, to establish the cost of maintaining the reusable third stage of the space transportation system. Refurbishment operations sheets which describe the actual tasks that are necessary to keep the equipment functioning properly were used along with refurbishment operations sheets which contain all of the pertinent descriptive information for each of the major vehicle areas. Tug refurbishment costs per mission are tabulated.

  20. Taking ART to Scale: Determinants of the Cost and Cost-Effectiveness of Antiretroviral Therapy in 45 Clinical Sites in Zambia

    PubMed Central

    Marseille, Elliot; Giganti, Mark J.; Mwango, Albert; Chisembele-Taylor, Angela; Mulenga, Lloyd; Over, Mead; Kahn, James G.; Stringer, Jeffrey S. A.

    2012-01-01

    Background We estimated the unit costs and cost-effectiveness of a government ART program in 45 sites in Zambia supported by the Centre for Infectious Disease Research Zambia (CIDRZ). Methods We estimated per person-year costs at the facility level, and support costs incurred above the facility level and used multiple regression to estimate variation in these costs. To estimate ART effectiveness, we compared mortality in this Zambian population to that of a cohort of rural Ugandan HIV patients receiving co-trimoxazole (CTX) prophylaxis. We used micro-costing techniques to estimate incremental unit costs, and calculated cost-effectiveness ratios with a computer model which projected results to 10 years. Results The program cost $69.7 million for 125,436 person-years of ART, or $556 per ART-year. Compared to CTX prophylaxis alone, the program averted 33.3 deaths or 244.5 disability adjusted life-years (DALYs) per 100 person-years of ART. In the base-case analysis, the net cost per DALY averted was $833 compared to CTX alone. More than two-thirds of the variation in average incremental total and on-site cost per patient-year of treatment is explained by eight determinants, including the complexity of the patient-case load, the degree of adherence among the patients, and institutional characteristics including, experience, scale, scope, setting and sector. Conclusions and Significance The 45 sites exhibited substantial variation in unit costs and cost-effectiveness and are in the mid-range of cost-effectiveness when compared to other ART programs studied in southern Africa. Early treatment initiation, large scale, and hospital setting, are associated with statistically significantly lower costs, while others (rural location, private sector) are associated with shifting cost from on- to off-site. This study shows that ART programs can be significantly less costly or more cost-effective when they exploit economies of scale and scope, and initiate patients at higher CD4 counts. PMID:23284843

  1. Taking ART to scale: determinants of the cost and cost-effectiveness of antiretroviral therapy in 45 clinical sites in Zambia.

    PubMed

    Marseille, Elliot; Giganti, Mark J; Mwango, Albert; Chisembele-Taylor, Angela; Mulenga, Lloyd; Over, Mead; Kahn, James G; Stringer, Jeffrey S A

    2012-01-01

    We estimated the unit costs and cost-effectiveness of a government ART program in 45 sites in Zambia supported by the Centre for Infectious Disease Research Zambia (CIDRZ). We estimated per person-year costs at the facility level, and support costs incurred above the facility level and used multiple regression to estimate variation in these costs. To estimate ART effectiveness, we compared mortality in this Zambian population to that of a cohort of rural Ugandan HIV patients receiving co-trimoxazole (CTX) prophylaxis. We used micro-costing techniques to estimate incremental unit costs, and calculated cost-effectiveness ratios with a computer model which projected results to 10 years. The program cost $69.7 million for 125,436 person-years of ART, or $556 per ART-year. Compared to CTX prophylaxis alone, the program averted 33.3 deaths or 244.5 disability adjusted life-years (DALYs) per 100 person-years of ART. In the base-case analysis, the net cost per DALY averted was $833 compared to CTX alone. More than two-thirds of the variation in average incremental total and on-site cost per patient-year of treatment is explained by eight determinants, including the complexity of the patient-case load, the degree of adherence among the patients, and institutional characteristics including, experience, scale, scope, setting and sector. The 45 sites exhibited substantial variation in unit costs and cost-effectiveness and are in the mid-range of cost-effectiveness when compared to other ART programs studied in southern Africa. Early treatment initiation, large scale, and hospital setting, are associated with statistically significantly lower costs, while others (rural location, private sector) are associated with shifting cost from on- to off-site. This study shows that ART programs can be significantly less costly or more cost-effective when they exploit economies of scale and scope, and initiate patients at higher CD4 counts.

  2. Experiments, conceptual design, preliminary cost estimates and schedules for an underground research facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korbin, G.; Wollenberg, H.; Wilson, C.

    Plans for an underground research facility are presented, incorporating techniques to assess the hydrological and thermomechanical response of a rock mass to the introduction and long-term isolation of radioactive waste, and to assess the effects of excavation on the hydrologic integrity of a repository and its subsequent backfill, plugging, and sealing. The project is designed to utilize existing mine or civil works for access to experimental areas and is estimated to last 8 years at a total cost for contruction and operation of $39.0 million (1981 dollars). Performing the same experiments in an existing underground research facility would reduce themore » duration to 7-1/2 years and cost $27.7 million as a lower-bound estimate. These preliminary plans and estimates should be revised after specific sites are identified which would accommodate the facility.« less

  3. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    NASA Technical Reports Server (NTRS)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable prefiltering technique. But, this increased bandwidth comes at the cost of the lower frequency estimates. The studies further showed that large data sets of the order of 100,000 points, or more, high data rates, and Poisson sampling are very crucial for obtaining reliable spectral estimates from randomly sampled data, such as LV data. Some of the results of the current study are presented.

  4. Economic implications of current systems

    NASA Technical Reports Server (NTRS)

    Daniel, R. E.; Aster, R. W.

    1983-01-01

    The primary goals of this study are to estimate the value of R&D to photovoltaic (PV) metallization systems cost, and to provide a method for selecting an optimal metallization method for any given PV system. The value-added cost and relative electrical performance of 25 state-of-the-art (SOA) and advanced metallization system techniques are compared.

  5. 75 FR 44848 - Proposed Collection; Comment Request for Announcement 2004-43

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-29

    ... Benefit Guaranty Corporation within 30 days of making an election to take advantage of the alternative... this time. Type of Review: Extension of a currently approved collection. Affected Public: Business or... techniques or other forms of information technology; and (e) estimates of capital or start-up costs and costs...

  6. 78 FR 16916 - Proposed Collection; Comment Request for Regulation Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-19

    ... guidance to a passive foreign investment company (PFIC) shareholder that makes the election under Code... be received on or before May 20, 2013 to be assured of consideration. ADDRESSES: Direct all written... techniques or other forms of information technology; and (e) estimates of capital or start-up costs and costs...

  7. 48 CFR 2015.304 - Evaluation factors.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 2015.304... numerically weighted are conflict of interest, estimated cost, and “go/no go” evaluation factors. ...

  8. Using decision modeling to determine pricing of new pharmaceuticals: the case of neurokinin-1 receptor antagonist antiemetics for cancer chemotherapy.

    PubMed

    Dranitsaris, George; Leung, Pauline

    2004-01-01

    Decision analysis is commonly used to perform economic evaluations of new pharmaceuticals. The outcomes of such studies are often reported as an incremental cost per quality-adjusted life year (QALY) gained with the new agent. Decision analysis can also be used in the context of estimating drug cost before market entry. The current study used neurokinin-1 (NK-1) receptor antagonists, a new class of antiemetics for cancer patients, as an example to illustrate the process using an incremental cost of dollars Can20,000 per QALY gained as the target threshold. A decision model was developed to simulate the control of acute and delayed emesis after cisplatin-based chemotherapy. The model compared standard therapy with granisetron and dexamethasone to the same protocol with the addition of an NK-1 before chemotherapy and continued twice daily for five days. The rates of complete emesis control were abstracted from a double-blind randomized trial. Costs of standard antiemetics and therapy for breakthrough vomiting were obtained from hospital sources. Utility estimates characterized as quality-adjusted emesis-free days were determined by interviewing twenty-five oncology nurses and pharmacists by using the Time Trade-Off technique. These data were then used to estimate the unit cost of the new antiemetic using a target threshold of dollars Can20,000 per QALY gained. A cost of dollars Can6.60 per NK-1 dose would generate an incremental cost of dollars Can20,000 per QALY. The sensitivity analysis on the unit cost identified a range from dollars Can4.80 to dollars Can10.00 per dose. For the recommended five days of therapy, the total cost should be dollars Can66.00 (dollars Can48.00-dollars Can100.00) for optimal economic efficiency relative to Canada's publicly funded health-care system. The use of decision modeling for estimating drug cost before product launch is a powerful technique to ensure value for money. Such information can be of value to both drug manufacturers and formulary committees, because it would facilitate negotiations for optimal pricing in a given jurisdiction.

  9. Economic analysis of electronic waste recycling: modeling the cost and revenue of a materials recovery facility in California.

    PubMed

    Kang, Hai-Yong; Schoenung, Julie M

    2006-03-01

    The objectives of this study are to identify the various techniques used for treating electronic waste (e-waste) at material recovery facilities (MRFs) in the state of California and to investigate the costs and revenue drivers for these techniques. The economics of a representative e-waste MRF are evaluated by using technical cost modeling (TCM). MRFs are a critical element in the infrastructure being developed within the e-waste recycling industry. At an MRF, collected e-waste can become marketable output products including resalable systems/components and recyclable materials such as plastics, metals, and glass. TCM has two main constituents, inputs and outputs. Inputs are process-related and economic variables, which are directly specified in each model. Inputs can be divided into two parts: inputs for cost estimation and for revenue estimation. Outputs are the results of modeling and consist of costs and revenues, distributed by unit operation, cost element, and revenue source. The results of the present analysis indicate that the largest cost driver for the operation of the defined California e-waste MRF is the materials cost (37% of total cost), which includes the cost to outsource the recycling of the cathode ray tubes (CRTs) (dollar 0.33/kg); the second largest cost driver is labor cost (28% of total cost without accounting for overhead). The other cost drivers are transportation, building, and equipment costs. The most costly unit operation is cathode ray tube glass recycling, and the next are sorting, collecting, and dismantling. The largest revenue source is the fee charged to the customer; metal recovery is the second largest revenue source.

  10. Simple and accurate methods for quantifying deformation, disruption, and development in biological tissues

    PubMed Central

    Boyle, John J.; Kume, Maiko; Wyczalkowski, Matthew A.; Taber, Larry A.; Pless, Robert B.; Xia, Younan; Genin, Guy M.; Thomopoulos, Stavros

    2014-01-01

    When mechanical factors underlie growth, development, disease or healing, they often function through local regions of tissue where deformation is highly concentrated. Current optical techniques to estimate deformation can lack precision and accuracy in such regions due to challenges in distinguishing a region of concentrated deformation from an error in displacement tracking. Here, we present a simple and general technique for improving the accuracy and precision of strain estimation and an associated technique for distinguishing a concentrated deformation from a tracking error. The strain estimation technique improves accuracy relative to other state-of-the-art algorithms by directly estimating strain fields without first estimating displacements, resulting in a very simple method and low computational cost. The technique for identifying local elevation of strain enables for the first time the successful identification of the onset and consequences of local strain concentrating features such as cracks and tears in a highly strained tissue. We apply these new techniques to demonstrate a novel hypothesis in prenatal wound healing. More generally, the analytical methods we have developed provide a simple tool for quantifying the appearance and magnitude of localized deformation from a series of digital images across a broad range of disciplines. PMID:25165601

  11. Maximizing mitigation benefits : project summary.

    DOT National Transportation Integrated Search

    2016-04-30

    The research team: : - Reviewed methods, techniques, and : processes at select state DOTs for estimating : mitigations costs for the following states: : Arizona, California, Colorado, Florida, New : York, North Carolina, Ohio, Oregon, : Pennsylvania,...

  12. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  13. The Human Resource Management Information Network (HRMIN): A Cost Comparison in Accordance with Office of Management and Budget (OMB) Circular No. A-76, of 5 April 1979.

    DTIC Science & Technology

    1984-12-01

    documentation details to support this presentation can be obtained from NMPC and NPRDC RMS accounting records. Cost estimates and their underlying 62 S ...Economics, Application, Science Research Associates, 1974. Horngren , C.T., Cost Accounting : A Managerial Emphasis, Prentice-Hall, 1977. Krauss, L.I...that will be encountered in the course of this work. Techniques and terms used in Managerial and Cost Accounting , Economics, the Behavioral Sciences

  14. Direct costs of osteoporosis and hip fracture: an analysis for the Mexican healthcare system.

    PubMed

    Clark, P; Carlos, F; Barrera, C; Guzman, J; Maetzel, A; Lavielle, P; Ramirez, E; Robinson, V; Rodriguez-Cabrera, R; Tamayo, J; Tugwell, P

    2008-03-01

    This study reports the direct costs related to osteoporosis and hip fractures paid for governmental and private institutions in the Mexican health system and estimates the impact of these entities on Mexico. We conclude that the economic burden due to the direct costs of hip fracture justifies wide-scale prevention programs for osteoporosis (OP). To estimate the total direct costs of OP and hip fractures in the Mexican Health care system, a sample of governmental and private institutions were studied. Information was gathered through direct questionnaires in 275 OP patients and 218 hip fracture cases. Additionally, a chart review was conducted and experts' opinions obtained to get accurate protocol scenarios for diagnoses and treatment of OP with no fracture. Microcosting and activity-based costing techniques were used to yield unit costs. The total direct costs for OP and hip fracture were estimated for 2006 based on the projected annual incidence of hip fractures in Mexico. A total of 22,233 hip fracture cases were estimated for 2006 with a total cost to the healthcare system of US$ 97,058,159 for the acute treatment alone ($4,365.50 per case). We found considerable differences in costs and the way the patients were treated across the different health sectors within the country. Costs of the acute treatment of hip fractures in Mexico are high and are expected to increase with the predicted increment of life expectancy and the number of elderly in our population.

  15. The threshold rate of oral atypical anti-psychotic adherence at which paliperidone palmitate is cost saving.

    PubMed

    Edwards, Natalie C; Muser, Erik; Doshi, Dilesh; Fastenau, John

    2012-01-01

    To identify, estimate, and compare 'real world' costs and outcomes associated with paliperidone palmitate compared with branded oral atypical anti-psychotics, and to estimate the threshold rate of oral atypical adherence at which paliperidone palmitate is cost saving. Decision analytic modeling techniques developed by Glazer and Ereshefsky have previously been used to estimate the cost-effectiveness of depot haloperidol, LAI risperidone, and, more recently, LAI olanzapine. This study used those same techniques, along with updated comparative published clinical data, to evaluate paliperidone palmitate. Adherence rates were based on strict Medication Event Monitoring System (MEMS) criteria. The evaluation was conducted from the perspective of US healthcare payers. Paliperidone palmitate patients had fewer mean annual days of relapse (8.7 days; 6.0 requiring hospitalization, 2.7 not requiring hospitalization vs 17.8 days; 12.4 requiring hospitalization, 5.4 not requiring hospitalization), and lower annual total cost ($20,995) compared to oral atypicals (mean $22,481). Because paliperidone palmitate was both more effective and less costly, it is considered economically dominant. Paliperidone palmitate saved costs when the rate of adherence of oral atypical anti-psychotics was below 44.9% using strict MEMS criteria. Sensitivity analyses showed results were robust to changes in parameter values. For patients receiving 156 mg paliperidone palmitate, the annual incremental cost was $1216 per patient (ICER = $191 per day of relapse averted). Inclusion of generic risperidone (market share 18.6%) also resulted in net incremental cost for paliperidone palmitate ($120; ICER = $13). Limitations of this evaluation include use of simplifying assumptions, data from multiple sources, and generalizability of results. Although uptake of LAIs in the US has not been as rapid as elsewhere, many thought leaders emphasize their importance in optimizing outcomes in patients with adherence problems. The findings of this analysis support the cost-effectiveness of paliperidone palmitate in these patients.

  16. VALIDATION OF A METHOD FOR ESTIMATING POLLUTION EMISSION RATES FROM AREA SOURCES USING OPEN-PATH FTIR SEPCTROSCOPY AND DISPERSION MODELING TECHNIQUES

    EPA Science Inventory

    The paper describes a methodology developed to estimate emissions factors for a variety of different area sources in a rapid, accurate, and cost effective manner. he methodology involves using an open-path Fourier transform infrared (FTIR) spectrometer to measure concentrations o...

  17. Estimating the opportunity costs of bed-days.

    PubMed

    Sandmann, Frank G; Robotham, Julie V; Deeny, Sarah R; Edmunds, W John; Jit, Mark

    2018-03-01

    Opportunity costs of bed-days are fundamental to understanding the value of healthcare systems. They greatly influence burden of disease estimations and economic evaluations involving stays in healthcare facilities. However, different estimation techniques employ assumptions that differ crucially in whether to consider the value of the second-best alternative use forgone, of any available alternative use, or the value of the actually chosen alternative. Informed by economic theory, this paper provides a taxonomic framework of methodologies for estimating the opportunity costs of resources. This taxonomy is then applied to bed-days by classifying existing approaches accordingly. We highlight differences in valuation between approaches and the perspective adopted, and we use our framework to appraise the assumptions and biases underlying the standard approaches that have been widely adopted mostly unquestioned in the past, such as the conventional use of reference costs and administrative accounting data. Drawing on these findings, we present a novel approach for estimating the opportunity costs of bed-days in terms of health forgone for the second-best patient, but expressed monetarily. This alternative approach effectively re-connects to the concept of choice and explicitly considers net benefits. It is broadly applicable across settings and for other resources besides bed-days. © 2017 The Authors Health Economics published by John Wiley & Sons Ltd.

  18. Techniques for assessing extramarket values

    Treesearch

    Donald F. Dennis

    1995-01-01

    Central to effective policy development and management of natural resources is an understanding of the trade-offs stakeholders are willing to accept and the values they hold. Although market prices reflect society's preferences to some degree, they clearly do not encompass all values or costs. Conjoint techniques offer a means to estimate and analyze stakeholder...

  19. Econometrics of inventory holding and shortage costs: the case of refined gasoline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krane, S.D.

    1985-01-01

    This thesis estimates a model of a firm's optimal inventory and production behavior in order to investigate the link between the role of inventories in the business cycle and the microeconomic incentives for holding stocks of finished goods. The goal is to estimate a set of structural cost function parameters that can be used to infer the optimal cyclical response of inventories and production to shocks in demand. To avoid problems associated with the use of value based aggregate inventory data, an industry level physical unit data set for refined motor gasoline is examined. The Euler equations for a refiner'smore » multiperiod decision problem are estimated using restrictions imposed by the rational expectations hypothesis. The model also embodies the fact that, in most periods, the level of shortages will be zero, and even when positive, the shortages are not directly observable in the data set. These two concerns lead us to use a generalized method of moments estimation technique on a functional form that resembles the formulation of a Tobit problem. The estimation results are disappointing; the model and data yield coefficient estimates incongruous with the cost function interpretations of the structural parameters. These is only some superficial evidence that production smoothing is significant and that marginal inventory shortage costs increase at a faster rate than do marginal holding costs.« less

  20. Household water use and conservation models using Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Cahill, R.; Lund, J. R.; DeOreo, B.; Medellín-Azuara, J.

    2013-10-01

    The increased availability of end use measurement studies allows for mechanistic and detailed approaches to estimating household water demand and conservation potential. This study simulates water use in a single-family residential neighborhood using end-water-use parameter probability distributions generated from Monte Carlo sampling. This model represents existing water use conditions in 2010 and is calibrated to 2006-2011 metered data. A two-stage mixed integer optimization model is then developed to estimate the least-cost combination of long- and short-term conservation actions for each household. This least-cost conservation model provides an estimate of the upper bound of reasonable conservation potential for varying pricing and rebate conditions. The models were adapted from previous work in Jordan and are applied to a neighborhood in San Ramon, California in the eastern San Francisco Bay Area. The existing conditions model produces seasonal use results very close to the metered data. The least-cost conservation model suggests clothes washer rebates are among most cost-effective rebate programs for indoor uses. Retrofit of faucets and toilets is also cost-effective and holds the highest potential for water savings from indoor uses. This mechanistic modeling approach can improve understanding of water demand and estimate cost-effectiveness of water conservation programs.

  1. Glucose-6-phosphate dehydrogenase deficiency and the use of primaquine: top-down and bottom-up estimation of professional costs.

    PubMed

    Peixoto, Henry Maia; Brito, Marcelo Augusto Mota; Romero, Gustavo Adolfo Sierra; Monteiro, Wuelton Marcelo; Lacerda, Marcus Vinícius Guimarães de; Oliveira, Maria Regina Fernandes de

    2017-10-05

    The aim of this study has been to study whether the top-down method, based on the average value identified in the Brazilian Hospitalization System (SIH/SUS), is a good estimator of the cost of health professionals per patient, using the bottom-up method for comparison. The study has been developed from the context of hospital care offered to the patient carrier of glucose-6-phosphate dehydrogenase (G6PD) deficiency with severe adverse effect because of the use of primaquine, in the Brazilian Amazon. The top-down method based on the spending with SIH/SUS professional services, as a proxy for this cost, corresponded to R$60.71, and the bottom-up, based on the salaries of the physician (R$30.43), nurse (R$16.33), and nursing technician (R$5.93), estimated a total cost of R$52.68. The difference was only R$8.03, which shows that the amounts paid by the Hospital Inpatient Authorization (AIH) are estimates close to those obtained by the bottom-up technique for the professionals directly involved in the care.

  2. A Leo Satellite Navigation Algorithm Based on GPS and Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Harman, Rick; Bar-Itzhack, Itzhack

    2001-01-01

    The Global Positioning System (GPS) has become a standard method for low cost onboard satellite orbit determination. The use of a GPS receiver as an attitude and rate sensor has also been developed in the recent past. Additionally, focus has been given to attitude and orbit estimation using the magnetometer, a low cost, reliable sensor. Combining measurements from both GPS and a magnetometer can provide a robust navigation system that takes advantage of the estimation qualities of both measurements. Ultimately, a low cost, accurate navigation system can result, potentially eliminating the need for more costly sensors, including gyroscopes. This work presents the development of a technique to eliminate numerical differentiation of the GPS phase measurements and also compares the use of one versus two GPS satellites.

  3. Design of surface-water data networks for regional information

    USGS Publications Warehouse

    Moss, Marshall E.; Gilroy, E.J.; Tasker, Gary D.; Karlinger, M.R.

    1982-01-01

    This report describes a technique, Network Analysis of Regional Information (NARI), and the existing computer procedures that have been developed for the specification of the regional information-cost relation for several statistical parameters of streamflow. The measure of information used is the true standard error of estimate of a regional logarithmic regression. The cost is a function of the number of stations at which hydrologic data are collected and the number of years for which the data are collected. The technique can be used to obtain either (1) a minimum cost network that will attain a prespecified accuracy and reliability or (2) a network that maximizes information given a set of budgetary and time constraints.

  4. The economic implications of a multimodal analgesic regimen for patients undergoing major orthopedic surgery: a comparative study of direct costs.

    PubMed

    Duncan, Christopher M; Hall Long, Kirsten; Warner, David O; Hebl, James R

    2009-01-01

    Total knee and total hip arthoplasty (THA) are 2 of the most common surgical procedures performed in the United States and represent the greatest single Medicare procedural expenditure. This study was designed to evaluate the economic impact of implementing a multimodal analgesic regimen (Total Joint Regional Anesthesia [TJRA] Clinical Pathway) on the estimated direct medical costs of patients undergoing lower extremity joint replacement surgery. An economic cost comparison was performed on Mayo Clinic patients (n = 100) undergoing traditional total knee or total hip arthroplasty using the TJRA Clinical Pathway. Study patients were matched 1:1 with historical controls undergoing similar procedures using traditional anesthetic (non-TJRA) techniques. Matching criteria included age, sex, surgeon, type of procedure, and American Society of Anesthesiologists (ASA) physical status (PS) classification. Hospital-based direct costs were collected for each patient and analyzed in standardized inflation-adjusted constant dollars using cost-to-charge ratios, wage indexes, and physician services valued using Medicare reimbursement rates. The estimated mean direct hospital costs were compared between groups, and a subgroup analysis was performed based on ASA PS classification. The estimated mean direct hospital costs were significantly reduced among TJRA patients when compared with controls (cost difference, 1999 dollars; 95% confidence interval, 584-3231 dollars; P = 0.0004). A significant reduction in hospital-based (Medicare Part A) costs accounted for the majority of the total cost savings. Use of a comprehensive, multimodal analgesic regimen (TJRA Clinical Pathway) in patients undergoing lower extremity joint replacement surgery provides a significant reduction in the estimated total direct medical costs. The reduction in mean cost is primarily associated with lower hospital-based (Medicare Part A) costs, with the greatest overall cost difference appearing among patients with significant comorbidities (ASA PS III-IV patients).

  5. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    NASA Technical Reports Server (NTRS)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  6. Sizing and Lifecycle Cost Analysis of an Ares V Composite Interstage

    NASA Technical Reports Server (NTRS)

    Mann, Troy; Smeltzer, Stan; Grenoble, Ray; Mason, Brian; Rosario, Sev; Fairbairn, Bob

    2012-01-01

    The Interstage Element of the Ares V launch vehicle was sized using a commercially available structural sizing software tool. Two different concepts were considered, a metallic design and a composite design. Both concepts were sized using similar levels of analysis fidelity and included the influence of design details on each concept. Additionally, the impact of the different manufacturing techniques and failure mechanisms for composite and metallic construction were considered. Significant details were included in analysis models of each concept, including penetrations for human access, joint connections, as well as secondary loading effects. The designs and results of the analysis were used to determine lifecycle cost estimates for the two Interstage designs. Lifecycle cost estimates were based on industry provided cost data for similar launch vehicle components. The results indicated that significant mass as well as cost savings are attainable for the chosen composite concept as compared with a metallic option.

  7. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.

  8. Valuing morbidity from wildfire smoke exposure: A comparison of revealed and stated preference techniques

    Treesearch

    Leslie Richardson; John B. Loomis; Patricia A. Champ

    2013-01-01

    Estimating the economic benefits of reduced health damages due to improvements in environmental quality continues to challenge economists. We review welfare measures associated with reduced wildfire smoke exposure, and a unique dataset from California's Station Fire of 2009 allows for a comparison of cost of illness (COI) estimates with willingness to pay (WTP)...

  9. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  10. Good Manufacturing Practices (GMP) manufacturing of advanced therapy medicinal products: a novel tailored model for optimizing performance and estimating costs.

    PubMed

    Abou-El-Enein, Mohamed; Römhild, Andy; Kaiser, Daniel; Beier, Carola; Bauer, Gerhard; Volk, Hans-Dieter; Reinke, Petra

    2013-03-01

    Advanced therapy medicinal products (ATMP) have gained considerable attention in academia due to their therapeutic potential. Good Manufacturing Practice (GMP) principles ensure the quality and sterility of manufacturing these products. We developed a model for estimating the manufacturing costs of cell therapy products and optimizing the performance of academic GMP-facilities. The "Clean-Room Technology Assessment Technique" (CTAT) was tested prospectively in the GMP facility of BCRT, Berlin, Germany, then retrospectively in the GMP facility of the University of California-Davis, California, USA. CTAT is a two-level model: level one identifies operational (core) processes and measures their fixed costs; level two identifies production (supporting) processes and measures their variable costs. The model comprises several tools to measure and optimize performance of these processes. Manufacturing costs were itemized using adjusted micro-costing system. CTAT identified GMP activities with strong correlation to the manufacturing process of cell-based products. Building best practice standards allowed for performance improvement and elimination of human errors. The model also demonstrated the unidirectional dependencies that may exist among the core GMP activities. When compared to traditional business models, the CTAT assessment resulted in a more accurate allocation of annual expenses. The estimated expenses were used to set a fee structure for both GMP facilities. A mathematical equation was also developed to provide the final product cost. CTAT can be a useful tool in estimating accurate costs for the ATMPs manufactured in an optimized GMP process. These estimates are useful when analyzing the cost-effectiveness of these novel interventions. Copyright © 2013 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  11. Alternative Models for Small Samples in Psychological Research: Applying Linear Mixed Effects Models and Generalized Estimating Equations to Repeated Measures Data

    ERIC Educational Resources Information Center

    Muth, Chelsea; Bales, Karen L.; Hinde, Katie; Maninger, Nicole; Mendoza, Sally P.; Ferrer, Emilio

    2016-01-01

    Unavoidable sample size issues beset psychological research that involves scarce populations or costly laboratory procedures. When incorporating longitudinal designs these samples are further reduced by traditional modeling techniques, which perform listwise deletion for any instance of missing data. Moreover, these techniques are limited in their…

  12. Using Earned Value Data to Forecast the Duration of Department of Defense (DoD) Space Acquisition Programs

    DTIC Science & Technology

    2015-03-26

    acquisition programs’ cost and schedule. Many prior studies have focused on the overall cost of programs (the cost estimate at completion (EAC)) ( Smoker ...regression ( Smoker , 2011), the Kalman Filter Forecasting Method (Kim, 2007), and analysis of the Integrated Master Schedule (IMS). All of the...A study by Smoker demonstrated this technique by first regressing the BCWP against months and the same approach for BAC (2011). In that study

  13. Theory and Techniques for Assessing the Demand and Supply of Outdoor Recreation in the United States

    Treesearch

    H. Ken Cordell; John C. Bergstrom

    1989-01-01

    As the central analysis for the 1989 Renewable Resources planning Act Assessment, a household market model covering 37 recreational activities was computed for the United States. Equilibrium consumption and costs were estimated, as were likely future changes in consumption and costs in response to expected demand growth and alternative development and access policies...

  14. Something old, something new, something borrowed, something blue: a framework for the marriage of health econometrics and cost-effectiveness analysis.

    PubMed

    Hoch, Jeffrey S; Briggs, Andrew H; Willan, Andrew R

    2002-07-01

    Economic evaluation is often seen as a branch of health economics divorced from mainstream econometric techniques. Instead, it is perceived as relying on statistical methods for clinical trials. Furthermore, the statistic of interest in cost-effectiveness analysis, the incremental cost-effectiveness ratio is not amenable to regression-based methods, hence the traditional reliance on comparing aggregate measures across the arms of a clinical trial. In this paper, we explore the potential for health economists undertaking cost-effectiveness analysis to exploit the plethora of established econometric techniques through the use of the net-benefit framework - a recently suggested reformulation of the cost-effectiveness problem that avoids the reliance on cost-effectiveness ratios and their associated statistical problems. This allows the formulation of the cost-effectiveness problem within a standard regression type framework. We provide an example with empirical data to illustrate how a regression type framework can enhance the net-benefit method. We go on to suggest that practical advantages of the net-benefit regression approach include being able to use established econometric techniques, adjust for imperfect randomisation, and identify important subgroups in order to estimate the marginal cost-effectiveness of an intervention. Copyright 2002 John Wiley & Sons, Ltd.

  15. [Unit cost variation in a social security company in Querétaro, México].

    PubMed

    Villarreal-Ríos, Enrique; Campos-Esparza, Maribel; Garza-Elizondo, María E; Martínez-González, Lidia; Núñez-Rocha, Georgina M; Romero-Islas, Nestor R

    2006-01-01

    Comparing unit cost variation between departments and reasons for consultation in outpatient health services provided by a social security company from Querétaro, Mexico. A study of costs (in US dollars) was carried out in outpatient health service units during 2004. Fixed unit costs were estimated per department and adjusted for one year's productivity. Material, physical and consumer resources were included. Weighting was assigned to resources invested in each department. Unit cost was estimated by using the micro cost technique; medicaments, materials used during treatment and reagents were considered to be consumer items. Unit cost resulted from adding fixed unit cost to the variable unit cost corresponding to the reason for consulting. Units costs were then compared between the medical units. Unit cost per month for diabetic treatment varied from 34.8 US dollars, 32,2 US dollars to US 34 US dollars, pap smear screening test costs were 7,2 US dollars, 8,7 US dollars and 7,3 US dollars and dental treatment 27 US dollars, 33 US dollars, 6 and 28,7 US dollars. Unit cost variation was more important in the emergency room and the dental service.

  16. Micro-costing studies in the health and medical literature: protocol for a systematic review

    PubMed Central

    2014-01-01

    Background Micro-costing is a cost estimation method that allows for precise assessment of the economic costs of health interventions. It has been demonstrated to be particularly useful for estimating the costs of new interventions, for interventions with large variability across providers, and for estimating the true costs to the health system and to society. However, existing guidelines for economic evaluations do not provide sufficient detail of the methods and techniques to use when conducting micro-costing analyses. Therefore, the purpose of this study is to review the current literature on micro-costing studies of health and medical interventions, strategies, and programs to assess the variation in micro-costing methodology and the quality of existing studies. This will inform current practice in conducting and reporting micro-costing studies and lead to greater standardization in methodology in the future. Methods/Design We will perform a systematic review of the current literature on micro-costing studies of health and medical interventions, strategies, and programs. Using rigorously designed search strategies, we will search Ovid MEDLINE, EconLit, BIOSIS Previews, Embase, Scopus, and the National Health Service Economic Evaluation Database (NHS EED) to identify relevant English-language articles. These searches will be supplemented by a review of the references of relevant articles identified. Two members of the review team will independently extract detailed information on the design and characteristics of each included article using a standardized data collection form. A third reviewer will be consulted to resolve discrepancies. We will use checklists that have been developed for critical appraisal of health economics studies to evaluate the quality and potential risk of bias of included studies. Discussion This systematic review will provide useful information to help standardize the methods and techniques for conducting and reporting micro-costing studies in research, which can improve the quality and transparency of future studies and enhance comparability and interpretation of findings. In the long run, these efforts will facilitate clinical and health policy decision-making about resource allocation. Trial registration Systematic review registration: PROSPERO CRD42014007453. PMID:24887208

  17. Comprehensive investigation into historical pipeline construction costs and engineering economic analysis of Alaska in-state gas pipeline

    NASA Astrophysics Data System (ADS)

    Rui, Zhenhua

    This study analyzes historical cost data of 412 pipelines and 220 compressor stations. On the basis of this analysis, the study also evaluates the feasibility of an Alaska in-state gas pipeline using Monte Carlo simulation techniques. Analysis of pipeline construction costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary by diameter, length, volume, year, and location. Overall average learning rates for pipeline material and labor costs are 6.1% and 12.4%, respectively. Overall average cost shares for pipeline material, labor, miscellaneous, and right of way (ROW) are 31%, 40%, 23%, and 7%, respectively. Regression models are developed to estimate pipeline component costs for different lengths, cross-sectional areas, and locations. An analysis of inaccuracy in pipeline cost estimation demonstrates that the cost estimation of pipeline cost components is biased except for in the case of total costs. Overall overrun rates for pipeline material, labor, miscellaneous, ROW, and total costs are 4.9%, 22.4%, -0.9%, 9.1%, and 6.5%, respectively, and project size, capacity, diameter, location, and year of completion have different degrees of impacts on cost overruns of pipeline cost components. Analysis of compressor station costs shows that component costs, shares of cost components, and learning rates for material and labor costs vary in terms of capacity, year, and location. Average learning rates for compressor station material and labor costs are 12.1% and 7.48%, respectively. Overall average cost shares of material, labor, miscellaneous, and ROW are 50.6%, 27.2%, 21.5%, and 0.8%, respectively. Regression models are developed to estimate compressor station component costs in different capacities and locations. An investigation into inaccuracies in compressor station cost estimation demonstrates that the cost estimation for compressor stations is biased except for in the case of material costs. Overall average overrun rates for compressor station material, labor, miscellaneous, land, and total costs are 3%, 60%, 2%, -14%, and 11%, respectively, and cost overruns for cost components are influenced by location and year of completion to different degrees. Monte Carlo models are developed and simulated to evaluate the feasibility of an Alaska in-state gas pipeline by assigning triangular distribution of the values of economic parameters. Simulated results show that the construction of an Alaska in-state natural gas pipeline is feasible at three scenarios: 500 million cubic feet per day (mmcfd), 750 mmcfd, and 1000 mmcfd.

  18. Life-Cycle Cost/Benefit Assessment of Expedite Departure Path (EDP)

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Chang, Paul; Datta, Koushik

    2005-01-01

    This report presents a life-cycle cost/benefit assessment (LCCBA) of Expedite Departure Path (EDP), an air traffic control Decision Support Tool (DST) currently under development at NASA. This assessment is an update of a previous study performed by bd Systems, Inc. (bd) during FY01, with the following revisions: The life-cycle cost assessment methodology developed by bd for the previous study was refined and calibrated using Free Flight Phase 1 (FFP1) cost information for Traffic Management Advisor (TMA, or TMA-SC in the FAA's terminology). Adjustments were also made to the site selection and deployment scheduling methodology to include airspace complexity as a factor. This technique was also applied to the benefit extrapolation methodology to better estimate potential benefits for other years, and at other sites. This study employed a new benefit estimating methodology because bd s previous single year potential benefit assessment of EDP used unrealistic assumptions that resulted in optimistic estimates. This methodology uses an air traffic simulation approach to reasonably predict the impacts from the implementation of EDP. The results of the costs and benefits analyses were then integrated into a life-cycle cost/benefit assessment.

  19. Microdiamond grade as a regionalised variable - some basic requirements for successful local microdiamond resource estimation of kimberlites

    NASA Astrophysics Data System (ADS)

    Stiefenhofer, Johann; Thurston, Malcolm L.; Bush, David E.

    2018-04-01

    Microdiamonds offer several advantages as a resource estimation tool, such as access to deeper parts of a deposit which may be beyond the reach of large diameter drilling (LDD) techniques, the recovery of the total diamond content in the kimberlite, and a cost benefit due to the cheaper treatment cost compared to large diameter samples. In this paper we take the first step towards local estimation by showing that micro-diamond samples can be treated as a regionalised variable suitable for use in geostatistical applications and we show examples of such output. Examples of microdiamond variograms are presented, the variance-support relationship for microdiamonds is demonstrated and consistency of the diamond size frequency distribution (SFD) is shown with the aid of real datasets. The focus therefore is on why local microdiamond estimation should be possible, not how to generate such estimates. Data from our case studies and examples demonstrate a positive correlation between micro- and macrodiamond sample grades as well as block estimates. This relationship can be demonstrated repeatedly across multiple mining operations. The smaller sample support size for microdiamond samples is a key difference between micro- and macrodiamond estimates and this aspect must be taken into account during the estimation process. We discuss three methods which can be used to validate or reconcile the estimates against macrodiamond data, either as estimates or in the form of production grades: (i) reconcilliation using production data, (ii) by comparing LDD-based grade estimates against microdiamond-based estimates and (iii) using simulation techniques.

  20. Planning for Downtown Circulation Systems. Volume 2. Analysis Techniques.

    DOT National Transportation Integrated Search

    1983-10-01

    This volume contains the analysis and refinement stages of downtown circulator planning. Included are sections on methods for estimating patronage, costs, revenues, and impacts, and a section on methods for performing micro-level analyses.

  1. [Direct costs of medical care for patients with type 2 diabetes mellitus in Mexico micro-costing analysis].

    PubMed

    Rodríguez Bolaños, Rosibel de Los Ángeles; Reynales Shigematsu, Luz Myriam; Jiménez Ruíz, Jorge Alberto; Juárez Márquezy, Sergio Arturo; Hernández Ávila, Mauricio

    2010-12-01

    Estimate the direct cost of medical care incurred by the Mexican Social Security Institute (IMSS, Instituto Mexicano del Seguro Social) for patients with type 2 diabetes mellitus (DM2). The clinical files of 497 patients who were treated in secondary and tertiary medical care units in 2002-2004 were reviewed. Costs were quantified using a disease costing approach (DCA) from the provider's perspective, a micro-costing technique, and a bottom-up methodology. Average annual costs by diagnosis, complication, and total cost were estimated. Total IMSS DM2 annual costs were US$452 064 988, or 3.1% of operating expenses. The annual average cost per patient was US$3 193.75, with US$2 740.34 per patient without complications and US$3 550.17 per patient with complications. Hospitalization and intensive care bed-days generated the greatest expenses. The high cost of providing medical care to patients with DM2 and its complications represents an economic burden that health institutions should consider in their budgets to enable them to offer quality service that is both adequate and timely. Using the micro-costing methodology allows an approximation to real data on utilization and management of the disease.

  2. Improved Battery State Estimation Using Novel Sensing Techniques

    NASA Astrophysics Data System (ADS)

    Abdul Samad, Nassim

    Lithium-ion batteries have been considered a great complement or substitute for gasoline engines due to their high energy and power density capabilities among other advantages. However, these types of energy storage devices are still yet not widespread, mainly because of their relatively high cost and safety issues, especially at elevated temperatures. This thesis extends existing methods of estimating critical battery states using model-based techniques augmented by real-time measurements from novel temperature and force sensors. Typically, temperature sensors are located near the edge of the battery, and away from the hottest core cell regions, which leads to slower response times and increased errors in the prediction of core temperatures. New sensor technology allows for flexible sensor placement at the cell surface between cells in a pack. This raises questions about the optimal locations of these sensors for best observability and temperature estimation. Using a validated model, which is developed and verified using experiments in laboratory fixtures that replicate vehicle pack conditions, it is shown that optimal sensor placement can lead to better and faster temperature estimation. Another equally important state is the state of health or the capacity fading of the cell. This thesis introduces a novel method of using force measurements for capacity fade estimation. Monitoring capacity is important for defining the range of electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs). Current capacity estimation techniques require a full discharge to monitor capacity. The proposed method can complement or replace current methods because it only requires a shallow discharge, which is especially useful in EVs and PHEVs. Using the accurate state estimation accomplished earlier, a method for downsizing a battery pack is shown to effectively reduce the number of cells in a pack without compromising safety. The influence on the battery performance (e.g. temperature, utilization, capacity fade, and cost) while downsizing and shifting the nominal operating SOC is demonstrated via simulations. The contributions in this thesis aim to make EVs, HEVs and PHEVs less costly while maintaining safety and reliability as more people are transitioning towards more environmentally friendly means of transportation.

  3. Estimation of under-reporting in epidemics using approximations.

    PubMed

    Gamado, Kokouvi; Streftaris, George; Zachary, Stan

    2017-06-01

    Under-reporting in epidemics, when it is ignored, leads to under-estimation of the infection rate and therefore of the reproduction number. In the case of stochastic models with temporal data, a usual approach for dealing with such issues is to apply data augmentation techniques through Bayesian methodology. Departing from earlier literature approaches implemented using reversible jump Markov chain Monte Carlo (RJMCMC) techniques, we make use of approximations to obtain faster estimation with simple MCMC. Comparisons among the methods developed here, and with the RJMCMC approach, are carried out and highlight that approximation-based methodology offers useful alternative inference tools for large epidemics, with a good trade-off between time cost and accuracy.

  4. Modeling rheumatoid arthritis using different techniques - a review of model construction and results.

    PubMed

    Scholz, Stefan; Mittendorf, Thomas

    2014-12-01

    Rheumatoid arthritis (RA) is a chronic, inflammatory disease with severe effects on the functional ability of patients. Due to the prevalence of 0.5 to 1.0 percent in western countries, new treatment options are a major concern for decision makers with regard to their budget impact. In this context, cost-effectiveness analyses are a helpful tool to evaluate new treatment options for reimbursement schemes. To analyze and compare decision analytic modeling techniques and to explore their use in RA with regard to their advantages and shortcomings. A systematic literature review was conducted in PubMED and 58 studies reporting health economics decision models were analyzed with regard to the modeling technique used. From the 58 reviewed publications, we found 13 reporting decision tree-analysis, 25 (cohort) Markov models, 13 publications on individual sampling methods (ISM) and seven discrete event simulations (DES). Thereby 26 studies were identified as presenting independently developed models and 32 models as adoptions. The modeling techniques used were found to differ in their complexity and in the number of treatment options compared. Methodological features are presented in the article and a comprehensive overview of the cost-effectiveness estimates is given in Additional files 1 and 2. When compared to the other modeling techniques, ISM and DES have advantages in the coverage of patient heterogeneity and, additionally, DES is capable to model more complex treatment sequences and competing risks in RA-patients. Nevertheless, the availability of sufficient data is necessary to avoid assumptions in ISM and DES exercises, thereby enabling biased results. Due to the different settings, time frames and interventions in the reviewed publications, no direct comparison of modeling techniques was applicable. The results from other indications suggest that incremental cost-effective ratios (ICERs) do not differ significantly between Markov and DES models, but DES is able to report more outcome parameters. Given a sufficient data supply, DES is the modeling technique of choice when modeling cost-effectiveness in RA. Otherwise transparency on the data inputs is crucial for valid results and to inform decision makers about possible biases. With regard to ICERs, Markov models might provide similar estimates as more advanced modeling techniques.

  5. Evaluation of the reference unit method for herbaceous biomass estimation in native grasslands of southwestern South Dakota

    Treesearch

    Eric D. Boyda

    2013-01-01

    The high costs associated with physically harvesting plant biomass may prevent sufficient data collection, which is necessary to account for the natural variability of vegetation at a landscape scale. A biomass estimation technique was previously developed using representative samples or "reference units", which eliminated the need to harvest biomass from all...

  6. Estimating the Latent Number of Types in Growing Corpora with Reduced Cost-Accuracy Trade-Off

    ERIC Educational Resources Information Center

    Hidaka, Shohei

    2016-01-01

    The number of unique words in children's speech is one of most basic statistics indicating their language development. We may, however, face difficulties when trying to accurately evaluate the number of unique words in a child's growing corpus over time with a limited sample size. This study proposes a novel technique to estimate the latent number…

  7. The Design-To-Cost Manifold

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1990-01-01

    Design-to-cost is a popular technique for controlling costs. Although qualitative techniques exist for implementing design to cost, quantitative methods are sparse. In the launch vehicle and spacecraft engineering process, the question whether to minimize mass is usually an issue. The lack of quantification in this issue leads to arguments on both sides. This paper presents a mathematical technique which both quantifies the design-to-cost process and the mass/complexity issue. Parametric cost analysis generates and applies mathematical formulas called cost estimating relationships. In their most common forms, they are continuous and differentiable. This property permits the application of the mathematics of differentiable manifolds. Although the terminology sounds formidable, the application of the techniques requires only a knowledge of linear algebra and ordinary differential equations, common subjects in undergraduate scientific and engineering curricula. When the cost c is expressed as a differentiable function of n system metrics, setting the cost c to be a constant generates an n-1 dimensional subspace of the space of system metrics such that any set of metric values in that space satisfies the constant design-to-cost criterion. This space is a differentiable manifold upon which all mathematical properties of a differentiable manifold may be applied. One important property is that an easily implemented system of ordinary differential equations exists which permits optimization of any function of the system metrics, mass for example, over the design-to-cost manifold. A dual set of equations defines the directions of maximum and minimum cost change. A simplified approximation of the PRICE H(TM) production-production cost is used to generate this set of differential equations over [mass, complexity] space. The equations are solved in closed form to obtain the one dimensional design-to-cost trade and design-for-cost spaces. Preliminary results indicate that cost is relatively insensitive to changes in mass and that the reduction of complexity, both in the manufacturing process and of the spacecraft, is dominant in reducing cost.

  8. POSTPROCESSING MIXED FINITE ELEMENT METHODS FOR SOLVING CAHN-HILLIARD EQUATION: METHODS AND ERROR ANALYSIS

    PubMed Central

    Wang, Wansheng; Chen, Long; Zhou, Jie

    2015-01-01

    A postprocessing technique for mixed finite element methods for the Cahn-Hilliard equation is developed and analyzed. Once the mixed finite element approximations have been computed at a fixed time on the coarser mesh, the approximations are postprocessed by solving two decoupled Poisson equations in an enriched finite element space (either on a finer grid or a higher-order space) for which many fast Poisson solvers can be applied. The nonlinear iteration is only applied to a much smaller size problem and the computational cost using Newton and direct solvers is negligible compared with the cost of the linear problem. The analysis presented here shows that this technique remains the optimal rate of convergence for both the concentration and the chemical potential approximations. The corresponding error estimate obtained in our paper, especially the negative norm error estimates, are non-trivial and different with the existing results in the literatures. PMID:27110063

  9. Induction motor broken rotor bar fault location detection through envelope analysis of start-up current using Hilbert transform

    NASA Astrophysics Data System (ADS)

    Abd-el-Malek, Mina; Abdelsalam, Ahmed K.; Hassan, Ola E.

    2017-09-01

    Robustness, low running cost and reduced maintenance lead Induction Motors (IMs) to pioneerly penetrate the industrial drive system fields. Broken rotor bars (BRBs) can be considered as an important fault that needs to be early assessed to minimize the maintenance cost and labor time. The majority of recent BRBs' fault diagnostic techniques focus on differentiating between healthy and faulty rotor cage. In this paper, a new technique is proposed for detecting the location of the broken bar in the rotor. The proposed technique relies on monitoring certain statistical parameters estimated from the analysis of the start-up stator current envelope. The envelope of the signal is obtained using Hilbert Transformation (HT). The proposed technique offers non-invasive, fast computational and accurate location diagnostic process. Various simulation scenarios are presented that validate the effectiveness of the proposed technique.

  10. A comparative cost analysis of robot-assisted versus traditional laparoscopic partial nephrectomy.

    PubMed

    Hyams, Elias; Pierorazio, Philip; Mullins, Jeffrey K; Ward, Maryann; Allaf, Mohamad

    2012-07-01

    Robot-assisted laparoscopic partial nephrectomy (RALPN) is supplanting traditional laparoscopic partial nephrectomy (LPN) as the technique of choice for minimally invasive nephron-sparing surgery. This evolution has resulted from potential clinical benefits, as well as proliferation of robotic systems and patient demand for robot-assisted surgery. We sought to quantify the costs associated with the use of robotics for minimally invasive partial nephrectomy. A cost analysis was performed for 20 consecutive robot-assisted partial nephrectomy (RPN) and LPN patients at our institution from 2009 to 2010. Data included actual perioperative and hospitalization costs as well as professional fees. Capital costs were estimated using purchase costs and amortization of two robotic systems from 2001 to 2009, as well as maintenance contract costs. The estimated cost/case was obtained using total robotic surgical volume during this period. Total estimated costs were compared between groups. A separate analysis was performed assuming "ideal" robotic utilization during a comparable period. RALPN had a cost premium of +$1066/case compared with LPN, assuming actual robot utilization from 2001 to 2009. Assuming "ideal" utilization during a comparable period, this premium decreased to +$334; capital costs per case decreased from $1907 to $1175. Tumor size, operative time, and length of stay were comparable between groups. RALPN is associated with a small to moderate cost premium depending on assumptions regarding robotic surgical volume. Saturated utilization of robotic systems decreases attributable capital costs and makes comparison with laparoscopy more favorable. Purported clinical benefits of RPN (eg, decreased warm ischemia time, increased utilization of nephron-sparing surgery) need further study, because these may have cost implications.

  11. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    PubMed

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  12. Billing and insurance-related administrative costs in United States' health care: synthesis of micro-costing evidence.

    PubMed

    Jiwani, Aliya; Himmelstein, David; Woolhandler, Steffie; Kahn, James G

    2014-11-13

    The United States' multiple-payer health care system requires substantial effort and costs for administration, with billing and insurance-related (BIR) activities comprising a large but incompletely characterized proportion. A number of studies have quantified BIR costs for specific health care sectors, using micro-costing techniques. However, variation in the types of payers, providers, and BIR activities across studies complicates estimation of system-wide costs. Using a consistent and comprehensive definition of BIR (including both public and private payers, all providers, and all types of BIR activities), we synthesized and updated available micro-costing evidence in order to estimate total and added BIR costs for the U.S. health care system in 2012. We reviewed BIR micro-costing studies across healthcare sectors. For physician practices, hospitals, and insurers, we estimated the % BIR using existing research and publicly reported data, re-calculated to a standard and comprehensive definition of BIR where necessary. We found no data on % BIR in other health services or supplies settings, so extrapolated from known sectors. We calculated total BIR costs in each sector as the product of 2012 U.S. national health expenditures and the percentage of revenue used for BIR. We estimated "added" BIR costs by comparing total BIR costs in each sector to those observed in existing, simplified financing systems (Canada's single payer system for providers, and U.S. Medicare for insurers). Due to uncertainty in inputs, we performed sensitivity analyses. BIR costs in the U.S. health care system totaled approximately $471 ($330 - $597) billion in 2012. This includes $70 ($54 - $76) billion in physician practices, $74 ($58 - $94) billion in hospitals, an estimated $94 ($47 - $141) billion in settings providing other health services and supplies, $198 ($154 - $233) billion in private insurers, and $35 ($17 - $52) billion in public insurers. Compared to simplified financing, $375 ($254 - $507) billion, or 80%, represents the added BIR costs of the current multi-payer system. A simplified financing system in the U.S. could result in cost savings exceeding $350 billion annually, nearly 15% of health care spending.

  13. Cost-effectiveness analysis of low-molecular-weight heparin versus aspirin thromboprophylaxis in patients newly diagnosed with multiple myeloma.

    PubMed

    Chalayer, Emilie; Bourmaud, Aurélie; Tinquaut, Fabien; Chauvin, Franck; Tardy, Bernard

    2016-09-01

    The aim of this study was to assess the cost-effectiveness of low molecular weight heparin versus aspirin as primary thromboprophylaxis throughout chemotherapy for newly diagnosed multiple myeloma patients treated with protocols including thalidomide from the perspective of French health care providers. We used a modeling approach combining data from the only randomized trial evaluating the efficacy of the two treatments and secondary sources for costs, and utility values. We performed a decision-tree analysis and our base case was a hypothetical cohort of 10,000 patients. A bootstrap resampling technique was used. The incremental cost-effectiveness ratio was calculated using estimated quality-adjusted life years as the efficacy outcome. Incremental costs and effectiveness were estimated for each strategy and the incremental cost-effectiveness ratio was calculated. One-way sensitivity analyses were performed. The number of quality-adjusted life years was estimated to be 0.300 with aspirin and 0.299 with heparin. The estimated gain with aspirin was therefore approximately one day. Over 6months, the mean total cost was € 1518 (SD=601) per patient in the heparin arm and € 273 (SD=1019) in the aspirin arm. This resulted in an incremental cost of € 1245 per patient treated with heparin. The incremental cost-effectiveness ratio for the aspirin versus heparin strategy was calculated to be - 687,398 € (95% CI, -13,457,369 to -225,385). Aspirin rather than heparin thromboprophylaxis, during the first six months of chemotherapy for myeloma, is associated with significant cost savings per patient and also with an unexpected slight increase in quality of life. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Feasibility study for automatic reduction of phase change imagery

    NASA Technical Reports Server (NTRS)

    Nossaman, G. O.

    1971-01-01

    The feasibility of automatically reducing a form of pictorial aerodynamic heating data is discussed. The imagery, depicting the melting history of a thin coat of fusible temperature indicator painted on an aerodynamically heated model, was previously reduced by manual methods. Careful examination of various lighting theories and approaches led to an experimentally verified illumination concept capable of yielding high-quality imagery. Both digital and video image processing techniques were applied to reduction of the data, and it was demonstrated that either method can be used to develop superimposed contours. Mathematical techniques were developed to find the model-to-image and the inverse image-to-model transformation using six conjugate points, and methods were developed using these transformations to determine heating rates on the model surface. A video system was designed which is able to reduce the imagery rapidly, economically and accurately. Costs for this system were estimated. A study plan was outlined whereby the mathematical transformation techniques developed to produce model coordinate heating data could be applied to operational software, and methods were discussed and costs estimated for obtaining the digital information necessary for this software.

  15. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  16. Economic burden of torture for a refugee host country: development of a model and presentation of a country case study.

    PubMed

    Mpinga, Emmanuel Kabengele; Frey, Conrad; Chastonay, Philippe

    2014-01-01

    Torture is an important social and political problem worldwide that affects millions of people. Many host countries give victims of torture the status of refugee and take care of them as far as basic needs; health care, professional reinsertion, and education. Little is known about the costs of torture. However, this knowledge could serve as an additional argument for the prevention and social mobilization to fight against torture and to provide a powerful basis of advocacy for rehabilitation programs and judiciary claims. Development of a model for estimating the economic costs of torture and applying the model to a specific country. The estimation of the possible prevalence of victims of torture was based on a review of the literature. The identification of the socioeconomic factors to be considered was done by analogy with various health problems. The estimation of the loss of the productivity and of the economic burden of disease related to torture was done through the human capital approach and the component technique analysis. The model was applied to the situation in Switzerland of estimated torture victims Switzerland is confronted with. When applied to the case study, the direct costs - such as housing, food, and clothing - represent roughly 130 million Swiss francs (CHF) per year; whereas, health care costs amount to 16 million CHF per year, and the costs related to education of young people to 34 million CHF per year. Indirect costs, namely those costs related to the loss of the productivity of direct survivors of torture, have been estimated to one-third of 1 billion CHF per year. This jumps to 10,073,419,200 CHF in the loss of productivity if one would consider 30 years of loss per survivor. Our study shows that a rough estimation of the costs related to torture is possible with some prerequisites, such as access to social and economic indicators at the country level.

  17. How economics can further the success of ecological restoration.

    PubMed

    Iftekhar, Md Sayed; Polyakov, Maksym; Ansell, Dean; Gibson, Fiona; Kay, Geoffrey M

    2017-04-01

    Restoration scientists and practitioners have recently begun to include economic and social aspects in the design and investment decisions for restoration projects. With few exceptions, ecological restoration studies that include economics focus solely on evaluating costs of restoration projects. However, economic principles, tools, and instruments can be applied to a range of other factors that affect project success. We considered the relevance of applying economics to address 4 key challenges of ecological restoration: assessing social and economic benefits, estimating overall costs, project prioritization and selection, and long-term financing of restoration programs. We found it is uncommon to consider all types of benefits (such as nonmarket values) and costs (such as transaction costs) in restoration programs. Total benefit of a restoration project can be estimated using market prices and various nonmarket valuation techniques. Total cost of a project can be estimated using methods based on property or land-sale prices, such as hedonic pricing method and organizational surveys. Securing continuous (or long-term) funding is also vital to accomplishing restoration goals and can be achieved by establishing synergy with existing programs, public-private partnerships, and financing through taxation. © 2016 Society for Conservation Biology.

  18. The economic burden of skin disease in the United States.

    PubMed

    Dehkharghani, Seena; Bible, Jason; Chen, John G; Feldman, Steven R; Fleischer, Alan B

    2003-04-01

    Skin diseases and their complications are a significant burden on the nation, both in terms of acute and chronic morbidities and their related expenditures for care. Because accurately calculating the cost of skin disease has proven difficult in the past, we present here multiple comparative techniques allowing a more expanded approach to estimating the overall economic burden. Our aims were to (1) determine the economic burden of primary diseases falling within the realm of skin disease, as defined by modern clinical disease classification schemes and (2) identify the specific contribution of each component of costs to the overall expense. Costs were taken as the sum of several factors, divided into direct and indirect health care costs. The direct costs included inpatient hospital costs, ambulatory visit costs (further divided into physician's office visits, outpatient department visits, and emergency department visits), prescription drug costs, and self-care/over-the-counter drug costs. Indirect costs were calculated as the outlay of days of work lost because of skin diseases. The economic burden of skin disease in the United States is large, estimated at approximately $35.9 billion for 1997, including $19.8 billion (54%) in ambulatory care costs; $7.2 billion (20.2%) in hospital inpatient charges; $3.0 billion (8.2%) in prescription drug costs; $4.3 billion (11.7%) in over-the-counter preparations; and $1.6 billion (6.0%) in indirect costs attributable to lost workdays. Our determination of the economic burden of skin care in the United States surpasses past estimates several-fold, and the model presented for calculating cost of illness allows for tracking changes in national expenses for skin care in future studies. The amount of estimated resources devoted to skin disease management is far more than required to treat conditions such as urinary incontinence ($16 billion) and hypertension ($23 billion), but far less than required to treat musculoskeletal conditions ($193 billion).

  19. Adaptive neuro fuzzy inference system-based power estimation method for CMOS VLSI circuits

    NASA Astrophysics Data System (ADS)

    Vellingiri, Govindaraj; Jayabalan, Ramesh

    2018-03-01

    Recent advancements in very large scale integration (VLSI) technologies have made it feasible to integrate millions of transistors on a single chip. This greatly increases the circuit complexity and hence there is a growing need for less-tedious and low-cost power estimation techniques. The proposed work employs Back-Propagation Neural Network (BPNN) and Adaptive Neuro Fuzzy Inference System (ANFIS), which are capable of estimating the power precisely for the complementary metal oxide semiconductor (CMOS) VLSI circuits, without requiring any knowledge on circuit structure and interconnections. The ANFIS to power estimation application is relatively new. Power estimation using ANFIS is carried out by creating initial FIS modes using hybrid optimisation and back-propagation (BP) techniques employing constant and linear methods. It is inferred that ANFIS with the hybrid optimisation technique employing the linear method produces better results in terms of testing error that varies from 0% to 0.86% when compared to BPNN as it takes the initial fuzzy model and tunes it by means of a hybrid technique combining gradient descent BP and mean least-squares optimisation algorithms. ANFIS is the best suited for power estimation application with a low RMSE of 0.0002075 and a high coefficient of determination (R) of 0.99961.

  20. The MERMAID project

    NASA Technical Reports Server (NTRS)

    Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A

    1992-01-01

    The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.

  1. Evaluating uncertainties in multi-layer soil moisture estimation with support vector machines and ensemble Kalman filtering

    NASA Astrophysics Data System (ADS)

    Liu, Di; Mishra, Ashok K.; Yu, Zhongbo

    2016-07-01

    This paper examines the combination of support vector machines (SVM) and the dual ensemble Kalman filter (EnKF) technique to estimate root zone soil moisture at different soil layers up to 100 cm depth. Multiple experiments are conducted in a data rich environment to construct and validate the SVM model and to explore the effectiveness and robustness of the EnKF technique. It was observed that the performance of SVM relies more on the initial length of training set than other factors (e.g., cost function, regularization parameter, and kernel parameters). The dual EnKF technique proved to be efficient to improve SVM with observed data either at each time step or at a flexible time steps. The EnKF technique can reach its maximum efficiency when the updating ensemble size approaches a certain threshold. It was observed that the SVM model performance for the multi-layer soil moisture estimation can be influenced by the rainfall magnitude (e.g., dry and wet spells).

  2. The trade-off between hospital cost and quality of care. An exploratory empirical analysis.

    PubMed

    Morey, R C; Fine, D J; Loree, S W; Retzlaff-Roberts, D L; Tsubakitani, S

    1992-08-01

    The debate concerning quality of care in hospitals, its "value" and affordability, is increasingly of concern to providers, consumers, and purchasers in the United States and elsewhere. We undertook an exploratory study to estimate the impact on hospital-wide costs if quality-of-care levels were varied. To do so, we obtained costs and service output data regarding 300 U.S. hospitals, representing approximately a 5% cross section of all hospitals operating in 1983; both inpatient and outpatient services were included. The quality-of-care measure used for the exploratory analysis was the ratio of actual deaths in the hospital for the year in question to the forecasted number of deaths for the hospital; the hospital mortality forecaster had earlier (and elsewhere) been built from analyses of 6 million discharge abstracts, and took into account each hospital's actual individual admissions, including key patient descriptors for each admission. Such adjusted death rates have increasingly been used as potential indicators of quality, with recent research lending support for the viability of that linkage. The authors then utilized the economic construct of allocative efficiency relying on "best practices" concepts and peer groupings, built using the "envelopment" philosophy of Data Envelopment Analysis and Pareto efficiency. These analytical techniques estimated the efficiently delivered costs required to meet prespecified levels of quality of care. The marginal additional cost per each death deferred in 1983 was estimated to be approximately $29,000 (in 1990 dollars) for the average efficient hospital. Also, over a feasible range, a 1% increase in the level of quality of care delivered was estimated to increase hospital cost by an average of 1.34%. This estimated elasticity of quality on cost also increased with the number of beds in the hospital.

  3. Asymptotic Analysis Of The Total Least Squares ESPRIT Algorithm'

    NASA Astrophysics Data System (ADS)

    Ottersten, B. E.; Viberg, M.; Kailath, T.

    1989-11-01

    This paper considers the problem of estimating the parameters of multiple narrowband signals arriving at an array of sensors. Modern approaches to this problem often involve costly procedures for calculating the estimates. The ESPRIT (Estimation of Signal Parameters via Rotational Invariance Techniques) algorithm was recently proposed as a means for obtaining accurate estimates without requiring a costly search of the parameter space. This method utilizes an array invariance to arrive at a computationally efficient multidimensional estimation procedure. Herein, the asymptotic distribution of the estimation error is derived for the Total Least Squares (TLS) version of ESPRIT. The Cramer-Rao Bound (CRB) for the ESPRIT problem formulation is also derived and found to coincide with the variance of the asymptotic distribution through numerical examples. The method is also compared to least squares ESPRIT and MUSIC as well as to the CRB for a calibrated array. Simulations indicate that the theoretic expressions can be used to accurately predict the performance of the algorithm.

  4. Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information

    NASA Technical Reports Server (NTRS)

    Butts, Glenn

    2007-01-01

    Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.

  5. Dynamic Programming and Error Estimates for Stochastic Control Problems with Maximum Cost

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bokanowski, Olivier, E-mail: boka@math.jussieu.fr; Picarelli, Athena, E-mail: athena.picarelli@inria.fr; Zidani, Hasnaa, E-mail: hasnaa.zidani@ensta.fr

    2015-02-15

    This work is concerned with stochastic optimal control for a running maximum cost. A direct approach based on dynamic programming techniques is studied leading to the characterization of the value function as the unique viscosity solution of a second order Hamilton–Jacobi–Bellman (HJB) equation with an oblique derivative boundary condition. A general numerical scheme is proposed and a convergence result is provided. Error estimates are obtained for the semi-Lagrangian scheme. These results can apply to the case of lookback options in finance. Moreover, optimal control problems with maximum cost arise in the characterization of the reachable sets for a system ofmore » controlled stochastic differential equations. Some numerical simulations on examples of reachable analysis are included to illustrate our approach.« less

  6. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  7. Design techniques for modular integrated utility systems. [energy production and conversion efficiency

    NASA Technical Reports Server (NTRS)

    Wolfer, B. M.

    1977-01-01

    Features basic to the integrated utility system, such as solid waste incineration, heat recovery and usage, and water recycling/treatment, are compared in terms of cost, fuel conservation, and efficiency to conventional utility systems in the same mean-climatic area of Washington, D. C. The larger of the two apartment complexes selected for the test showed the more favorable results in the three areas of comparison. Restrictions concerning the sole use of currently available technology are hypothetically removed to consider the introduction and possible advantages of certain advanced techniques in an integrated utility system; recommendations are made and costs are estimated for each type of system.

  8. Cost-Benefit Analysis for the Advanced Near Net Shape Technology (ANNST) Method for Fabricating Stiffened Cylinders

    NASA Technical Reports Server (NTRS)

    Ivanco, Marie L.; Domack, Marcia S.; Stoner, Mary Cecilia; Hehir, Austin R.

    2016-01-01

    Low Technology Readiness Levels (TRLs) and high levels of uncertainty make it challenging to develop cost estimates of new technologies in the R&D phase. It is however essential for NASA to understand the costs and benefits associated with novel concepts, in order to prioritize research investments and evaluate the potential for technology transfer and commercialization. This paper proposes a framework to perform a cost-benefit analysis of a technology in the R&D phase. This framework was developed and used to assess the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. Following the definition of a case study for a cryogenic tank cylinder of specified geometry, data was gathered through interviews with Subject Matter Experts (SMEs), with particular focus placed on production costs and process complexity. This data served as the basis to produce process flowcharts and timelines, mass estimates, and rough order-of-magnitude cost and schedule estimates. The scalability of the results was subsequently investigated to understand the variability of the results based on tank size. Lastly, once costs and benefits were identified, the Analytic Hierarchy Process (AHP) was used to assess the relative value of these achieved benefits for potential stakeholders. These preliminary, rough order-of-magnitude results predict a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Compared to the composite manufacturing technique, these results predict cost savings of 35 to 58 percent; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels when compared with conventional metallic manufacturing. The AHP study results revealed that decreased final cylinder mass and improved quality assurance were the most valued benefits of cylinder manufacturing methods, therefore emphasizing the relevance of the benefits achieved with the ANNST process for future projects.

  9. Component costs of foodborne illness: a scoping review

    PubMed Central

    2014-01-01

    Background Governments require high-quality scientific evidence to prioritize resource allocation and the cost-of-illness (COI) methodology is one technique used to estimate the economic burden of a disease. However, variable cost inventories make it difficult to interpret and compare costs across multiple studies. Methods A scoping review was conducted to identify the component costs and the respective data sources used for estimating the cost of foodborne illnesses in a population. This review was accomplished by: (1) identifying the research question and relevant literature, (2) selecting the literature, (3) charting, collating, and summarizing the results. All pertinent data were extracted at the level of detail reported in a study, and the component cost and source data were subsequently grouped into themes. Results Eighty-four studies were identified that described the cost of foodborne illness in humans. Most studies (80%) were published in the last two decades (1992–2012) in North America and Europe. The 10 most frequently estimated costs were due to illnesses caused by bacterial foodborne pathogens, with non-typhoidal Salmonella spp. being the most commonly studied. Forty studies described both individual (direct and indirect) and societal level costs. The direct individual level component costs most often included were hospital services, physician personnel, and drug costs. The most commonly reported indirect individual level component cost was productivity losses due to sick leave from work. Prior estimates published in the literature were the most commonly used source of component cost data. Data sources were not provided or specifically linked to component costs in several studies. Conclusions The results illustrated a highly variable depth and breadth of individual and societal level component costs, and a wide range of data sources being used. This scoping review can be used as evidence that there is a lack of standardization in cost inventories in the cost of foodborne illness literature, and to promote greater transparency and detail of data source reporting. By conforming to a more standardized cost inventory, and by reporting data sources in more detail, there will be an increase in cost of foodborne illness research that can be interpreted and compared in a meaningful way. PMID:24885154

  10. Calculation of the Average Cost per Case of Dengue Fever in Mexico Using a Micro-Costing Approach

    PubMed Central

    2016-01-01

    Introduction The increasing burden of dengue fever (DF) in the Americas, and the current epidemic in previously unaffected countries, generate major costs for national healthcare systems. There is a need to quantify the average cost per DF case. In Mexico, few data are available on costs, despite DF being endemic in some areas. Extrapolations from studies in other countries may prove unreliable and are complicated by the two main Mexican healthcare systems (the Secretariat of Health [SS] and the Mexican Social Security Institute [IMSS]). The present study aimed to generate specific average DF cost-per-case data for Mexico using a micro-costing approach. Methods Expected medical costs associated with an ideal management protocol for DF (denoted ´ideal costs´) were compared with the medical costs of current treatment practice (denoted ´real costs´) in 2012. Real cost data were derived from chart review of DF cases and interviews with patients and key personnel from 64 selected hospitals and ambulatory care units in 16 states for IMSS and SS. In both institutions, ideal and real costs were estimated using the program, actions, activities, tasks, inputs (PAATI) approach, a micro-costing technique developed by us. Results Clinical pathways were obtained for 1,168 patients following review of 1,293 charts. Ideal and real costs for SS patients were US$165.72 and US$32.60, respectively, in the outpatient setting, and US$587.77 and US$490.93, respectively, in the hospital setting. For IMSS patients, ideal and real costs were US$337.50 and US$92.03, respectively, in the outpatient setting, and US$2,042.54 and US$1,644.69 in the hospital setting. Conclusions The markedly higher ideal versus real costs may indicate deficiencies in the actual care of patients with DF. It may be necessary to derive better estimates with micro-costing techniques and compare the ideal protocol with current practice when calculating these costs, as patients do not always receive optimal care. PMID:27501146

  11. Cost savings associated with prevention of recurrent lumbar disc herniation with a novel annular closure device: a multicenter prospective cohort study.

    PubMed

    Parker, Scott L; Grahovac, Gordan; Vukas, Duje; Ledic, Darko; Vilendecic, Milorad; McGirt, Matthew J

    2013-09-01

    Same-level recurrent disc herniation is a well-defined complication following lumbar discectomy. Reherniation results in increased morbidity and health care costs. Techniques to reduce these consequences may improve outcomes and reduce cost after lumbar discectomy. In a prospective cohort study, we set out to evaluate the cost associated with surgical management of recurrent, same-level lumbar disc herniation following primary discectomy. Forty-six consecutive European patients undergoing lumbar discectomy for a single-level herniated disc at two institutions were prospectively followed with clinical and radiographic evaluations. A second consecutive cohort of 30 patients undergoing 31 lumbar discectomies with implantation of an annular closure device was followed at the same hospitals and same follow-up intervals. Cost estimates for reherniation were modeled on Medicare national allowable payment amounts (direct cost) and patient work-day losses (indirect cost). Annular closure and control cohorts were matched at baseline. By 2 years follow-up, symptomatic recurrent same-level disc herniation occurred in three (6.5%) patients in the control cohort versus zero (0%) patients in the annular closure cohort. For patients experiencing recurrent disc herniation, mean estimated direct and indirect cost of management of recurrent disc herniation was $34,242 and $3,778, respectively. Use of an annular closure device potentially results in a cost savings of $222,573 per 100 primary discectomy procedures performed (or $2,226 per discectomy), based solely on the reduction of reoperated reherniations when modeled on U.S. Medicare costs. Recurrent disc herniation did not occur in any patients after annular closure within the 12-month follow-up. The reduction in the incidence of reherniation was associated with potentially significant cost savings. Development of novel techniques to prevent recurrent lumbar disc herniation is warranted to decrease the associated morbidity and health care costs associated with this complication. Georg Thieme Verlag KG Stuttgart · New York.

  12. A cost-effectiveness analysis of propofol versus midazolam for procedural sedation in the emergency department.

    PubMed

    Hohl, Corinne Michèle; Nosyk, Bohdan; Sadatsafavi, Mohsen; Anis, Aslam Hayat

    2008-01-01

    To determine the incremental cost-effectiveness of using propofol versus midazolam for procedural sedation (PS) in adults in the emergency department (ED). The authors conducted a cost-effectiveness analysis from the perspective of the health care provider. The primary outcome was the incremental cost (or savings) to achieve one additional successful sedation with propofol compared to midazolam. A decision model was developed in which the clinical effectiveness and cost of a PS strategy using either agent was estimated. The authors derived estimates of clinical effectiveness and risk of adverse events (AEs) from a systematic review. The cost of each clinical outcome was determined by incorporating the baseline cost of the ED visit, the cost of the drug, the cost of labor of physicians and nurses, the cost and probability of an AE, and the cost and probability of a PS failure. A standard meta-analytic technique was used to calculate the weighted mean difference in recovery times and obtain mean drug doses from patient-level data from a randomized controlled trial. Probabilistic sensitivity analyses were conducted to examine the uncertainty around the estimated incremental cost-effectiveness ratio using Monte Carlo simulation. Choosing a sedation strategy with propofol resulted in average savings of $17.33 (95% confidence interval [CI] = $24.13 to $10.44) per sedation performed. This resulted in an incremental cost-effectiveness ratio of -$597.03 (95% credibility interval -$6,434.03 to $6,113.57) indicating savings of $597.03 per additional successful sedation performed with propofol. This result was driven by shorter recovery times and was robust to all sensitivity analyses performed. These results indicate that using propofol for PS in the ED is a cost-saving strategy.

  13. Machine learning and microsimulation techniques on the prognosis of dementia: A systematic literature review.

    PubMed

    Dallora, Ana Luiza; Eivazzadeh, Shahryar; Mendes, Emilia; Berglund, Johan; Anderberg, Peter

    2017-01-01

    Dementia is a complex disorder characterized by poor outcomes for the patients and high costs of care. After decades of research little is known about its mechanisms. Having prognostic estimates about dementia can help researchers, patients and public entities in dealing with this disorder. Thus, health data, machine learning and microsimulation techniques could be employed in developing prognostic estimates for dementia. The goal of this paper is to present evidence on the state of the art of studies investigating and the prognosis of dementia using machine learning and microsimulation techniques. To achieve our goal we carried out a systematic literature review, in which three large databases-Pubmed, Socups and Web of Science were searched to select studies that employed machine learning or microsimulation techniques for the prognosis of dementia. A single backward snowballing was done to identify further studies. A quality checklist was also employed to assess the quality of the evidence presented by the selected studies, and low quality studies were removed. Finally, data from the final set of studies were extracted in summary tables. In total 37 papers were included. The data summary results showed that the current research is focused on the investigation of the patients with mild cognitive impairment that will evolve to Alzheimer's disease, using machine learning techniques. Microsimulation studies were concerned with cost estimation and had a populational focus. Neuroimaging was the most commonly used variable. Prediction of conversion from MCI to AD is the dominant theme in the selected studies. Most studies used ML techniques on Neuroimaging data. Only a few data sources have been recruited by most studies and the ADNI database is the one most commonly used. Only two studies have investigated the prediction of epidemiological aspects of Dementia using either ML or MS techniques. Finally, care should be taken when interpreting the reported accuracy of ML techniques, given studies' different contexts.

  14. Machine learning and microsimulation techniques on the prognosis of dementia: A systematic literature review

    PubMed Central

    Mendes, Emilia; Berglund, Johan; Anderberg, Peter

    2017-01-01

    Background Dementia is a complex disorder characterized by poor outcomes for the patients and high costs of care. After decades of research little is known about its mechanisms. Having prognostic estimates about dementia can help researchers, patients and public entities in dealing with this disorder. Thus, health data, machine learning and microsimulation techniques could be employed in developing prognostic estimates for dementia. Objective The goal of this paper is to present evidence on the state of the art of studies investigating and the prognosis of dementia using machine learning and microsimulation techniques. Method To achieve our goal we carried out a systematic literature review, in which three large databases—Pubmed, Socups and Web of Science were searched to select studies that employed machine learning or microsimulation techniques for the prognosis of dementia. A single backward snowballing was done to identify further studies. A quality checklist was also employed to assess the quality of the evidence presented by the selected studies, and low quality studies were removed. Finally, data from the final set of studies were extracted in summary tables. Results In total 37 papers were included. The data summary results showed that the current research is focused on the investigation of the patients with mild cognitive impairment that will evolve to Alzheimer’s disease, using machine learning techniques. Microsimulation studies were concerned with cost estimation and had a populational focus. Neuroimaging was the most commonly used variable. Conclusions Prediction of conversion from MCI to AD is the dominant theme in the selected studies. Most studies used ML techniques on Neuroimaging data. Only a few data sources have been recruited by most studies and the ADNI database is the one most commonly used. Only two studies have investigated the prediction of epidemiological aspects of Dementia using either ML or MS techniques. Finally, care should be taken when interpreting the reported accuracy of ML techniques, given studies’ different contexts. PMID:28662070

  15. Launch vehicle operations cost reduction through artificial intelligence techniques

    NASA Technical Reports Server (NTRS)

    Davis, Tom C., Jr.

    1988-01-01

    NASA's Kennedy Space Center has attempted to develop AI methods in order to reduce the cost of launch vehicle ground operations as well as to improve the reliability and safety of such operations. Attention is presently given to cost savings estimates for systems involving launch vehicle firing-room software and hardware real-time diagnostics, as well as the nature of configuration control and the real-time autonomous diagnostics of launch-processing systems by these means. Intelligent launch decisions and intelligent weather forecasting are additional applications of AI being considered.

  16. Valuing morbidity from wildfire smoke exposure: a comparison of revealed and stated preference techniques

    USGS Publications Warehouse

    Richardson, Leslie; Loomis, John B.; Champ, Patricia A.

    2013-01-01

    Estimating the economic benefits of reduced health damages due to improvements in environmental quality continues to challenge economists. We review welfare measures associated with reduced wildfire smoke exposure, and a unique dataset from California’s Station Fire of 2009 allows for a comparison of cost of illness (COI) estimates with willingness to pay (WTP) measures. The WTP for one less symptom day is estimated to be $87 and $95, using the defensive behavior and contingent valuation methods, respectively. These WTP estimates are not statistically different but do differ from a $3 traditional daily COI estimate and $17 comprehensive daily COI estimate.

  17. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Cost-effectiveness of aortic valve replacement in the elderly: an introductory study.

    PubMed

    Wu, YingXing; Jin, Ruyun; Gao, Guangqiang; Grunkemeier, Gary L; Starr, Albert

    2007-03-01

    With increased life expectancy and improved technology, valve replacement is being offered to increasing numbers of elderly patients with satisfactory clinical results. By using standard econometric techniques, we estimated the relative cost-effectiveness of aortic valve replacement by drawing on a large prospective database at our institution. By using aortic valve replacement as an example, this introductory report paves the way to more definitive studies of these issues in the future. From 1961 to 2003, 4617 adult patients underwent aortic valve replacement at our service. These patients were provided with a prospective lifetime follow-up. As of 2005, these patients had accumulated 31,671 patient-years of follow-up (maximum 41 years) and had returned 22,396 yearly questionnaires. A statistical model was used to estimate the future life years of patients who are currently alive. In the absence of direct estimates of utility, quality-adjusted life years were estimated from New York Heart Association class. The cost-effectiveness ratio was calculated by the patient's age at surgery. The overall cost-effectiveness ratio was approximately 13,528 dollars per quality-adjusted life year gained. The cost-effectiveness ratio increased according to age at surgery, up to 19,826 dollars per quality-adjusted life year for octogenarians and 27,182 dollars per quality-adjusted life year for nonagenarians. Given the limited scope of this introductory study, aortic valve replacement is cost-effective for all age groups and is very cost-effective for all but the most elderly according to standard econometric rules of thumb.

  19. Surveying drainage culvert use by carnivores: sampling design and cost-benefit analyzes of track-pads vs. video-surveillance methods.

    PubMed

    Mateus, Ana Rita A; Grilo, Clara; Santos-Reis, Margarida

    2011-10-01

    Environmental assessment studies often evaluate the effectiveness of drainage culverts as habitat linkages for species, however, the efficiency of the sampling designs and the survey methods are not known. Our main goal was to estimate the most cost-effective monitoring method for sampling carnivore culvert using track-pads and video-surveillance. We estimated the most efficient (lower costs and high detection success) interval between visits (days) when using track-pads and also determined the advantages of using each method. In 2006, we selected two highways in southern Portugal and sampled 15 culverts over two 10-day sampling periods (spring and summer). Using the track-pad method, 90% of the animal tracks were detected using a 2-day interval between visits. We recorded a higher number of crossings for most species using video-surveillance (n = 129) when compared with the track-pad technique (n = 102); however, the detection ability using the video-surveillance method varied with type of structure and species. More crossings were detected in circular culverts (1 m and 1.5 m diameter) than in box culverts (2 m to 4 m width), likely because video cameras had a reduced vision coverage area. On the other hand, carnivore species with small feet such as the common genet Genetta genetta were detected less often using the track-pad surveying method. The cost-benefit analyzes shows that the track-pad technique is the most appropriate technique, but video-surveillance allows year-round surveys as well as the behavior response analyzes of species using crossing structures.

  20. Rapid, cost-effective and accurate quantification of Yucca schidigera Roezl. steroidal saponins using HPLC-ELSD method.

    PubMed

    Tenon, Mathieu; Feuillère, Nicolas; Roller, Marc; Birtić, Simona

    2017-04-15

    Yucca GRAS-labelled saponins have been and are increasingly used in food/feed, pharmaceutical or cosmetic industries. Existing techniques presently used for Yucca steroidal saponin quantification remain either inaccurate and misleading or accurate but time consuming and cost prohibitive. The method reported here addresses all of the above challenges. HPLC/ELSD technique is an accurate and reliable method that yields results of appropriate repeatability and reproducibility. This method does not over- or under-estimate levels of steroidal saponins. HPLC/ELSD method does not require each and every pure standard of saponins, to quantify the group of steroidal saponins. The method is a time- and cost-effective technique that is suitable for routine industrial analyses. HPLC/ELSD methods yield a saponin fingerprints specific to the plant species. As the method is capable of distinguishing saponin profiles from taxonomically distant species, it can unravel plant adulteration issues. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  1. Ecological economics of soil erosion: a review of the current state of knowledge.

    PubMed

    Adhikari, Bhim; Nadella, Karthik

    2011-02-01

    The economics of land degradation has received relatively little attention until recent years. Although a number of studies have undertaken valuation of ecosystem services ranging from the global to the micro level, and quite a few studies have attempted to quantify the costs of soil erosion, studies that address the full costs of land degradation are still scarce. In this review, we attempt to analyze different land resource modeling and valuation techniques applied in earlier research and the type of data used in these analyses, and to assess their utility for different forms of land resource and management appraisal. We also report on the strengths and weaknesses of different valuation techniques used in studies on the economics of soil erosion, and the relevance of these valuation techniques. We make a case for the need for more appropriate models that can make the analysis more robust in estimating the economic costs of land degradation while recognizing the spatial heterogeneity in biophysical and economic conditions. © 2011 New York Academy of Sciences.

  2. Evaluating noninvasive genetic sampling techniques to estimate large carnivore abundance.

    PubMed

    Mumma, Matthew A; Zieminski, Chris; Fuller, Todd K; Mahoney, Shane P; Waits, Lisette P

    2015-09-01

    Monitoring large carnivores is difficult because of intrinsically low densities and can be dangerous if physical capture is required. Noninvasive genetic sampling (NGS) is a safe and cost-effective alternative to physical capture. We evaluated the utility of two NGS methods (scat detection dogs and hair sampling) to obtain genetic samples for abundance estimation of coyotes, black bears and Canada lynx in three areas of Newfoundland, Canada. We calculated abundance estimates using program capwire, compared sampling costs, and the cost/sample for each method relative to species and study site, and performed simulations to determine the sampling intensity necessary to achieve abundance estimates with coefficients of variation (CV) of <10%. Scat sampling was effective for both coyotes and bears and hair snags effectively sampled bears in two of three study sites. Rub pads were ineffective in sampling coyotes and lynx. The precision of abundance estimates was dependent upon the number of captures/individual. Our simulations suggested that ~3.4 captures/individual will result in a < 10% CV for abundance estimates when populations are small (23-39), but fewer captures/individual may be sufficient for larger populations. We found scat sampling was more cost-effective for sampling multiple species, but suggest that hair sampling may be less expensive at study sites with limited road access for bears. Given the dependence of sampling scheme on species and study site, the optimal sampling scheme is likely to be study-specific warranting pilot studies in most circumstances. © 2015 John Wiley & Sons Ltd.

  3. Risk Assessment Techniques. A Handbook for Program Management Personnel

    DTIC Science & Technology

    1983-07-01

    tion; not directly usable without further development. 37. Lieber, R.S., "New Approaches for Quantifying Risk and Determining Sharing Arrangements...must be provided. Prediction intervals around cost estimating relationships (CERs) or Monte Carlo simulations will be used as proper in quantifying ... risk ." [emphasis supplied] Para 9.d. "The ISR will address the potential risk in the program office estimate by identifying ’risk’ areas and their

  4. Analysis of Gopher Tortoise Population Estimation Techniques

    DTIC Science & Technology

    2005-10-01

    land use practices on the gopher tortoise, Gopherus polyphemus.” Biological Conservation.108: 289-298. Horngren , C.T and G. Foster. 1991. Cost ...with flagging to ensure complete coverage. A South Carolina census was conducted with a team of 60 to 70 volunteers walking abreast ( S . Bennett...best method in terms of cost and accuracy. Burke and Cox (1988) tested if the direction of tortoise tracks in the burrow was re- liable in

  5. A Cost Estimation Analysis of U.S. Navy Ship Fuel-Savings Techniques and Technologies

    DTIC Science & Technology

    2009-09-01

    readings to the boiler operator. The PLC will provide constant automatic trimming of the excess oxygen based upon real time SGA readings. An SCD...the author): The Aegis Combat System is controlled by an advanced, automatic detect-and-track, multi-function three-dimensional passive...subsequently offloaded. An Online Wash System would reduce these maintenance costs and improve fuel efficiency of these engines by keeping the engines

  6. GREAT I: A Study of the Upper Mississippi River. Volume 3. Material and Equipment Needs, Commercial Transportation.

    DTIC Science & Technology

    1980-09-01

    Needs 75 DEVELOPMENT OF DREDGING COST ESTIMATES 76 IMPLEMENTATION OF SELECTED PLAN 77 EVALUATION OF SELECTED PLAN 78 NATIONAL ECONOMIC DEVELOPMENT EFFECTS ...78 ENVIRONMENTAL QUALITY EFFECTS 78 TABLE OF CONTENTS (CONT) ITEM PAGE RECOMM*ENDATIONS 79 THE DREDGE WILLIAM A. THOMPSON 79 MECHANICAL DREDGING...and cost effective for implementing a recommended channel maintenance plan. 3. Suggesting which types of equipment and techniques are best suited for

  7. Non-parametric methods for cost-effectiveness analysis: the central limit theorem and the bootstrap compared.

    PubMed

    Nixon, Richard M; Wonderling, David; Grieve, Richard D

    2010-03-01

    Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.

  8. Analyzing costs of space debris mitigation methods

    NASA Astrophysics Data System (ADS)

    Wiedemann, C.; Krag, H.; Bendisch, J.; Sdunnus, H.

    2004-01-01

    The steadily increasing number of space objects poses a considerable hazard to all kinds of spacecraft. To reduce the risks to future space missions different debris mitigation measures and spacecraft protection techniques have been investigated during the last years. However, the economic efficiency has not been considered yet in this context. Current studies have the objective to evaluate the mission costs due to space debris in a business as usual (no mitigation) scenario compared to the missions costs considering debris mitigation. The aim is an estimation of the time until the investment in debris mitigation will lead to an effective reduction of mission costs. This paper presents the results of investigations on the key issues of cost estimation for spacecraft and the influence of debris mitigation and shielding on cost. Mitigation strategies like the reduction of orbital lifetime and de- or re-orbit of non-operational satellites are methods to control the space debris environment. These methods result in an increase of costs. In a first step the overall costs of different types of unmanned satellites are analyzed. A selected cost model is simplified and generalized for an application on all operational satellites. In a next step the influence of space debris on cost is treated, if the implementation of mitigation strategies is considered.

  9. Marginal cost curves for water footprint reduction in irrigated agriculture: guiding a cost-effective reduction of crop water consumption to a permit or benchmark level

    NASA Astrophysics Data System (ADS)

    Chukalla, Abebe D.; Krol, Maarten S.; Hoekstra, Arjen Y.

    2017-07-01

    Reducing the water footprint (WF) of the process of growing irrigated crops is an indispensable element in water management, particularly in water-scarce areas. To achieve this, information on marginal cost curves (MCCs) that rank management packages according to their cost-effectiveness to reduce the WF need to support the decision making. MCCs enable the estimation of the cost associated with a certain WF reduction target, e.g. towards a given WF permit (expressed in m3  ha-1 per season) or to a certain WF benchmark (expressed in m3  t-1 of crop). This paper aims to develop MCCs for WF reduction for a range of selected cases. AquaCrop, a soil-water-balance and crop-growth model, is used to estimate the effect of different management packages on evapotranspiration and crop yield and thus the WF of crop production. A management package is defined as a specific combination of management practices: irrigation technique (furrow, sprinkler, drip or subsurface drip); irrigation strategy (full or deficit irrigation); and mulching practice (no, organic or synthetic mulching). The annual average cost for each management package is estimated as the annualized capital cost plus the annual costs of maintenance and operations (i.e. costs of water, energy and labour). Different cases are considered, including three crops (maize, tomato and potato); four types of environment (humid in UK, sub-humid in Italy, semi-arid in Spain and arid in Israel); three hydrologic years (wet, normal and dry years) and three soil types (loam, silty clay loam and sandy loam). For each crop, alternative WF reduction pathways were developed, after which the most cost-effective pathway was selected to develop the MCC for WF reduction. When aiming at WF reduction one can best improve the irrigation strategy first, next the mulching practice and finally the irrigation technique. Moving from a full to deficit irrigation strategy is found to be a no-regret measure: it reduces the WF by reducing water consumption at negligible yield reduction while reducing the cost for irrigation water and the associated costs for energy and labour. Next, moving from no to organic mulching has a high cost-effectiveness, reducing the WF significantly at low cost. Finally, changing from sprinkler or furrow to drip or subsurface drip irrigation reduces the WF, but at a significant cost.

  10. Rapid estimation of nutritional elements on citrus leaves by near infrared reflectance spectroscopy.

    PubMed

    Galvez-Sola, Luis; García-Sánchez, Francisco; Pérez-Pérez, Juan G; Gimeno, Vicente; Navarro, Josefa M; Moral, Raul; Martínez-Nicolás, Juan J; Nieves, Manuel

    2015-01-01

    Sufficient nutrient application is one of the most important factors in producing quality citrus fruits. One of the main guides in planning citrus fertilizer programs is by directly monitoring the plant nutrient content. However, this requires analysis of a large number of leaf samples using expensive and time-consuming chemical techniques. Over the last 5 years, it has been demonstrated that it is possible to quantitatively estimate certain nutritional elements in citrus leaves by using the spectral reflectance values, obtained by using near infrared reflectance spectroscopy (NIRS). This technique is rapid, non-destructive, cost-effective and environmentally friendly. Therefore, the estimation of macro and micronutrients in citrus leaves by this method would be beneficial in identifying the mineral status of the trees. However, to be used effectively NIRS must be evaluated against the standard techniques across different cultivars. In this study, NIRS spectral analysis, and subsequent nutrient estimations for N, K, Ca, Mg, B, Fe, Cu, Mn, and Zn concentration, were performed using 217 leaf samples from different citrus trees species. Partial least square regression and different pre-processing signal treatments were used to generate the best estimation against the current best practice techniques. It was verified a high proficiency in the estimation of N (Rv = 0.99) and Ca (Rv = 0.98) as well as achieving acceptable estimation for K, Mg, Fe, and Zn. However, no successful calibrations were obtained for the estimation of B, Cu, and Mn.

  11. Rapid estimation of nutritional elements on citrus leaves by near infrared reflectance spectroscopy

    PubMed Central

    Galvez-Sola, Luis; García-Sánchez, Francisco; Pérez-Pérez, Juan G.; Gimeno, Vicente; Navarro, Josefa M.; Moral, Raul; Martínez-Nicolás, Juan J.; Nieves, Manuel

    2015-01-01

    Sufficient nutrient application is one of the most important factors in producing quality citrus fruits. One of the main guides in planning citrus fertilizer programs is by directly monitoring the plant nutrient content. However, this requires analysis of a large number of leaf samples using expensive and time-consuming chemical techniques. Over the last 5 years, it has been demonstrated that it is possible to quantitatively estimate certain nutritional elements in citrus leaves by using the spectral reflectance values, obtained by using near infrared reflectance spectroscopy (NIRS). This technique is rapid, non-destructive, cost-effective and environmentally friendly. Therefore, the estimation of macro and micronutrients in citrus leaves by this method would be beneficial in identifying the mineral status of the trees. However, to be used effectively NIRS must be evaluated against the standard techniques across different cultivars. In this study, NIRS spectral analysis, and subsequent nutrient estimations for N, K, Ca, Mg, B, Fe, Cu, Mn, and Zn concentration, were performed using 217 leaf samples from different citrus trees species. Partial least square regression and different pre-processing signal treatments were used to generate the best estimation against the current best practice techniques. It was verified a high proficiency in the estimation of N (Rv = 0.99) and Ca (Rv = 0.98) as well as achieving acceptable estimation for K, Mg, Fe, and Zn. However, no successful calibrations were obtained for the estimation of B, Cu, and Mn. PMID:26257767

  12. Home care for the disabled elderly: predictors and expected costs.

    PubMed Central

    Coughlin, T A; McBride, T D; Perozek, M; Liu, K

    1992-01-01

    While interest in publicly funded home care for the disabled elderly is keen, basic policy issues need to be addressed before an appropriate program can be adopted and financed. This article presents findings from a study in which the cost implications of anticipated behavioral responses (for example, caregiver substitution) are estimated. Using simulation techniques, the results demonstrate that anticipated behavioral responses would likely add between $1.8 and $2.7 billion (1990 dollars) to the costs of a public home care program. Results from a variety of cost simulations are presented. The data base for the study was the 1982 National Long-Term Care Survey. PMID:1399652

  13. In-hospital cost comparison between percutaneous pulmonary valve implantation and surgery

    PubMed Central

    Mishra, Vinod; Lewandowska, Milena; Andersen, Jack Gunnar; Andersen, Marit Helen; Lindberg, Harald; Døhlen, Gaute; Fosse, Erik

    2017-01-01

    Abstract OBJECTIVES: Today, both surgical and percutaneous techniques are available for pulmonary valve implantation in patients with right ventricle outflow tract obstruction or insufficiency. In this controlled, non-randomized study the hospital costs per patient of the two treatment options were identified and compared. METHODS: During the period of June 2011 until October 2014 cost data in 20 patients treated with the percutaneous technique and 14 patients treated with open surgery were consecutively included. Two methods for cost analysis were used, a retrospective average cost estimate (overhead costs) and a direct prospective detailed cost acquisition related to each individual patient (patient-specific costs). RESULTS: The equipment cost, particularly the stents and valve itself was by far the main cost-driving factor in the percutaneous pulmonary valve group, representing 96% of the direct costs, whereas in the open surgery group the main costs derived from the postoperative care and particularly the stay in the intensive care department. The device-related cost in this group represented 13.5% of the direct costs. Length-of-stay-related costs in the percutaneous group were mean $3885 (1618) and mean $17 848 (5060) in the open surgery group. The difference in postoperative stay between the groups was statistically significant (P≤ 0.001). CONCLUSIONS: Given the high postoperative cost in open surgery, the percutaneous procedure could be cost saving even with a device cost of more than five times the cost of the surgical device. PMID:28007875

  14. The costs of introducing new technologies into space systems

    NASA Technical Reports Server (NTRS)

    Dodson, E. N.; Partma, H.; Ruhland, W.

    1992-01-01

    A review is conducted of cost-research studies intended to provide guidelines for cost estimates of integrating new technologies into existing satellite systems. Quantitative methods are described for determining the technological state-of-the-art so that proposed programs can be evaluated accurately in terms of their contribution to technological development. The R&D costs associated with the proposed programs are then assessed with attention given to the technological advances. Also incorporated quantifiably are any reductions in the costs of production, operations, and support afforded by the advanced technologies. The proposed model is employed in relation to a satellite sizing and cost study in which a tradeoff between increased R&D costs and reduced production costs is examined. The technology/cost model provides a consistent yardstick for assessing the true relative economic impact of introducing novel techniques and technologies.

  15. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  16. 75 FR 16527 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-01

    ... applies to all of the approximately 5,178 firms registered with the Commission that effect transactions on... confirmations than larger firms because they effect fewer transactions. The Commission staff estimates the costs... of automated collection techniques or other forms of information technology. Consideration will be...

  17. Time-driven Activity-based Costing More Accurately Reflects Costs in Arthroplasty Surgery.

    PubMed

    Akhavan, Sina; Ward, Lorrayne; Bozic, Kevin J

    2016-01-01

    Cost estimates derived from traditional hospital cost accounting systems have inherent limitations that restrict their usefulness for measuring process and quality improvement. Newer approaches such as time-driven activity-based costing (TDABC) may offer more precise estimates of true cost, but to our knowledge, the differences between this TDABC and more traditional approaches have not been explored systematically in arthroplasty surgery. The purposes of this study were to compare the costs associated with (1) primary total hip arthroplasty (THA); (2) primary total knee arthroplasty (TKA); and (3) three surgeons performing these total joint arthroplasties (TJAs) as measured using TDABC versus traditional hospital accounting (TA). Process maps were developed for each phase of care (preoperative, intraoperative, and postoperative) for patients undergoing primary TJA performed by one of three surgeons at a tertiary care medical center. Personnel costs for each phase of care were measured using TDABC based on fully loaded labor rates, including physician compensation. Costs associated with consumables (including implants) were calculated based on direct purchase price. Total costs for 677 primary TJAs were aggregated over 17 months (January 2012 to May 2013) and organized into cost categories (room and board, implant, operating room services, drugs, supplies, other services). Costs derived using TDABC, based on actual time and intensity of resources used, were compared with costs derived using TA techniques based on activity-based costing and indirect costs calculated as a percentage of direct costs from the hospital decision support system. Substantial differences between cost estimates using TDABC and TA were found for primary THA (USD 12,982 TDABC versus USD 23,915 TA), primary TKA (USD 13,661 TDABC versus USD 24,796 TA), and individually across all three surgeons for both (THA: TDABC = 49%-55% of TA total cost; TKA: TDABC = 53%-55% of TA total cost). Cost categories with the most variability between TA and TDABC estimates were operating room services and room and board. Traditional hospital cost accounting systems overestimate the costs associated with many surgical procedures, including primary TJA. TDABC provides a more accurate measure of true resource use associated with TJAs and can be used to identify high-cost/high-variability processes that can be targeted for process/quality improvement. Level III, therapeutic study.

  18. Metal surface corrosion grade estimation from single image

    NASA Astrophysics Data System (ADS)

    Chen, Yijun; Qi, Lin; Sun, Huyuan; Fan, Hao; Dong, Junyu

    2018-04-01

    Metal corrosion can cause many problems, how to quickly and effectively assess the grade of metal corrosion and timely remediation is a very important issue. Typically, this is done by trained surveyors at great cost. Assisting them in the inspection process by computer vision and artificial intelligence would decrease the inspection cost. In this paper, we propose a dataset of metal surface correction used for computer vision detection and present a comparison between standard computer vision techniques by using OpenCV and deep learning method for automatic metal surface corrosion grade estimation from single image on this dataset. The test has been performed by classifying images and calculating the accuracy for the two different approaches.

  19. The Effect of State Regulatory Stringency on Nursing Home Quality

    PubMed Central

    Mukamel, Dana B; Weimer, David L; Harrington, Charlene; Spector, William D; Ladd, Heather; Li, Yue

    2012-01-01

    Objective To test the hypothesis that more stringent quality regulations contribute to better quality nursing home care and to assess their cost-effectiveness. Data Sources/Setting Primary and secondary data from all states and U.S. nursing homes between 2005 and 2006. Study Design We estimated seven models, regressing quality measures on the Harrington Regulation Stringency Index and control variables. To account for endogeneity between regulation and quality, we used instrumental variables techniques. Quality was measured by staffing hours by type per case-mix adjusted day, hotel expenditures, and risk-adjusted decline in activities of daily living, high-risk pressure sores, and urinary incontinence. Data Collection All states' licensing and certification offices were surveyed to obtain data about deficiencies. Secondary data included the Minimum Data Set, Medicare Cost Reports, and the Economic Freedom Index. Principal Findings Regulatory stringency was significantly associated with better quality for four of the seven measures studied. The cost-effectiveness for the activities-of-daily-living measure was estimated at about 72,000 in 2011/ Quality Adjusted Life Year. Conclusions Quality regulations lead to better quality in nursing homes along some dimensions, but not all. Our estimates of cost-effectiveness suggest that increased regulatory stringency is in the ballpark of other acceptable cost-effective practices. PMID:22946859

  20. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    NASA Astrophysics Data System (ADS)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over developing new super sensors.

  1. Optimal pipe size design for looped irrigation water supply system using harmony search: Saemangeum project area.

    PubMed

    Yoo, Do Guen; Lee, Ho Min; Sadollah, Ali; Kim, Joong Hoon

    2015-01-01

    Water supply systems are mainly classified into branched and looped network systems. The main difference between these two systems is that, in a branched network system, the flow within each pipe is a known value, whereas in a looped network system, the flow in each pipe is considered an unknown value. Therefore, an analysis of a looped network system is a more complex task. This study aims to develop a technique for estimating the optimal pipe diameter for a looped agricultural irrigation water supply system using a harmony search algorithm, which is an optimization technique. This study mainly serves two purposes. The first is to develop an algorithm and a program for estimating a cost-effective pipe diameter for agricultural irrigation water supply systems using optimization techniques. The second is to validate the developed program by applying the proposed optimized cost-effective pipe diameter to an actual study region (Saemangeum project area, zone 6). The results suggest that the optimal design program, which applies an optimization theory and enhances user convenience, can be effectively applied for the real systems of a looped agricultural irrigation water supply.

  2. Optimal Pipe Size Design for Looped Irrigation Water Supply System Using Harmony Search: Saemangeum Project Area

    PubMed Central

    Lee, Ho Min; Sadollah, Ali

    2015-01-01

    Water supply systems are mainly classified into branched and looped network systems. The main difference between these two systems is that, in a branched network system, the flow within each pipe is a known value, whereas in a looped network system, the flow in each pipe is considered an unknown value. Therefore, an analysis of a looped network system is a more complex task. This study aims to develop a technique for estimating the optimal pipe diameter for a looped agricultural irrigation water supply system using a harmony search algorithm, which is an optimization technique. This study mainly serves two purposes. The first is to develop an algorithm and a program for estimating a cost-effective pipe diameter for agricultural irrigation water supply systems using optimization techniques. The second is to validate the developed program by applying the proposed optimized cost-effective pipe diameter to an actual study region (Saemangeum project area, zone 6). The results suggest that the optimal design program, which applies an optimization theory and enhances user convenience, can be effectively applied for the real systems of a looped agricultural irrigation water supply. PMID:25874252

  3. [Variability and opportunity costs among the surgical alternatives for breast cancer].

    PubMed

    Angulo-Pueyo, Ester; Ridao-López, Manuel; Martínez-Lizaga, Natalia; García-Armesto, Sandra; Bernal-Delgado, Enrique

    2014-01-01

    To analyze medical practice variation in breast cancer surgery (either inpatient-based or day-case surgery), by comparing conservative surgery (CS) plus radiotherapy vs. non-conservative surgery (NCS). We also analyzed the opportunity costs associated with CS and NCS. We performed an observational study of age- and sex-standardized rates of CS and NCS, performed in 199 Spanish healthcare areas in 2008-2009. Costs were calculated by using two techniques: indirectly, by using All-Patients Diagnosis Related Groups (AP-DRG) based on hospital admissions, and directly by using full costing from the Spanish Network of Hospital Costs (SNHC) data. Standardized surgery rates for CS and NCS were 6.84 and 4.35 per 10,000 women, with variation across areas ranging from 2.95 to 3.11 per 10,000 inhabitants. In 2009, 9% of CS was performed as day-case surgery, although a third of the health care areas did not perform this type of surgery. Taking the SNHC as a reference, the cost of CS was estimated at 7,078 € and that of NCS was 6,161 €. Using AP-DRG, costs amounted to 9,036 € and 8,526 €, respectively. However, CS had lower opportunity costs than NCS when day-case surgery was performed frequently-more than 46% of cases (following SNHC estimates) or 23% of cases (following AP-DRG estimates). Day-case CS for breast cancer was found to be the best option in terms of opportunity-costs beyond a specific threshold, when both CS and NCS are elective. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  4. Estimation of spatial-temporal gait parameters using a low-cost ultrasonic motion analysis system.

    PubMed

    Qi, Yongbin; Soh, Cheong Boon; Gunawan, Erry; Low, Kay-Soon; Thomas, Rijil

    2014-08-20

    In this paper, a low-cost motion analysis system using a wireless ultrasonic sensor network is proposed and investigated. A methodology has been developed to extract spatial-temporal gait parameters including stride length, stride duration, stride velocity, stride cadence, and stride symmetry from 3D foot displacements estimated by the combination of spherical positioning technique and unscented Kalman filter. The performance of this system is validated against a camera-based system in the laboratory with 10 healthy volunteers. Numerical results show the feasibility of the proposed system with average error of 2.7% for all the estimated gait parameters. The influence of walking speed on the measurement accuracy of proposed system is also evaluated. Statistical analysis demonstrates its capability of being used as a gait assessment tool for some medical applications.

  5. Running Neuroimaging Applications on Amazon Web Services: How, When, and at What Cost?

    PubMed

    Madhyastha, Tara M; Koh, Natalie; Day, Trevor K M; Hernández-Fernández, Moises; Kelley, Austin; Peterson, Daniel J; Rajan, Sabreena; Woelfer, Karl A; Wolf, Jonathan; Grabowski, Thomas J

    2017-01-01

    The contribution of this paper is to identify and describe current best practices for using Amazon Web Services (AWS) to execute neuroimaging workflows "in the cloud." Neuroimaging offers a vast set of techniques by which to interrogate the structure and function of the living brain. However, many of the scientists for whom neuroimaging is an extremely important tool have limited training in parallel computation. At the same time, the field is experiencing a surge in computational demands, driven by a combination of data-sharing efforts, improvements in scanner technology that allow acquisition of images with higher image resolution, and by the desire to use statistical techniques that stress processing requirements. Most neuroimaging workflows can be executed as independent parallel jobs and are therefore excellent candidates for running on AWS, but the overhead of learning to do so and determining whether it is worth the cost can be prohibitive. In this paper we describe how to identify neuroimaging workloads that are appropriate for running on AWS, how to benchmark execution time, and how to estimate cost of running on AWS. By benchmarking common neuroimaging applications, we show that cloud computing can be a viable alternative to on-premises hardware. We present guidelines that neuroimaging labs can use to provide a cluster-on-demand type of service that should be familiar to users, and scripts to estimate cost and create such a cluster.

  6. Cost Estimation Techniques for C3I System Software.

    DTIC Science & Technology

    1984-07-01

    opment manmonth have been determined for maxi, midi , and mini .1 type computers. Small to median size timeshared developments used 0.2 to 1.5 hours...development schedule 1.23 1.00 1.10 2.1.3 Detailed Model The final codification of the COCOMO regressions was the development of separate effort...regardless of the software structure level being estimated: D8VC -- the expected development computer (maxi. midi . mini, micro) MODE -- the expected

  7. Health economics in drug development: efficient research to inform healthcare funding decisions.

    PubMed

    Hall, Peter S; McCabe, Christopher; Brown, Julia M; Cameron, David A

    2010-10-01

    In order to decide whether a new treatment should be used in patients, a robust estimate of efficacy and toxicity is no longer sufficient. As a result of increasing healthcare costs across the globe healthcare payers and providers now seek estimates of cost-effectiveness as well. Most trials currently being designed still only consider the need for prospective efficacy and toxicity data during the development life-cycle of a new intervention. Hence the cost-effectiveness estimates are inevitably less precise than the clinical data on which they are based. Methods based on decision theory are being developed by health economists that can contribute to the design of clinical trials in such a way that they can more effectively lead to better informed drug funding decisions on the basis of cost-effectiveness in addition to clinical outcomes. There is an opportunity to apply these techniques prospectively in the design of future clinical trials. This article describes the problems encountered by those responsible for drug reimbursement decisions as a consequence of the current drug development pathway. The potential for decision theoretic methods to help overcome these problems is introduced and potential obstacles in implementation are highlighted. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. The economics of treatment for infants with respiratory distress syndrome.

    PubMed

    Neil, N; Sullivan, S D; Lessler, D S

    1998-01-01

    To define clinical outcomes and prevailing patterns of care for the initial hospitalization of infants at greatest risk for respiratory distress syndrome (RDS); to estimate direct medical care costs associated with the initial hospitalization; and to introduce and demonstrate a simulation technique for the economic evaluation of health care technologies. Clinical outcomes and usual-care algorithms were determined for infants with RDS in three birthweight categories (500-1,000g; >1,000-1,500g; and >1,500g) using literature- and expert-panel-based data. The experts were practitioners from major U.S. hospitals who were directly involved in the clinical care of such infants. Using the framework derived from the usual care patterns and outcomes, the authors developed an itemized "micro-costing" economic model to simulate the costs associated with the initial hospitalization of a hypothetical RDS patient. The model is computerized and dynamic; unit costs, frequencies, number of days, probabilities and population multipliers are all variable and can be modified on the basis of new information or local conditions. Aggregated unit costs are used to estimate the expected medical costs of treatment per patient. Expected costs of initial hospitalization per uncomplicated surviving infant with RDS were estimated to be $101,867 for 500-1,000g infants; $64,524 for >1,000-1,500g infants; and $27,224 for >1,500g infants. Incremental costs of complications among survivors were estimated to be $22,155 (500-1,000g); $11,041 (>1,000-1,500g); and $2,448 (>1,500 g). Expected costs of initial hospitalization per case (including non-survivors) were $100,603; $72,353; and $28,756, respectively. An itemized model such as the one developed here serves as a benchmark for the economic assessment of treatment costs and utilization. Moreover, it offers a powerful tool for the prospective evaluation of new technologies or procedures designed to reduce the incidence of, severity of, and/or total hospital resource use ascribed to RDS.

  9. Directly coupled vs conventional time domain reflectometry in soils

    USDA-ARS?s Scientific Manuscript database

    Time domain reflectometry (TDR), a technique for estimation of soil water, measures the travel time of an electromagnetic pulse on electrodes embedded in the soil, but has limited application in commercial agriculture due to costs, labor, and sensing depth. Conventional TDR systems have employed ana...

  10. Policy analysis for prenatal genetic diagnosis.

    PubMed

    Thompson, M; Milunsky, A

    1979-01-01

    Consideration of the analytic difficulties faced in estimating the benefits and costs of prenatal genetic diagnosis, coupled with a brief review of existing benefit-cost studies, leads to the conclusion that public subsidy of prenatal testing can yield benefits substantially in excess of costs. The practical obstacles to such programs include the attitudes of prospective parents, a lack of knowledge, monetary barriers, inadequately organized medical resources, and the political issue of abortion. Policy analysis can now nevertheless formulate principles and guide immediate actions to improve present utilization of prenatal testing and to facilitate possible future expansion of these diagnostic techniques.

  11. Pre-hospitalization, hospitalization, and post-hospitalization costs of patients with neurocysticercosis treated at the Instituto Nacional de Neurologia y Neurocirugia (INNN) in Mexico City, Mexico.

    PubMed

    Bhattarai, Rachana; Carabin, Hélène; Flores-Rivera, Jose; Corona, Teresa; Proaño, Jefferson V; Flisser, Ana; Budke, Christine M

    2018-01-01

    The objective of this study was to estimate the direct costs associated with the diagnosis and treatment of neurocysticercosis (NCC) during pre-hospitalization, hospitalization, and post-hospitalization periods for 108 NCC patients treated at the Instituto Nacional de Neurologia y Neurocirugia (INNN) in Mexico City, Mexico. Information on clinical manifestations, diagnostic tests, hospitalizations, surgical procedures, prescription medication, and other treatments was collected via medical chart reviews. Uncertain values for costs and frequency of treatments were imputed using bootstrap techniques. The average per-patient pre-hospitalization and hospitalization costs were US$ 257 (95% CI: 185 - 329) and US$ 2,576 (95% CI: 2,244 - 2,908), respectively. Post-hospitalization costs tended to decrease over time, with estimates for the first five years post-hospitalization of US$ 475 (95% CI: 423 - 527), US$ 228 (95% CI: 167 - 288), US$ 157 (95% CI: 111 - 202), US$ 150 (95% CI: 106 - 204), and US$ 91 (95% CI: 27 - 154), respectively. NCC results in a significant economic burden for patients requiring hospitalization, with this burden continuing years post-hospitalization.

  12. Perioperative Cost Analysis of Minimally Invasive vs Open Resection of Intradural Extramedullary Spinal Cord Tumors.

    PubMed

    Fontes, Ricardo B V; Wewel, Joshua T; OʼToole, John E

    2016-04-01

    Minimally invasive spinal surgery (MIS) has emerged as a clinically effective tool but its cost-effectiveness remains unclear. No studies have compared MIS vs open surgical techniques for the treatment of intradural extramedullary (IDEM) tumors. To analyze and compare open and MIS techniques for resection of IDEM tumors, with focus on perioperative costs. Retrospective analysis of a prospectively collected database including 35 IDEM patients (18 open, 17 MIS). Perioperative data, hospital costs, and hospital and physician charges for in-hospital services associated with the index surgical procedure and readmissions within 90 days were compared. Mean estimated blood loss, operative time, preoperative hospital charges, and physician fees were similar between open and MIS techniques. Patient and tumor characteristics were similar between groups. MIS cases were associated with shorter intensive care unit and floor stay. There were 3 complications in the open group, requiring 2 readmissions and 1 reoperation. Hospital costs ($21 307.80 open, $15 015.20 MIS, P < .01), and postoperative ($75 383.48 open, $56 006.88 MIS, P < .01) and total charges ($100 779.38 open, $76 100.92 MIS, P < .01) were significantly lower in the MIS group. There were no tumor recurrences in either group. All patients except for one in the open group maintained or improved their Nurick score. Both MIS and open techniques were able to adequately treat IDEM tumors. Reductions in complication rate and intensive care unit and hospital stay led to a decrease in hospital costs of almost 30% in the MIS group. MIS resection of IDEM tumors is not only an effective and safe option, but allows faster hospital discharge and significant cost savings.

  13. Estimator banks: a new tool for direction-of-arrival estimation

    NASA Astrophysics Data System (ADS)

    Gershman, Alex B.; Boehme, Johann F.

    1997-10-01

    A new powerful tool for improving the threshold performance of direction-of-arrival (DOA) estimation is considered. The essence of our approach is to reduce the number of outliers in the threshold domain using the so-called estimator bank containing multiple 'parallel' underlying DOA estimators which are based on pseudorandom resampling of the MUSIC spatial spectrum for given data batch or sample covariance matrix. To improve the threshold performance relative to conventional MUSIC, evolutionary principles are used, i.e., only 'successful' underlying estimators (having no failure in the preliminary estimated source localization sectors) are exploited in the final estimate. An efficient beamspace root implementation of the estimator bank approach is developed, combined with the array interpolation technique which enables the application to arbitrary arrays. A higher-order extension of our approach is also presented, where the cumulant-based MUSIC estimator is exploited as a basic technique for spatial spectrum resampling. Simulations and experimental data processing show that our algorithm performs well below the MUSIC threshold, namely, has the threshold performance similar to that of the stochastic ML method. At the same time, the computational cost of our algorithm is much lower than that of stochastic ML because no multidimensional optimization is involved.

  14. Optimal control of nonlinear continuous-time systems in strict-feedback form.

    PubMed

    Zargarzadeh, Hassan; Dierks, Travis; Jagannathan, Sarangapani

    2015-10-01

    This paper proposes a novel optimal tracking control scheme for nonlinear continuous-time systems in strict-feedback form with uncertain dynamics. The optimal tracking problem is transformed into an equivalent optimal regulation problem through a feedforward adaptive control input that is generated by modifying the standard backstepping technique. Subsequently, a neural network-based optimal control scheme is introduced to estimate the cost, or value function, over an infinite horizon for the resulting nonlinear continuous-time systems in affine form when the internal dynamics are unknown. The estimated cost function is then used to obtain the optimal feedback control input; therefore, the overall optimal control input for the nonlinear continuous-time system in strict-feedback form includes the feedforward plus the optimal feedback terms. It is shown that the estimated cost function minimizes the Hamilton-Jacobi-Bellman estimation error in a forward-in-time manner without using any value or policy iterations. Finally, optimal output feedback control is introduced through the design of a suitable observer. Lyapunov theory is utilized to show the overall stability of the proposed schemes without requiring an initial admissible controller. Simulation examples are provided to validate the theoretical results.

  15. Applications of non-standard maximum likelihood techniques in energy and resource economics

    NASA Astrophysics Data System (ADS)

    Moeltner, Klaus

    Two important types of non-standard maximum likelihood techniques, Simulated Maximum Likelihood (SML) and Pseudo-Maximum Likelihood (PML), have only recently found consideration in the applied economic literature. The objective of this thesis is to demonstrate how these methods can be successfully employed in the analysis of energy and resource models. Chapter I focuses on SML. It constitutes the first application of this technique in the field of energy economics. The framework is as follows: Surveys on the cost of power outages to commercial and industrial customers usually capture multiple observations on the dependent variable for a given firm. The resulting pooled data set is censored and exhibits cross-sectional heterogeneity. We propose a model that addresses these issues by allowing regression coefficients to vary randomly across respondents and by using the Geweke-Hajivassiliou-Keane simulator and Halton sequences to estimate high-order cumulative distribution terms. This adjustment requires the use of SML in the estimation process. Our framework allows for a more comprehensive analysis of outage costs than existing models, which rely on the assumptions of parameter constancy and cross-sectional homogeneity. Our results strongly reject both of these restrictions. The central topic of the second Chapter is the use of PML, a robust estimation technique, in count data analysis of visitor demand for a system of recreation sites. PML has been popular with researchers in this context, since it guards against many types of mis-specification errors. We demonstrate, however, that estimation results will generally be biased even if derived through PML if the recreation model is based on aggregate, or zonal data. To countervail this problem, we propose a zonal model of recreation that captures some of the underlying heterogeneity of individual visitors by incorporating distributional information on per-capita income into the aggregate demand function. This adjustment eliminates the unrealistic constraint of constant income across zonal residents, and thus reduces the risk of aggregation bias in estimated macro-parameters. The corrected aggregate specification reinstates the applicability of PML. It also increases model efficiency, and allows-for the generation of welfare estimates for population subgroups.

  16. On the feasibility of benefit-cost analysis applied to remote sensing projects. [California water resources

    NASA Technical Reports Server (NTRS)

    Merewitz, L.

    1973-01-01

    The following step-wise procedure for making a benefit-cost analysis of using remote sensing techniques could be used either in the limited context of California water resources, or a context as broad as the making of integrated resource surveys of the entire earth resource complex on a statewide, regional, national, or global basis. (1) Survey all data collection efforts which can be accomplished by remote sensing techniques. (2) Carefully inspect the State of California budget and the Budget of the United States Government to find annual cost of data collection efforts. (3) Decide the extent to which remote sensing can obviate each of the collection efforts. (4) Sum the annual costs of all data collection which can be equivalently accomplished through remote sensing. (5) Decide what additional data could and would be collected through remote sensing. (6) Estimate the value of this information. It is not harmful to do a benefit-cost analysis so long as its severe limitations are recalled and it is supplemented with socio-economic impact studies.

  17. A critical review of accounting and economic methods for estimating the costs of addiction treatment.

    PubMed

    Cartwright, William S

    2008-04-01

    Researchers have been at the forefront of applying new costing methods to drug abuse treatment programs and innovations. The motivation for such work has been to improve costing accuracy. Recent work has seen applications initiated in establishing charts of account and cost accounting for service delivery. As a result, researchers now have available five methods to apply to the costing of drug abuse treatment programs. In all areas of costing, there is room for more research on costing concepts and measurement applications. Additional work would be useful in establishing studies with activity-based costing for both research and managerial purposes. Studies of economies of scope are particularly relevant because of the integration of social services and criminal justice in drug abuse treatment. In the long run, managerial initiatives to improve the administration and quality of drug abuse treatment will benefit directly from research with new information on costing techniques.

  18. An automation of design and modelling tasks in NX Siemens environment with original software - cost module

    NASA Astrophysics Data System (ADS)

    Zbiciak, R.; Grabowik, C.; Janik, W.

    2015-11-01

    The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing costs estimation has been designed properly was made by comparison of the achieved from the algorithm results with those obtained from industry. This verification has indicated that in most cases both group of results are similar. Taking into account above it is possible to draw a conclusion that the Cost module might play significant role in design constructional process by adding an engineer at the selection stage of alternative gear wheels design. It should be remembered that real manufacturing cost can differ significantly according to available in a factory manufacturing techniques and stock of machine tools.

  19. An approach to estimate body dimensions through constant body ratio benchmarks.

    PubMed

    Chao, Wei-Cheng; Wang, Eric Min-Yang

    2010-12-01

    Building a new anthropometric database is a difficult and costly job that requires considerable manpower and time. However, most designers and engineers do not know how to convert old anthropometric data into applicable new data with minimal errors and costs (Wang et al., 1999). To simplify the process of converting old anthropometric data into useful new data, this study analyzed the available data in paired body dimensions in an attempt to determine constant body ratio (CBR) benchmarks that are independent of gender and age. In total, 483 CBR benchmarks were identified and verified from 35,245 ratios analyzed. Additionally, 197 estimation formulae, taking as inputs 19 easily measured body dimensions, were built using 483 CBR benchmarks. Based on the results for 30 recruited participants, this study determined that the described approach is more accurate and cost-effective than alternative techniques. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Autonomous Sun-Direction Estimation Using Partially Underdetermined Coarse Sun Sensor Configurations

    NASA Astrophysics Data System (ADS)

    O'Keefe, Stephen A.

    In recent years there has been a significant increase in interest in smaller satellites as lower cost alternatives to traditional satellites, particularly with the rise in popularity of the CubeSat. Due to stringent mass, size, and often budget constraints, these small satellites rely on making the most of inexpensive hardware components and sensors, such as coarse sun sensors (CSS) and magnetometers. More expensive high-accuracy sun sensors often combine multiple measurements, and use specialized electronics, to deterministically solve for the direction of the Sun. Alternatively, cosine-type CSS output a voltage relative to the input light and are attractive due to their very low cost, simplicity to manufacture, small size, and minimal power consumption. This research investigates using coarse sun sensors for performing robust attitude estimation in order to point a spacecraft at the Sun after deployment from a launch vehicle, or following a system fault. As an alternative to using a large number of sensors, this thesis explores sun-direction estimation techniques with low computational costs that function well with underdetermined sets of CSS. Single-point estimators are coupled with simultaneous nonlinear control to achieve sun-pointing within a small percentage of a single orbit despite the partially underdetermined nature of the sensor suite. Leveraging an extensive analysis of the sensor models involved, sequential filtering techniques are shown to be capable of estimating the sun-direction to within a few degrees, with no a priori attitude information and using only CSS, despite the significant noise and biases present in the system. Detailed numerical simulations are used to compare and contrast the performance of the five different estimation techniques, with and without rate gyro measurements, their sensitivity to rate gyro accuracy, and their computation time. One of the key concerns with reducing the number of CSS is sensor degradation and failure. In this thesis, a Modified Rodrigues Parameter based CSS calibration filter suitable for autonomous on-board operation is developed. The sensitivity of this method's accuracy to the available Earth albedo data is evaluated and compared to the required computational effort. The calibration filter is expanded to perform sensor fault detection, and promising results are shown for reduced resolution albedo models. All of the methods discussed provide alternative attitude, determination, and control system algorithms for small satellite missions looking to use inexpensive, small sensors due to size, power, or budget limitations.

  1. Cost-revenue analysis in the surgical treatment of the obstructed defecation syndrome.

    PubMed

    Schiano di Visconte, Michele; Piccin, Alessandra; Di Bella, Raimondo; Giomo, Priscilla; Pederiva, Vania; Cina, Livio Dal; Munegato, Gabriele

    2006-01-01

    The obstructed defecation syndrome is a frequent condition in the female population. Rectocele and rectal intussusception may cause symptoms of obstructed defecation. The aim of this study is to carry out an economic cost-revenue analysis comparing the rectocele and the rectal intussusception surgical techniques using a double-transanal, circular stapler (Stapled Trans-Anal Rectal Resection - STARR) with other techniques used to repair the same defects. The analysis involved the systematic calculation of the costs incurred during hospitalisation. The revenue estimate was obtained according to the rate quantification of the Diagnosis Related Group (DRG) associated with each hospitalisation. Our analysis confirmed that the global expenditure for the STARR technique amounts to 3,579.09 Euro as against 5,401.15 Euro for rectocele abdominal repair and 3,469.32 Euro for perineal repair. The intussusception repair cost according to Delorme's procedure amounts to 5,877.41Euro as against 3,579.09 Euro for the STARR technique. The revenue analysis revealed a substantial gain for the Health Authority as regards the treatment of rectocele and rectal intussusception for obstructed defecation syndrome. The highest revenue, 6,168. 52 Euro, was obtained with intussusception repair with STARR as compared to Delorme's procedure which presented revenue amounting to 2,359.04. Lower revenues are recorded if the STARR technique is intended for rectocele repair; in this case the revenue amounts to 1,778.12 Euro as against 869.67 Euro and 1,887.89 Euro for abdominal and perineal repair, respectively.

  2. Carpal tunnel syndrome, the search for a cost-effective surgical intervention: a randomised controlled trial.

    PubMed Central

    Lorgelly, Paula K.; Dias, Joseph J.; Bradley, Mary J.; Burke, Frank D.

    2005-01-01

    OBJECTIVE: There is insufficient evidence regarding the clinical and cost-effectiveness of surgical interventions for carpal tunnel syndrome. This study evaluates the cost, effectiveness and cost-effectiveness of minimally invasive surgery compared with conventional open surgery. PATIENTS AND METHODS: 194 sufferers (208 hands) of carpal tunnel syndrome were randomly assigned to each treatment option. A self-administered questionnaire assessed the severity of patients' symptoms and functional status pre- and postoperatively. Treatment costs were estimated from resource use and hospital financial data. RESULTS: Minimally invasive carpal tunnel decompression is marginally more effective than open surgery in terms of functional status, but not significantly so. Little improvement in symptom severity was recorded for either intervention. Minimally invasive surgery was found to be significantly more costly than open surgery. The incremental cost effectiveness ratio for functional status was estimated to be 197 UK pounds, such that a one percentage point improvement in functioning costs 197 UK pounds when using the minimally invasive technique. CONCLUSIONS: Minimally invasive carpal tunnel decompression appears to be more effective but more costly. Initial analysis suggests that the additional expense for such a small improvement in function and no improvement in symptoms would not be regarded as value-for-money, such that minimally invasive carpal tunnel release is unlikely to be considered a cost-effective alternative to the traditional open surgery procedure. PMID:15720906

  3. Considerations in Duplex Investment.

    ERIC Educational Resources Information Center

    Wright, Arthur; Goen, Tom

    Problems of duplex investment are noted in the introduction to this booklet designed to provide a technique by which the investment decision can be approached, develop estimates of typical costs and returns under differing conditions, and encourage investors to analyze objectives and conditions before the decision to buy or build is made. A…

  4. Crop area estimation using high and medium resolution satellite imagery in areas with complex topography

    USGS Publications Warehouse

    Husak, G.J.; Marshall, M. T.; Michaelsen, J.; Pedreros, Diego; Funk, Christopher C.; Galu, G.

    2008-01-01

    Reliable estimates of cropped area (CA) in developing countries with chronic food shortages are essential for emergency relief and the design of appropriate market-based food security programs. Satellite interpretation of CA is an effective alternative to extensive and costly field surveys, which fail to represent the spatial heterogeneity at the country-level. Bias-corrected, texture based classifications show little deviation from actual crop inventories, when estimates derived from aerial photographs or field measurements are used to remove systematic errors in medium resolution estimates. In this paper, we demonstrate a hybrid high-medium resolution technique for Central Ethiopia that combines spatially limited unbiased estimates from IKONOS images, with spatially extensive Landsat ETM+ interpretations, land-cover, and SRTM-based topography. Logistic regression is used to derive the probability of a location being crop. These individual points are then aggregated to produce regional estimates of CA. District-level analysis of Landsat based estimates showed CA totals which supported the estimates of the Bureau of Agriculture and Rural Development. Continued work will evaluate the technique in other parts of Africa, while segmentation algorithms will be evaluated, in order to automate classification of medium resolution imagery for routine CA estimation in the future.

  5. Crop area estimation using high and medium resolution satellite imagery in areas with complex topography

    NASA Astrophysics Data System (ADS)

    Husak, G. J.; Marshall, M. T.; Michaelsen, J.; Pedreros, D.; Funk, C.; Galu, G.

    2008-07-01

    Reliable estimates of cropped area (CA) in developing countries with chronic food shortages are essential for emergency relief and the design of appropriate market-based food security programs. Satellite interpretation of CA is an effective alternative to extensive and costly field surveys, which fail to represent the spatial heterogeneity at the country-level. Bias-corrected, texture based classifications show little deviation from actual crop inventories, when estimates derived from aerial photographs or field measurements are used to remove systematic errors in medium resolution estimates. In this paper, we demonstrate a hybrid high-medium resolution technique for Central Ethiopia that combines spatially limited unbiased estimates from IKONOS images, with spatially extensive Landsat ETM+ interpretations, land-cover, and SRTM-based topography. Logistic regression is used to derive the probability of a location being crop. These individual points are then aggregated to produce regional estimates of CA. District-level analysis of Landsat based estimates showed CA totals which supported the estimates of the Bureau of Agriculture and Rural Development. Continued work will evaluate the technique in other parts of Africa, while segmentation algorithms will be evaluated, in order to automate classification of medium resolution imagery for routine CA estimation in the future.

  6. Allocation of Future Federal Airport and Airway Costs.

    DTIC Science & Technology

    1986-12-01

    attributable to users are allocated among them based upon Ramsey Pricing which minimizes the distortion in aviation markets resulting from the allocation of...the years following 1992, the producers price Uindex projections made by Wharton Econometric Forecasting . Associates1 were employed. This latter set...and on econometric cost estimation techniques. These are Volumes 3 and 5 respectively. 68 A(A A11 I FSZK7_ ODi Id Lin <j< .99 C-4 x\\ M LL- < P7 Pi0

  7. An investigation into the cost, coverage and activities of Helicopter Emergency Medical Services in the state of New South Wales, Australia.

    PubMed

    Taylor, Colman B; Stevenson, Mark; Jan, Stephen; Liu, Bette; Tall, Gary; Middleton, Paul M; Fitzharris, Michael; Myburgh, John

    2011-10-01

    Helicopter Emergency Medical Services (HEMS) have been incorporated into modern health systems for their speed and coverage. In the state of New South Wales (NSW), nine HEMS operate from various locations around the state and currently there is no clear picture of their resource implications. The aim of this study was to assess the cost of HEMS in NSW and investigate the factors linked with the variation in the costs, coverage and activities of HEMS. We undertook a survey of HEMS costs, structures and operations in NSW for the 2008/2009 financial year. Costs were estimated from annual reports and contractual agreements. Data related to the structure and operation of services was obtained by face-to-face interviews, from operational data extracted from individual HEMS, from the NSW Ambulance Computer Aided Despatch system and from the Aeromedical Operations Centre database. In order to estimate population coverage for each HEMS, we used GIS mapping techniques with Australian Bureau of Statistics census information. Across HEMS, cost per mission estimates ranged between $9300 and $19,000 and cost per engine hour estimates ranged between $5343 and $15,743. Regarding structural aspects, six HEMS were run by charities or not-for-profit companies (with partial government funding) and three HEMS were run (and fully funded) by the state government through NSW Ambulance. Two HEMS operated as 'hub' services in conjunction with three associated 'satellite' services and in contrast, four services operated independently. Variation also existed between the HEMS in the type of helicopter used, the clinical staffing and the hours of operation. The majority of services undertook both primary scene responses and secondary inter-facility transfers, although the proportion of each type of transport contributing to total operations varied across the services. This investigation highlighted the cost of HEMS operations in NSW which in total equated to over $50 million per annum. Across services, we found large variation in the cost estimates which was underscored by variation in the structure and operations of HEMS. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. A New Approach to Sap Flow Measurement Using 3D Printed Gauges and Open-source Electronics

    NASA Astrophysics Data System (ADS)

    Ham, J. M.; Miner, G. L.; Kluitenberg, G. J.

    2015-12-01

    A new type of sap flow gauge was developed to measure transpiration from herbaceous plants using a modified heat pulse technique. Gauges were fabricated using 3D-printing technology and low-cost electronics to keep the materials cost under $20 (U.S.) per sensor. Each gauge consisted of small-diameter needle probes fastened to a 3D-printed frame. One needle contained a resistance heater to provide a 6 to 8 second heat pulse while the other probes measured the resultant temperature increase at two distances from the heat source. The data acquisition system for the gauges was built from a low-cost Arduino microcontroller. The system read the gauges every 10 minutes and stored the results on a SD card. Different numerical techniques were evaluated for estimating sap velocity from the heat pulse data - including analytical solutions and parameter estimation approaches . Prototype gauges were tested in the greenhouse on containerized corn and sunflower. Sap velocities measured by the gauges were compared to independent gravimetric measurements of whole plant transpiration. Results showed the system could measure daily transpiration to within 3% of the gravimetric measurements. Excellent agreement was observed when two gauges were attached the same stem. Accuracy was not affected by rapidly changing transpiration rates observed under partly cloudy conditions. The gauge-based estimates of stem thermal properties suggested the system may also detect the onset of water stress. A field study showed the gauges could run for 1 to 2 weeks on a small battery pack. Sap flow measurements on multiple corn stems were scaled up by population to estimate field-scale transpiration. During full canopy cover, excellent agreement was observed between the scaled-up sap flow measurements and reference crop evapotranspiration calculated from weather data. Data also showed promise as a way to estimate real-time canopy resistance required for model verification and development. Given the low-cost, low-power, and open-source characteristics of the system; the technology is well suited for applications requiring large number of gauges (spatial scaling or treatment comparisons). While early work was done with agricultural crops, the approach is well suited for other species such as riverine shrubs.

  9. Photographic and photometric enhancement of Lunar Orbiter products, projects A, B and C

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A detailed discussion is presented of the framelet joining, photometric data improvement, and statistical error analysis. The Lunar Orbiter film handling system, readout system, and the digitization are described, along with the technique of joining adjacent framelets by a using a digital computer. Time and cost estimates are given. The problems and techniques involved in improving the digitized data are discussed. It was found that spectacular improvements are possible. Program documentations are included.

  10. Analyzing costs of space debris mitigation methods

    NASA Astrophysics Data System (ADS)

    Wiedemann, C.; Krag, H.; Bendisch, J.; Sdunnus, H.

    The steadily increasing number of space objects poses a considerable hazard to all kinds of spacecraft. To reduce the risks to future space missions different debris mitigation measures and spacecraft protection techniques have been investigated during the last years. However, the economic efficiency has not been considered yet in this context. This economical background is not always clear to satellite operators and the space industry. Current studies have the objective to evaluate the mission costs due to space debris in a business as usual (no mitigation) scenario compared to the missions costs considering debris mitigation. The aim i an estimation of thes time until the investment in debris mitigation will lead to an effective reduction of mission costs. This paper presents the results of investigations on the key problems of cost estimation for spacecraft and the influence of debris mitigation and shielding on cost. The shielding of a satellite can be an effective method to protect the spacecraft against debris impact. Mitigation strategies like the reduction of orbital lifetime and de- or re-orbit of non-operational satellites are methods to control the space debris environment. These methods result in an increase of costs. In a first step the overall costs of different types of unmanned satellites are analyzed. The key problem is, that it is not possible to provide a simple cost model that can be applied to all types of satellites. Unmanned spacecraft differ very much in mission, complexity of design, payload and operational lifetime. It is important to classify relevant cost parameters and investigate their influence on the respective mission. The theory of empirical cost estimation and existing cost models are discussed. A selected cost model is simplified and generalized for an application on all operational satellites. In a next step the influence of space debris on cost is treated, if the implementation of mitigation strategies is considered.

  11. Two phase sampling for wheat acreage estimation. [large area crop inventory experiment

    NASA Technical Reports Server (NTRS)

    Thomas, R. W.; Hay, C. M.

    1977-01-01

    A two phase LANDSAT-based sample allocation and wheat proportion estimation method was developed. This technique employs manual, LANDSAT full frame-based wheat or cultivated land proportion estimates from a large number of segments comprising a first sample phase to optimally allocate a smaller phase two sample of computer or manually processed segments. Application to the Kansas Southwest CRD for 1974 produced a wheat acreage estimate for that CRD within 2.42 percent of the USDA SRS-based estimate using a lower CRD inventory budget than for a simulated reference LACIE system. Factor of 2 or greater cost or precision improvements relative to the reference system were obtained.

  12. Direct costs associated with the management of progressive early onset scoliosis: estimations based on gold standard technique or with magnetically controlled growing rods.

    PubMed

    Charroin, C; Abelin-Genevois, K; Cunin, V; Berthiller, J; Constant, H; Kohler, R; Aulagner, G; Serrier, H; Armoiry, X

    2014-09-01

    The main disadvantage of the surgical management of early onset scoliosis (EOS) using conventional growing rods is the need for iterative surgical procedures during childhood. The emergence of an innovative device using distraction-based magnetically controlled growing rods (MCGR) provides the opportunity to avoid such surgeries and therefore to improve the patient's quality of life. Despite the high cost of MCGR and considering its potential impact in reducing hospital stays, the use of MCGR could reduce medical resource consumption in a long-term view in comparison to traditional growing rod (TGR). A cost-simulation model was constructed to assess the incremental cost between the two strategies. The cost for each strategy was estimated based on probability of medical resource consumption determined from literature search as well as data from EOS patients treated in our centre. Some medical expenses were also estimated from expert interviews. The time horizon chosen was 4 years as from first surgical implantation. Costs were calculated in the perspective of the French sickness fund (using rates from year 2013) and were discounted by an annual rate of 4%. Sensitivity analyses were conducted to test model strength to various parameters. With a time horizon of 4 years, the estimated direct costs of TGR and MCGR strategies were 49,067 € and 42,752 €, respectively leading to an incremental costs of 6135 € in favour of MCGR strategy. In the first case, costs were mainly represented by hospital stays expenses (83.9%) whereas in the other the cost of MCGR contributed to 59.5% of the total amount. In the univariate sensitivity analysis, the tariffs of hospital stays, the tariffs of the MCG, and the frequency of distraction surgeries were the parameters with the most important impact on incremental cost. MCGR is a recent and promising innovation in the management of severe EOS. Besides improving the quality of life, its use in the treatment of severe EOS is likely to be offset by lower costs of hospital stays. Level IV, economic and decision analyses, retrospective study. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  13. A participatory approach for selecting cost-effective measures in the WFD context: the Mar Menor (SE Spain).

    PubMed

    Perni, Angel; Martínez-Paz, José M

    2013-08-01

    Achieving a good ecological status in water bodies by 2015 is one of the objectives established in the European Water Framework Directive. Cost-effective analysis (CEA) has been applied for selecting measures to achieve this goal, but this appraisal technique requires technical and economic information that is not always available. In addition, there are often local insights that can only be identified by engaging multiple stakeholders in a participatory process. This paper proposes to combine CEA with the active involvement of stakeholders for selecting cost-effective measures. This approach has been applied to the case study of one of the main coastal lagoons in the European Mediterranean Sea, the Mar Menor, which presents eutrophication problems. Firstly, face-to-face interviews were conducted to estimate relative effectiveness and relative impacts of a set of measures by means of the pairwise comparison technique. Secondly, relative effectiveness was used to estimate cost-effectiveness ratios. The most cost-effective measures were the restoration of watercourses that drain into the lagoon and the treatment of polluted groundwater. Although in general the stakeholders approved the former, most of them stated that the latter involved some uncertainties, which must be addressed before implementing it. Stakeholders pointed out that the PoM would have a positive impact not only on water quality, but also on fishing, agriculture and tourism in the area. This approach can be useful to evaluate other programmes, plans or projects related to other European environmental strategies. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. A cost-effective line-based light-balancing technique using adaptive processing.

    PubMed

    Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min

    2006-09-01

    The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.

  15. Fatigue life estimation of a 1D aluminum beam under mode-I loading using the electromechanical impedance technique

    NASA Astrophysics Data System (ADS)

    Lim, Yee Yan; Kiong Soh, Chee

    2011-12-01

    Structures in service are often subjected to fatigue loads. Cracks would develop and lead to failure if left unnoticed after a large number of cyclic loadings. Monitoring the process of fatigue crack propagation as well as estimating the remaining useful life of a structure is thus essential to prevent catastrophe while minimizing earlier-than-required replacement. The advent of smart materials such as piezo-impedance transducers (lead zirconate titanate, PZT) has ushered in a new era of structural health monitoring (SHM) based on non-destructive evaluation (NDE). This paper presents a series of investigative studies to evaluate the feasibility of fatigue crack monitoring and estimation of remaining useful life using the electromechanical impedance (EMI) technique employing a PZT transducer. Experimental tests were conducted to study the ability of the EMI technique in monitoring fatigue crack in 1D lab-sized aluminum beams. The experimental results prove that the EMI technique is very sensitive to fatigue crack propagation. A proof-of-concept semi-analytical damage model for fatigue life estimation has been developed by incorporating the linear elastic fracture mechanics (LEFM) theory into the finite element (FE) model. The prediction of the model matches closely with the experiment, suggesting the possibility of replacing costly experiments in future.

  16. IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.

    PubMed

    Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio

    2016-03-01

    The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result.

  17. Cost-utility of cognitive behavioral therapy for low back pain from the commercial payer perspective.

    PubMed

    Norton, Giulia; McDonough, Christine M; Cabral, Howard; Shwartz, Michael; Burgess, James F

    2015-05-15

    Markov cost-utility model. To evaluate the cost-utility of cognitive behavioral therapy (CBT) for the treatment of persistent nonspecific low back pain (LBP) from the perspective of US commercial payers. CBT is widely deemed clinically effective for LBP treatment. The evidence is suggestive of cost-effectiveness. We constructed and validated a Markov intention-to-treat model to estimate the cost-utility of CBT, with 1-year and 10-year time horizons. We applied likelihood of improvement and utilities from a randomized controlled trial assessing CBT to treat LBP. The trial randomized subjects to treatment but subjects freely sought health care services. We derived the cost of equivalent rates and types of services from US commercial claims for LBP for a similar population. For the 10-year estimates, we derived recurrence rates from the literature. The base case included medical and pharmaceutical services and assumed gradual loss of skill in applying CBT techniques. Sensitivity analyses assessed the distribution of service utilization, utility values, and rate of LBP recurrence. We compared health plan designs. Results are based on 5000 iterations of each model and expressed as an incremental cost per quality-adjusted life-year. The incremental cost-utility of CBT was $7197 per quality-adjusted life-year in the first year and $5855 per quality-adjusted life-year over 10 years. The results are robust across numerous sensitivity analyses. No change of parameter estimate resulted in a difference of more than 7% from the base case for either time horizon. Including chiropractic and/or acupuncture care did not substantively affect cost-effectiveness. The model with medical but no pharmaceutical costs was more cost-effective ($5238 for 1 yr and $3849 for 10 yr). CBT is a cost-effective approach to manage chronic LBP among commercial health plans members. Cost-effectiveness is demonstrated for multiple plan designs. 2.

  18. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries.

    PubMed

    Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien

    2018-01-01

    In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.

  19. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries

    PubMed Central

    Resch, Stephen

    2018-01-01

    Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964

  20. Costs of examinations performed in a hospital laboratory in Chile.

    PubMed

    Andrade, Germán Lobos; Palma, Carolina Salas

    2018-01-01

    To determine the total average costs related to laboratory examinations performed in a hospital laboratory in Chile. Retrospective study with data from July 2014 to June 2015. 92 examinations classified in ten groups were selected according to the analysis methodology. The costs were estimated as the sum of direct and indirect laboratory costs and indirect institutional factors. The average values obtained for the costs according to examination group (in USD) were: 1.79 (clinical chemistry), 10.21 (immunoassay techniques), 13.27 (coagulation), 26.06 (high-performance liquid chromatography), 21.2 (immunological), 3.85 (gases and electrolytes), 156.48 (cytogenetic), 1.38 (urine), 4.02 (automated hematological), 4.93 (manual hematological). The value, or service fee, returned to public institutions who perform laboratory services does not adequately reflect the true total average production costs of examinations.

  1. Continued investigation of solid propulsion economics. Task 1B: Large solid rocket motor case fabrication methods - Supplement process complexity factor cost technique

    NASA Technical Reports Server (NTRS)

    Baird, J.

    1967-01-01

    This supplement to Task lB-Large Solid Rocket Motor Case Fabrication Methods supplies additional supporting cost data and discusses in detail the methodology that was applied to the task. For the case elements studied, the cost was found to be directly proportional to the Process Complexity Factor (PCF). The PCF was obtained for each element by identifying unit processes that are common to the elements and their alternative manufacturing routes, by assigning a weight to each unit process, and by summing the weighted counts. In three instances of actual manufacture, the actual cost per pound equaled the cost estimate based on PCF per pound, but this supplement, recognizes that the methodology is of limited, rather than general, application.

  2. Age and sex pattern of cardiovascular mortality, hospitalisation and associated cost in India.

    PubMed

    Srivastava, Akanksha; Mohanty, Sanjay K

    2013-01-01

    Though the cardiovascular diseases are the leading cause of mortality in India, little is known about the human and economic loss attributed to the disease. The aim of this paper is to account the age and sex pattern of mortality, hospitalisation and the cost of hospitalisation for cardiovascular diseases in India. Data for the present study has been drawn from multiple sources; 52(nd) and 60(th) rounds of the National Sample Survey, Special Survey of Death, 2001-03 and the Sample Registration System 2004-2010. Under the changing demographics and constant assumptions of mortality, hospitalisation and cost of hospitalisation, we have estimated the deaths, hospitalisation and cost of hospitalisation for cardiovascular diseases in India during 2004 to 2021. Descriptive analyses and multivariate techniques were used to understand the socio-economic differentials in cost of hospitalisation for cardiovascular diseases in India. In India, the cardiovascular diseases accounted for an estimated 1.4 million deaths in 2004 and it is likely to be 2.1 million in 2021. An estimated 6.7 million people were hospitalised for cardiovascular diseases in 2004, and projected to be 10.9 million by 2021. Unlike mortality, majority of the hospitalisation due to cardiovascular diseases will be in the prime working age group (25-59). The estimated cost of hospitalisation for cardiovascular diseases was 94/- billion rupees in 2004 and expected to be 152/- billion rupees by 2021, at 2004 prices. The cost of hospitalisation for cardiovascular diseases was significantly high in private health centres, high fertility states and among high socio-economic groups. The cardiovascular mortality and hospitalisation will be largely concentrated in the prime working age group and the cost of hospitalisation is expected to increase substantially in coming years. This calls for mobilising resources, increasing access to health insurance and devising strategies for the prevention, control and treatment of cardiovascular diseases in India.

  3. Multistage variable probability forest volume inventory. [the Defiance Unit of the Navajo Nation

    NASA Technical Reports Server (NTRS)

    Anderson, J. E. (Principal Investigator)

    1979-01-01

    An inventory scheme based on the use of computer processed LANDSAT MSS data was developed. Output from the inventory scheme provides an estimate of the standing net saw timber volume of a major timber species on a selected forested area of the Navajo Nation. Such estimates are based on the values of parameters currently used for scaled sawlog conversion to mill output. The multistage variable probability sampling appears capable of producing estimates which compare favorably with those produced using conventional techniques. In addition, the reduction in time, manpower, and overall costs lend it to numerous applications.

  4. Estimation of Spatiotemporal Sensitivity Using Band-limited Signals with No Additional Acquisitions for k-t Parallel Imaging.

    PubMed

    Takeshima, Hidenori; Saitoh, Kanako; Nitta, Shuhei; Shiodera, Taichiro; Takeguchi, Tomoyuki; Bannae, Shuhei; Kuhara, Shigehide

    2018-03-13

    Dynamic MR techniques, such as cardiac cine imaging, benefit from shorter acquisition times. The goal of the present study was to develop a method that achieves short acquisition times, while maintaining a cost-effective reconstruction, for dynamic MRI. k - t sensitivity encoding (SENSE) was identified as the base method to be enhanced meeting these two requirements. The proposed method achieves a reduction in acquisition time by estimating the spatiotemporal (x - f) sensitivity without requiring the acquisition of the alias-free signals, typical of the k - t SENSE technique. The cost-effective reconstruction, in turn, is achieved by a computationally efficient estimation of the x - f sensitivity from the band-limited signals of the aliased inputs. Such band-limited signals are suitable for sensitivity estimation because the strongly aliased signals have been removed. For the same reduction factor 4, the net reduction factor 4 for the proposed method was significantly higher than the factor 2.29 achieved by k - t SENSE. The processing time is reduced from 4.1 s for k - t SENSE to 1.7 s for the proposed method. The image quality obtained using the proposed method proved to be superior (mean squared error [MSE] ± standard deviation [SD] = 6.85 ± 2.73) compared to the k - t SENSE case (MSE ± SD = 12.73 ± 3.60) for the vertical long-axis (VLA) view, as well as other views. In the present study, k - t SENSE was identified as a suitable base method to be improved achieving both short acquisition times and a cost-effective reconstruction. To enhance these characteristics of base method, a novel implementation is proposed, estimating the x - f sensitivity without the need for an explicit scan of the reference signals. Experimental results showed that the acquisition, computational times and image quality for the proposed method were improved compared to the standard k - t SENSE method.

  5. Operations and support cost modeling using Markov chains

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1989-01-01

    Systems for future missions will be selected with life cycle costs (LCC) as a primary evaluation criterion. This reflects the current realization that only systems which are considered affordable will be built in the future due to the national budget constaints. Such an environment calls for innovative cost modeling techniques which address all of the phases a space system goes through during its life cycle, namely: design and development, fabrication, operations and support; and retirement. A significant portion of the LCC for reusable systems are generated during the operations and support phase (OS). Typically, OS costs can account for 60 to 80 percent of the total LCC. Clearly, OS costs are wholly determined or at least strongly influenced by decisions made during the design and development phases of the project. As a result OS costs need to be considered and estimated early in the conceptual phase. To be effective, an OS cost estimating model needs to account for actual instead of ideal processes by associating cost elements with probabilities. One approach that may be suitable for OS cost modeling is the use of the Markov Chain Process. Markov chains are an important method of probabilistic analysis for operations research analysts but they are rarely used for life cycle cost analysis. This research effort evaluates the use of Markov Chains in LCC analysis by developing OS cost model for a hypothetical reusable space transportation vehicle (HSTV) and suggests further uses of the Markov Chain process as a design-aid tool.

  6. Using multilevel models for assessing the variability of multinational resource use and cost data.

    PubMed

    Grieve, Richard; Nixon, Richard; Thompson, Simon G; Normand, Charles

    2005-02-01

    Multinational economic evaluations often calculate a single measure of cost-effectiveness using cost data pooled across several countries. To assess the validity of pooling international cost data the reasons for cost variation across countries need to be assessed. Previously, ordinary least-squares (OLS) regression models have been used to identify factors associated with variability in resource use and total costs. However, multilevel models (MLMs), which accommodate the hierarchical structure of the data, may be more appropriate. This paper compares these different techniques using a multinational dataset comprising case-mix, resource use and cost data on 1300 stroke admissions from 13 centres in 11 European countries. OLS and MLMs were used to estimate the effect of patient and centre-level covariates on the total length of hospital stay (LOS) and total cost. MLMs with normal and gamma distributions for the data within centres were compared. The results from the OLS model showed that both patient and centre-level covariates were associated with LOS and total cost. The estimates from the MLMs showed that none of the centre-level characteristics were associated with LOS, and the level of spending on health was the centre-level variable most highly associated with total cost. We conclude that using OLS models for assessing international variation can lead to incorrect inferences, and that MLMs are more appropriate for assessing why resource use and costs vary across centres. Copyright (c) 2004 John Wiley & Sons, Ltd.

  7. Running Neuroimaging Applications on Amazon Web Services: How, When, and at What Cost?

    PubMed Central

    Madhyastha, Tara M.; Koh, Natalie; Day, Trevor K. M.; Hernández-Fernández, Moises; Kelley, Austin; Peterson, Daniel J.; Rajan, Sabreena; Woelfer, Karl A.; Wolf, Jonathan; Grabowski, Thomas J.

    2017-01-01

    The contribution of this paper is to identify and describe current best practices for using Amazon Web Services (AWS) to execute neuroimaging workflows “in the cloud.” Neuroimaging offers a vast set of techniques by which to interrogate the structure and function of the living brain. However, many of the scientists for whom neuroimaging is an extremely important tool have limited training in parallel computation. At the same time, the field is experiencing a surge in computational demands, driven by a combination of data-sharing efforts, improvements in scanner technology that allow acquisition of images with higher image resolution, and by the desire to use statistical techniques that stress processing requirements. Most neuroimaging workflows can be executed as independent parallel jobs and are therefore excellent candidates for running on AWS, but the overhead of learning to do so and determining whether it is worth the cost can be prohibitive. In this paper we describe how to identify neuroimaging workloads that are appropriate for running on AWS, how to benchmark execution time, and how to estimate cost of running on AWS. By benchmarking common neuroimaging applications, we show that cloud computing can be a viable alternative to on-premises hardware. We present guidelines that neuroimaging labs can use to provide a cluster-on-demand type of service that should be familiar to users, and scripts to estimate cost and create such a cluster. PMID:29163119

  8. A Highly Efficient Design Strategy for Regression with Outcome Pooling

    PubMed Central

    Mitchell, Emily M.; Lyles, Robert H.; Manatunga, Amita K.; Perkins, Neil J.; Schisterman, Enrique F.

    2014-01-01

    The potential for research involving biospecimens can be hindered by the prohibitive cost of performing laboratory assays on individual samples. To mitigate this cost, strategies such as randomly selecting a portion of specimens for analysis or randomly pooling specimens prior to performing laboratory assays may be employed. These techniques, while effective in reducing cost, are often accompanied by a considerable loss of statistical efficiency. We propose a novel pooling strategy based on the k-means clustering algorithm to reduce laboratory costs while maintaining a high level of statistical efficiency when predictor variables are measured on all subjects, but the outcome of interest is assessed in pools. We perform simulations motivated by the BioCycle study to compare this k-means pooling strategy with current pooling and selection techniques under simple and multiple linear regression models. While all of the methods considered produce unbiased estimates and confidence intervals with appropriate coverage, pooling under k-means clustering provides the most precise estimates, closely approximating results from the full data and losing minimal precision as the total number of pools decreases. The benefits of k-means clustering evident in the simulation study are then applied to an analysis of the BioCycle dataset. In conclusion, when the number of lab tests is limited by budget, pooling specimens based on k-means clustering prior to performing lab assays can be an effective way to save money with minimal information loss in a regression setting. PMID:25220822

  9. A highly efficient design strategy for regression with outcome pooling.

    PubMed

    Mitchell, Emily M; Lyles, Robert H; Manatunga, Amita K; Perkins, Neil J; Schisterman, Enrique F

    2014-12-10

    The potential for research involving biospecimens can be hindered by the prohibitive cost of performing laboratory assays on individual samples. To mitigate this cost, strategies such as randomly selecting a portion of specimens for analysis or randomly pooling specimens prior to performing laboratory assays may be employed. These techniques, while effective in reducing cost, are often accompanied by a considerable loss of statistical efficiency. We propose a novel pooling strategy based on the k-means clustering algorithm to reduce laboratory costs while maintaining a high level of statistical efficiency when predictor variables are measured on all subjects, but the outcome of interest is assessed in pools. We perform simulations motivated by the BioCycle study to compare this k-means pooling strategy with current pooling and selection techniques under simple and multiple linear regression models. While all of the methods considered produce unbiased estimates and confidence intervals with appropriate coverage, pooling under k-means clustering provides the most precise estimates, closely approximating results from the full data and losing minimal precision as the total number of pools decreases. The benefits of k-means clustering evident in the simulation study are then applied to an analysis of the BioCycle dataset. In conclusion, when the number of lab tests is limited by budget, pooling specimens based on k-means clustering prior to performing lab assays can be an effective way to save money with minimal information loss in a regression setting. Copyright © 2014 John Wiley & Sons, Ltd.

  10. An Evaluation of CPRA (Cost Performance Report Analysis) Estimate at Completion Techniques Based Upon AFWAL (Air Force Wright Aeronautical Laboratories) Cost/Schedule Control System Criteria Data

    DTIC Science & Technology

    1985-09-01

    4 C/SCSC Terms and Definitions ...... ..... 5 Cost Performance Report Analysis (CPA) Progrra" m 6 Description of CPRA Terms and Formulas...hypotheses are: 1 2 C2: al’ 02 ’ The test statistic is then calculated as: F* (( SSEI + (nI - 2)) / (SSE 2 + (n 2 - 2))] The critical F value is: F(c, nl...353.90767 SIGNIF F = .0000 44 ,1 42 •.4 m . - .TABLE B.4 General Linear Test for EAC1 and EAC5 MEAN STD DEV CASES ECAC 827534.056 1202737.882 1630 EACS

  11. Accurate low-cost methods for performance evaluation of cache memory systems

    NASA Technical Reports Server (NTRS)

    Laha, Subhasis; Patel, Janak H.; Iyer, Ravishankar K.

    1988-01-01

    Methods of simulation based on statistical techniques are proposed to decrease the need for large trace measurements and for predicting true program behavior. Sampling techniques are applied while the address trace is collected from a workload. This drastically reduces the space and time needed to collect the trace. Simulation techniques are developed to use the sampled data not only to predict the mean miss rate of the cache, but also to provide an empirical estimate of its actual distribution. Finally, a concept of primed cache is introduced to simulate large caches by the sampling-based method.

  12. Through the Looking Glass: Estimating Effects of Medical Homes for People with Severe Mental Illness.

    PubMed

    Domino, Marisa Elena; Kilany, Mona; Wells, Rebecca; Morrissey, Joseph P

    2017-10-01

    To examine whether medical homes have heterogeneous effects in different subpopulations, leveraging the interpretations from a variety of statistical techniques. Secondary claims data from the NC Medicaid program for 2004-2007. The sample included all adults with diagnoses of schizophrenia, bipolar disorder, or major depression who were not dually enrolled in Medicare or in a nursing facility. We modeled a number of monthly service use, adherence, and expenditure outcomes using fixed effects, generalized estimating equation with and without inverse probability of treatment weights, and instrumental variables analyses. Data were received from the Carolina Cost and Quality Initiative. The four estimation techniques consistently revealed generally positive associations between medical homes and access to primary care, specialty mental health care, greater medication adherence, slightly lower emergency room use, and greater expenditures. These findings were consistent across all three major severe mental illness diagnostic groups. Some heterogeneity in effects were noted, especially in preventive screening. Expanding access to primary care-based medical homes for people with severe mental illness may not save money for insurance providers, due to greater access for important outpatient services with little cost offset. Health services research examining more of the treatment heterogeneity may contribute to more realistic projections about medical homes outcomes. © Health Research and Educational Trust.

  13. Manufacture of conical springs with elastic medium technology improvement

    NASA Astrophysics Data System (ADS)

    Kurguzov, S. A.; Mikhailova, U. V.; Kalugina, O. B.

    2018-01-01

    This article considers the manufacturing technology improvement by using an elastic medium in the stamping tool forming space to improve the conical springs performance characteristics and reduce the costs of their production. Estimation technique of disk spring operational properties is developed by mathematical modeling of the compression process during the operation of a spring. A technique for optimizing the design parameters of a conical spring is developed, which ensures a minimum voltage value when operated in the edge of the spring opening.

  14. THINEX - an expert system for estimating forest harvesting productivity and cost

    Treesearch

    C. B. LeDoux; B. Gopalakrishnan; R. S. Pabba

    1998-01-01

    As the emphasis of forest stand management shifts towards implementing ecosystem management, managers are examining alternative methods to harvesting stands in order to accomplish multiple objectives by using techniques such as shelterwood harvests, thinnings, and group selection methods, thus leaving more residual trees to improve the visual quality of the harvested...

  15. An Overview of a Hybrid Fraud Scoring and Spike Detection Technique for Fraud Detection in Streaming Data

    NASA Astrophysics Data System (ADS)

    Laleh, Naeimeh; Azgomi, Mohammad Abdollahi

    Credit card and personal loan applications have increased significantly. Application fraud is present when the application forms contain plausible and synthetic identity information or real stolen identity information. The monetary cost of application fraud is often estimated to be in the billions of dollars [1].

  16. An Analysis of Variance Framework for Matrix Sampling.

    ERIC Educational Resources Information Center

    Sirotnik, Kenneth

    Significant cost savings can be achieved with the use of matrix sampling in estimating population parameters from psychometric data. The statistical design is intuitively simple, using the framework of the two-way classification analysis of variance technique. For example, the mean and variance are derived from the performance of a certain grade…

  17. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  18. Comparison Study of MS-HRM and Pyrosequencing Techniques for Quantification of APC and CDKN2A Gene Methylation

    PubMed Central

    Migheli, Francesca; Stoccoro, Andrea; Coppedè, Fabio; Wan Omar, Wan Adnan; Failli, Alessandra; Consolini, Rita; Seccia, Massimo; Spisni, Roberto; Miccoli, Paolo; Mathers, John C.; Migliore, Lucia

    2013-01-01

    There is increasing interest in the development of cost-effective techniques for the quantification of DNA methylation biomarkers. We analyzed 90 samples of surgically resected colorectal cancer tissues for APC and CDKN2A promoter methylation using methylation sensitive-high resolution melting (MS-HRM) and pyrosequencing. MS-HRM is a less expensive technique compared with pyrosequencing but is usually more limited because it gives a range of methylation estimates rather than a single value. Here, we developed a method for deriving single estimates, rather than a range, of methylation using MS-HRM and compared the values obtained in this way with those obtained using the gold standard quantitative method of pyrosequencing. We derived an interpolation curve using standards of known methylated/unmethylated ratio (0%, 12.5%, 25%, 50%, 75%, and 100% of methylation) to obtain the best estimate of the extent of methylation for each of our samples. We observed similar profiles of methylation and a high correlation coefficient between the two techniques. Overall, our new approach allows MS-HRM to be used as a quantitative assay which provides results which are comparable with those obtained by pyrosequencing. PMID:23326336

  19. Estimating TCP Packet Loss Ratio from Sampled ACK Packets

    NASA Astrophysics Data System (ADS)

    Yamasaki, Yasuhiro; Shimonishi, Hideyuki; Murase, Tutomu

    The advent of various quality-sensitive applications has greatly changed the requirements for IP network management and made the monitoring of individual traffic flows more important. Since the processing costs of per-flow quality monitoring are high, especially in high-speed backbone links, packet sampling techniques have been attracting considerable attention. Existing sampling techniques, such as those used in Sampled NetFlow and sFlow, however, focus on the monitoring of traffic volume, and there has been little discussion of the monitoring of such quality indexes as packet loss ratio. In this paper we propose a method for estimating, from sampled packets, packet loss ratios in individual TCP sessions. It detects packet loss events by monitoring duplicate ACK events raised by each TCP receiver. Because sampling reveals only a portion of the actual packet loss, the actual packet loss ratio is estimated statistically. Simulation results show that the proposed method can estimate the TCP packet loss ratio accurately from a 10% sampling of packets.

  20. Exploring multi-scale forest above ground biomass estimation with optical remote sensing imageries

    NASA Astrophysics Data System (ADS)

    Koju, U.; Zhang, J.; Gilani, H.

    2017-02-01

    Forest shares 80% of total exchange of carbon between the atmosphere and the terrestrial ecosystem. Due to this monitoring of forest above ground biomass (as carbon can be calculated as 0.47 part of total biomass) has become very important. Forest above ground biomass as being the major portion of total forest biomass should be given a very careful consideration in its estimation. It is hoped to be useful in addressing the ongoing problems of deforestation and degradation and to gain carbon mitigation benefits through mechanisms like Reducing Emissions from Deforestation and Forest Degradation (REDD+). Many methods of above ground biomass estimation are in used ranging from use of optical remote sensing imageries of very high to very low resolution to SAR data and LIDAR. This paper describes a multi-scale approach for assessing forest above ground biomass, and ultimately carbon stocks, using very high imageries, open source medium resolution and medium resolution satellite datasets with a very limited number of field plots. We found this method is one of the most promising method for forest above ground biomass estimation with higher accuracy and low cost budget. Pilot study was conducted in Chitwan district of Nepal on the estimation of biomass using this technique. The GeoEye-1 (0.5m), Landsat (30m) and Google Earth (GE) images were used remote sensing imageries. Object-based image analysis (OBIA) classification technique was done on Geo-eye imagery for the tree crown delineation at the watershed level. After then, crown projection area (CPA) vs. biomass model was developed and validated at the watershed level. Open source GE imageries were used to calculate the CPA and biomass from virtual plots at district level. Using data mining technique, different parameters from Landsat imageries along with the virtual sample biomass were used for upscaling biomass estimation at district level. We found, this approach can considerably reduce field data requirements for estimation of biomass and carbon in comparison with inventory methods based on enumeration of all trees in a plot. The proposed methodology is very cost effective and can be replicated with limited resources and time.

  1. Detecting declines in the abundance of a bull trout (Salvelinus confluentus) population: Understanding the accuracy, precision, and costs of our efforts

    USGS Publications Warehouse

    Al-Chokhachy, R.; Budy, P.; Conner, M.

    2009-01-01

    Using empirical field data for bull trout (Salvelinus confluentus), we evaluated the trade-off between power and sampling effort-cost using Monte Carlo simulations of commonly collected mark-recapture-resight and count data, and we estimated the power to detect changes in abundance across different time intervals. We also evaluated the effects of monitoring different components of a population and stratification methods on the precision of each method. Our results illustrate substantial variability in the relative precision, cost, and information gained from each approach. While grouping estimates by age or stage class substantially increased the precision of estimates, spatial stratification of sampling units resulted in limited increases in precision. Although mark-resight methods allowed for estimates of abundance versus indices of abundance, our results suggest snorkel surveys may be a more affordable monitoring approach across large spatial scales. Detecting a 25% decline in abundance after 5 years was not possible, regardless of technique (power = 0.80), without high sampling effort (48% of study site). Detecting a 25% decline was possible after 15 years, but still required high sampling efforts. Our results suggest detecting moderate changes in abundance of freshwater salmonids requires considerable resource and temporal commitments and highlight the difficulties of using abundance measures for monitoring bull trout populations.

  2. Direction of Arrival Estimation with a Novel Single-Port Smart Antenna

    NASA Astrophysics Data System (ADS)

    Sun, Chen; Karmakar, Nemai C.

    2004-12-01

    A novel direction of arrival (DOA) estimation technique that uses the conventional multiple-signal classification (MUSIC) algorithm with periodic signals is applied to a single-port smart antenna. Results show that the proposed method gives a high-resolution (1 degree) DOA estimation in an uncorrelated signal environment. The novelty lies in that the MUSIC algorithm is applied to a simplified antenna configuration. Only 1 analogue-to-digital converter (ADC) is used in this antenna, which features low power consumption, low cost, and ease of fabrication. Modifications to the conventional MUSIC algorithm do not bring much additional complexity. The proposed technique is also free from the negative influence by the mutual coupling among antenna elements. Therefore, it offers an economical way to extensively implement smart antennas into the existing wireless mobile communications systems, especially at the power consumption limited mobile terminals such as laptops in wireless networks.

  3. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  4. Techniques and methods for estimating abundance of larval and metamorphosed sea lampreys in Great Lakes tributaries, 1995 to 2001

    USGS Publications Warehouse

    Slade, Jeffrey W.; Adams, Jean V.; Christie, Gavin C.; Cuddy, Douglas W.; Fodale, Michael F.; Heinrich, John W.; Quinlan, Henry R.; Weise, Jerry G.; Weisser, John W.; Young, Robert J.

    2003-01-01

    Before 1995, Great Lakes streams were selected for lampricide treatment based primarily on qualitative measures of the relative abundance of larval sea lampreys, Petromyzon marinus. New integrated pest management approaches required standardized quantitative measures of sea lamprey. This paper evaluates historical larval assessment techniques and data and describes how new standardized methods for estimating abundance of larval and metamorphosed sea lampreys were developed and implemented. These new methods have been used to estimate larval and metamorphosed sea lamprey abundance in about 100 Great Lakes streams annually and to rank them for lampricide treatment since 1995. Implementation of these methods has provided a quantitative means of selecting streams for treatment based on treatment cost and estimated production of metamorphosed sea lampreys, provided managers with a tool to estimate potential recruitment of sea lampreys to the Great Lakes and the ability to measure the potential consequences of not treating streams, resulting in a more justifiable allocation of resources. The empirical data produced can also be used to simulate the impacts of various control scenarios.

  5. Humans, 'things' and space: costing hospital infection control interventions.

    PubMed

    Page, K; Graves, N; Halton, K; Barnett, A G

    2013-07-01

    Previous attempts at costing infection control programmes have tended to focus on accounting costs rather than economic costs. For studies using economic costs, estimates tend to be quite crude and probably underestimate the true cost. One of the largest costs of any intervention is staff time, but this cost is difficult to quantify and has been largely ignored in previous attempts. To design and evaluate the costs of hospital-based infection control interventions or programmes. This article also discusses several issues to consider when costing interventions, and suggests strategies for overcoming these issues. Previous literature and techniques in both health economics and psychology are reviewed and synthesized. This article provides a set of generic, transferable costing guidelines. Key principles such as definition of study scope and focus on large costs, as well as pitfalls (e.g. overconfidence and uncertainty), are discussed. These new guidelines can be used by hospital staff and other researchers to cost their infection control programmes and interventions more accurately. Copyright © 2013 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  6. How do high cost-sharing policies for physician care affect total care costs among people with chronic disease?

    PubMed

    Xin, Haichang; Harman, Jeffrey S; Yang, Zhou

    2014-01-01

    This study examines whether high cost-sharing in physician care is associated with a differential impact on total care costs by health status. Total care includes physician care, emergency room (ER) visits and inpatient care. Since high cost-sharing policies can reduce needed care as well as unneeded care use, it raises the concern whether these policies are a good strategy for controlling costs among chronically ill patients. This study used the 2007 Medical Expenditure Panel Survey data with a cross-sectional study design. Difference in difference (DID), instrumental variable technique, two-part model, and bootstrap technique were employed to analyze cost data. Chronically ill individuals' probability of reducing any overall care costs was significantly less than healthier individuals (beta = 2.18, p = 0.04), while the integrated DID estimator from split results indicated that going from low cost-sharing to high cost-sharing significantly reduced costs by $12,853.23 more for sick people than for healthy people (95% CI: -$17,582.86, -$8,123.60). This greater cost reduction in total care among sick people likely resulted from greater cost reduction in physician care, and may have come at the expense of jeopardizing health outcomes by depriving patients of needed care. Thus, these policies would be inappropriate in the short run, and unlikely in the long run to control health plans costs among chronically ill individuals. A generous benefit design with low cost-sharing policies in physician care or primary care is recommended for both health plans and chronically ill individuals, to save costs and protect these enrollees' health status.

  7. Estimation of velocity structure around a natural gas reservoir at Yufutsu, Japan, by microtremor survey

    NASA Astrophysics Data System (ADS)

    Shiraishi, H.; Asanuma, H.; Tezuka, K.

    2010-12-01

    Seismic reflection survey has been commonly used for exploration and time-lapse monitoring of oil/gas resources. Seismic reflection images typically have reasonable reliability and resolution for commercial production. However, cost consideration sometimes avoids deployment of widely distributed array or repeating survey in cases of time lapse monitoring or exploration of small-scale reservoir. Hence, technologies to estimate structures and physical properties around the reservoir with limited cost would be effectively used. Microtremor survey method (MSM) has an ability to realize long-term monitoring of reservoir with low cost, because this technique has a passive nature and minimum numbers of the monitoring station is four. MSM has been mainly used for earthquake disaster prevention, because velocity structure of S-wave is directly estimated from velocity dispersion of the Rayleigh wave. The authors experimentally investigated feasibility of the MSM survey for exploration of oil/gas reservoir. The field measurement was carried out around natural gas reservoir at Yufutsu, Hokkaido, Japan. Four types of arrays with array radii of 30m, 100m, 300m and 600m are deployed in each area. Dispersion curves of the velocity of Rayleigh wave were estimated from observed microtremors, and S-wave velocity structures were estimated by an inverse analysis of the dispersion curves with genetic algorism (GA). The estimated velocity structures showed good consistency with one dimensional velocity structure by previous reflection surveys up to 4-5 km. We also found from the field experiment that a data of 40min is effective to estimate the velocity structure even the seismometers are deployed along roads with heavy traffic.

  8. Fast algorithm for spectral processing with application to on-line welding quality assurance

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.

    2006-10-01

    A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.

  9. Using pharmacoeconomic modelling to determine value-based pricing for new pharmaceuticals in malaysia.

    PubMed

    Dranitsaris, George; Truter, Ilse; Lubbe, Martie S; Sriramanakoppa, Nitin N; Mendonca, Vivian M; Mahagaonkar, Sangameshwar B

    2011-10-01

    Decision analysis (DA) is commonly used to perform economic evaluations of new pharmaceuticals. Using multiples of Malaysia's per capita 2010 gross domestic product (GDP) as the threshold for economic value as suggested by the World Health Organization (WHO), DA was used to estimate a price per dose for bevacizumab, a drug that provides a 1.4-month survival benefit in patients with metastatic colorectal cancer (mCRC). A decision model was developed to simulate progression-free and overall survival in mCRC patients receiving chemotherapy with and without bevacizumab. Costs for chemotherapy and management of side effects were obtained from public and private hospitals in Malaysia. Utility estimates, measured as quality-adjusted life years (QALYs), were determined by interviewing 24 oncology nurses using the time trade-off technique. The price per dose was then estimated using a target threshold of US$44 400 per QALY gained, which is 3 times the Malaysian per capita GDP. A cost-effective price for bevacizumab could not be determined because the survival benefit provided was insufficient According to the WHO criteria, if the drug was able to improve survival from 1.4 to 3 or 6 months, the price per dose would be $567 and $1258, respectively. The use of decision modelling for estimating drug pricing is a powerful technique to ensure value for money. Such information is of value to drug manufacturers and formulary committees because it facilitates negotiations for value-based pricing in a given jurisdiction.

  10. [Monetary value of the human costs of road traffic injuries in Spain].

    PubMed

    Martínez Pérez, Jorge Eduardo; Sánchez Martínez, Fernando Ignacio; Abellán Perpiñán, José María; Pinto Prades, José Luis

    2015-09-01

    Cost-benefit analyses in the field of road safety compute human costs as a key component of total costs. The present article presents two studies promoted by the Directorate-General for Traffic aimed at obtaining official values for the costs associated with fatal and non-fatal traffic injuries in Spain. We combined the contingent valuation approach and the (modified) standard gamble technique in two surveys administered to large representative samples (n1=2,020, n2=2,000) of the Spanish population. The monetary value of preventing a fatality was estimated to be 1.4 million euros. Values of 219,000 and 6,100 euros were obtained for minor and severe non-fatal injuries, respectively. These figures are comparable to those observed in neighboring countries. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  11. The cost of starting and maintaining a large home hemodialysis program.

    PubMed

    Komenda, Paul; Copland, Michael; Makwana, Jay; Djurdjev, Ogdjenka; Sood, Manish M; Levin, Adeera

    2010-06-01

    Home extended hours hemodialysis improves some measurable biological and quality-of-life parameters over conventional renal replacement therapies in patients with end-stage renal disease. Published small studies evaluating costs have shown savings in terms of ongoing operating costs with this modality. However, all estimates need to include the total costs, including infrastructure, patient training, and maintenance; patient attrition by death, transplantation, technique failure; and the necessity of in-center dialysis. We describe a comprehensive funding model for a large centrally administered but locally delivered home hemodialysis program in British Columbia, Canada that covered 122 patients, of which 113 were still in the program at study end. The majority of patients performed home nocturnal hemodialysis in this 2-year retrospective study. All training periods, both in-center and in-home dialysis, medications, hospitalizations, and deaths were captured using our provincial renal database and vital statistics. Comparative data from the provincial database and pricing models were used for costing purposes. The total comprehensive costs per patient-incorporating startup, home, and in-center dialysis; medications; home remodeling; and consumables-was $59,179 for years 2004-2005 and $48,648 for 2005-2006. The home dialysis patients required multiple in-center dialysis runs, significantly contributing to the overall costs. Our study describes a valid, comprehensive funding model delineating reliable cost estimates of starting and maintaining a large home-based hemodialysis program. Consideration of hidden costs is important for administrators and planners to take into account when designing budgets for home hemodialysis.

  12. Cost-benefit analysis of biopsy methods for suspicious mammographic lesions; discussion 994-5.

    PubMed

    Fahy, B N; Bold, R J; Schneider, P D; Khatri, V; Goodnight, J E

    2001-09-01

    Stereotactic core biopsy (SCB) is more cost-effective than needle-localized biopsy (NLB) for evaluation and treatment of mammographic lesions. A computer-generated mathematical model was developed based on clinical outcome modeling to estimate costs accrued during evaluation and treatment of suspicious mammographic lesions. Total costs were determined for evaluation and subsequent treatment of cancer when either SCB or NLB was used as the initial biopsy method. Cost was estimated by the cumulative work relative value units accrued. The risk of malignancy based on the Breast Imaging Reporting Data System (BIRADS) score and mammographic suspicion of ductal carcinoma in situ were varied to simulate common clinical scenarios. Total cost accumulated during evaluation and subsequent surgical therapy (if required). Evaluation of BIRADS 5 lesions (highly suggestive, risk of malignancy = 90%) resulted in equivalent relative value units for both techniques (SCB, 15.54; NLB, 15.47). Evaluation of lesions highly suspicious for ductal carcinoma in situ yielded similar total treatment relative value units (SCB, 11.49; NLB, 10.17). Only for evaluation of BIRADS 4 lesions (suspicious abnormality, risk of malignancy = 34%) was SCB more cost-effective than NLB (SCB, 7.65 vs. NLB, 15.66). No difference in cost-benefit was found when lesions highly suggestive of malignancy (BIRADS 5) or those suspicious for ductal carcinoma in situ were evaluated initially with SCB vs. NLB, thereby disproving the hypothesis. Only for intermediate-risk lesions (BIRADS 4) did initial evaluation with SCB yield a greater cost savings than with NLB.

  13. Operations Research techniques in the management of large-scale reforestation programs

    Treesearch

    Joseph Buongiorno; D.E. Teeguarden

    1978-01-01

    A reforestation planning system for the Douglas-fir region of the Western United States is described. Part of the system is a simulation model to predict plantation growth and to determine economic thinning regimes and rotation ages as a function of site characteristics, initial density, reforestation costs, and management constraints. A second model estimates the...

  14. 75 FR 5452 - Regulations Under the Paul Wellstone and Pete Domenici Mental Health Parity and Addiction Equity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ... collection techniques or other forms of information technology; and Estimates of capital or start-up costs... participant or beneficiary in accordance with regulations. The likely respondents are business or other for... Advocacy of the Small Business Administration for comment on its impact on small business. Comments and...

  15. Examination of the Financial Costs of Teacher Turnover in Mid-Sized Urban School Districts

    ERIC Educational Resources Information Center

    Synar, Edwyna Anne

    2010-01-01

    It is estimated that 50% of beginning teachers leave the profession within the first five years on the job (Murnane, Singer, Willett, Kemple, & Olsen, 1991; Colbert & Wolff, 1992; Ingersoll, 2003b; Schlechty & Vance, 1981). When teachers depart, they take with them their knowledge of instructional techniques, students' learning styles, and…

  16. A model for the cost of doing a cost estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1992-01-01

    A model for estimating the cost required to do a cost estimate for Deep Space Network (DSN) projects that range from $0.1 to $100 million is presented. The cost of the cost estimate in thousands of dollars, C(sub E), is found to be approximately given by C(sub E) = K((C(sub p))(sup 0.35)) where C(sub p) is the cost of the project being estimated in millions of dollars and K is a constant depending on the accuracy of the estimate. For an order-of-magnitude estimate, K = 24; for a budget estimate, K = 60; and for a definitive estimate, K = 115. That is, for a specific project, the cost of doing a budget estimate is about 2.5 times as much as that for an order-of-magnitude estimate, and a definitive estimate costs about twice as much as a budget estimate. Use of this model should help provide the level of resources required for doing cost estimates and, as a result, provide insights towards more accurate estimates with less potential for cost overruns.

  17. A low-cost three-dimensional laser surface scanning approach for defining body segment parameters.

    PubMed

    Pandis, Petros; Bull, Anthony Mj

    2017-11-01

    Body segment parameters are used in many different applications in ergonomics as well as in dynamic modelling of the musculoskeletal system. Body segment parameters can be defined using different methods, including techniques that involve time-consuming manual measurements of the human body, used in conjunction with models or equations. In this study, a scanning technique for measuring subject-specific body segment parameters in an easy, fast, accurate and low-cost way was developed and validated. The scanner can obtain the body segment parameters in a single scanning operation, which takes between 8 and 10 s. The results obtained with the system show a standard deviation of 2.5% in volumetric measurements of the upper limb of a mannequin and 3.1% difference between scanning volume and actual volume. Finally, the maximum mean error for the moment of inertia by scanning a standard-sized homogeneous object was 2.2%. This study shows that a low-cost system can provide quick and accurate subject-specific body segment parameter estimates.

  18. Cost utility of maintenance treatment of recurrent depression with sertraline versus episodic treatment with dothiepin.

    PubMed

    Hatziandreu, E J; Brown, R E; Revicki, D A; Turner, R; Martindale, J; Levine, S; Siegel, J E

    1994-03-01

    The objective of this study was to model, for patients at risk of recurrent depression, the cost-utility of maintenance therapy with sertraline compared with treatment of acute episodes with dothiepin ('episodic treatment'). Using clinical decision analysis techniques, a Markov state-transition model was constructed to estimate the lifetime costs and quality-adjusted life-years (QALYs) of the 2 therapeutic strategies. The model follows 2 cohorts of 35-year-old women at high risk for recurrent depression over their lifetimes. Model construction and relevant data (probabilities) for performing the analysis were based on existing clinical knowledge. Two physician panels were used to obtain estimates of recurrence probabilities not available in the literature, health utilities, and resource consumption. Costs were obtained from published sources. The baseline analysis showed that it costs 2172 British pounds sterling ($US3692, 1991 currency) to save an additional QALY with sertraline maintenance treatment. Sensitivity analysis showed that the incremental cost-utility ratio ranged from 557 British pounds sterling to 5260 British pounds sterling per QALY. Overall, the resulting ratios are considered to be well within the range of cost-utility ratios that support the adoption and appropriate utilisation of a technology. Based on the study assumptions, long term maintenance treatment with sertraline appears to be clinically and economically justified choice for patients at high risk of recurrent depression.

  19. NASA/BLM Applications Pilot Test (APT), phase 2. Volume 1: Executive summary. [vegetation mapping and production estimation in northwestern Arizona

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Data from LANDSAT, low altitude color aerial photography, and ground visits were combined and used to produce vegetation cover maps and to estimate productivity of range, woodland, and forest resources in northwestern Arizona. A planning session, two workshops, and four status reviews were held to assist technology transfer from NASA. Computer aided digital classification of LANDSAT data was selected as a major source of input data. An overview is presented of the data processing, data collection, productivity estimation, and map verification techniques used. Cost analysis and digital LANDSAT digital products are also considered.

  20. Financial Impact of the Robotic Approach in Liver Surgery: A Comparative Study of Clinical Outcomes and Costs Between the Robotic and Open Technique in a Single Institution.

    PubMed

    Daskalaki, Despoina; Gonzalez-Heredia, Raquel; Brown, Marc; Bianco, Francesco M; Tzvetanov, Ivo; Davis, Myriam; Kim, Jihun; Benedetti, Enrico; Giulianotti, Pier C

    2017-04-01

    One of the perceived major drawbacks of minimally invasive techniques has always been its cost. This is especially true for the robotic approach and is one of the main reasons that has prevented its wider acceptance among hospitals and surgeons. The aim of our study was to evaluate the clinical outcomes and economic impact of robotic and open liver surgery in a single institution. Sixty-eight robotic and 55 open hepatectomies were performed at our institution between January 1, 2009 and December 31, 2013. Demographics, perioperative data, and postoperative outcomes were collected and compared between the two groups. An independent company performed the financial analysis. The economic parameters comprised direct variable costs, direct fixed costs, and indirect costs. Mean estimated blood loss was significantly less in the robotic group (438 versus 727.8 mL; P = .038). Overall morbidity was significantly lower in the robotic group (22% versus 40%; P = .047). Clavien III/IV complications were also lower, with 4.4% in the robotic versus 16.3% in the open group (P = .043). The length of stay in the intensive care unit (ICU) was shorter for patients who underwent a robotic procedure (2.1 versus 3.3 days; P = .004). The average total cost, including readmissions, was $37,518 for robotic surgery and $41,948 for open technique. Robotic liver resections had less overall morbidity, ICU, and hospital stay. This translates into decreased average costs for robotic surgery. These procedures are financially comparable to open resections and do not represent a financial burden to the hospital.

  1. Financial Impact of the Robotic Approach in Liver Surgery: A Comparative Study of Clinical Outcomes and Costs Between the Robotic and Open Technique in a Single Institution

    PubMed Central

    Daskalaki, Despoina; Brown, Marc; Bianco, Francesco M.; Tzvetanov, Ivo; Davis, Myriam; Kim, Jihun; Benedetti, Enrico; Giulianotti, Pier C.

    2017-01-01

    Abstract Background: One of the perceived major drawbacks of minimally invasive techniques has always been its cost. This is especially true for the robotic approach and is one of the main reasons that has prevented its wider acceptance among hospitals and surgeons. The aim of our study was to evaluate the clinical outcomes and economic impact of robotic and open liver surgery in a single institution. Methods: Sixty-eight robotic and 55 open hepatectomies were performed at our institution between January 1, 2009 and December 31, 2013. Demographics, perioperative data, and postoperative outcomes were collected and compared between the two groups. An independent company performed the financial analysis. The economic parameters comprised direct variable costs, direct fixed costs, and indirect costs. Results: Mean estimated blood loss was significantly less in the robotic group (438 versus 727.8 mL; P = .038). Overall morbidity was significantly lower in the robotic group (22% versus 40%; P = .047). Clavien III/IV complications were also lower, with 4.4% in the robotic versus 16.3% in the open group (P = .043). The length of stay in the intensive care unit (ICU) was shorter for patients who underwent a robotic procedure (2.1 versus 3.3 days; P = .004). The average total cost, including readmissions, was $37,518 for robotic surgery and $41,948 for open technique. Conclusions: Robotic liver resections had less overall morbidity, ICU, and hospital stay. This translates into decreased average costs for robotic surgery. These procedures are financially comparable to open resections and do not represent a financial burden to the hospital. PMID:28186429

  2. Resource utilization pattern and cost of tuberculosis treatment from the provider and patient perspectives in the state of Penang, Malaysia.

    PubMed

    Atif, Muhammad; Sulaiman, Syed Azhar Syed; Shafie, Asrul Akmal; Asif, Muhammad; Babar, Zaheer-Ud-Din

    2014-08-19

    Studies from both developed and developing countries have demonstrated a considerable fluctuation in the average cost of TB treatment. The objective of this study was to analyze the medical resource utilization among new smear positive pulmonary tuberculosis patients. We also estimated the cost of tuberculosis treatment from the provider and patient perspectives, and identified the significant cost driving factors. All new smear positive pulmonary tuberculosis patients who were registered at the chest clinic of the Penang General Hospital, between March 2010 and February 2011, were invited to participate in the study. Provider sector costs were estimated using bottom-up, micro-costing technique. For the calculation of costs from the patients' perspective, all eligible patients who agreed to participate in the study were interviewed after the intensive phase and subsequently at the end of the treatment by a trained nurse. PASW was used to analyze the data (Predictive Analysis SoftWare, version 19.0, Armonk, NY: IBM Corp.). During the study period, 226 patients completed the treatment. However, complete costing data were available for 212 patients. The most highly utilized resources were chest X-ray followed by sputum smear examination. Only a smaller proportion of the patients were hospitalized. The average provider sector cost was MYR 992.34 (i.e., USD 325.35 per patient) whereby the average patient sector cost was MYR 1225.80 (i.e., USD 401.90 per patient). The average patient sector cost of our study population accounted for 5.7% of their annual family income. In multiple linear regression analysis, prolonged treatment duration (i.e., > 6 months) was the only predictor of higher provider sector costs whereby higher patient sector costs were determined by greater household income and persistent cough at the end of the intensive phase of the treatment. In relation to average provider sector cost, our estimates are substantially higher than the budget allocated by the Ministry of Health for the treatment of a tuberculosis case in Malaysia. The expenses borne by the patients and their families on the treatment of the current episode of tuberculosis were not catastrophic for them.

  3. Efficient implementation of a real-time estimation system for thalamocortical hidden Parkinsonian properties

    NASA Astrophysics Data System (ADS)

    Yang, Shuangming; Deng, Bin; Wang, Jiang; Li, Huiyan; Liu, Chen; Fietkiewicz, Chris; Loparo, Kenneth A.

    2017-01-01

    Real-time estimation of dynamical characteristics of thalamocortical cells, such as dynamics of ion channels and membrane potentials, is useful and essential in the study of the thalamus in Parkinsonian state. However, measuring the dynamical properties of ion channels is extremely challenging experimentally and even impossible in clinical applications. This paper presents and evaluates a real-time estimation system for thalamocortical hidden properties. For the sake of efficiency, we use a field programmable gate array for strictly hardware-based computation and algorithm optimization. In the proposed system, the FPGA-based unscented Kalman filter is implemented into a conductance-based TC neuron model. Since the complexity of TC neuron model restrains its hardware implementation in parallel structure, a cost efficient model is proposed to reduce the resource cost while retaining the relevant ionic dynamics. Experimental results demonstrate the real-time capability to estimate thalamocortical hidden properties with high precision under both normal and Parkinsonian states. While it is applied to estimate the hidden properties of the thalamus and explore the mechanism of the Parkinsonian state, the proposed method can be useful in the dynamic clamp technique of the electrophysiological experiments, the neural control engineering and brain-machine interface studies.

  4. Solution to the Problem of Calibration of Low-Cost Air Quality Measurement Sensors in Networks.

    PubMed

    Miskell, Georgia; Salmond, Jennifer A; Williams, David E

    2018-04-27

    We provide a simple, remote, continuous calibration technique suitable for application in a hierarchical network featuring a few well-maintained, high-quality instruments ("proxies") and a larger number of low-cost devices. The ideas are grounded in a clear definition of the purpose of a low-cost network, defined here as providing reliable information on air quality at small spatiotemporal scales. The technique assumes linearity of the sensor signal. It derives running slope and offset estimates by matching mean and standard deviations of the sensor data to values derived from proxies over the same time. The idea is extremely simple: choose an appropriate proxy and an averaging-time that is sufficiently long to remove the influence of short-term fluctuations but sufficiently short that it preserves the regular diurnal variations. The use of running statistical measures rather than cross-correlation of sites means that the method is robust against periods of missing data. Ideas are first developed using simulated data and then demonstrated using field data, at hourly and 1 min time-scales, from a real network of low-cost semiconductor-based sensors. Despite the almost naïve simplicity of the method, it was robust for both drift detection and calibration correction applications. We discuss the use of generally available geographic and environmental data as well as microscale land-use regression as means to enhance the proxy estimates and to generalize the ideas to other pollutants with high spatial variability, such as nitrogen dioxide and particulates. These improvements can also be used to minimize the required number of proxy sites.

  5. Tuning support vector machines for minimax and Neyman-Pearson classification.

    PubMed

    Davenport, Mark A; Baraniuk, Richard G; Scott, Clayton D

    2010-10-01

    This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as cross-validation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2nu-SVM. We then exploit a characterization of the 2nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study, we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.

  6. An angle-dependent estimation of CT x-ray spectrum from rotational transmission measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Yuan, E-mail: yuan.lin@duke.edu; Samei, Ehsan; Ramirez-Giraldo, Juan Carlos

    2014-06-15

    Purpose: Computed tomography (CT) performance as well as dose and image quality is directly affected by the x-ray spectrum. However, the current assessment approaches of the CT x-ray spectrum require costly measurement equipment and complicated operational procedures, and are often limited to the spectrum corresponding to the center of rotation. In order to address these limitations, the authors propose an angle-dependent estimation technique, where the incident spectra across a wide range of angular trajectories can be estimated accurately with only a single phantom and a single axial scan in the absence of the knowledge of the bowtie filter. Methods: Themore » proposed technique uses a uniform cylindrical phantom, made of ultra-high-molecular-weight polyethylene and positioned in an off-centered geometry. The projection data acquired with an axial scan have a twofold purpose. First, they serve as a reflection of the transmission measurements across different angular trajectories. Second, they are used to reconstruct the cross sectional image of the phantom, which is then utilized to compute the intersection length of each transmission measurement. With each CT detector element recording a range of transmission measurements for a single angular trajectory, the spectrum is estimated for that trajectory. A data conditioning procedure is used to combine information from hundreds of collected transmission measurements to accelerate the estimation speed, to reduce noise, and to improve estimation stability. The proposed spectral estimation technique was validated experimentally using a clinical scanner (Somatom Definition Flash, Siemens Healthcare, Germany) with spectra provided by the manufacturer serving as the comparison standard. Results obtained with the proposed technique were compared against those obtained from a second conventional transmission measurement technique with two materials (i.e., Cu and Al). After validation, the proposed technique was applied to measure spectra from the clinical system across a range of angular trajectories [−15°, 15°] and spectrum settings (80, 100, 120, 140 kVp). Results: At 140 kVp, the proposed technique was comparable to the conventional technique in terms of the mean energy difference (MED, −0.29 keV) and the normalized root mean square difference (NRMSD, 0.84%) from the comparison standard compared to 0.64 keV and 1.56%, respectively, with the conventional technique. The average absolute MEDs and NRMSDs across kVp settings and angular trajectories were less than 0.61 keV and 3.41%, respectively, which indicates a high level of estimation accuracy and stability. Conclusions: An angle-dependent estimation technique of CT x-ray spectra from rotational transmission measurements was proposed. Compared with the conventional technique, the proposed method simplifies the measurement procedures and enables incident spectral estimation for a wide range of angular trajectories. The proposed technique is suitable for rigorous research objectives as well as routine clinical quality control procedures.« less

  7. Fast and robust estimation of ophthalmic wavefront aberrations

    NASA Astrophysics Data System (ADS)

    Dillon, Keith

    2016-12-01

    Rapidly rising levels of myopia, particularly in the developing world, have led to an increased need for inexpensive and automated approaches to optometry. A simple and robust technique is provided for estimating major ophthalmic aberrations using a gradient-based wavefront sensor. The approach is based on the use of numerical calculations to produce diverse combinations of phase components, followed by Fourier transforms to calculate the coefficients. The approach does not utilize phase unwrapping nor iterative solution of inverse problems. This makes the method very fast and tolerant to image artifacts, which do not need to be detected and masked or interpolated as is needed in other techniques. These features make it a promising algorithm on which to base low-cost devices for applications that may have limited access to expert maintenance and operation.

  8. Comparison of digital signal-signal beat interference compensation techniques in direct-detection subcarrier modulation systems.

    PubMed

    Li, Zhe; Erkilinc, M Sezer; Galdino, Lidia; Shi, Kai; Thomsen, Benn C; Bayvel, Polina; Killey, Robert I

    2016-12-12

    Single-polarization direct-detection transceivers may offer advantages compared to digital coherent technology for some metro, back-haul, access and inter-data center applications since they offer low-cost and complexity solutions. However, a direct-detection receiver introduces nonlinearity upon photo detection, since it is a square-law device, which results in signal distortion due to signal-signal beat interference (SSBI). Consequently, it is desirable to develop effective and low-cost SSBI compensation techniques to improve the performance of such transceivers. In this paper, we compare the performance of a number of recently proposed digital signal processing-based SSBI compensation schemes, including the use of single- and two-stage linearization filters, an iterative linearization filter and a SSBI estimation and cancellation technique. Their performance is assessed experimentally using a 7 × 25 Gb/s wavelength division multiplexed (WDM) single-sideband 16-QAM Nyquist-subcarrier modulation system operating at a net information spectral density of 2.3 (b/s)/Hz.

  9. A novel approach to estimate emissions from large transportation networks: Hierarchical clustering-based link-driving-schedules for EPA-MOVES using dynamic time warping measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aziz, H. M. Abdul; Ukkusuri, Satish V.

    We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less

  10. A novel approach to estimate emissions from large transportation networks: Hierarchical clustering-based link-driving-schedules for EPA-MOVES using dynamic time warping measures

    DOE PAGES

    Aziz, H. M. Abdul; Ukkusuri, Satish V.

    2017-06-29

    We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less

  11. Cost-effectiveness of a distance lifestyle counselling programme among overweight employees from a company perspective, ALIFE@Work: a randomized controlled trial.

    PubMed

    Gussenhoven, A H M; van Wier, M F; Bosmans, J E; Dekkers, J C; van Mechelen, W

    2013-01-01

    The objective of this study was to determine whether a lifestyle intervention with individual counselling was cost-effective for reducing body weight compared with usual care from a company perspective. Overweight employees were recruited and randomly assigned to the intervention groups, either phone or Internet, or the control group. The intervention was based on a cognitive behavioural approach and addressed physical activity and diet. Self-reported body weight was collected at baseline and 12 months follow-up. Intervention costs and costs of sick leave days based on gross and net lost productivity days (GLPDs/NLPDs) obtained from the participating companies were calculated. Missing data were imputed using multiple imputation techniques. Uncertainty surrounding the differences in costs and the incremental cost-effectiveness ratios (ICER) was estimated by bootstrapping techniques, and presented on cost-effectiveness planes and cost-effectiveness acceptability curves. No statistically significant differences in total costs were found between the intervention groups and control group, though mean total costs in both intervention groups tended to be higher than those in the control group. The ICER of the Internet group compared with the control group was €59 per kilogram of weight loss based on GLPD costs. The probability of cost effectiveness of the Internet intervention was 45% at a willingness-to-pay of €0 per extra kilogram weight loss and 75% at a willingness-to-pay of €1500 per extra kilogram body weight loss. Comparable results were found for the phone intervention. The intervention was not cost effective in comparison with usual care from the company perspective. Due to the large amount of missing data, it is not possible to draw firm conclusions.

  12. A reduced order model based on Kalman filtering for sequential data assimilation of turbulent flows

    NASA Astrophysics Data System (ADS)

    Meldi, M.; Poux, A.

    2017-10-01

    A Kalman filter based sequential estimator is presented in this work. The estimator is integrated in the structure of segregated solvers for the analysis of incompressible flows. This technique provides an augmented flow state integrating available observation in the CFD model, naturally preserving a zero-divergence condition for the velocity field. Because of the prohibitive costs associated with a complete Kalman Filter application, two model reduction strategies have been proposed and assessed. These strategies dramatically reduce the increase in computational costs of the model, which can be quantified in an augmentation of 10%- 15% with respect to the classical numerical simulation. In addition, an extended analysis of the behavior of the numerical model covariance Q has been performed. Optimized values are strongly linked to the truncation error of the discretization procedure. The estimator has been applied to the analysis of a number of test cases exhibiting increasing complexity, including turbulent flow configurations. The results show that the augmented flow successfully improves the prediction of the physical quantities investigated, even when the observation is provided in a limited region of the physical domain. In addition, the present work suggests that these Data Assimilation techniques, which are at an embryonic stage of development in CFD, may have the potential to be pushed even further using the augmented prediction as a powerful tool for the optimization of the free parameters in the numerical simulation.

  13. Estimation of Theaflavins (TF) and Thearubigins (TR) Ratio in Black Tea Liquor Using Electronic Vision System

    NASA Astrophysics Data System (ADS)

    Akuli, Amitava; Pal, Abhra; Ghosh, Arunangshu; Bhattacharyya, Nabarun; Bandhopadhyya, Rajib; Tamuly, Pradip; Gogoi, Nagen

    2011-09-01

    Quality of black tea is generally assessed using organoleptic tests by professional tea tasters. They determine the quality of black tea based on its appearance (in dry condition and during liquor formation), aroma and taste. Variation in the above parameters is actually contributed by a number of chemical compounds like, Theaflavins (TF), Thearubigins (TR), Caffeine, Linalool, Geraniol etc. Among the above, TF and TR are the most important chemical compounds, which actually contribute to the formation of taste, colour and brightness in tea liquor. Estimation of TF and TR in black tea is generally done using a spectrophotometer instrument. But, the analysis technique undergoes a rigorous and time consuming effort for sample preparation; also the operation of costly spectrophotometer requires expert manpower. To overcome above problems an Electronic Vision System based on digital image processing technique has been developed. The system is faster, low cost, repeatable and can estimate the amount of TF and TR ratio for black tea liquor with accuracy. The data analysis is done using Principal Component Analysis (PCA), Multiple Linear Regression (MLR) and Multiple Discriminate Analysis (MDA). A correlation has been established between colour of tea liquor images and TF, TR ratio. This paper describes the newly developed E-Vision system, experimental methods, data analysis algorithms and finally, the performance of the E-Vision System as compared to the results of traditional spectrophotometer.

  14. Metrics in method engineering

    NASA Astrophysics Data System (ADS)

    Brinkkemper, S.; Rossi, M.

    1994-12-01

    As customizable computer aided software engineering (CASE) tools, or CASE shells, have been introduced in academia and industry, there has been a growing interest into the systematic construction of methods and their support environments, i.e. method engineering. To aid the method developers and method selectors in their tasks, we propose two sets of metrics, which measure the complexity of diagrammatic specification techniques on the one hand, and of complete systems development methods on the other hand. Proposed metrics provide a relatively fast and simple way to analyze the technique (or method) properties, and when accompanied with other selection criteria, can be used for estimating the cost of learning the technique and the relative complexity of a technique compared to others. To demonstrate the applicability of the proposed metrics, we have applied them to 34 techniques and 15 methods.

  15. Epidemiological and financial indicators of hypertension in older adults in Mexico: challenges for health planning and management in Latin America.

    PubMed

    Arredondo, Armando; Duarte, Maria Beatriz; Cuadra, Silvia Magali

    2017-04-01

    This study estimated the epidemiological and financial indicators of hypertension in order to identify challenges in strategic planning and management for health systems in Latin America. This is a longitudinal study with a population base of 187 326 reported cases of older adults with hypertension, diagnosed at public health institutions in Mexico. The cost-evaluation method that was used was based on the instrumentation and consensus techniques. To estimate the epidemiological changes and financial consequences for 2015-2017, time series analyses and probabilistic models were constructed according to the Box-Jenkins technique. Regarding epidemiological changes for 2015 versus 2017, an increase of 8-12% is expected (p < 0.001). Comparing the economic impact in 2015 versus 2017 (p < 0.001), there is a 22% increase in financial requirements. The total amount estimated for hypertension in 2015 (in US dollars) was $1 575 671 330. It included $747 527 259 as direct costs and $829 144 071 as indirect costs. If the risk factors and the different healthcare services for older adults remain as they are currently, the financial consequences of epidemiological changes in older adults will have a major impact on the users' pockets, following in order of importance, on social security providers and on public assistance providers. The challenges and implications of our findings in the context of universal coverage reforms in Latin America reinforce the urgent need to develop more and better strategic planning for the prevention of chronic diseases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Economic costs of hospitalized diarrheal disease in Bangladesh: a societal perspective.

    PubMed

    Sarker, Abdur Razzaque; Sultana, Marufa; Mahumud, Rashidul Alam; Ali, Nausad; Huda, Tanvir M; Salim Uzzaman, M; Haider, Sabbir; Rahman, Hafizur; Islam, Ziaul; Khan, Jahangir A M; Van Der Meer, Robert; Morton, Alec

    2018-01-01

    Diarrheal diseases are a major threat to human health and still represent a leading cause of morbidity and mortality worldwide. Although the burden of the diarrheal diseases is much lower in developed countries, it is a significant public health problem in low and middle-income countries like Bangladesh. Though diarrhea is preventable and managed with low-cost interventions, it is still the leading cause of morbidity according to the patient who sought care from public hospitals in Bangladesh indicating that significant resources are consumed in treating those patients. The aim of the study is to capture the inpatients and outpatient treatment cost of diarrheal disease and to measure the cost burden and coping mechanisms associated with diarrheal illness. This study was conducted in six randomly selected district hospitals from six divisions (larger administrative units) in Bangladesh. The study was performed from the societal perspective which means all types of costs were identified, measured and valued no matter who incurred them. Cost analysis was estimated using the guideline proposed by the World Health Organization for estimating the economic burden of diarrheal diseases. The study adopted quantitative techniques to collect the household and hospital level data including structured and semi-structured questionnaires, observation checklists, analysis of hospital database, telephone interviews and compilation of service statistics. The average total societal cost of illness per episode was BDT 5274.02 (US $ 67.18) whereas the average inpatient and outpatient costs were BDT 8675.09 (US $ 110.51) and BDT 1853.96 (US $ 23.62) respectively. The cost burden was significantly highest for poorest households, 21.45% of household income, compared to 4.21% of the richest quintile. Diarrheal diseases continue to be an overwhelming problem in Bangladesh. The economic impact of any public health interventions (either preventive or promotive) that can reduce the prevalence of diarrheal diseases can be estimated from the data generated from this study.

  17. Remote sensing for grassland management in the arid Southwest

    USGS Publications Warehouse

    Marsett, R.C.; Qi, J.; Heilman, P.; Biedenbender, S.H.; Watson, M.C.; Amer, S.; Weltz, M.; Goodrich, D.; Marsett, R.

    2006-01-01

    We surveyed a group of rangeland managers in the Southwest about vegetation monitoring needs on grassland. Based on their responses, the objective of the RANGES (Rangeland Analysis Utilizing Geospatial Information Science) project was defined to be the accurate conversion of remotely sensed data (satellite imagery) to quantitative estimates of total (green and senescent) standing cover and biomass on grasslands and semidesert grasslands. Although remote sensing has been used to estimate green vegetation cover, in arid grasslands herbaceous vegetation is senescent much of the year and is not detected by current remote sensing techniques. We developed a ground truth protocol compatible with both range management requirements and Landsat's 30 m resolution imagery. The resulting ground-truth data were then used to develop image processing algorithms that quantified total herbaceous vegetation cover, height, and biomass. Cover was calculated based on a newly developed Soil Adjusted Total Vegetation Index (SATVI), and height and biomass were estimated based on reflectance in the near infrared (NIR) band. Comparison of the remotely sensed estimates with independent ground measurements produced r2 values of 0.80, 0.85, and 0.77 and Nash Sutcliffe values of 0.78, 0.70, and 0.77 for the cover, plant height, and biomass, respectively. The approach for estimating plant height and biomass did not work for sites where forbs comprised more than 30% of total vegetative cover. The ground reconnaissance protocol and image processing techniques together offer land managers accurate and timely methods for monitoring extensive grasslands. The time-consuming requirement to collect concurrent data in the field for each image implies a need to share the high fixed costs of processing an image across multiple users to reduce the costs for individual rangeland managers.

  18. Marginal cost curves for water footprint reduction in irrigated agriculture: a policy and decision making guide for efficient water use in crop production

    NASA Astrophysics Data System (ADS)

    Chukalla, Abebe; Krol, Maarten; Hoekstra, Arjen

    2016-04-01

    Reducing water footprints (WF) in irrigated crop production is an essential element in water management, particularly in water-scarce areas. To achieve this, policy and decision making need to be supported with information on marginal cost curves that rank measures to reduce the WF according to their cost-effectiveness and enable the estimation of the cost associated with a certain WF reduction target, e.g. towards a certain reasonable WF benchmark. This paper aims to develop marginal cost curves (MCC) for WF reduction. The AquaCrop model is used to explore the effect of different measures on evapotranspiration and crop yield and thus WF that is used as input in the MCC. Measures relate to three dimensions of management practices: irrigation techniques (furrow, sprinkler, drip and subsurface drip); irrigation strategies (full and deficit irrigation); and mulching practices (no mulching, organic and synthetic mulching). A WF benchmark per crop is calculated as resulting from the best-available production technology. The marginal cost curve is plotted using the ratios of the marginal cost to WF reduction of the measures as ordinate, ranking with marginal costs rise with the increase of the reduction effort. For each measure, the marginal cost to reduce WF is estimated by comparing the associated WF and net present value (NPV) to the reference case (furrow irrigation, full irrigation, no mulching). The NPV for each measure is based on its capital costs, operation and maintenances costs (O&M) and revenues. A range of cases is considered, including: different crops, soil types and different environments. Key words: marginal cost curve, water footprint benchmark, soil water balance, crop growth, AquaCrop

  19. A model-based 3D template matching technique for pose acquisition of an uncooperative space object.

    PubMed

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2015-03-16

    This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented.

  20. An economic evaluation of adaptive e-learning devices to promote weight loss via dietary change for people with obesity.

    PubMed

    Miners, Alec; Harris, Jody; Felix, Lambert; Murray, Elizabeth; Michie, Susan; Edwards, Phil

    2012-07-07

    The prevalence of obesity is over 25 % in many developed countries. Obesity is strongly associated with an increased risk of fatal and chronic conditions such as cardiovascular disease and type 2 diabetes. Therefore it has become a major public health concern for many economies. E-learning devices are a relatively novel approach to promoting dietary change. The new generation of devices are 'adaptive' and use interactive electronic media to facilitate teaching and learning. E-Learning has grown out of recent developments in information and communication technology, such as the Internet, interactive computer programmes, interactive television and mobile phones. The aim of this study is to assess the cost-effectiveness of e-learning devices as a method of promoting weight loss via dietary change. An economic evaluation was performed using decision modelling techniques. Outcomes were expressed in terms of Quality-Adjusted Life-Years (QALYs) and costs were estimated from a health services perspective. All parameter estimates were derived from the literature. A systematic review was undertaken to derive the estimate of relative treatment effect. The base case results from the e-Learning Economic Evaluation Model (e-LEEM) suggested that the incremental cost-effectiveness ratio was approximately £102,000 per Quality-Adjusted Life-Year (QALY) compared to conventional care. This finding was robust to most alternative assumptions, except a much lower fixed cost of providing e-learning devices. Expected value of perfect information (EVPI) analysis showed that while the individual level EVPI was arguably negligible, the population level value was between £37 M and £170 M at a willingness to pay between £20,000 to £30,000 per additional QALY. The current economic evidence base suggests that e-learning devices for managing the weight of obese individuals are unlikely to be cost-effective unless their fixed costs are much lower than estimated or future devices prove to be much more effective.

  1. Cost-effectiveness analysis of neonatal hearing screening program in China: should universal screening be prioritized?

    PubMed

    Huang, Li-Hui; Zhang, Luo; Tobe, Ruo-Yan Gai; Qi, Fang-Hua; Sun, Long; Teng, Yue; Ke, Qing-Lin; Mai, Fei; Zhang, Xue-Feng; Zhang, Mei; Yang, Ru-Lan; Tu, Lin; Li, Hong-Hui; Gu, Yan-Qing; Xu, Sai-Nan; Yue, Xiao-Yan; Li, Xiao-Dong; Qi, Bei-Er; Cheng, Xiao-Huan; Tang, Wei; Xu, Ling-Zhong; Han, De-Min

    2012-04-17

    Neonatal hearing screening (NHS) has been routinely offered as a vital component of early childhood care in developed countries, whereas such a screening program is still at the pilot or preliminary stage as regards its nationwide implementation in developing countries. To provide significant evidence for health policy making in China, this study aims to determine the cost-effectiveness of NHS program implementation in case of eight provinces of China. A cost-effectiveness model was conducted and all neonates annually born from 2007 to 2009 in eight provinces of China were simulated in this model. The model parameters were estimated from the established databases in the general hospitals or maternal and child health hospitals of these eight provinces, supplemented from the published literature. The model estimated changes in program implementation costs, disability-adjusted life years (DALYs), average cost-effectiveness ratio (ACER), and incremental cost-effectiveness ratio (ICER) for universal screening compared to targeted screening in eight provinces. A multivariate sensitivity analysis was performed to determine uncertainty in health effect estimates and cost-effectiveness ratios using a probabilistic modeling technique. Targeted strategy trended to be cost-effective in Guangxi, Jiangxi, Henan, Guangdong, Zhejiang, Hebei, Shandong, and Beijing from the level of 9%, 9%, 8%, 4%, 3%, 7%, 5%, and 2%, respectively; while universal strategy trended to be cost-effective in those provinces from the level of 70%, 70%, 48%, 10%, 8%, 28%, 15%, 4%, respectively. This study showed although there was a huge disparity in the implementation of the NHS program in the surveyed provinces, both universal strategy and targeted strategy showed cost-effectiveness in those relatively developed provinces, while neither of the screening strategy showed cost-effectiveness in those relatively developing provinces. This study also showed that both strategies especially universal strategy achieve a good economic effect in the long term costs. Universal screening might be considered as the prioritized implementation goal especially in those relatively developed provinces of China as it provides the best health and economic effects, while targeted screening might be temporarily more realistic than universal screening in those relatively developing provinces of China.

  2. Arterial Mechanical Motion Estimation Based on a Semi-Rigid Body Deformation Approach

    PubMed Central

    Guzman, Pablo; Hamarneh, Ghassan; Ros, Rafael; Ros, Eduardo

    2014-01-01

    Arterial motion estimation in ultrasound (US) sequences is a hard task due to noise and discontinuities in the signal derived from US artifacts. Characterizing the mechanical properties of the artery is a promising novel imaging technique to diagnose various cardiovascular pathologies and a new way of obtaining relevant clinical information, such as determining the absence of dicrotic peak, estimating the Augmentation Index (AIx), the arterial pressure or the arterial stiffness. One of the advantages of using US imaging is the non-invasive nature of the technique unlike Intra Vascular Ultra Sound (IVUS) or angiography invasive techniques, plus the relative low cost of the US units. In this paper, we propose a semi rigid deformable method based on Soft Bodies dynamics realized by a hybrid motion approach based on cross-correlation and optical flow methods to quantify the elasticity of the artery. We evaluate and compare different techniques (for instance optical flow methods) on which our approach is based. The goal of this comparative study is to identify the best model to be used and the impact of the accuracy of these different stages in the proposed method. To this end, an exhaustive assessment has been conducted in order to decide which model is the most appropriate for registering the variation of the arterial diameter over time. Our experiments involved a total of 1620 evaluations within nine simulated sequences of 84 frames each and the estimation of four error metrics. We conclude that our proposed approach obtains approximately 2.5 times higher accuracy than conventional state-of-the-art techniques. PMID:24871987

  3. Mental Models of Software Forecasting

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.

    1993-01-01

    The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.

  4. FOCIS: A forest classification and inventory system using LANDSAT and digital terrain data

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Franklin, J.; Woodcook, C. E.; Logan, T. L.

    1981-01-01

    Accurate, cost-effective stratification of forest vegetation and timber inventory is the primary goal of a Forest Classification and Inventory System (FOCIS). Conventional timber stratification using photointerpretation can be time-consuming, costly, and inconsistent from analyst to analyst. FOCIS was designed to overcome these problems by using machine processing techniques to extract and process tonal, textural, and terrain information from registered LANDSAT multispectral and digital terrain data. Comparison of samples from timber strata identified by conventional procedures showed that both have about the same potential to reduce the variance of timber volume estimates over simple random sampling.

  5. Carbon footprint and cost-effectiveness of cataract surgery.

    PubMed

    Venkatesh, Rengaraj; van Landingham, Suzanne W; Khodifad, Ashish M; Haripriya, Aravind; Thiel, Cassandra L; Ramulu, Pradeep; Robin, Alan L

    2016-01-01

    This article raises awareness about the cost-effectiveness and carbon footprint of various cataract surgery techniques, comparing their relative carbon emissions and expenses: manual small-incision cataract surgery (MSICS), phacoemulsification, and femtosecond laser-assisted cataract surgery. As the most commonly performed surgical procedure worldwide, cataract surgery contributes significantly to global climate change. The carbon footprint of a single phacoemulsification cataract surgery is estimated to be comparable to that of a typical person's life for 1 week. Phacoemulsification has been estimated to be between 1.4 and 4.7 times more expensive than MSICS; however, given the lower degree of postoperative astigmatism and other potential complications, phacoemulsification may still be preferable to MSICS in relatively resource-rich settings requiring high levels of visual function. Limited data are currently available regarding the environmental and financial impact of femtosecond laser-assisted cataract surgery; however, in its current form, it appears to be the least cost-effective option. Cataract surgery has a high value to patients. The relative environmental impact and cost of different types of cataract surgery should be considered as this treatment becomes even more broadly available globally and as new technologies are developed and implemented.

  6. Technology management: a perspective on system support, procurement, and replacement planning.

    PubMed

    Dickerson, M L; Jackson, M E

    1992-01-01

    The escalating costs associated with medical technology present a host of challenges for the hospital clinical engineering department. As service and support costs comprise ever larger portions of a system's life cycle cost, innovative management of service provider mix and mechanisms can provide substantial savings in operating expenses. In addition to full-service contracts, the use of demand service and independents has become commonplace. Medical equipment maintenance insurance programs provide yet another service alternative, combining the flexibility of demand service with the safety of a capped budget. These programs have gained acceptance among hospitals as their providers have become more focused on the healthcare market and its many needs. In view of the long-term cost impact surrounding technology procurement, the authors recommend that hospitals refine system evaluation methodologies and develop more comprehensive techniques directed at capital equipment replacement planning. One replacement planning approach, based on an estimation of system value changes, is described and illustrated using data collected through client consultations. Although the validity of this method has not been demonstrated, it represents a simplified approach to life cycle cost analysis and is intended to provide a standard method by which system replacement planning may be quantified. As a departure from system devaluation based solely on depreciation, this method estimates prospective system values derived from anticipated operations and maintenance costs, projected revenue, and the availability of new technology.

  7. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to highlight their inappropriateness for what is really needed at the conceptual phase of the design process. The First-Order Process Velocity Cost Model (FOPV) is discussed at length in the next section. This is followed by an application of the FOPV cost model to a generic wing. For designs that have no precedence as far as acquisition costs are concerned, cost data derived from the FOPV cost model may not be accurate enough because of new requirements for shape complexity, material, equipment and precision/tolerance. The concept of Cost Modulus is introduced at this point to compensate for these new burdens on the basic processes. This is treated in section 5. The cost of a design must be conveniently linked to its CAD representation. The interfacing of CAD models and spreadsheets containing the cost equations is the subject of the next section, section 6. The last section of the report is a summary of the progress made so far, and the anticipated research work to be achieved in the future.

  8. Using Pharmacoeconomic Modelling to Determine Value-Based Pricing for New Pharmaceuticals in Malaysia

    PubMed Central

    Dranitsaris, George; Truter, Ilse; Lubbe, Martie S; Sriramanakoppa, Nitin N; Mendonca, Vivian M; Mahagaonkar, Sangameshwar B

    2011-01-01

    Background: Decision analysis (DA) is commonly used to perform economic evaluations of new pharmaceuticals. Using multiples of Malaysia’s per capita 2010 gross domestic product (GDP) as the threshold for economic value as suggested by the World Health Organization (WHO), DA was used to estimate a price per dose for bevacizumab, a drug that provides a 1.4-month survival benefit in patients with metastatic colorectal cancer (mCRC). Methods: A decision model was developed to simulate progression-free and overall survival in mCRC patients receiving chemotherapy with and without bevacizumab. Costs for chemotherapy and management of side effects were obtained from public and private hospitals in Malaysia. Utility estimates, measured as quality-adjusted life years (QALYs), were determined by interviewing 24 oncology nurses using the time trade-off technique. The price per dose was then estimated using a target threshold of US$44 400 per QALY gained, which is 3 times the Malaysian per capita GDP. Results: A cost-effective price for bevacizumab could not be determined because the survival benefit provided was insufficient According to the WHO criteria, if the drug was able to improve survival from 1.4 to 3 or 6 months, the price per dose would be $567 and $1258, respectively. Conclusion: The use of decision modelling for estimating drug pricing is a powerful technique to ensure value for money. Such information is of value to drug manufacturers and formulary committees because it facilitates negotiations for value-based pricing in a given jurisdiction. PMID:22589671

  9. Risk aversion and uncertainty in cost-effectiveness analysis: the expected-utility, moment-generating function approach.

    PubMed

    Elbasha, Elamin H

    2005-05-01

    The availability of patient-level data from clinical trials has spurred a lot of interest in developing methods for quantifying and presenting uncertainty in cost-effectiveness analysis (CEA). Although the majority has focused on developing methods for using sample data to estimate a confidence interval for an incremental cost-effectiveness ratio (ICER), a small strand of the literature has emphasized the importance of incorporating risk preferences and the trade-off between the mean and the variance of returns to investment in health and medicine (mean-variance analysis). This paper shows how the exponential utility-moment-generating function approach is a natural extension to this branch of the literature for modelling choices from healthcare interventions with uncertain costs and effects. The paper assumes an exponential utility function, which implies constant absolute risk aversion, and is based on the fact that the expected value of this function results in a convenient expression that depends only on the moment-generating function of the random variables. The mean-variance approach is shown to be a special case of this more general framework. The paper characterizes the solution to the resource allocation problem using standard optimization techniques and derives the summary measure researchers need to estimate for each programme, when the assumption of risk neutrality does not hold, and compares it to the standard incremental cost-effectiveness ratio. The importance of choosing the correct distribution of costs and effects and the issues related to estimation of the parameters of the distribution are also discussed. An empirical example to illustrate the methods and concepts is provided. Copyright 2004 John Wiley & Sons, Ltd

  10. Estimated cost of overactive bladder in Thailand.

    PubMed

    Prasopsanti, Kriangsak; Santi-Ngamkun, Apirak; Pornprasit, Kanokwan

    2007-11-01

    To estimate the annual direct and indirect costs of overactive bladder (OAB) in indigenous Thai people aged 18 years and over in the year 2005. Economically based models using diagnostic and treatment algorithms from clinical practice guidelines and current disease prevalence data were used to estimate direct and indirect costs of OAB. Prevalence and event probability estimates were obtained from the literature, national data sets, and expert opinion. Costs were estimated from a small survey using a cost questionnaire and from unit costs of King Chulalongkorn Memorial Hospital. The annual cost of OAB in Thailand is estimated as 1.9 billion USD. It is estimated to consume 1.14% of national GDP The cost includes 0.33 billion USD for direct medical costs, 1.3 billion USD for direct, nonmedical costs and 0.29 billion USD for indirect costs of lost productivity. The largest costs category was direct treatment costs of comorbidities associated with OAB. Costs of OAB medication accountedfor 14% of the total costs ofOAB.

  11. Implications of sampling design and sample size for national carbon accounting systems.

    PubMed

    Köhl, Michael; Lister, Andrew; Scott, Charles T; Baldauf, Thomas; Plugge, Daniel

    2011-11-08

    Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of earth-observation data and in-situ field assessments as data sources. We compared the cost-efficiency of four different sampling design alternatives (simple random sampling, regression estimators, stratified sampling, 2-phase sampling with regression estimators) that have been proposed in the scope of REDD. Three of the design alternatives provide for a combination of in-situ and earth-observation data. Under different settings of remote sensing coverage, cost per field plot, cost of remote sensing imagery, correlation between attributes quantified in remote sensing and field data, as well as population variability and the percent standard error over total survey cost was calculated. The cost-efficiency of forest carbon stock assessments is driven by the sampling design chosen. Our results indicate that the cost of remote sensing imagery is decisive for the cost-efficiency of a sampling design. The variability of the sample population impairs cost-efficiency, but does not reverse the pattern of cost-efficiency of the individual design alternatives. Our results clearly indicate that it is important to consider cost-efficiency in the development of forest carbon stock assessments and the selection of remote sensing techniques. The development of MRV-systems for REDD need to be based on a sound optimization process that compares different data sources and sampling designs with respect to their cost-efficiency. This helps to reduce the uncertainties related with the quantification of carbon stocks and to increase the financial benefits from adopting a REDD regime.

  12. The potential of imaging techniques as a screening tool for colorectal cancer: a cost-effectiveness analysis.

    PubMed

    Greuter, Marjolein J E; Berkhof, Johannes; Fijneman, Remond J A; Demirel, Erhan; Lew, Jie-Bin; Meijer, Gerrit A; Stoker, Jaap; Coupé, Veerle M H

    2016-07-01

    Imaging may be promising for colorectal cancer (CRC) screening, since it has test characteristics comparable with colonoscopy but is less invasive. We aimed to assess the potential of CT colonography (CTC) and MR colonography (MRC) in terms of (cost-effectiveness) using the Adenoma and Serrated pathway to Colorectal CAncer model. We compared several CTC and MRC strategies with 5- or 10-yearly screening intervals with no screening, 10-yearly colonoscopy screening and biennial faecal immunochemical test (FIT) screening. We assumed trial-based participation rates in the base-case analyses and varied the rates in sensitivity analyses. Incremental lifetime costs and health effects were estimated from a healthcare perspective. The health gain of CTC and MRC was similar and ranged from 0.031 to 0.048 life-year gained compared with no screening, for 2-5 screening rounds. Lifetime costs per person for MRC strategies were €60-110 higher than those for CTC strategies with an equal number of screening rounds. All imaging-based strategies were cost-effective compared with no screening. FIT screening was the dominant screening strategy, leading to most LYG and highest cost-savings. Compared with three rounds of colonoscopy screening, CTC with five rounds was found to be cost-effective in an incremental analysis of imaging strategies. Assumptions on screening participation have a major influence on the ordering of strategies in terms of costs and effects. CTC and MRC have potential for CRC screening, compared with no screening and compared with three rounds of 10-yearly colonoscopy screening. When taking FIT screening as the reference, imaging is not cost-effective. Participation is an important driver of effectiveness and cost estimates. This is the first study to assess the cost-effectiveness of MRC screening for CRC.

  13. Joint OSNR monitoring and modulation format identification in digital coherent receivers using deep neural networks.

    PubMed

    Khan, Faisal Nadeem; Zhong, Kangping; Zhou, Xian; Al-Arashi, Waled Hussein; Yu, Changyuan; Lu, Chao; Lau, Alan Pak Tao

    2017-07-24

    We experimentally demonstrate the use of deep neural networks (DNNs) in combination with signals' amplitude histograms (AHs) for simultaneous optical signal-to-noise ratio (OSNR) monitoring and modulation format identification (MFI) in digital coherent receivers. The proposed technique automatically extracts OSNR and modulation format dependent features of AHs, obtained after constant modulus algorithm (CMA) equalization, and exploits them for the joint estimation of these parameters. Experimental results for 112 Gbps polarization-multiplexed (PM) quadrature phase-shift keying (QPSK), 112 Gbps PM 16 quadrature amplitude modulation (16-QAM), and 240 Gbps PM 64-QAM signals demonstrate OSNR monitoring with mean estimation errors of 1.2 dB, 0.4 dB, and 1 dB, respectively. Similarly, the results for MFI show 100% identification accuracy for all three modulation formats. The proposed technique applies deep machine learning algorithms inside standard digital coherent receiver and does not require any additional hardware. Therefore, it is attractive for cost-effective multi-parameter estimation in next-generation elastic optical networks (EONs).

  14. Economics of polysilicon processes

    NASA Technical Reports Server (NTRS)

    Yaws, C. L.; Li, K. Y.; Chou, S. M.

    1986-01-01

    Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.

  15. Watching Grass - a Pilot Study on the Suitability of Photogrammetric Techniques for Quantifying Change in Aboveground Biomass in Grassland Experiments

    NASA Astrophysics Data System (ADS)

    Kröhnert, M.; Anderson, R.; Bumberger, J.; Dietrich, P.; Harpole, W. S.; Maas, H.-G.

    2018-05-01

    Grassland ecology experiments in remote locations requiring quantitative analysis of the biomass in defined plots are becoming increasingly widespread, but are still limited by manual sampling methodologies. To provide a cost-effective automated solution for biomass determination, several photogrammetric techniques are examined to generate 3D point cloud representations of plots as a basis, to estimate aboveground biomass on grassland plots, which is a key ecosystem variable used in many experiments. Methods investigated include Structure from Motion (SfM) techniques for camera pose estimation with posterior dense matching as well as the usage of a Time of Flight (TOF) 3D camera, a laser light sheet triangulation system and a coded light projection system. In this context, plants of small scales (herbage) and medium scales are observed. In the first pilot study presented here, the best results are obtained by applying dense matching after SfM, ideal for integration into distributed experiment networks.

  16. A perioperative cost analysis comparing single-level minimally invasive and open transforaminal lumbar interbody fusion.

    PubMed

    Singh, Kern; Nandyala, Sreeharsha V; Marquez-Lara, Alejandro; Fineberg, Steven J; Oglesby, Mathew; Pelton, Miguel A; Andersson, Gunnar B; Isayeva, Darya; Jegier, Briana J; Phillips, Frank M

    2014-08-01

    Emerging literature suggests superior clinical short- and long-term outcomes of MIS (minimally invasive surgery) TLIFs (transforaminal lumbar interbody fusion) versus open fusions. Few studies to date have analyzed the cost differences between the two techniques and their relationship to acute clinical outcomes. The purpose of the study was to determine the differences in hospitalization costs and payments for patients treated with primary single-level MIS versus open TLIF. The impact of clinical outcomes and their contribution to financial differences was explored as well. This study was a nonrandomized, nonblinded prospective review. Sixty-six consecutive patients undergoing a single-level TLIF (open/MIS) were analyzed (33 open, 33 MIS). Patients in either cohort (MIS/open) were matched based on race, sex, age, smoking status, medical comorbidities (Charlson Comorbidity index), payer, and diagnosis. Every patient in the study had a diagnosis of either degenerative disc disease or spondylolisthesis and stenosis. Operative time (minutes), length of stay (LOS, days), estimated blood loss (EBL, mL), anesthesia time (minutes), Visual Analog Scale (VAS) scores, and hospital cost/payment amount were assessed. The MIS and open TLIF groups were compared based on clinical outcomes measures and hospital cost/payment data using SPSS version 20.0 for statistical analysis. The two groups were compared using bivariate chi-squared analysis. Mann-Whitney tests were used for non-normal distributed data. Effect size estimate was calculated with the Cohen d statistic and the r statistic with a 95% confidence interval. Average surgical time was shorter for the MIS than the open TLIF group (115.8 minutes vs. 186.0 minutes respectively; p=.001). Length of stay was also reduced for the MIS versus the open group (2.3 days vs. 2.9 days, respectively; p=.018). Average anesthesia time and EBL were also lower in the MIS group (p<.001). VAS scores decreased for both groups, although these scores were significantly lower for the MIS group (p<.001). Financial analysis demonstrated lower total hospital direct costs (blood, imaging, implant, laboratory, pharmacy, physical therapy/occupational therapy/speech, room and board) in the MIS versus the open group ($19,512 vs. $23,550, p<.001). Implant costs were similar (p=.686) in both groups, although these accounted for about two-thirds of the hospital direct costs in the MIS cohort ($13,764) and half of these costs ($13,778) in the open group. Hospital payments were $6,248 higher for open TLIF patients compared with the MIS group (p=.267). MIS TLIF technique demonstrated significant reductions of operative time, LOS, anesthesia time, VAS scores, and EBL compared with the open technique. This reduction in perioperative parameters translated into lower total hospital costs over a 60-day perioperative period. Although hospital reimbursements appear higher in the open group over the MIS group, shorter surgical times and LOS days in the MIS technique provide opportunities for hospitals to reduce utilization of resources and to increase surgical case volume. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Optimal estimation and scheduling in aquifer management using the rapid feedback control method

    NASA Astrophysics Data System (ADS)

    Ghorbanidehno, Hojat; Kokkinaki, Amalia; Kitanidis, Peter K.; Darve, Eric

    2017-12-01

    Management of water resources systems often involves a large number of parameters, as in the case of large, spatially heterogeneous aquifers, and a large number of "noisy" observations, as in the case of pressure observation in wells. Optimizing the operation of such systems requires both searching among many possible solutions and utilizing new information as it becomes available. However, the computational cost of this task increases rapidly with the size of the problem to the extent that textbook optimization methods are practically impossible to apply. In this paper, we present a new computationally efficient technique as a practical alternative for optimally operating large-scale dynamical systems. The proposed method, which we term Rapid Feedback Controller (RFC), provides a practical approach for combined monitoring, parameter estimation, uncertainty quantification, and optimal control for linear and nonlinear systems with a quadratic cost function. For illustration, we consider the case of a weakly nonlinear uncertain dynamical system with a quadratic objective function, specifically a two-dimensional heterogeneous aquifer management problem. To validate our method, we compare our results with the linear quadratic Gaussian (LQG) method, which is the basic approach for feedback control. We show that the computational cost of the RFC scales only linearly with the number of unknowns, a great improvement compared to the basic LQG control with a computational cost that scales quadratically. We demonstrate that the RFC method can obtain the optimal control values at a greatly reduced computational cost compared to the conventional LQG algorithm with small and controllable losses in the accuracy of the state and parameter estimation.

  18. Benchmarking the expected stack manufacturing cost of next generation, intermediate-temperature protonic ceramic fuel cells with solid oxide fuel cell technology

    NASA Astrophysics Data System (ADS)

    Dubois, Alexis; Ricote, Sandrine; Braun, Robert J.

    2017-11-01

    Recent progress in the performance of intermediate temperature (500-600 °C) protonic ceramic fuel cells (PCFCs) has demonstrated both fuel flexibility and increasing power density that approach commercial application requirements. These developments may eventually position the technology as a viable alternative to solid oxide fuel cells (SOFCs) and molten carbonate fuel cells (MCFCs). The PCFCs investigated in this work are based on a BaZr0.8Y0.2O3-δ (BZY20) thin electrolyte supported by BZY20/Ni porous anodes, and a triple conducting cathode material comprised of BaCo0.4Fe0.4Zr0.1Y0.1O3-δ (BCFZY0.1). These cells are prepared using a low-cost solid-state reactive sintering (SSRS) process, and are capable of power densities of 0.156 W cm-2 at 500 °C operating directly from methane fuel. We develop a manufacturing cost model to estimate the Nth generation production costs of PCFC stack technology using high volume manufacturing processes and compare them to the state-of-the-art in SOFC technology. The low-cost cell manufacturing enabled by the SSRS technique compensates for the lower PCFC power density and the trade-off between operating temperature and efficiency enables the use of lower-cost stainless steel materials. PCFC stack production cost estimates are found to be as much as 27-37% lower at 550 °C than SOFCs operating at 800 °C.

  19. Evanescent Field Based Photoacoustics: Optical Property Evaluation at Surfaces

    PubMed Central

    Goldschmidt, Benjamin S.; Rudy, Anna M.; Nowak, Charissa A.; Tsay, Yowting; Whiteside, Paul J. D.; Hunt, Heather K.

    2016-01-01

    Here, we present a protocol to estimate material and surface optical properties using the photoacoustic effect combined with total internal reflection. Optical property evaluation of thin films and the surfaces of bulk materials is an important step in understanding new optical material systems and their applications. The method presented can estimate thickness, refractive index, and use absorptive properties of materials for detection. This metrology system uses evanescent field-based photoacoustics (EFPA), a field of research based upon the interaction of an evanescent field with the photoacoustic effect. This interaction and its resulting family of techniques allow the technique to probe optical properties within a few hundred nanometers of the sample surface. This optical near field allows for the highly accurate estimation of material properties on the same scale as the field itself such as refractive index and film thickness. With the use of EFPA and its sub techniques such as total internal reflection photoacoustic spectroscopy (TIRPAS) and optical tunneling photoacoustic spectroscopy (OTPAS), it is possible to evaluate a material at the nanoscale in a consolidated instrument without the need for many instruments and experiments that may be cost prohibitive. PMID:27500652

  20. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus.

    PubMed

    Ulfsdotter, Malin; Lindberg, Lene; Månsdotter, Anna

    2015-01-01

    There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs) as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC). The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness. A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data. The cost was € 326.3 per parent, of which € 53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and € 272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of € 47 290 per gained QALY. The sensitivity analyses resulted in ratios from € 41 739 to € 55 072. With the common Swedish threshold value of € 55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent. Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation.

  1. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus

    PubMed Central

    Ulfsdotter, Malin

    2015-01-01

    Objective There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs) as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC). The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness. Methods A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data. Results The cost was €326.3 per parent, of which €53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and €272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of €47 290 per gained QALY. The sensitivity analyses resulted in ratios from €41 739 to €55 072. With the common Swedish threshold value of €55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent. Conclusion Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation. PMID:26681349

  2. Hospitalization costs of severe bacterial pneumonia in children: comparative analysis considering different costing methods

    PubMed Central

    Nunes, Sheila Elke Araujo; Minamisava, Ruth; Vieira, Maria Aparecida da Silva; Itria, Alexander; Pessoa, Vicente Porfirio; de Andrade, Ana Lúcia Sampaio Sgambatti; Toscano, Cristiana Maria

    2017-01-01

    ABSTRACT Objective To determine and compare hospitalization costs of bacterial community-acquired pneumonia cases via different costing methods under the Brazilian Public Unified Health System perspective. Methods Cost-of-illness study based on primary data collected from a sample of 59 children aged between 28 days and 35 months and hospitalized due to bacterial pneumonia. Direct medical and non-medical costs were considered and three costing methods employed: micro-costing based on medical record review, micro-costing based on therapeutic guidelines and gross-costing based on the Brazilian Public Unified Health System reimbursement rates. Costs estimates obtained via different methods were compared using the Friedman test. Results Cost estimates of inpatient cases of severe pneumonia amounted to R$ 780,70/$Int. 858.7 (medical record review), R$ 641,90/$Int. 706.90 (therapeutic guidelines) and R$ 594,80/$Int. 654.28 (Brazilian Public Unified Health System reimbursement rates). Costs estimated via micro-costing (medical record review or therapeutic guidelines) did not differ significantly (p=0.405), while estimates based on reimbursement rates were significantly lower compared to estimates based on therapeutic guidelines (p<0.001) or record review (p=0.006). Conclusion Brazilian Public Unified Health System costs estimated via different costing methods differ significantly, with gross-costing yielding lower cost estimates. Given costs estimated by different micro-costing methods are similar and costing methods based on therapeutic guidelines are easier to apply and less expensive, this method may be a valuable alternative for estimation of hospitalization costs of bacterial community-acquired pneumonia in children. PMID:28767921

  3. Application of a predictive Bayesian model to environmental accounting.

    PubMed

    Anex, R P; Englehardt, J D

    2001-03-30

    Environmental accounting techniques are intended to capture important environmental costs and benefits that are often overlooked in standard accounting practices. Environmental accounting methods themselves often ignore or inadequately represent large but highly uncertain environmental costs and costs conditioned by specific prior events. Use of a predictive Bayesian model is demonstrated for the assessment of such highly uncertain environmental and contingent costs. The predictive Bayesian approach presented generates probability distributions for the quantity of interest (rather than parameters thereof). A spreadsheet implementation of a previously proposed predictive Bayesian model, extended to represent contingent costs, is described and used to evaluate whether a firm should undertake an accelerated phase-out of its PCB containing transformers. Variability and uncertainty (due to lack of information) in transformer accident frequency and severity are assessed simultaneously using a combination of historical accident data, engineering model-based cost estimates, and subjective judgement. Model results are compared using several different risk measures. Use of the model for incorporation of environmental risk management into a company's overall risk management strategy is discussed.

  4. Are cooler surfaces a cost-effect mitigation of urban heat islands?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomerantz, Melvin

    Much research has gone into technologies to mitigate urban heat islands by making urban surfaces cooler by increasing their albedos. To be practical, the benefit of the technology must be greater than its cost. Here, this report provides simple methods for quantifying the maxima of some benefits that albedo increases may provide. The method used is an extension of an earlier paper that estimated the maximum possible electrical energy saving achievable in an entire city in a year by a change of albedo of its surfaces. The present report estimates the maximum amounts and monetary savings of avoided CO 2more » emissions and the decreases in peak power demands. As examples, for several warm cities in California, a 0.2 increase in albedo of pavements is found to reduce CO 2 emissions by < 1 kg per m 2 per year. At the current price of CO 2 reduction in California, the monetary saving is < US$ 0.01 per year per m 2 modified. The resulting maximum peak-power reductions are estimated to be < 7% of the base power of the city. In conclusion, the magnitudes of the savings are such that decision-makers should choose carefully which urban heat island mitigation techniques are cost effective.« less

  5. Are cooler surfaces a cost-effect mitigation of urban heat islands?

    DOE PAGES

    Pomerantz, Melvin

    2017-04-20

    Much research has gone into technologies to mitigate urban heat islands by making urban surfaces cooler by increasing their albedos. To be practical, the benefit of the technology must be greater than its cost. Here, this report provides simple methods for quantifying the maxima of some benefits that albedo increases may provide. The method used is an extension of an earlier paper that estimated the maximum possible electrical energy saving achievable in an entire city in a year by a change of albedo of its surfaces. The present report estimates the maximum amounts and monetary savings of avoided CO 2more » emissions and the decreases in peak power demands. As examples, for several warm cities in California, a 0.2 increase in albedo of pavements is found to reduce CO 2 emissions by < 1 kg per m 2 per year. At the current price of CO 2 reduction in California, the monetary saving is < US$ 0.01 per year per m 2 modified. The resulting maximum peak-power reductions are estimated to be < 7% of the base power of the city. In conclusion, the magnitudes of the savings are such that decision-makers should choose carefully which urban heat island mitigation techniques are cost effective.« less

  6. Estimating the Cost to do a Cost Estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1998-01-01

    This article provides a model for estimating the cost required to do a cost estimate. Overruns may lead to concellation of a project. In 1991, we completed a study on the cost of doing cost estimates for the class of projects normally encountered in the development and implementation of equipment at the network of tracking stations operated by the Jet Propulsion Laboratory (JPL) for NASA.

  7. 48 CFR 1852.216-74 - Estimated cost and fixed fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and Clauses 1852.216-74 Estimated cost and fixed fee. As prescribed in 1816.307-70(b), insert the following clause: Estimated Cost and Fixed Fee (DEC 1991) The estimated cost of this contract is ______ exclusive of the fixed fee of ______. The total estimated cost and fixed fee is ______. (End of clause) [62...

  8. Marginal estimator for the aberrations of a space telescope by phase diversity

    NASA Astrophysics Data System (ADS)

    Blanc, Amandine; Mugnier, Laurent; Idier, Jérôme

    2017-11-01

    In this communication, we propose a novel method for estimating the aberrations of a space telescope from phase diversity data. The images recorded by such a telescope can be degraded by optical aberrations due to design, fabrication or misalignments. Phase diversity is a technique that allows the estimation of aberrations. The only estimator found in the relevant literature is based on a joint estimation of the aberrated phase and the observed object. We recall this approach and study the behavior of this joint estimator by means of simulations. We propose a novel marginal estimator of the sole phase. it is obtained by integrating the observed object out of the problem; indeed, this object is a nuisance parameter in our problem. This reduces drastically the number of unknown and provides better asymptotic properties. This estimator is implemented and its properties are validated by simulation. its performance is equal or even better than that of the joint estimator for the same computing cost.

  9. Mass balance for on-line alphakLa estimation in activated sludge oxidation ditch.

    PubMed

    Chatellier, P; Audic, J M

    2001-01-01

    The capacity of an aeration system to transfer oxygen to a given activated sludge oxidation ditch is characterised by the alphakLa parameter. This parameter is difficult to measure under normal plant working conditions. Usually this measurement involves off-gas techniques or static mass balance. Therefore an on-line technique has been developed and tested in order to evaluate alphakLa. This technique deduces alphakLa from a data analysis of low cost sensor measurement: two flow meters and one oxygen probe. It involves a dynamic mass balance applied to aeration cycles selected according to given criteria. This technique has been applied to a wastewater treatment plant during four years. Significant variations of the alphakLa values have been detected while the number of blowers changes. This technique has been applied to another plant during two months.

  10. Cost-sensitive AdaBoost algorithm for ordinal regression based on extreme learning machine.

    PubMed

    Riccardi, Annalisa; Fernández-Navarro, Francisco; Carloni, Sante

    2014-10-01

    In this paper, the well known stagewise additive modeling using a multiclass exponential (SAMME) boosting algorithm is extended to address problems where there exists a natural order in the targets using a cost-sensitive approach. The proposed ensemble model uses an extreme learning machine (ELM) model as a base classifier (with the Gaussian kernel and the additional regularization parameter). The closed form of the derived weighted least squares problem is provided, and it is employed to estimate analytically the parameters connecting the hidden layer to the output layer at each iteration of the boosting algorithm. Compared to the state-of-the-art boosting algorithms, in particular those using ELM as base classifier, the suggested technique does not require the generation of a new training dataset at each iteration. The adoption of the weighted least squares formulation of the problem has been presented as an unbiased and alternative approach to the already existing ELM boosting techniques. Moreover, the addition of a cost model for weighting the patterns, according to the order of the targets, enables the classifier to tackle ordinal regression problems further. The proposed method has been validated by an experimental study by comparing it with already existing ensemble methods and ELM techniques for ordinal regression, showing competitive results.

  11. Online Detection of Broken Rotor Bar Fault in Induction Motors by Combining Estimation of Signal Parameters via Min-norm Algorithm and Least Square Method

    NASA Astrophysics Data System (ADS)

    Wang, Pan-Pan; Yu, Qiang; Hu, Yong-Jun; Miao, Chang-Xin

    2017-11-01

    Current research in broken rotor bar (BRB) fault detection in induction motors is primarily focused on a high-frequency resolution analysis of the stator current. Compared with a discrete Fourier transformation, the parametric spectrum estimation technique has a higher frequency accuracy and resolution. However, the existing detection methods based on parametric spectrum estimation cannot realize online detection, owing to the large computational cost. To improve the efficiency of BRB fault detection, a new detection method based on the min-norm algorithm and least square estimation is proposed in this paper. First, the stator current is filtered using a band-pass filter and divided into short overlapped data windows. The min-norm algorithm is then applied to determine the frequencies of the fundamental and fault characteristic components with each overlapped data window. Next, based on the frequency values obtained, a model of the fault current signal is constructed. Subsequently, a linear least squares problem solved through singular value decomposition is designed to estimate the amplitudes and phases of the related components. Finally, the proposed method is applied to a simulated current and an actual motor, the results of which indicate that, not only parametric spectrum estimation technique.

  12. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    NASA Technical Reports Server (NTRS)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  13. Medicare changes create accounting, reporting, and auditing problems. Task Force on Federal Health Care Legislation, American Institute of Certified Public Accountants.

    PubMed

    1984-11-01

    Hospital auditors and financial officers must adjust and react to the changing financial healthcare environment brought about by PPS. A close review of accounting systems, reporting methods, auditing procedures, and internal control systems should be made to determine that assets are safe-guarded and financial information is presented in conformity with GAAP. This article identified new problems and suggested solutions. Old tasks may no longer be necessary. For example, retroactive adjustments are not as important as they used to be. Estimates for capital and outpatient costs may continue to be required, but elaborate cost-finding techniques may no longer be necessary to estimate retroactive adjustments for reimbursable items. We recommend that prior to beginning an audit of a hospital's financial statements, each hospital's financial officers and its auditors discuss the possible accounting, reporting, and auditing implications as a result of PPS.

  14. LIBRA: An inexpensive geodetic network densification system

    NASA Technical Reports Server (NTRS)

    Fliegel, H. F.; Gantsweg, M.; Callahan, P. S.

    1975-01-01

    A description is given of the Libra (Locations Interposed by Ranging Aircraft) system, by which geodesy and earth strain measurements can be performed rapidly and inexpensively to several hundred auxiliary points with respect to a few fundamental control points established by any other technique, such as radio interferometry or satellite ranging. This low-cost means of extending the accuracy of space age geodesy to local surveys provides speed and spatial resolution useful, for example, for earthquake hazards estimation. Libra may be combined with an existing system, Aries (Astronomical Radio Interferometric Earth Surveying) to provide a balanced system adequate to meet the geophysical needs, and applicable to conventional surveying. The basic hardware design was outlined and specifications were defined. Then need for network densification was described. The following activities required to implement the proposed Libra system are also described: hardware development, data reduction, tropospheric calibrations, schedule of development and estimated costs.

  15. Weighted least squares techniques for improved received signal strength based localization.

    PubMed

    Tarrío, Paula; Bernardos, Ana M; Casar, José R

    2011-01-01

    The practical deployment of wireless positioning systems requires minimizing the calibration procedures while improving the location estimation accuracy. Received Signal Strength localization techniques using propagation channel models are the simplest alternative, but they are usually designed under the assumption that the radio propagation model is to be perfectly characterized a priori. In practice, this assumption does not hold and the localization results are affected by the inaccuracies of the theoretical, roughly calibrated or just imperfect channel models used to compute location. In this paper, we propose the use of weighted multilateration techniques to gain robustness with respect to these inaccuracies, reducing the dependency of having an optimal channel model. In particular, we propose two weighted least squares techniques based on the standard hyperbolic and circular positioning algorithms that specifically consider the accuracies of the different measurements to obtain a better estimation of the position. These techniques are compared to the standard hyperbolic and circular positioning techniques through both numerical simulations and an exhaustive set of real experiments on different types of wireless networks (a wireless sensor network, a WiFi network and a Bluetooth network). The algorithms not only produce better localization results with a very limited overhead in terms of computational cost but also achieve a greater robustness to inaccuracies in channel modeling.

  16. Estimation of fatigue life using electromechanical impedance technique

    NASA Astrophysics Data System (ADS)

    Lim, Yee Yan; Soh, Chee Kiong

    2010-04-01

    Fatigue induced damage is often progressive and gradual in nature. Structures subjected to large number of fatigue load cycles will encounter the process of progressive crack initiation, propagation and finally fracture. Monitoring of structural health, especially for the critical components, is therefore essential for early detection of potential harmful crack. Recent advent of smart materials such as piezo-impedance transducer adopting the electromechanical impedance (EMI) technique and wave propagation technique are well proven to be effective in incipient damage detection and characterization. Exceptional advantages such as autonomous, real-time and online, remote monitoring may provide a cost-effective alternative to the conventional structural health monitoring (SHM) techniques. In this study, the main focus is to investigate the feasibility of characterizing a propagating fatigue crack in a structure using the EMI technique as well as estimating its remaining fatigue life using the linear elastic fracture mechanics (LEFM) approach. Uniaxial cyclic tensile load is applied on a lab-sized aluminum beam up to failure. Progressive shift in admittance signatures measured by the piezo-impedance transducer (PZT patch) corresponding to increase of loading cycles reflects effectiveness of the EMI technique in tracing the process of fatigue damage progression. With the use of LEFM, prediction of the remaining life of the structure at different cycles of loading is possible.

  17. Weighted Least Squares Techniques for Improved Received Signal Strength Based Localization

    PubMed Central

    Tarrío, Paula; Bernardos, Ana M.; Casar, José R.

    2011-01-01

    The practical deployment of wireless positioning systems requires minimizing the calibration procedures while improving the location estimation accuracy. Received Signal Strength localization techniques using propagation channel models are the simplest alternative, but they are usually designed under the assumption that the radio propagation model is to be perfectly characterized a priori. In practice, this assumption does not hold and the localization results are affected by the inaccuracies of the theoretical, roughly calibrated or just imperfect channel models used to compute location. In this paper, we propose the use of weighted multilateration techniques to gain robustness with respect to these inaccuracies, reducing the dependency of having an optimal channel model. In particular, we propose two weighted least squares techniques based on the standard hyperbolic and circular positioning algorithms that specifically consider the accuracies of the different measurements to obtain a better estimation of the position. These techniques are compared to the standard hyperbolic and circular positioning techniques through both numerical simulations and an exhaustive set of real experiments on different types of wireless networks (a wireless sensor network, a WiFi network and a Bluetooth network). The algorithms not only produce better localization results with a very limited overhead in terms of computational cost but also achieve a greater robustness to inaccuracies in channel modeling. PMID:22164092

  18. Retrospective Assessment of Cost Savings From Prevention

    PubMed Central

    Grosse, Scott D.; Berry, Robert J.; Tilford, J. Mick; Kucik, James E.; Waitzman, Norman J.

    2016-01-01

    Introduction Although fortification of food with folic acid has been calculated to be cost saving in the U.S., updated estimates are needed. This analysis calculates new estimates from the societal perspective of net cost savings per year associated with mandatory folic acid fortification of enriched cereal grain products in the U.S. that was implemented during 1997–1998. Methods Estimates of annual numbers of live-born spina bifida cases in 1995–1996 relative to 1999–2011 based on birth defects surveillance data were combined during 2015 with published estimates of the present value of lifetime direct costs updated in 2014 U.S. dollars for a live-born infant with spina bifida to estimate avoided direct costs and net cost savings. Results The fortification mandate is estimated to have reduced the annual number of U.S. live-born spina bifida cases by 767, with a lower-bound estimate of 614. The present value of mean direct lifetime cost per infant with spina bifida is estimated to be $791,900, or $577,000 excluding caregiving costs. Using a best estimate of numbers of avoided live-born spina bifida cases, fortification is estimated to reduce the present value of total direct costs for each year's birth cohort by $603 million more than the cost of fortification. A lower-bound estimate of cost savings using conservative assumptions, including the upper-bound estimate of fortification cost, is $299 million. Conclusions The estimates of cost savings are larger than previously reported, even using conservative assumptions. The analysis can also inform assessments of folic acid fortification in other countries. PMID:26790341

  19. A Fast Goal Recognition Technique Based on Interaction Estimates

    NASA Technical Reports Server (NTRS)

    E-Martin, Yolanda; R-Moreno, Maria D.; Smith, David E.

    2015-01-01

    Goal Recognition is the task of inferring an actor's goals given some or all of the actor's observed actions. There is considerable interest in Goal Recognition for use in intelligent personal assistants, smart environments, intelligent tutoring systems, and monitoring user's needs. In much of this work, the actor's observed actions are compared against a generated library of plans. Recent work by Ramirez and Geffner makes use of AI planning to determine how closely a sequence of observed actions matches plans for each possible goal. For each goal, this is done by comparing the cost of a plan for that goal with the cost of a plan for that goal that includes the observed actions. This approach yields useful rankings, but is impractical for real-time goal recognition in large domains because of the computational expense of constructing plans for each possible goal. In this paper, we introduce an approach that propagates cost and interaction information in a plan graph, and uses this information to estimate goal probabilities. We show that this approach is much faster, but still yields high quality results.

  20. Design of DNA pooling to allow incorporation of covariates in rare variants analysis.

    PubMed

    Guan, Weihua; Li, Chun

    2014-01-01

    Rapid advances in next-generation sequencing technologies facilitate genetic association studies of an increasingly wide array of rare variants. To capture the rare or less common variants, a large number of individuals will be needed. However, the cost of a large scale study using whole genome or exome sequencing is still high. DNA pooling can serve as a cost-effective approach, but with a potential limitation that the identity of individual genomes would be lost and therefore individual characteristics and environmental factors could not be adjusted in association analysis, which may result in power loss and a biased estimate of genetic effect. For case-control studies, we propose a design strategy for pool creation and an analysis strategy that allows covariate adjustment, using multiple imputation technique. Simulations show that our approach can obtain reasonable estimate for genotypic effect with only slight loss of power compared to the much more expensive approach of sequencing individual genomes. Our design and analysis strategies enable more powerful and cost-effective sequencing studies of complex diseases, while allowing incorporation of covariate adjustment.

  1. Variational stereo imaging of oceanic waves with statistical constraints.

    PubMed

    Gallego, Guillermo; Yezzi, Anthony; Fedele, Francesco; Benetazzo, Alvise

    2013-11-01

    An image processing observational technique for the stereoscopic reconstruction of the waveform of oceanic sea states is developed. The technique incorporates the enforcement of any given statistical wave law modeling the quasi-Gaussianity of oceanic waves observed in nature. The problem is posed in a variational optimization framework, where the desired waveform is obtained as the minimizer of a cost functional that combines image observations, smoothness priors and a weak statistical constraint. The minimizer is obtained by combining gradient descent and multigrid methods on the necessary optimality equations of the cost functional. Robust photometric error criteria and a spatial intensity compensation model are also developed to improve the performance of the presented image matching strategy. The weak statistical constraint is thoroughly evaluated in combination with other elements presented to reconstruct and enforce constraints on experimental stereo data, demonstrating the improvement in the estimation of the observed ocean surface.

  2. Simultaneous narrowband ultrasonic strain-flow imaging

    NASA Astrophysics Data System (ADS)

    Tsou, Jean K.; Mai, Jerome J.; Lupotti, Fermin A.; Insana, Michael F.

    2004-04-01

    We are summarizing new research aimed at forming spatially and temporally registered combinations of strain and color-flow images using echo data recorded from a commercial ultrasound system. Applications include diagnosis of vascular diseases and tumor malignancies. The challenge is to meet the diverse needs of each measurement. The approach is to first apply eigenfilters that separate echo components from moving tissues and blood flow, and then estimate blood velocity and tissue displacement from the filtered-IQ-signal phase modulations. At the cost of a lower acquisition frame rate, we find the autocorrelation strain estimator yields higher resolution strain estimate than the cross-correlator since estimates are made from ensembles at a single point in space. The technique is applied to in vivo carotid imaging, to demonstrate the sensitivity for strain-flow vascular imaging.

  3. Inverse sampling regression for pooled data.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Eskridge, Kent; Crossa, José

    2017-06-01

    Because pools are tested instead of individuals in group testing, this technique is helpful for estimating prevalence in a population or for classifying a large number of individuals into two groups at a low cost. For this reason, group testing is a well-known means of saving costs and producing precise estimates. In this paper, we developed a mixed-effect group testing regression that is useful when the data-collecting process is performed using inverse sampling. This model allows including covariate information at the individual level to incorporate heterogeneity among individuals and identify which covariates are associated with positive individuals. We present an approach to fit this model using maximum likelihood and we performed a simulation study to evaluate the quality of the estimates. Based on the simulation study, we found that the proposed regression method for inverse sampling with group testing produces parameter estimates with low bias when the pre-specified number of positive pools (r) to stop the sampling process is at least 10 and the number of clusters in the sample is also at least 10. We performed an application with real data and we provide an NLMIXED code that researchers can use to implement this method.

  4. Cost-Effectiveness Analysis of Bariatric Surgery for Morbid Obesity.

    PubMed

    Alsumali, Adnan; Eguale, Tewodros; Bairdain, Sigrid; Samnaliev, Mihail

    2018-01-15

    In the USA, three types of bariatric surgeries are widely performed, including laparoscopic sleeve gastrectomy (LSG), laparoscopic Roux-en-Y gastric bypass (LRYGB), and laparoscopic adjustable gastric banding (LAGB). However, few economic evaluations of bariatric surgery are published. There is also scarcity of studies focusing on the LSG alone. Therefore, this study is evaluating the cost-effectiveness of bariatric surgery using LRYGB, LAGB, and LSG as treatment for morbid obesity. A microsimulation model was developed over a lifetime horizon to simulate weight change, health consequences, and costs of bariatric surgery for morbid obesity. US health care prospective was used. A model was propagated based on a report from the first report of the American College of Surgeons. Incremental cost-effectiveness ratios (ICERs) in terms of cost per quality-adjusted life-year (QALY) gained were used in the model. Model parameters were estimated from publicly available databases and published literature. LRYGB was cost-effective with higher QALYs (17.07) and cost ($138,632) than LSG (16.56 QALYs; $138,925), LAGB (16.10 QALYs; $135,923), and no surgery (15.17 QALYs; $128,284). Sensitivity analysis showed initial cost of surgery and weight regain assumption were very sensitive to the variation in overall model parameters. Across patient groups, LRYGB remained the optimal bariatric technique, except that with morbid obesity 1 (BMI 35-39.9 kg/m 2 ) patients, LSG was the optimal choice. LRYGB is the optimal bariatric technique, being the most cost-effective compared to LSG, LAGB, and no surgery options for most subgroups. However, LSG was the most cost-effective choice when initial BMI ranged between 35 and 39.9 kg/m 2 .

  5. Office of the Secretary of Defense Research, Development Test and Evaluation, Development and Test Evaluation, Defense, Director of Operational Test and Evaluation Defense, FY 1994 Budget Estimates, Justification of Estimates Submitted to Congress April 1993

    DTIC Science & Technology

    1993-04-01

    separation capability. o Demonstrate advanced KKVs in the 6-20 KG weight class. o Test planning for SRAM/LEAP and PATRIOT/LEAP integrated technology...packaging techniques to reduce satellite size, weight , power, and total system costs. Further development of these technologies are absolutely 4...1993 o Developed a master plan with a delivery schedule for each light- weight subassembly in the sensor integration payload. o Finalized a contract for

  6. Enabling Incremental Query Re-Optimization.

    PubMed

    Liu, Mengmeng; Ives, Zachary G; Loo, Boon Thau

    2016-01-01

    As declarative query processing techniques expand to the Web, data streams, network routers, and cloud platforms, there is an increasing need to re-plan execution in the presence of unanticipated performance changes. New runtime information may affect which query plan we prefer to run. Adaptive techniques require innovation both in terms of the algorithms used to estimate costs , and in terms of the search algorithm that finds the best plan. We investigate how to build a cost-based optimizer that recomputes the optimal plan incrementally given new cost information, much as a stream engine constantly updates its outputs given new data. Our implementation especially shows benefits for stream processing workloads. It lays the foundations upon which a variety of novel adaptive optimization algorithms can be built. We start by leveraging the recently proposed approach of formulating query plan enumeration as a set of recursive datalog queries ; we develop a variety of novel optimization approaches to ensure effective pruning in both static and incremental cases. We further show that the lessons learned in the declarative implementation can be equally applied to more traditional optimizer implementations.

  7. Enabling Incremental Query Re-Optimization

    PubMed Central

    Liu, Mengmeng; Ives, Zachary G.; Loo, Boon Thau

    2017-01-01

    As declarative query processing techniques expand to the Web, data streams, network routers, and cloud platforms, there is an increasing need to re-plan execution in the presence of unanticipated performance changes. New runtime information may affect which query plan we prefer to run. Adaptive techniques require innovation both in terms of the algorithms used to estimate costs, and in terms of the search algorithm that finds the best plan. We investigate how to build a cost-based optimizer that recomputes the optimal plan incrementally given new cost information, much as a stream engine constantly updates its outputs given new data. Our implementation especially shows benefits for stream processing workloads. It lays the foundations upon which a variety of novel adaptive optimization algorithms can be built. We start by leveraging the recently proposed approach of formulating query plan enumeration as a set of recursive datalog queries; we develop a variety of novel optimization approaches to ensure effective pruning in both static and incremental cases. We further show that the lessons learned in the declarative implementation can be equally applied to more traditional optimizer implementations. PMID:28659658

  8. A comparison of time-shared vs. batch development of space software

    NASA Technical Reports Server (NTRS)

    Forthofer, M.

    1977-01-01

    In connection with a study regarding the ground support software development for the Space Shuttle, an investigation was conducted concerning the most suitable software development techniques to be employed. A time-sharing 'trial period' was used to determine whether or not time-sharing would be a cost-effective software development technique for the Ground Based Shuttle system. It was found that time-sharing substantially improved job turnaround and programmer access to the computer for the representative group of ground support programmers. Moreover, this improvement resulted in an estimated saving of over fifty programmer days during the trial period.

  9. Cost-effectiveness of electroconvulsive therapy compared to repetitive transcranial magnetic stimulation for treatment-resistant severe depression: a decision model.

    PubMed

    Vallejo-Torres, L; Castilla, I; González, N; Hunter, R; Serrano-Pérez, P; Perestelo-Pérez, L

    2015-05-01

    Electroconvulsive therapy (ECT) is widely applied to treat severe depression resistant to standard treatment. Results from previous studies comparing the cost-effectiveness of this technique with treatment alternatives such as repetitive transcranial magnetic stimulation (rTMS) are conflicting. We conducted a cost-effectiveness analysis comparing ECT alone, rTMS alone and rTMS followed by ECT when rTMS fails under the perspective of the Spanish National Health Service. The analysis is based on a Markov model which simulates the costs and health outcomes of individuals treated under these alternatives over a 12-month period. Data to populate this model were extracted and synthesized from a series of randomized controlled trials and other studies that have compared these techniques on the patient group of interest. We measure effectiveness using quality-adjusted life years (QALYs) and characterize the uncertainty using probabilistic sensitivity analyses. ECT alone was found to be less costly and more effective than rTMS alone, while the strategy of providing rTMS followed by ECT when rTMS fails is the most expensive and effective option. The incremental cost per QALY gained of this latter strategy was found to be above the reference willingness-to-pay threshold used in these types of studies in Spain and other countries. The probability that ECT alone is the most cost-effective alternative was estimated to be around 70%. ECT is likely to be the most cost-effective option in the treatment of resistant severe depression for a willingness to pay of €30,000 per QALY.

  10. Prioritizing investments in health technology assessment. Can we assess potential value for money?

    PubMed

    Davies, L; Drummond, M; Papanikolaou, P

    2000-01-01

    The objective was to develop an economic prioritization model to assist those involved in the selection and prioritization of health technology assessment topics and commissioning of HTA projects. The model used decision analytic techniques to estimate the expected costs and benefits of the health care interventions that were the focus of the HTA question(s) considered by the NHS Health Technology Assessment Programme in England. Initial estimation of the value for money of HTA was conducted for several topics considered in 1997 and 1998. The results indicate that, using information routinely available in the literature and from the vignettes, it was not possible to estimate the absolute value of HTA with any certainty for this stage of the prioritization process. Overall, the results were uncertain for 65% of the HTA questions or topics analyzed. The relative costs of the interventions or technologies compared to existing costs of care and likely levels of utilization were critical factors in most of the analyses. The probability that the technology was effective with the HTA and the impact of the HTA on utilization rates were also key determinants of expected costs and benefits. The main conclusion was that it is feasible to conduct ex ante assessments of the value for money of HTA for specific topics. However, substantial work is required to ensure that the methods used are valid, reliable, consistent, and an efficient use of valuable research time.

  11. A Model Framework to Estimate Impact and Cost of Genetics-Based Sterile Insect Methods for Dengue Vector Control

    PubMed Central

    Alphey, Nina; Alphey, Luke; Bonsall, Michael B.

    2011-01-01

    Vector-borne diseases impose enormous health and economic burdens and additional methods to control vector populations are clearly needed. The Sterile Insect Technique (SIT) has been successful against agricultural pests, but is not in large-scale use for suppressing or eliminating mosquito populations. Genetic RIDL technology (Release of Insects carrying a Dominant Lethal) is a proposed modification that involves releasing insects that are homozygous for a repressible dominant lethal genetic construct rather than being sterilized by irradiation, and could potentially overcome some technical difficulties with the conventional SIT technology. Using the arboviral disease dengue as an example, we combine vector population dynamics and epidemiological models to explore the effect of a program of RIDL releases on disease transmission. We use these to derive a preliminary estimate of the potential cost-effectiveness of vector control by applying estimates of the costs of SIT. We predict that this genetic control strategy could eliminate dengue rapidly from a human community, and at lower expense (approximately US$ 2∼30 per case averted) than the direct and indirect costs of disease (mean US$ 86–190 per case of dengue). The theoretical framework has wider potential use; by appropriately adapting or replacing each component of the framework (entomological, epidemiological, vector control bio-economics and health economics), it could be applied to other vector-borne diseases or vector control strategies and extended to include other health interventions. PMID:21998654

  12. Behavioural Interventions for Urinary Incontinence in Community-Dwelling Seniors

    PubMed Central

    2008-01-01

    Executive Summary In early August 2007, the Medical Advisory Secretariat began work on the Aging in the Community project, an evidence-based review of the literature surrounding healthy aging in the community. The Health System Strategy Division at the Ministry of Health and Long-Term Care subsequently asked the secretariat to provide an evidentiary platform for the ministry’s newly released Aging at Home Strategy. After a broad literature review and consultation with experts, the secretariat identified 4 key areas that strongly predict an elderly person’s transition from independent community living to a long-term care home. Evidence-based analyses have been prepared for each of these 4 areas: falls and fall-related injuries, urinary incontinence, dementia, and social isolation. For the first area, falls and fall-related injuries, an economic model is described in a separate report. Please visit the Medical Advisory Secretariat Web site, http://www.health.gov.on.ca/english/providers/program/mas/mas_about.html, to review these titles within the Aging in the Community series. Aging in the Community: Summary of Evidence-Based Analyses Prevention of Falls and Fall-Related Injuries in Community-Dwelling Seniors: An Evidence-Based Analysis Behavioural Interventions for Urinary Incontinence in Community-Dwelling Seniors: An Evidence-Based Analysis Caregiver- and Patient-Directed Interventions for Dementia: An Evidence-Based Analysis Social Isolation in Community-Dwelling Seniors: An Evidence-Based Analysis The Falls/Fractures Economic Model in Ontario Residents Aged 65 Years and Over (FEMOR) Objective To assess the effectiveness of behavioural interventions for the treatment and management of urinary incontinence (UI) in community-dwelling seniors. Clinical Need: Target Population and Condition Urinary incontinence defined as “the complaint of any involuntary leakage of urine” was identified as 1 of the key predictors in a senior’s transition from independent community living to admission to a long-term care (LTC) home. Urinary incontinence is a health problem that affects a substantial proportion of Ontario’s community-dwelling seniors (and indirectly affects caregivers), impacting their health, functioning, well-being and quality of life. Based on Canadian studies, prevalence estimates range from 9% to 30% for senior men and nearly double from 19% to 55% for senior women. The direct and indirect costs associated with UI are substantial. It is estimated that the total annual costs in Canada are $1.5 billion (Cdn), and that each year a senior living at home will spend $1,000 to $1,500 on incontinence supplies. Interventions to treat and manage UI can be classified into broad categories which include lifestyle modification, behavioural techniques, medications, devices (e.g., continence pessaries), surgical interventions and adjunctive measures (e.g., absorbent products). The focus of this review is behavioural interventions, since they are commonly the first line of treatment considered in seniors given that they are the least invasive options with no reported side effects, do not limit future treatment options, and can be applied in combination with other therapies. In addition, many seniors would not be ideal candidates for other types of interventions involving more risk, such as surgical measures. Note: It is recognized that the terms “senior” and “elderly” carry a range of meanings for different audiences; this report generally uses the former, but the terms are treated here as essentially interchangeable. Description of Technology/Therapy Behavioural interventions can be divided into 2 categories according to the target population: caregiver-dependent techniques and patient-directed techniques. Caregiver-dependent techniques (also known as toileting assistance) are targeted at medically complex, frail individuals living at home with the assistance of a caregiver, who tends to be a family member. These seniors may also have cognitive deficits and/or motor deficits. A health care professional trains the senior’s caregiver to deliver an intervention such as prompted voiding, habit retraining, or timed voiding. The health care professional who trains the caregiver is commonly a nurse or a nurse with advanced training in the management of UI, such as a nurse continence advisor (NCA) or a clinical nurse specialist (CNS). The second category of behavioural interventions consists of patient-directed techniques targeted towards mobile, motivated seniors. Seniors in this population are cognitively able, free from any major physical deficits, and motivated to regain and/or improve their continence. A nurse or a nurse with advanced training in UI management, such as an NCA or CNS, delivers the patient-directed techniques. These are often provided as multicomponent interventions including a combination of bladder training techniques, pelvic floor muscle training (PFMT), education on bladder control strategies, and self-monitoring. Pelvic floor muscle training, defined as a program of repeated pelvic floor muscle contractions taught and supervised by a health care professional, may be employed as part of a multicomponent intervention or in isolation. Education is a large component of both caregiver-dependent and patient-directed behavioural interventions, and patient and/or caregiver involvement as well as continued practice strongly affect the success of treatment. Incontinence products, which include a large variety of pads and devices for effective containment of urine, may be used in conjunction with behavioural techniques at any point in the patient’s management. Evidence-Based Analysis Methods A comprehensive search strategy was used to identify systematic reviews and randomized controlled trials that examined the effectiveness, safety, and cost-effectiveness of caregiver-dependent and patient-directed behavioural interventions for the treatment of UI in community-dwelling seniors (see Appendix 1). Research Questions Are caregiver-dependent behavioural interventions effective in improving UI in medically complex, frail community-dwelling seniors with/without cognitive deficits and/or motor deficits? Are patient-directed behavioural interventions effective in improving UI in mobile, motivated community-dwelling seniors? Are behavioural interventions delivered by NCAs or CNSs in a clinic setting effective in improving incontinence outcomes in community-dwelling seniors? Assessment of Quality of Evidence The quality of the evidence was assessed as high, moderate, low, or very low according to the GRADE methodology and GRADE Working Group. As per GRADE the following definitions apply: High Further research is very unlikely to change confidence in the estimate of effect. Moderate Further research is likely to have an important impact on confidence in the estimate of effect and may change the estimate. Low Further research is very likely to have an important impact on confidence in the estimate of effect and is likely to change the estimate. Very Low Any estimate of effect is very uncertain Summary of Findings Executive Summary Table 1 summarizes the results of the analysis. The available evidence was limited by considerable variation in study populations and in the type and severity of UI for studies examining both caregiver-directed and patient-directed interventions. The UI literature frequently is limited to reporting subjective outcome measures such as patient observations and symptoms. The primary outcome of interest, admission to a LTC home, was not reported in the UI literature. The number of eligible studies was low, and there were limited data on long-term follow-up. Executive Summary Table 1: Summary of Evidence on Behavioural Interventions for the Treatment of Urinary Incontinence in Community-Dwelling Seniors Intervention Target Population Interventions Conclusions GRADE quality of the evidence 1. Caregiver-dependent techniques (toileting assistance) Medically complex, frail individuals at home with/without cognitive deficits and/or motor deficitsDelivered by informal caregivers who are trained by a nurse or a nurse with specialized UI training (NCA/CNS) Prompted voiding Habit retraining Timed voiding There is no evidence of effectiveness for habit retraining (n=1 study) and timed voiding (n=1 study).Prompted voiding may be effective, but effectiveness is difficult to substantiate because of an inadequately powered study (n=1 study).Resource implications and caregiver burden (usually on an informal caregiver) should be considered. Low 2. Patient-directed techniques Mobile, motivated seniorsDelivered by a nurse or a nurse with specialized UI training (NCA/CNS) Multicomponent behavioural interventionsInclude a combination ofBladder trainingPFMT (with or without biofeedback)Bladder control strategiesEducationSelf-monitoring Significant reduction in the mean number of incontinent episodes per week (n=5 studies, WMD 3.63, 95% CI, 2.07–5.19)Significant improvement in patient’s perception of UI (n=3 studies, OR 4.15, 95% CI, 2.70–6.37)Suggestive beneficial impact on patient’s health-related quality of life Moderate     PFMT alone Significant reduction in the mean number of incontinent episodes per week (n=1 study, WMD 10.50, 95% CI, 4.30–16.70) Moderate 3. Behavioural interventions led by an NCA/CNS in a clinic setting Community-dwelling seniors Behavioural interventions led by NCA/CNS Overall, effective in improving incontinence outcomes (n=3 RCTs + 1 Ontario-based before/after study) Moderate * CI refers to confidence interval; CNS, clinical nurse specialist; NCA, nurse continence advisor; PFMT, pelvic floor muscle training; RCT, randomized controlled trial; WMD, weighted mean difference; UI, urinary incontinence. Economic Analysis A budget impact analysis was conducted to forecast costs for caregiver-dependent and patient-directed multicomponent behavioural techniques delivered by NCAs, and PFMT alone delivered by physiotherapists. All costs are reported in 2008 Canadian dollars. Based on epidemiological data, published medical literature and clinical expert opinion, the annual cost of caregiver-dependent behavioural techniques was estimated to be $9.2 M, while the annual costs of patient-directed behavioural techniques delivered by either an NCA or physiotherapist were estimated to be $25.5 M and $36.1 M, respectively. Estimates will vary if the underlying assumptions are changed. Currently, the province of Ontario absorbs the cost of NCAs (available through the 42 Community Care Access Centres across the province) in the home setting. The 2007 Incontinence Care in the Community Report estimated that the total cost being absorbed by the public system of providing continence care in the home is $19.5 M in Ontario. This cost estimate included resources such as personnel, communication with physicians, record keeping and product costs. Clinic costs were not included in this estimation because currently these come out of the global budget of the respective hospital and very few continence clinics actually exist in the province. The budget impact analysis factored in a cost for the clinic setting, assuming that the public system would absorb the cost with this new model of community care. Considerations for Ontario Health System An expert panel on aging in the community met on 3 occasions from January to May 2008, and in part, discussed treatment of UI in seniors in Ontario with a focus on caregiver-dependent and patient-directed behavioural interventions. In particular, the panel discussed how treatment for UI is made available to seniors in Ontario and who provides the service. Some of the major themes arising from the discussions included: Services/interventions that currently exist in Ontario offering behavioural interventions to treat UI are not consistent. There is a lack of consistency in how seniors access services for treatment of UI, who manages patients and what treatment patients receive. Help-seeking behaviours are important to consider when designing optimal service delivery methods. There is considerable social stigma associated with UI and therefore there is a need for public education and an awareness campaign. The cost of incontinent supplies and the availability of NCAs were highlighted. Conclusions There is moderate-quality evidence that the following interventions are effective in improving UI in mobile motivated seniors: Multicomponent behavioural interventions including a combination of bladder training techniques, PFMT (with or without biofeedback), education on bladder control strategies and self-monitoring techniques. Pelvic floor muscle training alone. There is moderate quality evidence that when behavioural interventions are led by NCAs or CNSs in a clinic setting, they are effective in improving UI in seniors. There is limited low-quality evidence that prompted voiding may be effective in medically complex, frail seniors with motivated caregivers. There is insufficient evidence for the following interventions in medically complex, frail seniors with motivated caregivers: habit retraining, and timed voiding. PMID:23074508

  13. NASA Instrument Cost/Schedule Model

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  14. Estimating Physical Activity Energy Expenditure with the Kinect Sensor in an Exergaming Environment

    PubMed Central

    Nathan, David; Huynh, Du Q.; Rubenson, Jonas; Rosenberg, Michael

    2015-01-01

    Active video games that require physical exertion during game play have been shown to confer health benefits. Typically, energy expended during game play is measured using devices attached to players, such as accelerometers, or portable gas analyzers. Since 2010, active video gaming technology incorporates marker-less motion capture devices to simulate human movement into game play. Using the Kinect Sensor and Microsoft SDK this research aimed to estimate the mechanical work performed by the human body and estimate subsequent metabolic energy using predictive algorithmic models. Nineteen University students participated in a repeated measures experiment performing four fundamental movements (arm swings, standing jumps, body-weight squats, and jumping jacks). Metabolic energy was captured using a Cortex Metamax 3B automated gas analysis system with mechanical movement captured by the combined motion data from two Kinect cameras. Estimations of the body segment properties, such as segment mass, length, centre of mass position, and radius of gyration, were calculated from the Zatsiorsky-Seluyanov's equations of de Leva, with adjustment made for posture cost. GPML toolbox implementation of the Gaussian Process Regression, a locally weighted k-Nearest Neighbour Regression, and a linear regression technique were evaluated for their performance on predicting the metabolic cost from new feature vectors. The experimental results show that Gaussian Process Regression outperformed the other two techniques by a small margin. This study demonstrated that physical activity energy expenditure during exercise, using the Kinect camera as a motion capture system, can be estimated from segmental mechanical work. Estimates for high-energy activities, such as standing jumps and jumping jacks, can be made accurately, but for low-energy activities, such as squatting, the posture of static poses should be considered as a contributing factor. When translated into the active video gaming environment, the results could be incorporated into game play to more accurately control the energy expenditure requirements. PMID:26000460

  15. Joint estimation of vertical total electron content (VTEC) and satellite differential code biases (SDCBs) using low-cost receivers

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Hongxing; Li, Min

    2018-04-01

    Vertical total electron content (VTEC) parameters estimated using global navigation satellite system (GNSS) data are of great interest for ionosphere sensing. Satellite differential code biases (SDCBs) account for one source of error which, if left uncorrected, can deteriorate performance of positioning, timing and other applications. The customary approach to estimate VTEC along with SDCBs from dual-frequency GNSS data, hereinafter referred to as DF approach, consists of two sequential steps. The first step seeks to retrieve ionospheric observables through the carrier-to-code leveling technique. This observable, related to the slant total electron content (STEC) along the satellite-receiver line-of-sight, is biased also by the SDCBs and the receiver differential code biases (RDCBs). By means of thin-layer ionospheric model, in the second step one is able to isolate the VTEC, the SDCBs and the RDCBs from the ionospheric observables. In this work, we present a single-frequency (SF) approach, enabling the joint estimation of VTEC and SDCBs using low-cost receivers; this approach is also based on two steps and it differs from the DF approach only in the first step, where we turn to the precise point positioning technique to retrieve from the single-frequency GNSS data the ionospheric observables, interpreted as the combination of the STEC, the SDCBs and the biased receiver clocks at the pivot epoch. Our numerical analyses clarify how SF approach performs when being applied to GPS L1 data collected by a single receiver under both calm and disturbed ionospheric conditions. The daily time series of zenith VTEC estimates has an accuracy ranging from a few tenths of a TEC unit (TECU) to approximately 2 TECU. For 73-96% of GPS satellites in view, the daily estimates of SDCBs do not deviate, in absolute value, more than 1 ns from their ground truth values published by the Centre for Orbit Determination in Europe.

  16. A PLL-based resampling technique for vibration analysis in variable-speed wind turbines with PMSG: A bearing fault case

    NASA Astrophysics Data System (ADS)

    Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.

    2017-02-01

    Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.

  17. Serological testing versus other strategies for diagnosis of active tuberculosis in India: a cost-effectiveness analysis.

    PubMed

    Dowdy, David W; Steingart, Karen R; Pai, Madhukar

    2011-08-01

    Undiagnosed and misdiagnosed tuberculosis (TB) drives the epidemic in India. Serological (antibody detection) TB tests are not recommended by any agency, but widely used in many countries, including the Indian private sector. The cost and impact of using serology compared with other diagnostic techniques is unknown. Taking a patient cohort conservatively equal to the annual number of serological tests done in India (1.5 million adults suspected of having active TB), we used decision analysis to estimate costs and effectiveness of sputum smear microscopy (US$3.62 for two smears), microscopy plus automated liquid culture (mycobacterium growth indicator tube [MGIT], US$20/test), and serological testing (anda-tb ELISA, US$20/test). Data on test accuracy and costs were obtained from published literature. We adopted the perspective of the Indian TB control sector and an analysis frame of 1 year. Our primary outcome was the incremental cost per disability-adjusted life year (DALY) averted. We performed one-way sensitivity analysis on all model parameters, with multiway sensitivity analysis on variables to which the model was most sensitive. If used instead of sputum microscopy, serology generated an estimated 14,000 more TB diagnoses, but also 121,000 more false-positive diagnoses, 102,000 fewer DALYs averted, and 32,000 more secondary TB cases than microscopy, at approximately four times the incremental cost (US$47.5 million versus US$11.9 million). When added to high-quality sputum smears, MGIT culture was estimated to avert 130,000 incremental DALYs at an incremental cost of US$213 per DALY averted. Serology was dominated by (i.e., more costly and less effective than) MGIT culture and remained less economically favorable than sputum smear or TB culture in one-way and multiway sensitivity analyses. In India, sputum smear microscopy remains the most cost-effective diagnostic test available for active TB; efforts to increase access to quality-assured microscopy should take priority. In areas where high-quality microscopy exists and resources are sufficient, MGIT culture is more cost-effective than serology as an additional diagnostic test for TB. These data informed a recently published World Health Organization policy statement against serological tests.

  18. Enteral Feeding Set Handling Techniques.

    PubMed

    Lyman, Beth; Williams, Maria; Sollazzo, Janet; Hayden, Ashley; Hensley, Pam; Dai, Hongying; Roberts, Cristine

    2017-04-01

    Enteral nutrition therapy is common practice in pediatric clinical settings. Often patients will receive a pump-assisted bolus feeding over 30 minutes several times per day using the same enteral feeding set (EFS). This study aims to determine the safest and most efficacious way to handle the EFS between feedings. Three EFS handling techniques were compared through simulation for bacterial growth, nursing time, and supply costs: (1) rinsing the EFS with sterile water after each feeding, (2) refrigerating the EFS between feedings, and (3) using a ready-to-hang (RTH) product maintained at room temperature. Cultures were obtained at baseline, hour 12, and hour 21 of the 24-hour cycle. A time-in-motion analysis was conducted and reported in average number of seconds to complete each procedure. Supply costs were inventoried for 1 month comparing the actual usage to our estimated usage. Of 1080 cultures obtained, the overall bacterial growth rate was 8.7%. The rinse and refrigeration techniques displayed similar bacterial growth (11.4% vs 10.3%, P = .63). The RTH technique displayed the least bacterial growth of any method (4.4%, P = .002). The time analysis in minutes showed the rinse method was the most time-consuming (44.8 ± 2.7) vs refrigeration (35.8 ± 2.6) and RTH (31.08 ± 0.6) ( P < .0001). All 3 EFS handling techniques displayed low bacterial growth. RTH was superior in bacterial growth, nursing time, and supply costs. Since not all pediatric formulas are available in RTH, we conclude that refrigerating the EFS between uses is the next most efficacious method for handling the EFS between bolus feeds.

  19. The Economic Burden of Child Maltreatment in the United States And Implications for Prevention

    PubMed Central

    Fang, Xiangming; Brown, Derek S.; Florence, Curtis; Mercy, James A.

    2013-01-01

    Objectives To present new estimates of the average lifetime costs per child maltreatment victim and aggregate lifetime costs for all new child maltreatment cases incurred in 2008 using an incidence-based approach. Methods This study used the best available secondary data to develop cost per case estimates. For each cost category, the paper used attributable costs whenever possible. For those categories that attributable cost data were not available, costs were estimated as the product of incremental effect of child maltreatment on a specific outcome multiplied by the estimated cost associated with that outcome. The estimate of the aggregate lifetime cost of child maltreatment in 2008 was obtained by multiplying per-victim lifetime cost estimates by the estimated cases of new child maltreatment in 2008. Results The estimated average lifetime cost per victim of nonfatal child maltreatment is $210,012 in 2010 dollars, including $32,648 in childhood health care costs; $10,530 in adult medical costs; $144,360 in productivity losses; $7,728 in child welfare costs; $6,747 in criminal justice costs; and $7,999 in special education costs. The estimated average lifetime cost per death is $1,272,900, including $14,100 in medical costs and $1,258,800 in productivity losses. The total lifetime economic burden resulting from new cases of fatal and nonfatal child maltreatment in the United States in 2008 is approximately $124 billion. In sensitivity analysis, the total burden is estimated to be as large as $585 billion. Conclusions Compared with other health problems, the burden of child maltreatment is substantial, indicating the importance of prevention efforts to address the high prevalence of child maltreatment. PMID:22300910

  20. Performance assessment of fire-sat monitoring system based on satellite time series for fire danger estimation : the experience of the pre-operative application in the Basilicata Region (Italy)

    NASA Astrophysics Data System (ADS)

    Lanorte, Antonio; Desantis, Fortunato; Aromando, Angelo; Lasaponara, Rosa

    2013-04-01

    This paper presents the results we obtained in the context of the FIRE-SAT project during the 2012 operative application of the satellite based tools for fire monitoring. FIRE_SAT project has been funded by the Civil Protection of the Basilicata Region in order to set up a low cost methodology for fire danger monitoring and fire effect estimation based on satellite Earth Observation techniques. To this aim, NASA Moderate Resolution Imaging Spectroradiometer (MODIS), ASTER, Landsat TM data were used. Novel data processing techniques have been developed by researchers of the ARGON Laboratory of the CNR-IMAA for the operative monitoring of fire. In this paper we only focus on the danger estimation model which has been fruitfully used since 2008 to 2012 as an reliable operative tool to support and optimize fire fighting strategies from the alert to the management of resources including fire attacks. The daily updating of fire danger is carried out using satellite MODIS images selected for their spectral capability and availability free of charge from NASA web site. This makes these data sets very suitable for an effective systematic (daily) and sustainable low-cost monitoring of large areas. The preoperative use of the integrated model, pointed out that the system properly monitor spatial and temporal variations of fire susceptibility and provide useful information of both fire severity and post fire regeneration capability.

  1. New Technologies to Reclaim Arid Lands User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    W. K. Ostler

    2002-10-01

    Approximately 70 percent of all U.S. military training lands are located in arid and semi-arid areas. Training activities in such areas frequently adversely affect vegetation, damaging plants and reducing the resilience of vegetation to recover once disturbed. Fugitive dust resulting from a loss of vegetation creates additional problems for human health, increasing accidents due to decreased visibility, and increasing maintenance costs for roads, vehicles, and equipment. Under conventional technologies to mitigate these impacts, it is estimated that up to 35 percent of revegetation projects in arid areas will fail due to unpredictable natural environmental conditions, such as drought, and reclamationmore » techniques that were inadequate to restore vegetative cover in a timely and cost-effective manner. New reclamation and restoration techniques are needed in desert ranges to help mitigate the adverse effects of military training and other activities to arid-land environments. In 1999, a cooperative effort between the U.S. Department of Energy (DOE), the US. Department of Defense (DoD), and selected university scientists was undertaken to focus on mitigating military impacts in arid lands. As arid lands are impacted due to DoD and DOE activities, biological and soil resources are gradually lost and the habitat is altered. A conceptual model of that change in habitat quality is described for varying levels of disturbance in the Mojave Desert. As the habitat quality degrades and more biological and physical resources are lost from training areas, greater costs are required to return the land to sustainable levels. The purpose of this manual is to assist land managers in recognizing thresholds associated with habitat degradation and provide reclamation planning and techniques that can reduce the costs of mitigation for these impacted lands to ensure sustainable use of these lands. The importance of reclamation planning is described in this manual with suggestions about establishing project objectives, scheduling, budgeting, and selecting cost-effective techniques. Reclamation techniques include sections describing: (1) erosion control (physical, chemical, and biological), (2) site preparation, (3) soil amendments, (4) seeding, (5) planting, (6) grazing and weed control, (7) mulching, (8) irrigation, and (9) site protection. Each section states the objectives of the technique, the principles, an in-depth look at the techniques, and any special considerations as it relates to DoD or DOE lands. The need for monitoring and remediation is described to guide users in monitoring reclamation efforts to evaluate their cost-effectiveness. Costs are provided for the proposed techniques for the major deserts of the southwestern U.S. showing the average and range of costs. A set of decision tools are provided in the form of a flow diagram and table to guide users in selecting effective reclamation techniques to achieve mitigation objectives. Recommendations are provided to help summarize key reclamation principles and to assist users in developing a successful program that contributes to sustainable uses of DoD and DOE lands. The users manual is helpful to managers in communicating to installation management the needs and consequences of training decisions and the costs required to achieve successful levels of sustainable use. This users manual focuses on the development of new reclamation techniques that have been implemented at the National Training Center at Fort Irwin, California, and are applicable to most arid land reclamation efforts.« less

  2. Taking a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less

  3. Retrospective Assessment of Cost Savings From Prevention: Folic Acid Fortification and Spina Bifida in the U.S.

    PubMed

    Grosse, Scott D; Berry, Robert J; Mick Tilford, J; Kucik, James E; Waitzman, Norman J

    2016-05-01

    Although fortification of food with folic acid has been calculated to be cost saving in the U.S., updated estimates are needed. This analysis calculates new estimates from the societal perspective of net cost savings per year associated with mandatory folic acid fortification of enriched cereal grain products in the U.S. that was implemented during 1997-1998. Estimates of annual numbers of live-born spina bifida cases in 1995-1996 relative to 1999-2011 based on birth defects surveillance data were combined during 2015 with published estimates of the present value of lifetime direct costs updated in 2014 U.S. dollars for a live-born infant with spina bifida to estimate avoided direct costs and net cost savings. The fortification mandate is estimated to have reduced the annual number of U.S. live-born spina bifida cases by 767, with a lower-bound estimate of 614. The present value of mean direct lifetime cost per infant with spina bifida is estimated to be $791,900, or $577,000 excluding caregiving costs. Using a best estimate of numbers of avoided live-born spina bifida cases, fortification is estimated to reduce the present value of total direct costs for each year's birth cohort by $603 million more than the cost of fortification. A lower-bound estimate of cost savings using conservative assumptions, including the upper-bound estimate of fortification cost, is $299 million. The estimates of cost savings are larger than previously reported, even using conservative assumptions. The analysis can also inform assessments of folic acid fortification in other countries. Published by Elsevier Inc.

  4. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  5. Costs of Chronic Diseases at the State Level: The Chronic Disease Cost Calculator

    PubMed Central

    Murphy, Louise B.; Khavjou, Olga A.; Li, Rui; Maylahn, Christopher M.; Tangka, Florence K.; Nurmagambetov, Tursynbek A.; Ekwueme, Donatus U.; Nwaise, Isaac; Chapman, Daniel P.; Orenstein, Diane

    2015-01-01

    Introduction Many studies have estimated national chronic disease costs, but state-level estimates are limited. The Centers for Disease Control and Prevention developed the Chronic Disease Cost Calculator (CDCC), which estimates state-level costs for arthritis, asthma, cancer, congestive heart failure, coronary heart disease, hypertension, stroke, other heart diseases, depression, and diabetes. Methods Using publicly available and restricted secondary data from multiple national data sets from 2004 through 2008, disease-attributable annual per-person medical and absenteeism costs were estimated. Total state medical and absenteeism costs were derived by multiplying per person costs from regressions by the number of people in the state treated for each disease. Medical costs were estimated for all payers and separately for Medicaid, Medicare, and private insurers. Projected medical costs for all payers (2010 through 2020) were calculated using medical costs and projected state population counts. Results Median state-specific medical costs ranged from $410 million (asthma) to $1.8 billion (diabetes); median absenteeism costs ranged from $5 million (congestive heart failure) to $217 million (arthritis). Conclusion CDCC provides methodologically rigorous chronic disease cost estimates. These estimates highlight possible areas of cost savings achievable through targeted prevention efforts or research into new interventions and treatments. PMID:26334712

  6. Cognitive and Emotion Regulation Change Processes in Cognitive Behavioural Therapy for Social Anxiety Disorder.

    PubMed

    O'Toole, Mia S; Mennin, Douglas S; Hougaard, Esben; Zachariae, Robert; Rosenberg, Nicole K

    2015-01-01

    The objective of the study was to investigate variables, derived from both cognitive and emotion regulation conceptualizations of social anxiety disorder (SAD), as possible change processes in cognitive behaviour therapy (CBT) for SAD. Several proposed change processes were investigated: estimated probability, estimated cost, safety behaviours, acceptance of emotions, cognitive reappraisal and expressive suppression. Participants were 50 patients with SAD, receiving a standard manualized CBT program, conducted in groups or individually. All variables were measured pre-therapy, mid-therapy and post-therapy. Lower level mediation models revealed that while a change in most process measures significantly predicted clinical improvement, only changes in estimated probability and cost and acceptance of emotions showed significant indirect effects of CBT for SAD. The results are in accordance with previous studies supporting the mediating role of changes in cognitive distortions in CBT for SAD. In addition, acceptance of emotions may also be a critical component to clinical improvement in SAD during CBT, although more research is needed on which elements of acceptance are most helpful for individuals with SAD. The study's lack of a control condition limits any conclusion regarding the specificity of the findings to CBT. Change in estimated probability and cost, and acceptance of emotions showed an indirect effect of CBT for SAD. Cognitive distortions appear relevant to target with cognitive restructuring techniques. Finding acceptance to have an indirect effect could be interpreted as support for contemporary CBT approaches that include acceptance-based strategies. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Cost-effective sampling of (137)Cs-derived net soil redistribution: part 2 - estimating the spatial mean change over time.

    PubMed

    Chappell, A; Li, Y; Yu, H Q; Zhang, Y Z; Li, X Y

    2015-06-01

    The caesium-137 ((137)Cs) technique for estimating net, time-integrated soil redistribution by the processes of wind, water and tillage is increasingly being used with repeated sampling to form a baseline to evaluate change over small (years to decades) timeframes. This interest stems from knowledge that since the 1950s soil redistribution has responded dynamically to different phases of land use change and management. Currently, there is no standard approach to detect change in (137)Cs-derived net soil redistribution and thereby identify the driving forces responsible for change. We outline recent advances in space-time sampling in the soil monitoring literature which provide a rigorous statistical and pragmatic approach to estimating the change over time in the spatial mean of environmental properties. We apply the space-time sampling framework, estimate the minimum detectable change of net soil redistribution and consider the information content and cost implications of different sampling designs for a study area in the Chinese Loess Plateau. Three phases (1954-1996, 1954-2012 and 1996-2012) of net soil erosion were detectable and attributed to well-documented historical change in land use and management practices in the study area and across the region. We recommend that the design for space-time sampling is considered carefully alongside cost-effective use of the spatial mean to detect and correctly attribute cause of change over time particularly across spatial scales of variation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Cost Validation Using PRICE H

    NASA Technical Reports Server (NTRS)

    Jack, John; Kwan, Eric; Wood, Milana

    2011-01-01

    PRICE H was introduced into the JPL cost estimation tool set circa 2003. It became more available at JPL when IPAO funded the NASA-wide site license for all NASA centers. PRICE H was mainly used as one of the cost tools to validate proposal grassroots cost estimates. Program offices at JPL view PRICE H as an additional crosscheck to Team X (JPL Concurrent Engineering Design Center) estimates. PRICE H became widely accepted ca, 2007 at JPL when the program offices moved away from grassroots cost estimation for Step 1 proposals. PRICE H is now one of the key cost tools used for cost validation, cost trades, and independent cost estimates.

  9. Blind Compensation of I/Q Impairments in Wireless Transceivers

    PubMed Central

    Aziz, Mohsin; Ghannouchi, Fadhel M.; Helaoui, Mohamed

    2017-01-01

    The majority of techniques that deal with the mitigation of in-phase and quadrature-phase (I/Q) imbalance at the transmitter (pre-compensation) require long training sequences, reducing the throughput of the system. These techniques also require a feedback path, which adds more complexity and cost to the transmitter architecture. Blind estimation techniques are attractive for avoiding the use of long training sequences. In this paper, we propose a blind frequency-independent I/Q imbalance compensation method based on the maximum likelihood (ML) estimation of the imbalance parameters of a transceiver. A closed-form joint probability density function (PDF) for the imbalanced I and Q signals is derived and validated. ML estimation is then used to estimate the imbalance parameters using the derived joint PDF of the output I and Q signals. Various figures of merit have been used to evaluate the efficacy of the proposed approach using extensive computer simulations and measurements. Additionally, the bit error rate curves show the effectiveness of the proposed method in the presence of the wireless channel and Additive White Gaussian Noise. Real-world experimental results show an image rejection of greater than 30 dB as compared to the uncompensated system. This method has also been found to be robust in the presence of practical system impairments, such as time and phase delay mismatches. PMID:29257081

  10. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique

    PubMed Central

    Bayati, Mohsen; Mahboub Ahari, Alireza; Badakhshan, Abbas; Gholipour, Mahin; Joulaei, Hassan

    2015-01-01

    Background: Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs) have been the main motivations to define and implement this study. Objectives: The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC) as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs). Materials and Methods: We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Results: Total annual cost of MRI activity center (AC) was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. Conclusion: As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be implemented in MRI centers. With the settlement of a reliable cost accounting system such as ABC technique, hospitals would be able to generate robust evidences for financial management of their overhead, intermediate and final ACs. PMID:26715979

  11. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique.

    PubMed

    Bayati, Mohsen; Mahboub Ahari, Alireza; Badakhshan, Abbas; Gholipour, Mahin; Joulaei, Hassan

    2015-10-01

    Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs) have been the main motivations to define and implement this study. The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC) as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs). We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Total annual cost of MRI activity center (AC) was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be implemented in MRI centers. With the settlement of a reliable cost accounting system such as ABC technique, hospitals would be able to generate robust evidences for financial management of their overhead, intermediate and final ACs.

  12. Applying spectral data analysis techniques to aquifer monitoring data in Belvoir Ranch, Wyoming

    NASA Astrophysics Data System (ADS)

    Gao, F.; He, S.; Zhang, Y.

    2017-12-01

    This study uses spectral data analysis techniques to estimate the hydraulic parameters from water level fluctuation due to tide effect and barometric effect. All water level data used in this study are collected in Belvoir Ranch, Wyoming. Tide effect can be not only observed in coastal areas, but also in inland confined aquifers. The force caused by changing positions of sun and moon affects not only ocean but also solid earth. The tide effect has an oscillatory pumping or injection sequence to the aquifer, and can be observed from dense water level monitoring. Belvoir Ranch data are collected once per hour, thus is dense enough to capture the tide effect. First, transforming de-trended data from temporal domain to frequency domain with Fourier transform method. Then, the storage coefficient can be estimated using Bredehoeft-Jacob model. After this, analyze the gain function, which expresses the amplification and attenuation of the output signal, and derive barometric efficiency. Next, find effective porosity with storage coefficient and barometric efficiency with Jacob's model. Finally, estimate aquifer transmissivity and hydraulic conductivity using Paul Hsieh's method. The estimated hydraulic parameters are compared with those from traditional pumping data estimation. This study proves that hydraulic parameter can be estimated by only analyze water level data in frequency domain. It has the advantages of low cost and environmental friendly, thus should be considered for future use of hydraulic parameter estimations.

  13. A technique for estimating dry deposition velocities based on similarity with latent heat flux

    NASA Astrophysics Data System (ADS)

    Pleim, Jonathan E.; Finkelstein, Peter L.; Clarke, John F.; Ellestad, Thomas G.

    Field measurements of chemical dry deposition are needed to assess impacts and trends of airborne contaminants on the exposure of crops and unmanaged ecosystems as well as for the development and evaluation of air quality models. However, accurate measurements of dry deposition velocities require expensive eddy correlation measurements and can only be practically made for a few chemical species such as O 3 and CO 2. On the other hand, operational dry deposition measurements such as those used in large area networks involve relatively inexpensive standard meteorological and chemical measurements but rely on less accurate deposition velocity models. This paper describes an intermediate technique which can give accurate estimates of dry deposition velocity for chemical species which are dominated by stomatal uptake such as O 3 and SO 2. This method can give results that are nearly the quality of eddy correlation measurements of trace gas fluxes at much lower cost. The concept is that bulk stomatal conductance can be accurately estimated from measurements of latent heat flux combined with standard meteorological measurements of humidity, temperature, and wind speed. The technique is tested using data from a field experiment where high quality eddy correlation measurements were made over soybeans. Over a four month period, which covered the entire growth cycle, this technique showed very good agreement with eddy correlation measurements for O 3 deposition velocity.

  14. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOTmore » intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.« less

  15. A Total Cost of Ownership Model for Low Temperature PEM Fuel Cells in Combined Heat and Power and Backup Power Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    University of California, Berkeley; Wei, Max; Lipman, Timothy

    2014-06-23

    A total cost of ownership model is described for low temperature proton exchange membrane stationary fuel cell systems for combined heat and power (CHP) applications from 1-250kW and backup power applications from 1-50kW. System designs and functional specifications for these two applications were developed across the range of system power levels. Bottom-up cost estimates were made for balance of plant costs, and detailed direct cost estimates for key fuel cell stack components were derived using design-for-manufacturing-and-assembly techniques. The development of high throughput, automated processes achieving high yield are projected to reduce the cost for fuel cell stacks to the $300/kWmore » level at an annual production volume of 100 MW. Several promising combinations of building types and geographical location in the U.S. were identified for installation of fuel cell CHP systems based on the LBNL modelling tool DER CAM. Life-cycle modelling and externality assessment were done for hotels and hospitals. Reduced electricity demand charges, heating credits and carbon credits can reduce the effective cost of electricity ($/kWhe) by 26-44percent in locations such as Minneapolis, where high carbon intensity electricity from the grid is displaces by a fuel cell system operating on reformate fuel. This project extends the scope of existing cost studies to include externalities and ancillary financial benefits and thus provides a more comprehensive picture of fuel cell system benefits, consistent with a policy and incentive environment that increasingly values these ancillary benefits. The project provides a critical, new modelling capacity and should aid a broad range of policy makers in assessing the integrated costs and benefits of fuel cell systems versus other distributed generation technologies.« less

  16. Defining the Costs of Reusable Flexible Ureteroscope Reprocessing Using Time-Driven Activity-Based Costing.

    PubMed

    Isaacson, Dylan; Ahmad, Tessnim; Metzler, Ian; Tzou, David T; Taguchi, Kazumi; Usawachintachit, Manint; Zetumer, Samuel; Sherer, Benjamin; Stoller, Marshall; Chi, Thomas

    2017-10-01

    Careful decontamination and sterilization of reusable flexible ureteroscopes used in ureterorenoscopy cases prevent the spread of infectious pathogens to patients and technicians. However, inefficient reprocessing and unavailability of ureteroscopes sent out for repair can contribute to expensive operating room (OR) delays. Time-driven activity-based costing (TDABC) was applied to describe the time and costs involved in reprocessing. Direct observation and timing were performed for all steps in reprocessing of reusable flexible ureteroscopes following operative procedures. Estimated times needed for each step by which damaged ureteroscopes identified during reprocessing are sent for repair were characterized through interviews with purchasing analyst staff. Process maps were created for reprocessing and repair detailing individual step times and their variances. Cost data for labor and disposables used were applied to calculate per minute and average step costs. Ten ureteroscopes were followed through reprocessing. Process mapping for ureteroscope reprocessing averaged 229.0 ± 74.4 minutes, whereas sending a ureteroscope for repair required an estimated 143 minutes per repair. Most steps demonstrated low variance between timed observations. Ureteroscope drying was the longest and highest variance step at 126.5 ± 55.7 minutes and was highly dependent on manual air flushing through the ureteroscope working channel and ureteroscope positioning in the drying cabinet. Total costs for reprocessing totaled $96.13 per episode, including the cost of labor and disposable items. Utilizing TDABC delineates the full spectrum of costs associated with ureteroscope reprocessing and identifies areas for process improvement to drive value-based care. At our institution, ureteroscope drying was one clearly identified target area. Implementing training in ureteroscope drying technique could save up to 2 hours per reprocessing event, potentially preventing expensive OR delays.

  17. Estimation of Land Surface Fluxes and Their Uncertainty via Variational Data Assimilation Approach

    NASA Astrophysics Data System (ADS)

    Abdolghafoorian, A.; Farhadi, L.

    2016-12-01

    Accurate estimation of land surface heat and moisture fluxes as well as root zone soil moisture is crucial in various hydrological, meteorological, and agricultural applications. "In situ" measurements of these fluxes are costly and cannot be readily scaled to large areas relevant to weather and climate studies. Therefore, there is a need for techniques to make quantitative estimates of heat and moisture fluxes using land surface state variables. In this work, we applied a novel approach based on the variational data assimilation (VDA) methodology to estimate land surface fluxes and soil moisture profile from the land surface states. This study accounts for the strong linkage between terrestrial water and energy cycles by coupling the dual source energy balance equation with the water balance equation through the mass flux of evapotranspiration (ET). Heat diffusion and moisture diffusion into the column of soil are adjoined to the cost function as constraints. This coupling results in more accurate prediction of land surface heat and moisture fluxes and consequently soil moisture at multiple depths with high temporal frequency as required in many hydrological, environmental and agricultural applications. One of the key limitations of VDA technique is its tendency to be ill-posed, meaning that a continuum of possibilities exists for different parameters that produce essentially identical measurement-model misfit errors. On the other hand, the value of heat and moisture flux estimation to decision-making processes is limited if reasonable estimates of the corresponding uncertainty are not provided. In order to address these issues, in this research uncertainty analysis will be performed to estimate the uncertainty of retrieved fluxes and root zone soil moisture. The assimilation algorithm is tested with a series of experiments using a synthetic data set generated by the simultaneous heat and water (SHAW) model. We demonstrate the VDA performance by comparing the (synthetic) true measurements (including profile of soil moisture and temperature, land surface water and heat fluxes, and root water uptake) with VDA estimates. In addition, the feasibility of extending the proposed approach to use remote sensing observations is tested by limiting the number of LST observations and soil moisture observations.

  18. Retrofit photovoltaic systems for intermediate sized applications - A design and market study

    NASA Astrophysics Data System (ADS)

    Noel, G. T.; Hagely, J. R.

    An assessment of the technical and economic feasibility of retrofitting a significant portion of the existing intermediate sector building/application inventory with photovoltaic systems is presented. The assessment includes the development of detailed engineering and architectural designs as well as cost estimates for 12 representative installations. Promising applications include retail stores, warehouses, office buildings, religious buildings, shopping centers, education buildings, hospitals, and industrial sites. A market study indicates that there is a national invetory of 1.5 to 2.0 million feasible intermediate sector applications, with the majority being in the 20 to 400 kW size range. The present cost of the major systems components and the cost of necessary building modifications are the primary current barriers to the realization of a large retrofit photovoltaic system market. The development of standardized modular system designs and installation techniques are feasible ways to minimize costs.

  19. Governance and performance: the performance of Dutch hospitals explained by governance characteristics.

    PubMed

    Blank, Jos L T; van Hulst, Bart Laurents

    2011-10-01

    This paper describes the efficiency of Dutch hospitals using the Data Envelopment Analysis (DEA) method with bootstrapping. In particular, the analysis focuses on accounting for cost inefficiency measures on the part of hospital corporate governance. We use bootstrap techniques, as introduced by Simar and Wilson (J. Econom. 136(1):31-64, 2007), in order to obtain more efficient estimates of the effects of governance on the efficiency. The results show that part of the cost efficiency can be explained with governance. In particular we find that a higher remuneration of the board as well as a higher remuneration of the supervisory board does not implicate better performance.

  20. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  1. Economic costs of obesity in Thailand: a retrospective cost-of-illness study

    PubMed Central

    2014-01-01

    Background Over the last decade, the prevalence of obesity (BMI ≥ 25 kg/m2) in Thailand has been rising rapidly and consistently. Estimating the cost of obesity to society is an essential step in setting priorities for research and resource use and helping improve public awareness of the negative economic impacts of obesity. This prevalence-based, cost-of-illness study aims to estimate the economic costs of obesity in Thailand. Methods The estimated costs in this study included health care cost, cost of productivity loss due to premature mortality, and cost of productivity loss due to hospital-related absenteeism. The Obesity-Attributable Fraction (OAF) was used to estimate the extent to which the co-morbidities were attributable to obesity. The health care cost of obesity was further estimated by multiplying the number of patients in each disease category attributable to obesity by the unit cost of treatment. The cost of productivity loss was calculated using the human capital approach. Results The health care cost attributable to obesity was estimated at 5,584 million baht or 1.5% of national health expenditure. The cost of productivity loss attributable to obesity was estimated at 6,558 million baht - accounting for 54% of the total cost of obesity. The cost of hospital-related absenteeism was estimated at 694 million baht, while the cost of premature mortality was estimated at 5,864 million baht. The total cost of obesity was then estimated at 12,142 million baht (725.3 million US$PPP, 16.74 baht =1 US$PPP accounting for 0.13% of Thailand’s Gross Domestic Product (GDP). Conclusions Obesity imposes a substantial economic burden on Thai society especially in term of health care costs. Large-scale comprehensive interventions focused on improving public awareness of the cost of and problems associated with obesity and promoting a healthy lifestyle should be regarded as a public health priority. PMID:24690106

  2. Economic costs of obesity in Thailand: a retrospective cost-of-illness study.

    PubMed

    Pitayatienanan, Paiboon; Butchon, Rukmanee; Yothasamut, Jomkwan; Aekplakorn, Wichai; Teerawattananon, Yot; Suksomboon, Naeti; Thavorncharoensap, Montarat

    2014-04-02

    Over the last decade, the prevalence of obesity (BMI ≥ 25 kg/m2) in Thailand has been rising rapidly and consistently. Estimating the cost of obesity to society is an essential step in setting priorities for research and resource use and helping improve public awareness of the negative economic impacts of obesity. This prevalence-based, cost-of-illness study aims to estimate the economic costs of obesity in Thailand. The estimated costs in this study included health care cost, cost of productivity loss due to premature mortality, and cost of productivity loss due to hospital-related absenteeism. The Obesity-Attributable Fraction (OAF) was used to estimate the extent to which the co-morbidities were attributable to obesity. The health care cost of obesity was further estimated by multiplying the number of patients in each disease category attributable to obesity by the unit cost of treatment. The cost of productivity loss was calculated using the human capital approach. The health care cost attributable to obesity was estimated at 5,584 million baht or 1.5% of national health expenditure. The cost of productivity loss attributable to obesity was estimated at 6,558 million baht - accounting for 54% of the total cost of obesity. The cost of hospital-related absenteeism was estimated at 694 million baht, while the cost of premature mortality was estimated at 5,864 million baht. The total cost of obesity was then estimated at 12,142 million baht (725.3 million US$PPP, 16.74 baht =1 US$PPP accounting for 0.13% of Thailand's Gross Domestic Product (GDP). Obesity imposes a substantial economic burden on Thai society especially in term of health care costs. Large-scale comprehensive interventions focused on improving public awareness of the cost of and problems associated with obesity and promoting a healthy lifestyle should be regarded as a public health priority.

  3. Product line cost estimation: a standard cost approach.

    PubMed

    Cooper, J C; Suver, J D

    1988-04-01

    Product line managers often must make decisions based on inaccurate cost information. A method is needed to determine costs more accurately. By using a standard costing model, product line managers can better estimate the cost of intermediate and end products, and hence better estimate the costs of the product line.

  4. Storage capacity: how big should it be

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malina, M.A.

    1980-01-28

    A mathematical model was developed for determining the economically optimal storage capacity of a given material or product at a manufacturing plant. The optimum was defined as a trade-off between the inventory-holding costs and the cost of customer-service failures caused by insufficient stocks for a peak-demand period. The order-arrival, production, storage, and shipment process was simulated by Monte Carlo techniques to calculate the probability of order delays for various lengths of time as a function of storage capacity. Example calculations for the storage of a bulk liquid chemical in tanks showed that the conclusions arrived at, via this model, aremore » comparatively insensitive to errors made in estimating the capital cost of storage or the risk of losing an order because of a late delivery.« less

  5. Epilepsy in Sweden: health care costs and loss of productivity--a register-based approach.

    PubMed

    Bolin, Kristian; Lundgren, Anders; Berggren, Fredrik; Källén, Kristina

    2012-12-01

    The objective was to estimate health care costs and productivity losses due to epilepsy in Sweden and to compare these estimates to previously published estimates. Register data on health care utilisation, pharmaceutical sales, permanent disability and mortality were used to calculate health care costs and costs that accrue due to productivity losses. By linkage of register information, we were able to distinguish pharmaceuticals prescribed against epilepsy from prescriptions that were prompted by other indications. The estimated total cost of epilepsy in Sweden in 2009 was 441 million, which corresponds to an annual per-patient cost of 8,275. Health care accounted for about 16% of the estimated total cost, and drug costs accounted for about 7% of the total cost. The estimated health care cost corresponded to about 0.2% of the total health care cost in Sweden in 2009. Indirect costs were estimated at 370 million, 84% of which was due to sickness absenteeism. Costs resulting from epilepsy-attributable premature deaths or permanent disability to work accounted for about 1% of the total indirect cost in Sweden in 2009. The per-patient cost of epilepsy is substantial. Thus, even though the prevalence of the illness is relatively small, the aggregated cost that epilepsy incurs on society is significant.

  6. 48 CFR 1852.216-85 - Estimated cost and award fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and Clauses 1852.216-85 Estimated cost and award fee. As prescribed in 1816.406-70(e), insert the following clause: Estimated Cost and Award Fee (SEP 1993) The estimated cost of this contract is $___. The... cost, base fee, and maximum award fee are $___. (End of clause) Alternate I (SEP 1993). As prescribed...

  7. Weight and cost forecasting for advanced manned space vehicles

    NASA Technical Reports Server (NTRS)

    Williams, Raymond

    1989-01-01

    A mass and cost estimating computerized methology for predicting advanced manned space vehicle weights and costs was developed. The user friendly methology designated MERCER (Mass Estimating Relationship/Cost Estimating Relationship) organizes the predictive process according to major vehicle subsystem levels. Design, development, test, evaluation, and flight hardware cost forecasting is treated by the study. This methodology consists of a complete set of mass estimating relationships (MERs) which serve as the control components for the model and cost estimating relationships (CERs) which use MER output as input. To develop this model, numerous MER and CER studies were surveyed and modified where required. Additionally, relationships were regressed from raw data to accommodate the methology. The models and formulations which estimated the cost of historical vehicles to within 20 percent of the actual cost were selected. The result of the research, along with components of the MERCER Program, are reported. On the basis of the analysis, the following conclusions were established: (1) The cost of a spacecraft is best estimated by summing the cost of individual subsystems; (2) No one cost equation can be used for forecasting the cost of all spacecraft; (3) Spacecraft cost is highly correlated with its mass; (4) No study surveyed contained sufficient formulations to autonomously forecast the cost and weight of the entire advanced manned vehicle spacecraft program; (5) No user friendly program was found that linked MERs with CERs to produce spacecraft cost; and (6) The group accumulation weight estimation method (summing the estimated weights of the various subsystems) proved to be a useful method for finding total weight and cost of a spacecraft.

  8. On-line Model Structure Selection for Estimation of Plasma Boundary in a Tokamak

    NASA Astrophysics Data System (ADS)

    Škvára, Vít; Šmídl, Václav; Urban, Jakub

    2015-11-01

    Control of the plasma field in the tokamak requires reliable estimation of the plasma boundary. The plasma boundary is given by a complex mathematical model and the only available measurements are responses of induction coils around the plasma. For the purpose of boundary estimation the model can be reduced to simple linear regression with potentially infinitely many elements. The number of elements must be selected manually and this choice significantly influences the resulting shape. In this paper, we investigate the use of formal model structure estimation techniques for the problem. Specifically, we formulate a sparse least squares estimator using the automatic relevance principle. The resulting algorithm is a repetitive evaluation of the least squares problem which could be computed in real time. Performance of the resulting algorithm is illustrated on simulated data and evaluated with respect to a more detailed and computationally costly model FREEBIE.

  9. H∞ state estimation of stochastic memristor-based neural networks with time-varying delays.

    PubMed

    Bao, Haibo; Cao, Jinde; Kurths, Jürgen; Alsaedi, Ahmed; Ahmad, Bashir

    2018-03-01

    This paper addresses the problem of H ∞ state estimation for a class of stochastic memristor-based neural networks with time-varying delays. Under the framework of Filippov solution, the stochastic memristor-based neural networks are transformed into systems with interval parameters. The present paper is the first to investigate the H ∞ state estimation problem for continuous-time Itô-type stochastic memristor-based neural networks. By means of Lyapunov functionals and some stochastic technique, sufficient conditions are derived to ensure that the estimation error system is asymptotically stable in the mean square with a prescribed H ∞ performance. An explicit expression of the state estimator gain is given in terms of linear matrix inequalities (LMIs). Compared with other results, our results reduce control gain and control cost effectively. Finally, numerical simulations are provided to demonstrate the efficiency of the theoretical results. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Cost-engineering modeling to support rapid concept development of an advanced infrared satellite system

    NASA Astrophysics Data System (ADS)

    Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.

    1995-12-01

    Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.

  11. Online approximation of the multichannel Wiener filter with preservation of interaural level difference for binaural hearing-aids.

    PubMed

    Marques do Carmo, Diego; Costa, Márcio Holsbach

    2018-04-01

    This work presents an online approximation method for the multichannel Wiener filter (MWF) noise reduction technique with preservation of the noise interaural level difference (ILD) for binaural hearing-aids. The steepest descent method is applied to a previously proposed MWF-ILD cost function to both approximate the optimal linear estimator of the desired speech and keep the subjective perception of the original acoustic scenario. The computational cost of the resulting algorithm is estimated in terms of multiply and accumulate operations, whose number can be controlled by setting the number of iterations at each time frame. Simulation results for the particular case of one speech and one-directional noise source show that the proposed method increases the signal-to-noise ratio SNR of the originally acquired speech by up to 16.9 dB in the assessed scenarios. As compared to the online implementation of the conventional MWF technique, the proposed technique provides a reduction of up to 7 dB in the noise ILD error at the price of a reduction of up 3 dB in the output SNR. Subjective experiments with volunteers complement these objective measures with psychoacoustic results, which corroborate the expected spatial preservation of the original acoustic scenario. The proposed method allows practical online implementation of the MWF-ILD noise reduction technique under constrained computational resources. Predicted SNR improvements from 12 dB to 16.9 dB can be obtained in application-specific integrated circuits for hearing-aids and state-of-the-art digital signal processors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Estimating Airline Operating Costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    The factors affecting commercial aircraft operating and delay costs were used to develop an airline operating cost model which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model permits estimates of aircraft related costs, i.e., aircraft service, landing fees, flight attendants, and control fees. A method for estimating the costs of certain types of airline delay is also described.

  13. Using multi-level remote sensing and ground data to estimate forest biomass resources in remote regions: a case study in the boreal forests of interior Alaska

    Treesearch

    Hans-Erik Andersen; Strunk Jacob; Hailemariam Temesgen; Donald Atwood; Ken Winterberger

    2012-01-01

    The emergence of a new generation of remote sensing and geopositioning technologies, as well as increased capabilities in image processing, computing, and inferential techniques, have enabled the development and implementation of increasingly efficient and cost-effective multilevel sampling designs for forest inventory. In this paper, we (i) describe the conceptual...

  14. RF Bearing Estimation in Wireless Sensor Networks

    DTIC Science & Technology

    2010-01-01

    are the main design drivers. Techniques based on ultrasonic and infrared signal modalities have short range and require line-of-sight. Clearly, RF...generating a Doppler shifted RF signal . The small frequency change can be measured even on low cost resource constrained nodes using a radio...is already included in the power budget and RF range is superior to most other signals . Radio signal strength (RSS) based approaches are the most

  15. Quantifying environmental DNA signals for aquatic invasive species across multiple detection platforms.

    PubMed

    Nathan, Lucas M; Simmons, Megan; Wegleitner, Benjamin J; Jerde, Christopher L; Mahon, Andrew R

    2014-11-04

    The use of molecular surveillance techniques has become popular among aquatic researchers and managers due to the improved sensitivity and efficiency compared to traditional sampling methods. Rapid expansion in the use of environmental DNA (eDNA), paired with the advancement of molecular technologies, has resulted in new detection platforms and techniques. In this study we present a comparison of three eDNA surveillance platforms: traditional polymerase chain reaction (PCR), quantitative PCR (qPCR), and digital droplet PCR (ddPCR) in which water samples were collected over a 24 h time period from mesocosm experiments containing a population gradient of invasive species densities. All platforms reliably detected the presence of DNA, even at low target organism densities within the first hour. The two quantitative platforms (qPCR and ddPCR) produced similar estimates of DNA concentrations. The analyses completed with ddPCR was faster from sample collection through analyses and cost approximately half the expenditure of qPCR. Although a new platform for eDNA surveillance of aquatic species, ddPCR was consistent with more commonly used qPCR and a cost-effective means of estimating DNA concentrations. Use of ddPCR by researchers and managers should be considered in future eDNA surveillance applications.

  16. Dating human skeletal remains: investigating the viability of measuring the equilibrium between 210Po and 210Pb as a means of estimating the post-mortem interval.

    PubMed

    Swift, B

    1998-11-30

    Estimating the post-mortem interval in skeletal remains is a notoriously difficult task; forensic pathologists often rely heavily upon experience in recognising morphological appearances. Previous techniques have involved measuring physical or chemical changes within the hydroxyapatite matrix, radiocarbon dating and 90Sr dating, though no individual test has been advocated. Within this paper it is proposed that measuring the equilibrium between two naturally occurring radio-isotopes, 210Po and 210Pb, and comparison with post-mortem examination samples would produce a new method of dating human skeletal remains. Possible limitations exist, notably the effect of diagenesis, time limitations and relative cost, though this technique could provide a relatively accurate means of determining the post-mortem interval. It is therefore proposed that a large study be undertaken to provide a calibration scale against which bones uncovered can be dated.

  17. Weighted minimum-norm source estimation of magnetoencephalography utilizing the temporal information of the measured data

    NASA Astrophysics Data System (ADS)

    Iwaki, Sunao; Ueno, Shoogo

    1998-06-01

    The weighted minimum-norm estimation (wMNE) is a popular method to obtain the source distribution in the human brain from magneto- and electro- encephalograpic measurements when detailed information about the generator profile is not available. We propose a method to reconstruct current distributions in the human brain based on the wMNE technique with the weighting factors defined by a simplified multiple signal classification (MUSIC) prescanning. In this method, in addition to the conventional depth normalization technique, weighting factors of the wMNE were determined by the cost values previously calculated by a simplified MUSIC scanning which contains the temporal information of the measured data. We performed computer simulations of this method and compared it with the conventional wMNE method. The results show that the proposed method is effective for the reconstruction of the current distributions from noisy data.

  18. Estimating the Costs of Preventive Interventions

    ERIC Educational Resources Information Center

    Foster, E. Michael; Porter, Michele M.; Ayers, Tim S.; Kaplan, Debra L.; Sandler, Irwin

    2007-01-01

    The goal of this article is to improve the practice and reporting of cost estimates of prevention programs. It reviews the steps in estimating the costs of an intervention and the principles that should guide estimation. The authors then review prior efforts to estimate intervention costs using a sample of well-known but diverse studies. Finally,…

  19. Cost effectiveness of bisphosphonates in the management of breast cancer patients with bone metastases.

    PubMed

    Botteman, M; Barghout, V; Stephens, J; Hay, J; Brandman, J; Aapro, M

    2006-07-01

    Bisphosphonates are recommended to prevent skeletal related events (SREs) in patients with breast cancer and bone metastases (BCBM). However, their clinical and economic profiles vary from one agent to the other. Using modeling techniques, we simulated from the perspective of the UK's National Health Service (NHS) the cost and quality adjusted survival (QALY) associated with five commonly-used bisphosphonates or no therapy in this patient population. The simulation followed patients into several health states (i.e. alive or dead, experiencing an SRE or no SRE, and receiving first or second line therapy). Drugs costs, infusion costs, SREs costs, and utility values were estimated from published sources. Utilities were applied to time with and without SREs to capture the impact on quality of life. Compared to no therapy, all bisphosphonates are either cost saving or highly cost-effective (with a cost per QALY < or = 6126 pounds sterlings). Within this evaluation, zoledronic acid was more effective and less expensive than all other options. Based on our model, the use of bisphosphonates in breast cancer patients with bone metastases should lead to improved patient outcomes and cost savings to the NHS and possibly other similar entities.

  20. Estimating pharmacy level prescription drug acquisition costs for third-party reimbursement.

    PubMed

    Kreling, D H; Kirk, K W

    1986-07-01

    Accurate payment for the acquisition costs of drug products dispensed is an important consideration in a third-party prescription drug program. Two alternative methods of estimating these costs among pharmacies were derived and compared. First, pharmacists were surveyed to determine the purchase discounts offered to them by wholesalers. A 10.00% modal and 11.35% mean discount resulted for 73 responding pharmacists. Second, cost-plus percents derived from gross profit margins of wholesalers were calculated and applied to wholesaler product costs to estimate pharmacy level acquisition costs. Cost-plus percents derived from National Median and Southwestern Region wholesaler figures were 9.27% and 10.10%, respectively. A comparison showed the two methods of estimating acquisition costs would result in similar acquisition cost estimates. Adopting a cost-plus estimating approach is recommended because it avoids potential pricing manipulations by wholesalers and manufacturers that would negate improvements in drug product reimbursement accuracy.

  1. A health economic outcome evaluation of an internet-based mobile-supported stress management intervention for employees.

    PubMed

    Ebert, David Daniel; Kählke, Fanny; Buntrock, Claudia; Berking, Matthias; Smit, Filip; Heber, Elena; Baumeister, Harald; Funk, Burkhardt; Riper, Heleen; Lehr, Dirk

    2018-03-01

    Objective This study aimed to estimate and evaluate the cost-effectiveness and cost-benefit of a guided internet- and mobile-supported occupational stress-management intervention (iSMI) for employees from the employer's perspective alongside a randomized controlled trial. Methods A sample of 264 employees with elevated symptoms of perceived stress (Perceived Stress Scale, PSS-10 ≥22) was randomly assigned either to the iSMI or a waitlist control (WLC) group with unrestricted access to treatment as usual. The iSMI consisted of seven sessions of problem-solving and emotion-regulation techniques and one booster session. Self-report data on symptoms of perceived stress and economic data were assessed at baseline, and at six months following randomization. A cost-benefit analysis (CBA) and a cost-effectiveness analysis (CEA) with symptom-free status as the main outcome from the employer's perspective was carried out. Statistical uncertainty was estimated using bootstrapping (N=5000). Results The CBA yielded a net-benefit of EUR181 [95% confidence interval (CI) -6043-1042] per participant within the first six months following randomization. CEA showed that at a willingness-to-pay ceiling of EUR0, EUR1000, EUR2000 for one additional symptom free employee yielded a 67%, 90%, and 98% probability, respectively, of the intervention being cost-effective compared to the WLC. Conclusion The iSMI was cost-effective when compared to WLC and even lead to cost savings within the first six months after randomization. Offering stress-management interventions can present good value for money in occupational healthcare.

  2. Cost savings from peritoneal dialysis therapy time extension using icodextrin.

    PubMed

    Johnson, David W; Vincent, Kaia; Blizzard, Sophie; Rumpsfeld, Markus; Just, Paul

    2003-01-01

    Previous retrospective studies have reported that icodextrin may prolong peritoneal dialysis (PD) treatment time in patients with refractory fluid overload (RFO). Because the annual cost of PD therapy is lower than that of hemodialysis (HD) therapy in Australia, we prospectively investigated the ability of icodextrin to prolong PD technique survival in patients with RFO. We used a computer model to estimate the savings associated with that therapeutic strategy, based on annual therapy costs determined in a regional PD and HD costing exercise. Patients who met standard criteria for RFO and who were otherwise to be converted immediately to HD, were asked to consent to an open-label assessment of the ability of icodextrin to delay the need to start HD. Time to conversion to HD was measured. The study enrolled 39 patients who were followed for a mean period of 1.1 years. Icodextrin significantly increased peritoneal ultrafiltration by a median value of 368 mL daily. It prolonged technique survival by a mean period of 1.21 years [95% confidence interval (CI): 0.80-1.62 years]. Extension of PD treatment time by icodextrin was particularly marked for patients who had ultrafiltration failure (UFF, n = 20), defined as net daily peritoneal ultrafiltration < 1 L daily (mean extension time: 1.70 years; 95% CI: 1.16-2.25 years). Overall, annualized savings were US$3,683 per patient per year. If just the patients with UFF were considered, the savings increased to US$4,893 per year. Icodextrin prolongs PD technique survival in patients with RFO, permitting them to continue on their preferred therapy. In Australia, that practice is highly cost-effective, particularly in individuals with UFF.

  3. End-user perspective of low-cost sensors for outdoor air pollution monitoring.

    PubMed

    Rai, Aakash C; Kumar, Prashant; Pilla, Francesco; Skouloudis, Andreas N; Di Sabatino, Silvana; Ratti, Carlo; Yasar, Ansar; Rickerby, David

    2017-12-31

    Low-cost sensor technology can potentially revolutionise the area of air pollution monitoring by providing high-density spatiotemporal pollution data. Such data can be utilised for supplementing traditional pollution monitoring, improving exposure estimates, and raising community awareness about air pollution. However, data quality remains a major concern that hinders the widespread adoption of low-cost sensor technology. Unreliable data may mislead unsuspecting users and potentially lead to alarming consequences such as reporting acceptable air pollutant levels when they are above the limits deemed safe for human health. This article provides scientific guidance to the end-users for effectively deploying low-cost sensors for monitoring air pollution and people's exposure, while ensuring reasonable data quality. We review the performance characteristics of several low-cost particle and gas monitoring sensors and provide recommendations to end-users for making proper sensor selection by summarizing the capabilities and limitations of such sensors. The challenges, best practices, and future outlook for effectively deploying low-cost sensors, and maintaining data quality are also discussed. For data quality assurance, a two-stage sensor calibration process is recommended, which includes laboratory calibration under controlled conditions by the manufacturer supplemented with routine calibration checks performed by the end-user under final deployment conditions. For large sensor networks where routine calibration checks are impractical, statistical techniques for data quality assurance should be utilised. Further advancements and adoption of sophisticated mathematical and statistical techniques for sensor calibration, fault detection, and data quality assurance can indeed help to realise the promised benefits of a low-cost air pollution sensor network. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Surgical management for displaced pediatric proximal humeral fractures: a cost analysis.

    PubMed

    Shore, Benjamin J; Hedequist, Daniel J; Miller, Patricia E; Waters, Peter M; Bae, Donald S

    2015-02-01

    The purpose of this investigation was to determine which of the following methods of fixation, percutaneous pinning (PP) or intramedullary nailing (IMN), was more cost-effective in the treatment of displaced pediatric proximal humeral fractures (PPHF). A retrospective cohort of surgically treated PPHF over a 12-year period at a single institution was performed. A decision analysis model was constructed to compare three surgical strategies: IMN versus percutaneous pinning leaving the pins exposed (PPE) versus leaving the pins buried (PPB). Finally, sensitivity analyses were performed, assessing the cost-effectiveness of each technique when infection rates and cost of deep infections were varied. A total of 84 patients with displaced PPHF underwent surgical stabilization. A total of 35 cases were treated with IMN, 32 with PPE, and 17 with PPB. The age, sex, and preoperative fracture angulation were similar across all groups. A greater percentage of open reduction was seen in the IMN and PPB groups (p = 0.03), while a higher proportion of physeal injury was seen in the PPE group (p = 0.02). Surgical time and estimated blood loss was higher in the IMN group (p < 0.001 and p = 0.01, respectively). The decision analysis revealed that the PPE technique resulted in an average cost saving of $4,502 per patient compared to IMN and $2,066 compared to PPB. This strategy remained cost-effective even when the complication rates with exposed implants approached 55 %. Leaving pins exposed after surgical fixation of PPHF is more cost-effective than either burying pins or using intramedullary fixation.

  5. Snowball Vs. House-to-House Technique for Measuring Annual Incidence of Kala-azar in the Higher Endemic Blocks of Bihar, India: A Comparison

    PubMed Central

    Siddiqui, Niyamat A.; Rabidas, Vidya N.; Sinha, Sanjay K.; Verma, Rakesh B.; Pandey, Krishna; Singh, Vijay P.; Ranjan, Alok; Topno, Roshan K.; Lal, Chandra S.; Kumar, Vijay; Sahoo, Ganesh C.; Sridhar, Srikantaih; Pandey, Arvind; Das, Pradeep

    2016-01-01

    Background Visceral Leishmaniasis, commonly known as kala-azar, is widely prevalent in Bihar. The National Kala-azar Control Program has applied house-to-house survey approach several times for estimating Kala-azar incidence in the past. However, this approach includes huge logistics and operational cost, as occurrence of kala-azar is clustered in nature. The present study aims to compare efficiency, cost and feasibility of snowball sampling approach to house-to-house survey approach in capturing kala-azar cases in two endemic districts of Bihar, India. Methodology/Principal findings A community based cross-sectional study was conducted in two highly endemic Primary Health Centre (PHC) areas, each from two endemic districts of Bihar, India. Snowball technique (used to locate potential subjects with help of key informants where subjects are hard to locate) and house-to-house survey technique were applied to detect all the new cases of Kala-azar during a defined reference period of one year i.e. June, 2010 to May, 2011. The study covered a total of 105,035 households with 537,153 populations. Out of total 561 cases and 17 deaths probably due to kala-azar, identified by the study, snowball sampling approach captured only 221 cases and 13 deaths, whereas 489 cases and 17 deaths were detected by house-to-house survey approach. Higher value of McNemar’s χ² statistics (64; p<0.0001) for house-to-house survey approach than snowball sampling and relative difference (>1) indicates that most of the kala-azar cases missed by snowball sampling were captured by house-to-house approach with 13% of omission. Conclusion/Significance Snowball sampling was not found sensitive enough as it captured only about 50% of VL cases. However, it captured about 77% of the deaths probably due to kala-azar and was found more cost-effective than house-to-house approach. Standardization of snowball approach with improved procedure, training and logistics may enhance the sensitivity of snowball sampling and its application in national Kala-azar elimination programme as cost-effective approach for estimation of kala-azar burden. PMID:27681709

  6. Snowball Vs. House-to-House Technique for Measuring Annual Incidence of Kala-azar in the Higher Endemic Blocks of Bihar, India: A Comparison.

    PubMed

    Siddiqui, Niyamat A; Rabidas, Vidya N; Sinha, Sanjay K; Verma, Rakesh B; Pandey, Krishna; Singh, Vijay P; Ranjan, Alok; Topno, Roshan K; Lal, Chandra S; Kumar, Vijay; Sahoo, Ganesh C; Sridhar, Srikantaih; Pandey, Arvind; Das, Pradeep

    2016-09-01

    Visceral Leishmaniasis, commonly known as kala-azar, is widely prevalent in Bihar. The National Kala-azar Control Program has applied house-to-house survey approach several times for estimating Kala-azar incidence in the past. However, this approach includes huge logistics and operational cost, as occurrence of kala-azar is clustered in nature. The present study aims to compare efficiency, cost and feasibility of snowball sampling approach to house-to-house survey approach in capturing kala-azar cases in two endemic districts of Bihar, India. A community based cross-sectional study was conducted in two highly endemic Primary Health Centre (PHC) areas, each from two endemic districts of Bihar, India. Snowball technique (used to locate potential subjects with help of key informants where subjects are hard to locate) and house-to-house survey technique were applied to detect all the new cases of Kala-azar during a defined reference period of one year i.e. June, 2010 to May, 2011. The study covered a total of 105,035 households with 537,153 populations. Out of total 561 cases and 17 deaths probably due to kala-azar, identified by the study, snowball sampling approach captured only 221 cases and 13 deaths, whereas 489 cases and 17 deaths were detected by house-to-house survey approach. Higher value of McNemar's χ² statistics (64; p<0.0001) for house-to-house survey approach than snowball sampling and relative difference (>1) indicates that most of the kala-azar cases missed by snowball sampling were captured by house-to-house approach with 13% of omission. Snowball sampling was not found sensitive enough as it captured only about 50% of VL cases. However, it captured about 77% of the deaths probably due to kala-azar and was found more cost-effective than house-to-house approach. Standardization of snowball approach with improved procedure, training and logistics may enhance the sensitivity of snowball sampling and its application in national Kala-azar elimination programme as cost-effective approach for estimation of kala-azar burden.

  7. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  8. 48 CFR 2452.216-70 - Estimated cost, base fee and award fee.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Estimated cost, base fee... Provisions and Clauses 2452.216-70 Estimated cost, base fee and award fee. As prescribed in 2416.406(e)(1), insert the following clause in all cost-plus-award-fee contracts: Estimated Cost, Base Fee and Award Fee...

  9. 48 CFR 2452.216-70 - Estimated cost, base fee and award fee.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Estimated cost, base fee... Provisions and Clauses 2452.216-70 Estimated cost, base fee and award fee. As prescribed in 2416.406(e)(1), insert the following clause in all cost-plus-award-fee contracts: Estimated Cost, Base Fee and Award Fee...

  10. Wind energy prospecting: socio-economic value of a new wind resource assessment technique based on a NASA Earth science dataset

    NASA Astrophysics Data System (ADS)

    Vanvyve, E.; Magontier, P.; Vandenberghe, F. C.; Delle Monache, L.; Dickinson, K.

    2012-12-01

    Wind energy is amongst the fastest growing sources of renewable energy in the U.S. and could supply up to 20 % of the U.S power production by 2030. An accurate and reliable wind resource assessment for prospective wind farm sites is a challenging task, yet is crucial for evaluating the long-term profitability and feasibility of a potential development. We have developed an accurate and computationally efficient wind resource assessment technique for prospective wind farm sites, which incorporates innovative statistical techniques and the new NASA Earth science dataset MERRA. This technique produces a wind resource estimate that is more accurate than that obtained by the wind energy industry's standard technique, while providing a reliable quantification of its uncertainty. The focus now is on evaluating the socio-economic value of this new technique upon using the industry's standard technique. Would it yield lower financing costs? Could it result in lower electricity prices? Are there further down-the-line positive consequences, e.g. job creation, time saved, greenhouse gas decrease? Ultimately, we expect our results will inform efforts to refine and disseminate the new technique to support the development of the U.S. renewable energy infrastructure. In order to address the above questions, we are carrying out a cost-benefit analysis based on the net present worth of the technique. We will describe this approach, including the cash-flow process of wind farm financing, how the wind resource assessment factors in, and will present current results for various hypothetical candidate wind farm sites.

  11. An Error-Reduction Algorithm to Improve Lidar Turbulence Estimates for Wind Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Clifton, Andrew

    2016-08-01

    Currently, cup anemometers on meteorological (met) towers are used to measure wind speeds and turbulence intensity to make decisions about wind turbine class and site suitability. However, as modern turbine hub heights increase and wind energy expands to complex and remote sites, it becomes more difficult and costly to install met towers at potential sites. As a result, remote sensing devices (e.g., lidars) are now commonly used by wind farm managers and researchers to estimate the flow field at heights spanned by a turbine. While lidars can accurately estimate mean wind speeds and wind directions, there is still a largemore » amount of uncertainty surrounding the measurement of turbulence with lidars. This uncertainty in lidar turbulence measurements is one of the key roadblocks that must be overcome in order to replace met towers with lidars for wind energy applications. In this talk, a model for reducing errors in lidar turbulence estimates is presented. Techniques for reducing errors from instrument noise, volume averaging, and variance contamination are combined in the model to produce a corrected value of the turbulence intensity (TI), a commonly used parameter in wind energy. In the next step of the model, machine learning techniques are used to further decrease the error in lidar TI estimates.« less

  12. Estimation of both optical and nonoptical surface water quality parameters using Landsat 8 OLI imagery and statistical techniques

    NASA Astrophysics Data System (ADS)

    Sharaf El Din, Essam; Zhang, Yun

    2017-10-01

    Traditional surface water quality assessment is costly, labor intensive, and time consuming; however, remote sensing has the potential to assess surface water quality because of its spatiotemporal consistency. Therefore, estimating concentrations of surface water quality parameters (SWQPs) from satellite imagery is essential. Remote sensing estimation of nonoptical SWQPs, such as chemical oxygen demand (COD), biochemical oxygen demand (BOD), and dissolved oxygen (DO), has not yet been performed because they are less likely to affect signals measured by satellite sensors. However, concentrations of nonoptical variables may be correlated with optical variables, such as turbidity and total suspended sediments, which do affect the reflected radiation. In this context, an indirect relationship between satellite multispectral data and COD, BOD, and DO can be assumed. Therefore, this research attempts to develop an integrated Landsat 8 band ratios and stepwise regression to estimate concentrations of both optical and nonoptical SWQPs. Compared with previous studies, a significant correlation between Landsat 8 surface reflectance and concentrations of SWQPs was achieved and the obtained coefficient of determination (R2)>0.85. These findings demonstrated the possibility of using our technique to develop models to estimate concentrations of SWQPs and to generate spatiotemporal maps of SWQPs from Landsat 8 imagery.

  13. Probabilistic estimation of numbers and costs of future landslides in the San Francisco Bay region

    USGS Publications Warehouse

    Crovelli, R.A.; Coe, J.A.

    2009-01-01

    We used historical records of damaging landslides triggered by rainstorms and a newly developed Probabilistic Landslide Assessment Cost Estimation System (PLACES) to estimate the numbers and direct costs of future landslides in the 10-county San Francisco Bay region. Historical records of damaging landslides in the region are incomplete. Therefore, our estimates of numbers and costs of future landslides are minimal estimates. The estimated mean annual number of future damaging landslides for the entire 10-county region is about 65. Santa Cruz County has the highest estimated mean annual number of damaging future landslides (about 18), whereas Napa, San Francisco, and Solano Counties have the lowest estimated mean numbers of damaging landslides (about 1 each). The estimated mean annual cost of future landslides in the entire region is about US $14.80 million (year 2000 $). The estimated mean annual cost is highest for San Mateo County ($3.24 million) and lowest for Solano County ($0.18 million). The annual per capita cost for the entire region will be about $2.10. Santa Cruz County will have the highest annual per capita cost at $8.45, whereas San Francisco County will have the lowest per capita cost at $0.31. Normalising costs by dividing by the percentage of land area with slopes equal to or greater than 17% indicates that San Francisco County will have the highest cost per square km ($7,101), whereas Santa Clara County will have the lowest cost per square km ($229). These results indicate that the San Francisco Bay region has one of the highest levels of landslide risk in the United States. Compared with landslide cost estimates from the rest of the world, the risk level in the Bay region seems high, but not exceptionally high.

  14. Spatial aliasing for efficient direction-of-arrival estimation based on steering vector reconstruction

    NASA Astrophysics Data System (ADS)

    Yan, Feng-Gang; Cao, Bin; Rong, Jia-Jia; Shen, Yi; Jin, Ming

    2016-12-01

    A new technique is proposed to reduce the computational complexity of the multiple signal classification (MUSIC) algorithm for direction-of-arrival (DOA) estimate using a uniform linear array (ULA). The steering vector of the ULA is reconstructed as the Kronecker product of two other steering vectors, and a new cost function with spatial aliasing at hand is derived. Thanks to the estimation ambiguity of this spatial aliasing, mirror angles mathematically relating to the true DOAs are generated, based on which the full spectral search involved in the MUSIC algorithm is highly compressed into a limited angular sector accordingly. Further complexity analysis and performance studies are conducted by computer simulations, which demonstrate that the proposed estimator requires an extremely reduced computational burden while it shows a similar accuracy to the standard MUSIC.

  15. Cost Estimation and Control for Flight Systems

    NASA Technical Reports Server (NTRS)

    Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)

    2002-01-01

    Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.

  16. Design and Implementation of an Intelligent Cost Estimation Model for Decision Support System Software

    DTIC Science & Technology

    1990-09-01

    following two chapters. 28 V. COCOMO MODEL A. OVERVIEW The COCOMO model which stands for COnstructive COst MOdel was developed by Barry Boehm and is...estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W. Boehm and...cost estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W

  17. Estimating the costs of VA ambulatory care.

    PubMed

    Phibbs, Ciaran S; Bhandari, Aman; Yu, Wei; Barnett, Paul G

    2003-09-01

    This article reports how we matched Common Procedure Terminology (CPT) codes with Medicare payment rates and aggregate Veterans Affairs (VA) budget data to estimate the costs of every VA ambulatory encounter. Converting CPT codes to encounter-level costs was more complex than a simple match of Medicare reimbursements to CPT codes. About 40 percent of the CPT codes used in VA, representing about 20 percent of procedures, did not have a Medicare payment rate and required other cost estimates. Reconciling aggregated estimated costs to the VA budget allocations for outpatient care produced final VA cost estimates that were lower than projected Medicare reimbursements. The methods used to estimate costs for encounters could be replicated for other settings. They are potentially useful for any system that does not generate billing data, when CPT codes are simpler to collect than billing data, or when there is a need to standardize cost estimates across data sources.

  18. Cost and Return on Investment of a Work-Family Intervention in the Extended Care Industry: Evidence From the Work, Family, and Health Network.

    PubMed

    Dowd, William N; Bray, Jeremy W; Barbosa, Carolina; Brockwood, Krista; Kaiser, David J; Mills, Michael J; Hurtado, David A; Wipfli, Brad

    2017-10-01

    To estimate the cost and return on investment (ROI) of an intervention targeting work-family conflict (WFC) in the extended care industry. Costs to deliver the intervention during a group-randomized controlled trial were estimated, and data on organizational costs-presenteeism, health care costs, voluntary termination, and sick time-were collected from interviews and administrative data. Generalized linear models were used to estimate the intervention's impact on organizational costs. Combined, these results produced ROI estimates. A cluster-robust confidence interval (CI) was estimated around the ROI estimate. The per-participant cost of the intervention was $767. The ROI was -1.54 (95% CI: -4.31 to 2.18). The intervention was associated with a $668 reduction in health care costs (P < 0.05). This paper builds upon and expands prior ROI estimation methods to a new setting.

  19. Development of a Cost Estimation Process for Human Systems Integration Practitioners During the Analysis of Alternatives

    DTIC Science & Technology

    2010-12-01

    processes. Novice estimators must often use of these complicated cost estimation tools (e.g., ACEIT , SEER-H, SEER-S, PRICE-H, PRICE-S, etc.) until...However, the thesis will leverage the processes embedded in cost estimation tools such as the Automated Cost Estimating Integration Tool ( ACEIT ) and the

  20. The Estimated Annual Cost of Uterine Leiomyomata in the United States

    PubMed Central

    CARDOZO, Eden R.; CLARK, Andrew D.; BANKS, Nicole K.; HENNE, Melinda B.; STEGMANN, Barbara J.; SEGARS, James H.

    2011-01-01

    Objective To estimate the total annual societal cost of uterine fibroids in the United States, based on direct and indirect costs, including associated obstetric complications. Study Design A systematic review of the literature was conducted to estimate the number of women seeking treatment for symptomatic fibroids annually, the costs of medical and surgical treatment, work lost and obstetric complications attributable to fibroids. Total annual costs were converted to 2010 U.S. dollars. A sensitivity analysis was performed. Results The estimated annual direct costs (surgery, hospital admissions, outpatient visits, medications) were $4.1 to $9.4 billion. Estimated lost work costs ranged from $1.55 to $17.2 billion annually. Obstetric outcomes attributed to fibroids resulted in a cost of $238 million to $7.76 billion annually. Uterine fibroids were estimated to cost the US $5.9 to $34.4 billion annually. Conclusions Obstetric complications associated with fibroids contributed significantly to their economic burden. Lost work costs may account for the largest proportion of societal costs due to fibroids. PMID:22244472

  1. Innovation in the pharmaceutical industry: New estimates of R&D costs.

    PubMed

    DiMasi, Joseph A; Grabowski, Henry G; Hansen, Ronald W

    2016-05-01

    The research and development costs of 106 randomly selected new drugs were obtained from a survey of 10 pharmaceutical firms. These data were used to estimate the average pre-tax cost of new drug and biologics development. The costs of compounds abandoned during testing were linked to the costs of compounds that obtained marketing approval. The estimated average out-of-pocket cost per approved new compound is $1395 million (2013 dollars). Capitalizing out-of-pocket costs to the point of marketing approval at a real discount rate of 10.5% yields a total pre-approval cost estimate of $2558 million (2013 dollars). When compared to the results of the previous study in this series, total capitalized costs were shown to have increased at an annual rate of 8.5% above general price inflation. Adding an estimate of post-approval R&D costs increases the cost estimate to $2870 million (2013 dollars). Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Exploring Gigabyte Datasets in Real Time: Architectures, Interfaces and Time-Critical Design

    NASA Technical Reports Server (NTRS)

    Bryson, Steve; Gerald-Yamasaki, Michael (Technical Monitor)

    1998-01-01

    Architectures and Interfaces: The implications of real-time interaction on software architecture design: decoupling of interaction/graphics and computation into asynchronous processes. The performance requirements of graphics and computation for interaction. Time management in such an architecture. Examples of how visualization algorithms must be modified for high performance. Brief survey of interaction techniques and design, including direct manipulation and manipulation via widgets. talk discusses how human factors considerations drove the design and implementation of the virtual wind tunnel. Time-Critical Design: A survey of time-critical techniques for both computation and rendering. Emphasis on the assignment of a time budget to both the overall visualization environment and to each individual visualization technique in the environment. The estimation of the benefit and cost of an individual technique. Examples of the modification of visualization algorithms to allow time-critical control.

  3. How to estimate the cost of point-of-care CD4 testing in program settings: an example using the Alere Pima Analyzer in South Africa.

    PubMed

    Larson, Bruce; Schnippel, Kathryn; Ndibongo, Buyiswa; Long, Lawrence; Fox, Matthew P; Rosen, Sydney

    2012-01-01

    Integrating POC CD4 testing technologies into HIV counseling and testing (HCT) programs may improve post-HIV testing linkage to care and treatment. As evaluations of these technologies in program settings continue, estimates of the costs of POC CD4 tests to the service provider will be needed and estimates have begun to be reported. Without a consistent and transparent methodology, estimates of the cost per CD4 test using POC technologies are likely to be difficult to compare and may lead to erroneous conclusions about costs and cost-effectiveness. This paper provides a step-by-step approach for estimating the cost per CD4 test from a provider's perspective. As an example, the approach is applied to one specific POC technology, the Pima Analyzer. The costing approach is illustrated with data from a mobile HCT program in Gauteng Province of South Africa. For this program, the cost per test in 2010 was estimated at $23.76 (material costs  = $8.70; labor cost per test  = $7.33; and equipment, insurance, and daily quality control  = $7.72). Labor and equipment costs can vary widely depending on how the program operates and the number of CD4 tests completed over time. Additional costs not included in the above analysis, for on-going training, supervision, and quality control, are likely to increase further the cost per test. The main contribution of this paper is to outline a methodology for estimating the costs of incorporating POC CD4 testing technologies into an HCT program. The details of the program setting matter significantly for the cost estimate, so that such details should be clearly documented to improve the consistency, transparency, and comparability of cost estimates.

  4. Goal-Directed Fluid Therapy Guided by Cardiac Monitoring During High-Risk Abdominal Surgery in Adult Patients: Cost-Effectiveness Analysis of Esophageal Doppler and Arterial Pulse Pressure Waveform Analysis.

    PubMed

    Legrand, Guillaume; Ruscio, Laura; Benhamou, Dan; Pelletier-Fleury, Nathalie

    2015-07-01

    Several minimally invasive techniques for cardiac output monitoring such as the esophageal Doppler (ED) and arterial pulse pressure waveform analysis (APPWA) have been shown to improve surgical outcomes compared with conventional clinical assessment (CCA). To evaluate the cost-effectiveness of these techniques in high-risk abdominal surgery from the perspective of the French public health insurance fund. An analytical decision model was constructed to compare the cost-effectiveness of ED, APPWA, and CCA. Effectiveness data were defined from meta-analyses of randomized clinical trials. The clinical end points were avoidance of hospital mortality and avoidance of major complications. Hospital costs were estimated by the cost of corresponding diagnosis-related groups. Both goal-directed therapy strategies evaluated were more effective and less costly than CCA. Perioperative mortality and the rate of major complications were reduced by the use of ED and APPWA. Cost reduction was mainly due to the decrease in the rate of major complications. APPWA was dominant compared with ED in 71.6% and 27.6% and dominated in 23.8% and 20.8% of the cases when the end point considered was "major complications avoided" and "death avoided," respectively. Regarding cost per death avoided, APPWA was more likely to be cost-effective than ED in a wide range of willingness to pay. Cardiac output monitoring during high-risk abdominal surgery is cost-effective and is associated with a reduced rate of hospital mortality and major complications, whatever the device used. The two devices evaluated had negligible costs compared with the observed reduction in hospital costs. Our comparative studies suggest a larger effect with APPWA that needs to be confirmed by further studies. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Estimating age-based antiretroviral therapy costs for HIV-infected children in resource-limited settings based on World Health Organization weight-based dosing recommendations.

    PubMed

    Doherty, Kathleen; Essajee, Shaffiq; Penazzato, Martina; Holmes, Charles; Resch, Stephen; Ciaranello, Andrea

    2014-05-02

    Pediatric antiretroviral therapy (ART) has been shown to substantially reduce morbidity and mortality in HIV-infected infants and children. To accurately project program costs, analysts need accurate estimations of antiretroviral drug (ARV) costs for children. However, the costing of pediatric antiretroviral therapy is complicated by weight-based dosing recommendations which change as children grow. We developed a step-by-step methodology for estimating the cost of pediatric ARV regimens for children ages 0-13 years old. The costing approach incorporates weight-based dosing recommendations to provide estimated ARV doses throughout childhood development. Published unit drug costs are then used to calculate average monthly drug costs. We compared our derived monthly ARV costs to published estimates to assess the accuracy of our methodology. The estimates of monthly ARV costs are provided for six commonly used first-line pediatric ARV regimens, considering three possible care scenarios. The costs derived in our analysis for children were fairly comparable to or slightly higher than available published ARV drug or regimen estimates. The methodology described here can be used to provide an accurate estimation of pediatric ARV regimen costs for cost-effectiveness analysts to project the optimum packages of care for HIV-infected children, as well as for program administrators and budget analysts who wish to assess the feasibility of increasing pediatric ART availability in constrained budget environments.

  6. Deflection-Based Aircraft Structural Loads Estimation with Comparison to Flight

    NASA Technical Reports Server (NTRS)

    Lizotte, Andrew M.; Lokos, William A.

    2005-01-01

    Traditional techniques in structural load measurement entail the correlation of a known load with strain-gage output from the individual components of a structure or machine. The use of strain gages has proved successful and is considered the standard approach for load measurement. However, remotely measuring aerodynamic loads using deflection measurement systems to determine aeroelastic deformation as a substitute to strain gages may yield lower testing costs while improving aircraft performance through reduced instrumentation weight. With a reliable strain and structural deformation measurement system this technique was examined. The objective of this study was to explore the utility of a deflection-based load estimation, using the active aeroelastic wing F/A-18 aircraft. Calibration data from ground tests performed on the aircraft were used to derive left wing-root and wing-fold bending-moment and torque load equations based on strain gages, however, for this study, point deflections were used to derive deflection-based load equations. Comparisons between the strain-gage and deflection-based methods are presented. Flight data from the phase-1 active aeroelastic wing flight program were used to validate the deflection-based load estimation method. Flight validation revealed a strong bending-moment correlation and slightly weaker torque correlation. Development of current techniques, and future studies are discussed.

  7. Deflection-Based Structural Loads Estimation From the Active Aeroelastic Wing F/A-18 Aircraft

    NASA Technical Reports Server (NTRS)

    Lizotte, Andrew M.; Lokos, William A.

    2005-01-01

    Traditional techniques in structural load measurement entail the correlation of a known load with strain-gage output from the individual components of a structure or machine. The use of strain gages has proved successful and is considered the standard approach for load measurement. However, remotely measuring aerodynamic loads using deflection measurement systems to determine aeroelastic deformation as a substitute to strain gages may yield lower testing costs while improving aircraft performance through reduced instrumentation weight. This technique was examined using a reliable strain and structural deformation measurement system. The objective of this study was to explore the utility of a deflection-based load estimation, using the active aeroelastic wing F/A-18 aircraft. Calibration data from ground tests performed on the aircraft were used to derive left wing-root and wing-fold bending-moment and torque load equations based on strain gages, however, for this study, point deflections were used to derive deflection-based load equations. Comparisons between the strain-gage and deflection-based methods are presented. Flight data from the phase-1 active aeroelastic wing flight program were used to validate the deflection-based load estimation method. Flight validation revealed a strong bending-moment correlation and slightly weaker torque correlation. Development of current techniques, and future studies are discussed.

  8. Generalized Redistribute-to-the-Right Algorithm: Application to the Analysis of Censored Cost Data

    PubMed Central

    CHEN, SHUAI; ZHAO, HONGWEI

    2013-01-01

    Medical cost estimation is a challenging task when censoring of data is present. Although researchers have proposed methods for estimating mean costs, these are often derived from theory and are not always easy to understand. We provide an alternative method, based on a replace-from-the-right algorithm, for estimating mean costs more efficiently. We show that our estimator is equivalent to an existing one that is based on the inverse probability weighting principle and semiparametric efficiency theory. We also propose an alternative method for estimating the survival function of costs, based on the redistribute-to-the-right algorithm, that was originally used for explaining the Kaplan–Meier estimator. We show that this second proposed estimator is equivalent to a simple weighted survival estimator of costs. Finally, we develop a more efficient survival estimator of costs, using the same redistribute-to-the-right principle. This estimator is naturally monotone, more efficient than some existing survival estimators, and has a quite small bias in many realistic settings. We conduct numerical studies to examine the finite sample property of the survival estimators for costs, and show that our new estimator has small mean squared errors when the sample size is not too large. We apply both existing and new estimators to a data example from a randomized cardiovascular clinical trial. PMID:24403869

  9. Healthcare costs of the progression of chronic kidney disease and different dialysis techniques estimated through administrative database analysis.

    PubMed

    Roggeri, Alessandro; Roggeri, Daniela Paola; Zocchetti, Carlo; Bersani, Maurizio; Conte, Ferruccio

    2017-04-01

    Chronic kidney disease (CKD) progression is associated with significant comorbidities and costs. In Italy, limited evidence of healthcare resource consumption and costs is available. We therefore aimed to investigate the direct healthcare costs in charge to the Lombardy Regional Health Service (RHS) for the treatment of CKD patients in the first year after starting hemodialysis and in the 2 years prior to dialysis. Citizens resident in the Lombardy Region (Italy) who initiated dialysis in the year 2011 (Jan 1 to Dec 31) were selected and data were extracted from Lombardy Regional databases on their direct healthcare costs in the first year after starting dialysis and in the 2 years prior to it was analyzed. Drugs, hospitalizations, diagnostic procedures and outpatient costs covered by RHS were estimated. Patients treated for acute kidney injury, or who died or stopped dialysis during the observational period were excluded. From the regional population (>9,700,000 inhabitants), 1067 patients (34.3 % females) initiating dialysis were identified, of whom 82 % underwent only hemodialysis (HD), 13 % only peritoneal dialysis (PD) and the remaining 5 % both treatments. Direct healthcare costs/patient were € 5239, € 12,303 and € 38,821 (€ 40,132 for HD vs. € 30,444 for PD patients) for the periods 24-12 months pre-dialysis, 12-0 months pre-dialysis, and in the first year of dialysis, respectively. This study highlights a significant economic burden related to CKD and an increase in direct healthcare costs associated with the start of dialysis, pointing to the importance of prevention programs and early diagnosis.

  10. Built-in self-test (BIST) techniques for millimeter wave CMOS transceivers

    NASA Astrophysics Data System (ADS)

    Mahzabeen, Tabassum

    The seamless integration of complementary metal oxide semiconductor (CMOS) transceivers with a digital CMOS process enhances on-chip testability, thus reducing production and testing costs. Built in self testability also improves yield by offering on-chip compensation. This work focuses on built in self test techniques for CMOS based millimeter wave (mm-wave) transceivers. Built-in-self-test (BIST) using the loopback method is one cost-effective method for testing these transceivers. Since the loopback switch is always present during the normal operation of the transceiver, the requirement of the switch is different than for a conventional switch. The switch needs to have high isolation and high impedance during its OFF period. Two 80 GHz single pole single throw (SPST) switches have been designed, fabricated in standard CMOS process, and measured to connect the loopback path for BIST applications. The loopback switches in this work provide the required criteria for loopback BIST. A stand alone 80 GHz low noise amplifier (LNA) and the same LNA integrated with one of the loopback switches have been fabricated, and measured to observe the difference in performance when the loopback switch is present. Besides the loopback switch, substrate leakage also forms a path between the transmitter and receiver. Substrate leakage has been characterized as a function of distance between the transmitter and receiver for consideration in using the BIST method. A BIST algorithm has been developed to estimate the process variation in device sizes by probing a low frequency ring oscillator to estimate the device variation and map this variation to the 80 GHz LNA. Probing a low frequency circuit is cheaper compared to the probing of a millimeter wave circuit and reduces the testing costs. The performance of the LNA degrades due to variation in device size. Once the shift in the device size is being estimated (from the ring oscillator's shifted frequency), the LNA's performance can be recovered using several methods; for example, using tunable transmission line lengths in the amplifier or using a variable supply voltage. This concept of estimating process variation has been demonstrated in Agilent Design System (ADS).

  11. Efficient high-dimensional characterization of conductivity in a sand box using massive MRI-imaged concentration data

    NASA Astrophysics Data System (ADS)

    Lee, J. H.; Yoon, H.; Kitanidis, P. K.; Werth, C. J.; Valocchi, A. J.

    2015-12-01

    Characterizing subsurface properties, particularly hydraulic conductivity, is crucial for reliable and cost-effective groundwater supply management, contaminant remediation, and emerging deep subsurface activities such as geologic carbon storage and unconventional resources recovery. With recent advances in sensor technology, a large volume of hydro-geophysical and chemical data can be obtained to achieve high-resolution images of subsurface properties, which can be used for accurate subsurface flow and reactive transport predictions. However, subsurface characterization with a plethora of information requires high, often prohibitive, computational costs associated with "big data" processing and large-scale numerical simulations. As a result, traditional inversion techniques are not well-suited for problems that require coupled multi-physics simulation models with massive data. In this work, we apply a scalable inversion method called Principal Component Geostatistical Approach (PCGA) for characterizing heterogeneous hydraulic conductivity (K) distribution in a 3-D sand box. The PCGA is a Jacobian-free geostatistical inversion approach that uses the leading principal components of the prior information to reduce computational costs, sometimes dramatically, and can be easily linked with any simulation software. Sequential images of transient tracer concentrations in the sand box were obtained using magnetic resonance imaging (MRI) technique, resulting in 6 million tracer-concentration data [Yoon et. al., 2008]. Since each individual tracer observation has little information on the K distribution, the dimension of the data was reduced using temporal moments and discrete cosine transform (DCT). Consequently, 100,000 unknown K values consistent with the scale of MRI data (at a scale of 0.25^3 cm^3) were estimated by matching temporal moments and DCT coefficients of the original tracer data. Estimated K fields are close to the true K field, and even small-scale variability of the sand box was captured to highlight high K connectivity and contrasts between low and high K zones. Total number of 1,000 MODFLOW and MT3DMS simulations were required to obtain final estimates and corresponding estimation uncertainty, showing the efficiency and effectiveness of our method.

  12. Mitigating climate change through afforestation: new cost estimates for the United States

    Treesearch

    Anne Sofie Elberg Nielsen; Andrew J. Plantinga; Ralph J. Alig

    2014-01-01

    We provide new cost estimates for carbon sequestration through afforestation in the U.S. We extend existing studies of carbon sequestration costs in several important ways, while ensuring the transparency of our approach. Our costs estimates have five distinguishing features: (1) we estimate costs for each county in the contiguous U.S., (2) we include afforestation of...

  13. Assuring Software Cost Estimates: Is it an Oxymoron?

    NASA Technical Reports Server (NTRS)

    Hihn, Jarius; Tregre, Grant

    2013-01-01

    The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.

  14. Estimating the cost of a smoking employee.

    PubMed

    Berman, Micah; Crane, Rob; Seiber, Eric; Munur, Mehmet

    2014-09-01

    We attempted to estimate the excess annual costs that a US private employer may attribute to employing an individual who smokes tobacco as compared to a non-smoking employee. Reviewing and synthesising previous literature estimating certain discrete costs associated with smoking employees, we developed a cost estimation approach that approximates the total of such costs for U.S. employers. We examined absenteeism, presenteesim, smoking breaks, healthcare costs and pension benefits for smokers. Our best estimate of the annual excess cost to employ a smoker is $5816. This estimate should be taken as a general indicator of the extent of excess costs, not as a predictive point value. Employees who smoke impose significant excess costs on private employers. The results of this study may help inform employer decisions about tobacco-related policies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Cone penetrometer testing and discrete-depth ground water sampling techniques: A cost-effective method of site characterization in a multiple-aquifer setting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zemo, D.A.; Pierce, Y.G.; Gallinatti, J.D.

    Cone penetrometer testing (CPT), combined with discrete-depth ground water sampling methods, can significantly reduce the time and expense required to characterize large sites that have multiple aquifers. Results from the screening site characterization can then be used to design and install a cost-effective monitoring well network. At a site in northern California, it was necessary to characterize the stratigraphy and the distribution of volatile organic compounds (VOCs). To expedite characterization, a five-week field screening program was implemented that consisted of a shallow ground water survey, CPT soundings and pore-pressure measurements, and discrete-depth ground water sampling. Based on continuous lithologic informationmore » provided by the CPT soundings, four predominantly coarse-grained, water yielding stratigraphic packages were identified. Seventy-nine discrete-depth ground water samples were collected using either shallow ground water survey techniques, the BAT Enviroprobe, or the QED HydroPunch I, depending on subsurface conditions. Using results from these efforts, a 20-well monitoring network was designed and installed to monitor critical points within each stratigraphic package. Good correlation was found for hydraulic head and chemical results between discrete-depth screening data and monitoring well data. Understanding the vertical VOC distribution and concentrations produced substantial time and cost savings by minimizing the number of permanent monitoring wells and reducing the number of costly conductor casings that had to be installed. Additionally, significant long-term cost savings will result from reduced sampling costs, because fewer wells comprise the monitoring network. The authors estimate these savings to be 50% for site characterization costs, 65% for site characterization time, and 60% for long-term monitoring costs.« less

  16. Hybrid-Based Dense Stereo Matching

    NASA Astrophysics Data System (ADS)

    Chuang, T. Y.; Ting, H. W.; Jaw, J. J.

    2016-06-01

    Stereo matching generating accurate and dense disparity maps is an indispensable technique for 3D exploitation of imagery in the fields of Computer vision and Photogrammetry. Although numerous solutions and advances have been proposed in the literature, occlusions, disparity discontinuities, sparse texture, image distortion, and illumination changes still lead to problematic issues and await better treatment. In this paper, a hybrid-based method based on semi-global matching is presented to tackle the challenges on dense stereo matching. To ease the sensitiveness of SGM cost aggregation towards penalty parameters, a formal way to provide proper penalty estimates is proposed. To this end, the study manipulates a shape-adaptive cross-based matching with an edge constraint to generate an initial disparity map for penalty estimation. Image edges, indicating the potential locations of occlusions as well as disparity discontinuities, are approved by the edge drawing algorithm to ensure the local support regions not to cover significant disparity changes. Besides, an additional penalty parameter 𝑃𝑒 is imposed onto the energy function of SGM cost aggregation to specifically handle edge pixels. Furthermore, the final disparities of edge pixels are found by weighting both values derived from the SGM cost aggregation and the U-SURF matching, providing more reliable estimates at disparity discontinuity areas. Evaluations on Middlebury stereo benchmarks demonstrate satisfactory performance and reveal the potency of the hybrid-based dense stereo matching method.

  17. Sector-Based Detection for Hands-Free Speech Enhancement in Cars

    NASA Astrophysics Data System (ADS)

    Lathoud, Guillaume; Bourgeois, Julien; Freudenberger, Jürgen

    2006-12-01

    Adaptation control of beamforming interference cancellation techniques is investigated for in-car speech acquisition. Two efficient adaptation control methods are proposed that avoid target cancellation. The "implicit" method varies the step-size continuously, based on the filtered output signal. The "explicit" method decides in a binary manner whether to adapt or not, based on a novel estimate of target and interference energies. It estimates the average delay-sum power within a volume of space, for the same cost as the classical delay-sum. Experiments on real in-car data validate both methods, including a case with[InlineEquation not available: see fulltext.] km/h background road noise.

  18. Landsat for practical forest type mapping - A test case

    NASA Technical Reports Server (NTRS)

    Bryant, E.; Dodge, A. G., Jr.; Warren, S. D.

    1980-01-01

    Computer classified Landsat maps are compared with a recent conventional inventory of forest lands in northern Maine. Over the 196,000 hectare area mapped, estimates of the areas of softwood, mixed wood and hardwood forest obtained by a supervised classification of the Landsat data and a standard inventory based on aerial photointerpretation, probability proportional to prediction, field sampling and a standard forest measurement program are found to agree to within 5%. The cost of the Landsat maps is estimated to be $0.065/hectare. It is concluded that satellite techniques are worth developing for forest inventories, although they are not yet refined enough to be incorporated into current practical inventories.

  19. Pros, Cons, and Alternatives to Weight Based Cost Estimating

    NASA Technical Reports Server (NTRS)

    Joyner, Claude R.; Lauriem, Jonathan R.; Levack, Daniel H.; Zapata, Edgar

    2011-01-01

    Many cost estimating tools use weight as a major parameter in projecting the cost. This is often combined with modifying factors such as complexity, technical maturity of design, environment of operation, etc. to increase the fidelity of the estimate. For a set of conceptual designs, all meeting the same requirements, increased weight can be a major driver in increased cost. However, once a design is fixed, increased weight generally decreases cost, while decreased weight generally increases cost - and the relationship is not linear. Alternative approaches to estimating cost without using weight (except perhaps for materials costs) have been attempted to try to produce a tool usable throughout the design process - from concept studies through development. This paper will address the pros and cons of using weight based models for cost estimating, using liquid rocket engines as the example. It will then examine approaches that minimize the impct of weight based cost estimating. The Rocket Engine- Cost Model (RECM) is an attribute based model developed internally by Pratt & Whitney Rocketdyne for NASA. RECM will be presented primarily to show a successful method to use design and programmatic parameters instead of weight to estimate both design and development costs and production costs. An operations model developed by KSC, the Launch and Landing Effects Ground Operations model (LLEGO), will also be discussed.

  20. Estimating source parameters from deformation data, with an application to the March 1997 earthquake swarm off the Izu Peninsula, Japan

    NASA Astrophysics Data System (ADS)

    Cervelli, P.; Murray, M. H.; Segall, P.; Aoki, Y.; Kato, T.

    2001-06-01

    We have applied two Monte Carlo optimization techniques, simulated annealing and random cost, to the inversion of deformation data for fault and magma chamber geometry. These techniques involve an element of randomness that permits them to escape local minima and ultimately converge to the global minimum of misfit space. We have tested the Monte Carlo algorithms on two synthetic data sets. We have also compared them to one another in terms of their efficiency and reliability. We have applied the bootstrap method to estimate confidence intervals for the source parameters, including the correlations inherent in the data. Additionally, we present methods that use the information from the bootstrapping procedure to visualize the correlations between the different model parameters. We have applied these techniques to GPS, tilt, and leveling data from the March 1997 earthquake swarm off of the Izu Peninsula, Japan. Using the two Monte Carlo algorithms, we have inferred two sources, a dike and a fault, that fit the deformation data and the patterns of seismicity and that are consistent with the regional stress field.

  1. Staining-free malaria diagnostics by multispectral and multimodality light-emitting-diode microscopy

    NASA Astrophysics Data System (ADS)

    Merdasa, Aboma; Brydegaard, Mikkel; Svanberg, Sune; Zoueu, Jeremie T.

    2013-03-01

    We report an accurate optical differentiation technique between healthy and malaria-infected erythrocytes by quasi-simultaneous measurements of transmittance, reflectance, and scattering properties of unstained blood smears using a multispectral and multimode light-emitting diode microscope. We propose a technique for automated imaging, identification, and counting of malaria-infected erythrocytes for real-time and cost-effective parasitaemia diagnosis as an effective alternative to the manual screening of stained blood smears, now considered to be the gold standard in malaria diagnosis. We evaluate the performance of our algorithm against manual estimations of an expert and show a spectrally resolved increased scattering from malaria-infected blood cells.

  2. IUS/TUG orbital operations and mission support study. Volume 5: Cost estimates

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The costing approach, methodology, and rationale utilized for generating cost data for composite IUS and space tug orbital operations are discussed. Summary cost estimates are given along with cost data initially derived for the IUS program and space tug program individually, and cost estimates for each work breakdown structure element.

  3. A cost-effectiveness comparison of the use of antimicrobial agents for treatment or prophylaxis of travelers' diarrhea.

    PubMed

    Reves, R R; Johnson, P C; Ericsson, C D; DuPont, H L

    1988-11-01

    We conducted a decision analysis to compare the cost-effectiveness of antimicrobial agents used for treatment with their use for prophylaxis of travelers' diarrhea. Estimates of the likelihood and the cost of various outcomes were obtained from a panel of experts using the Delphi group opinion technique. Treatment with sulfamethoxazole-trimethoprim for three days was compared with daily prophylaxis with sulfamethoxazole-trimethoprim or doxycycline. The cost-effectiveness of prophylaxis with either agent (75% to 83%) was greater than that of treatment (38%). Treatment would become more cost-effective than prophylaxis when the cumulative risk of acquiring travelers' diarrhea was less than 0.05 episodes per person per week or if the effectiveness of prophylaxis fell below 35% for doxycycline and 46% for sulfamethoxazole-trimethoprim. The most important contributor to the mean cost of travelers' diarrhea in this analysis was the cost associated with a day of incapacitation due to illness. On the basis of the results of this decision analysis, we conclude that prophylaxis of travelers' diarrhea is an option that should be considered for individual situations and recommend further studies of its cost-effectiveness.

  4. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or equipment...

  5. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or equipment...

  6. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or equipment...

  7. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or equipment...

  8. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or equipment...

  9. The Sensitivity of Adverse Event Cost Estimates to Diagnostic Coding Error

    PubMed Central

    Wardle, Gavin; Wodchis, Walter P; Laporte, Audrey; Anderson, Geoffrey M; Baker, Ross G

    2012-01-01

    Objective To examine the impact of diagnostic coding error on estimates of hospital costs attributable to adverse events. Data Sources Original and reabstracted medical records of 9,670 complex medical and surgical admissions at 11 hospital corporations in Ontario from 2002 to 2004. Patient specific costs, not including physician payments, were retrieved from the Ontario Case Costing Initiative database. Study Design Adverse events were identified among the original and reabstracted records using ICD10-CA (Canadian adaptation of ICD10) codes flagged as postadmission complications. Propensity score matching and multivariate regression analysis were used to estimate the cost of the adverse events and to determine the sensitivity of cost estimates to diagnostic coding error. Principal Findings Estimates of the cost of the adverse events ranged from $16,008 (metabolic derangement) to $30,176 (upper gastrointestinal bleeding). Coding errors caused the total cost attributable to the adverse events to be underestimated by 16 percent. The impact of coding error on adverse event cost estimates was highly variable at the organizational level. Conclusions Estimates of adverse event costs are highly sensitive to coding error. Adverse event costs may be significantly underestimated if the likelihood of error is ignored. PMID:22091908

  10. 77 FR 45636 - Food Safety Modernization Act Domestic and Foreign Facility Reinspection, Recall, and Importer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-01

    ... of a Supported Direct FDA Work Hour for FY 2013 FDA is required to estimate 100 percent of its costs... operating costs. A. Estimating the Full Cost per Direct Work Hour in FY 2011 In general, the starting point for estimating the full cost per direct work hour is to estimate the cost of a full-time-equivalent...

  11. Using average cost methods to estimate encounter-level costs for medical-surgical stays in the VA.

    PubMed

    Wagner, Todd H; Chen, Shuo; Barnett, Paul G

    2003-09-01

    The U.S. Department of Veterans Affairs (VA) maintains discharge abstracts, but these do not include cost information. This article describes the methods the authors used to estimate the costs of VA medical-surgical hospitalizations in fiscal years 1998 to 2000. They estimated a cost regression with 1996 Medicare data restricted to veterans receiving VA care in an earlier year. The regression accounted for approximately 74 percent of the variance in cost-adjusted charges, and it proved to be robust to outliers and the year of input data. The beta coefficients from the cost regression were used to impute costs of VA medical-surgical hospital discharges. The estimated aggregate costs were reconciled with VA budget allocations. In addition to the direct medical costs, their cost estimates include indirect costs and physician services; both of these were allocated in proportion to direct costs. They discuss the method's limitations and application in other health care systems.

  12. Estimates and implications of the costs of compliance with biosafety regulations in developing countries.

    PubMed

    Falck-Zepeda, Jose; Yorobe, Jose; Husin, Bahagiawati Amir; Manalo, Abraham; Lokollo, Erna; Ramon, Godfrey; Zambrano, Patricia; Sutrisno

    2012-01-01

    Estimating the cost of compliance with biosafety regulations is important as it helps developers focus their investments in producer development. We provide estimates for the cost of compliance for a set of technologies in Indonesia, the Philippines and other countries. These costs vary from US $100,000 to 1.7 million. These are estimates of regulatory costs and do not include product development or deployment costs. Cost estimates need to be compared with potential gains when the technology is introduced in these countries and the gains in knowledge accumulate during the biosafety assessment process. Although the cost of compliance is important, time delays and uncertainty are even more important and may have an adverse impact on innovations reaching farmers.

  13. Costing the satellite power system

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.

    1978-01-01

    The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.

  14. Estimating the Deep Space Network modification costs to prepare for future space missions by using major cost drivers

    NASA Technical Reports Server (NTRS)

    Remer, Donald S.; Sherif, Josef; Buchanan, Harry R.

    1993-01-01

    This paper develops a cost model to do long range planning cost estimates for Deep Space Network (DSN) support of future space missions. The paper focuses on the costs required to modify and/or enhance the DSN to prepare for future space missions. The model is a function of eight major mission cost drivers and estimates both the total cost and the annual costs of a similar future space mission. The model is derived from actual cost data from three space missions: Voyager (Uranus), Voyager (Neptune), and Magellan. Estimates derived from the model are tested against actual cost data for two independent missions, Viking and Mariner Jupiter/Saturn (MJS).

  15. Outer planet probe cost estimates: First impressions

    NASA Technical Reports Server (NTRS)

    Niehoff, J.

    1974-01-01

    An examination was made of early estimates of outer planetary atmospheric probe cost by comparing the estimates with past planetary projects. Of particular interest is identification of project elements which are likely cost drivers for future probe missions. Data are divided into two parts: first, the description of a cost model developed by SAI for the Planetary Programs Office of NASA, and second, use of this model and its data base to evaluate estimates of probe costs. Several observations are offered in conclusion regarding the credibility of current estimates and specific areas of the outer planet probe concept most vulnerable to cost escalation.

  16. Venture Evaluation and Review Technique (VERT). Users’/Analysts’ Manual

    DTIC Science & Technology

    1979-10-01

    real world. Additionally, activity pro- cessing times could be entered as a normal, uniform or triangular distribution. Activity times can also be...work or tasks, or if the unit activities are such abstractions of the real world that the estimation of the time , cost and performance parameters for...utilized in that con- straining capacity. 7444 The network being processed has passed all the previous error checks. It currently has a real time

  17. Wrong Signs in Regression Coefficients

    NASA Technical Reports Server (NTRS)

    McGee, Holly

    1999-01-01

    When using parametric cost estimation, it is important to note the possibility of the regression coefficients having the wrong sign. A wrong sign is defined as a sign on the regression coefficient opposite to the researcher's intuition and experience. Some possible causes for the wrong sign discussed in this paper are a small range of x's, leverage points, missing variables, multicollinearity, and computational error. Additionally, techniques for determining the cause of the wrong sign are given.

  18. Three essays in energy economics

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Wook

    Deregulation in electricity and natural gas market in an attempt to alleviate market power of privately owned utilities is widespread throughout the United States. Beginning with Gollop and Roberts (1979), a number of empirical studies have allowed the data to identify industry competition and marginal cost levels by estimating the firms' first order condition within a conjectural variations framework. The first chapter of my dissertation uses direct measures of marginal cost for the California electricity market to measure the extent to which estimated mark-ups and marginal costs are biased. My results suggest that the New Empirical Industrial Organization technique poorly estimates the level of mark-ups and the sensitivity of marginal cost to cost shifters. The second chapter takes advantage of the market structure of electricity and natural gas varies in the United States. The goal of the chapter is to analyze whether combined-billed residential households of electricity and natural gas firms face information costs associated with determining the portion of their monthly energy bill attributed to natural gas consumption and the portion attributed to electricity consumption. However, if households are unable to determine whether an increase in their energy bill is the result of an increase in the price of electricity or an increase in the price of natural gas, they act as if electricity and natural gas were complements. I find that own-price elasticities are smaller in absolute terms in combined-billed markets, while cross-price elasticities are more positive, compared to separate-billed markets; both of these results are consistent with the presence of information costs. In chapter 3, I provide an empirical evidence of the impact of variations in ownership, regulation and market structure on the electric and natural gas markets in the United States. My results suggest that the private firms in electricity markets are associated with higher prices than public firms. I further find that dual-product firms in the natural gas industry tend to charge less than single product firms. Finally, my results suggest that merger activities in natural gas markets are associated with higher rates after controlling cost and demand.

  19. Variational optical flow estimation based on stick tensor voting.

    PubMed

    Rashwan, Hatem A; Garcia, Miguel A; Puig, Domenec

    2013-07-01

    Variational optical flow techniques allow the estimation of flow fields from spatio-temporal derivatives. They are based on minimizing a functional that contains a data term and a regularization term. Recently, numerous approaches have been presented for improving the accuracy of the estimated flow fields. Among them, tensor voting has been shown to be particularly effective in the preservation of flow discontinuities. This paper presents an adaptation of the data term by using anisotropic stick tensor voting in order to gain robustness against noise and outliers with significantly lower computational cost than (full) tensor voting. In addition, an anisotropic complementary smoothness term depending on directional information estimated through stick tensor voting is utilized in order to preserve discontinuity capabilities of the estimated flow fields. Finally, a weighted non-local term that depends on both the estimated directional information and the occlusion state of pixels is integrated during the optimization process in order to denoise the final flow field. The proposed approach yields state-of-the-art results on the Middlebury benchmark.

  20. Costs of Alcohol-Involved Crashes, United States, 2010

    PubMed Central

    Zaloshnja, Eduard; Miller, Ted R.; Blincoe, Lawrence J.

    2013-01-01

    This paper estimates total and unit costs of alcohol-involved crashes in the U.S. in 2010. With methods from earlier studies, we estimated costs per crash survivor by MAIS, body part, and fracture/dislocation involvement. We multiplied them times 2010 crash incidence estimates from NHTSA data sets, with adjustments for underreporting of crashes and their alcohol involvement. The unit costs are lifetime costs discounted at 3%. To develop medical costs, we combined 2008 Health Care Utilization Program national data for hospitalizations and ED visits of crash survivors with prior estimates of post-discharge costs. Productivity losses drew on Current Population Survey and American Time Use Survey data. Quality of life losses came from a 2011 AAAM paper and property damage from insurance data. We built a hybrid incidence file comprised of 2008–2010 and 1984–86 NHTSA crash surveillance data, weighted with 2010 General Estimates System weights. Fatality data came from the 2010 FARS. An estimated 12% of 2010 crashes but only 0.9% of miles driven were alcohol-involved (BAC > .05). Alcohol-involved crashes cost an estimated $125 billion. That is 22.5% of the societal cost of all crashes. Alcohol-attributable crashes accounted for an estimated 22.5% of US auto liability insurance payments. Alcohol-involved crashes cost $0.86 per drink. Above the US BAC limit of .08, crash costs were $8.37 per mile driven; 1 in 788 trips resulted in a crash and 1 in 1,016 trips in an arrest. Unit costs for crash survivors by severity are higher for impaired driving than for other crashes. That suggests national aggregate impaired driving cost estimates in other countries are substantial underestimates if they are based on all-crash unit costs. PMID:24406941

  1. Estimating age-based antiretroviral therapy costs for HIV-infected children in resource-limited settings based on World Health Organization weight-based dosing recommendations

    PubMed Central

    2014-01-01

    Background Pediatric antiretroviral therapy (ART) has been shown to substantially reduce morbidity and mortality in HIV-infected infants and children. To accurately project program costs, analysts need accurate estimations of antiretroviral drug (ARV) costs for children. However, the costing of pediatric antiretroviral therapy is complicated by weight-based dosing recommendations which change as children grow. Methods We developed a step-by-step methodology for estimating the cost of pediatric ARV regimens for children ages 0–13 years old. The costing approach incorporates weight-based dosing recommendations to provide estimated ARV doses throughout childhood development. Published unit drug costs are then used to calculate average monthly drug costs. We compared our derived monthly ARV costs to published estimates to assess the accuracy of our methodology. Results The estimates of monthly ARV costs are provided for six commonly used first-line pediatric ARV regimens, considering three possible care scenarios. The costs derived in our analysis for children were fairly comparable to or slightly higher than available published ARV drug or regimen estimates. Conclusions The methodology described here can be used to provide an accurate estimation of pediatric ARV regimen costs for cost-effectiveness analysts to project the optimum packages of care for HIV-infected children, as well as for program administrators and budget analysts who wish to assess the feasibility of increasing pediatric ART availability in constrained budget environments. PMID:24885453

  2. A non-stationary cost-benefit analysis approach for extreme flood estimation to explore the nexus of 'Risk, Cost and Non-stationarity'

    NASA Astrophysics Data System (ADS)

    Qi, Wei

    2017-11-01

    Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.

  3. Integrated pest management and allocation of control efforts for vector-borne diseases

    USGS Publications Warehouse

    Ginsberg, H.S.

    2001-01-01

    Applications of various control methods were evaluated to determine how to integrate methods so as to minimize the number of human cases of vector-borne diseases. These diseases can be controlled by lowering the number of vector-human contacts (e.g., by pesticide applications or use of repellents), or by lowering the proportion of vectors infected with pathogens (e.g., by lowering or vaccinating reservoir host populations). Control methods should be combined in such a way as to most efficiently lower the probability of human encounter with an infected vector. Simulations using a simple probabilistic model of pathogen transmission suggest that the most efficient way to integrate different control methods is to combine methods that have the same effect (e.g., combine treatments that lower the vector population; or combine treatments that lower pathogen prevalence in vectors). Combining techniques that have different effects (e.g., a technique that lowers vector populations with a technique that lowers pathogen prevalence in vectors) will be less efficient than combining two techniques that both lower vector populations or combining two techniques that both lower pathogen prevalence, costs being the same. Costs of alternative control methods generally differ, so the efficiency of various combinations at lowering human contact with infected vectors should be estimated at available funding levels. Data should be collected from initial trials to improve the effects of subsequent interventions on the number of human cases.

  4. Final Report: Wireless Instrument for Automated Measurement of Clean Cookstove Usage and Black Carbon Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukac, Martin; Ramanathan, Nithya; Graham, Eric

    2013-09-10

    Black carbon (BC) emissions from traditional cooking fires and other sources are significant anthropogenic drivers of radiative forcing. Clean cookstoves present a more energy-efficient and cleaner-burning vehicle for cooking than traditional wood-burning stoves, yet many existing cookstoves reduce emissions by only modest amounts. Further research into cookstove use, fuel types, and verification of emissions is needed as adoption rates for such stoves remain low. Accelerated innovation requires techniques for measuring and verifying such cookstove performance. The overarching goal of the proposed program was to develop a low-cost, wireless instrument to provide a high-resolution profile of the cookstove BC emissions andmore » usage in the field. We proposed transferring the complexity of analysis away from the sampling hardware at the measurement site and to software at a centrally located server to easily analyze data from thousands of sampling instruments. We were able to build a low-cost field-based instrument that produces repeatable, low-cost estimates of cookstove usage, fuel estimates, and emission values with low variability. Emission values from our instrument were consistent with published ranges of emissions for similar stove and fuel types.« less

  5. Cost effective stream-gaging strategies for the Lower Colorado River basin; the Blythe field office operations

    USGS Publications Warehouse

    Moss, Marshall E.; Gilroy, Edward J.

    1980-01-01

    This report describes the theoretical developments and illustrates the applications of techniques that recently have been assembled to analyze the cost-effectiveness of federally funded stream-gaging activities in support of the Colorado River compact and subsequent adjudications. The cost effectiveness of 19 stream gages in terms of minimizing the sum of the variances of the errors of estimation of annual mean discharge is explored by means of a sequential-search optimization scheme. The search is conducted over a set of decision variables that describes the number of times that each gaging route is traveled in a year. A gage route is defined as the most expeditious circuit that is made from a field office to visit one or more stream gages and return to the office. The error variance is defined as a function of the frequency of visits to a gage by using optimal estimation theory. Currently a minimum of 12 visits per year is made to any gage. By changing to a six-visit minimum, the same total error variance can be attained for the 19 stations with a budget of 10% less than the current one. Other strategies are also explored. (USGS)

  6. Particle swarm optimization algorithm based low cost magnetometer calibration

    NASA Astrophysics Data System (ADS)

    Ali, A. S.; Siddharth, S., Syed, Z., El-Sheimy, N.

    2011-12-01

    Inertial Navigation Systems (INS) consist of accelerometers, gyroscopes and a microprocessor provide inertial digital data from which position and orientation is obtained by integrating the specific forces and rotation rates. In addition to the accelerometers and gyroscopes, magnetometers can be used to derive the absolute user heading based on Earth's magnetic field. Unfortunately, the measurements of the magnetic field obtained with low cost sensors are corrupted by several errors including manufacturing defects and external electro-magnetic fields. Consequently, proper calibration of the magnetometer is required to achieve high accuracy heading measurements. In this paper, a Particle Swarm Optimization (PSO) based calibration algorithm is presented to estimate the values of the bias and scale factor of low cost magnetometer. The main advantage of this technique is the use of the artificial intelligence which does not need any error modeling or awareness of the nonlinearity. The estimated bias and scale factor errors from the proposed algorithm improve the heading accuracy and the results are also statistically significant. Also, it can help in the development of the Pedestrian Navigation Devices (PNDs) when combined with the INS and GPS/Wi-Fi especially in the indoor environments

  7. 40 CFR 144.62 - Cost estimate for plugging and abandonment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Cost estimate for plugging and... Waste Injection Wells § 144.62 Cost estimate for plugging and abandonment. (a) The owner or operator must prepare a written estimate, in current dollars, of the cost of plugging the injection well in...

  8. Sensitivity analysis of pars-tensa young's modulus estimation using inverse finite-element modeling

    NASA Astrophysics Data System (ADS)

    Rohani, S. Alireza; Elfarnawany, Mai; Agrawal, Sumit K.; Ladak, Hanif M.

    2018-05-01

    Accurate estimates of the pars-tensa (PT) Young's modulus (EPT) are required in finite-element (FE) modeling studies of the middle ear. Previously, we introduced an in-situ EPT estimation technique by optimizing a sample-specific FE model to match experimental eardrum pressurization data. This optimization process requires choosing some modeling assumptions such as PT thickness and boundary conditions. These assumptions are reported with a wide range of variation in the literature, hence affecting the reliability of the models. In addition, the sensitivity of the estimated EPT to FE modeling assumptions has not been studied. Therefore, the objective of this study is to identify the most influential modeling assumption on EPT estimates. The middle-ear cavity extracted from a cadaveric temporal bone was pressurized to 500 Pa. The deformed shape of the eardrum after pressurization was measured using a Fourier transform profilometer (FTP). A base-line FE model of the unpressurized middle ear was created. The EPT was estimated using golden section optimization method, which minimizes the cost function comparing the deformed FE model shape to the measured shape after pressurization. The effect of varying the modeling assumptions on EPT estimates were investigated. This included the change in PT thickness, pars flaccida Young's modulus and possible FTP measurement error. The most influential parameter on EPT estimation was PT thickness and the least influential parameter was pars flaccida Young's modulus. The results of this study provide insight into how different parameters affect the results of EPT optimization and which parameters' uncertainties require further investigation to develop robust estimation techniques.

  9. Xenia Spacecraft Study Addendum: Spacecraft Cost Estimate

    NASA Technical Reports Server (NTRS)

    Hill, Spencer; Hopkins, Randall

    2009-01-01

    This slide presentation reviews the Xenia spacecraft cost estimates as an addendum for the Xenia Spacecraft study. The NASA/Air Force Cost model (NAFCPOM) was used to derive the cost estimates that are expressed in 2009 dollars.

  10. Introduction of the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Costing Tool: a user-friendly spreadsheet program to estimate costs of providing patient-centered interventions.

    PubMed

    Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J

    2012-01-01

    Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.

  11. Introduction of the TEAM-HF Costing Tool: A User-Friendly Spreadsheet Program to Estimate Costs of Providing Patient-Centered Interventions

    PubMed Central

    Reed, Shelby D.; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L.; Bowers, Margaret T.; Samsa, Gregory P.; Paul, Sara; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara J.

    2011-01-01

    Background Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Methods and Results Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers or health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. Conclusions The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions. PMID:22147884

  12. Parametric cost estimation for space science missions

    NASA Astrophysics Data System (ADS)

    Lillie, Charles F.; Thompson, Bruce E.

    2008-07-01

    Cost estimation for space science missions is critically important in budgeting for successful missions. The process requires consideration of a number of parameters, where many of the values are only known to a limited accuracy. The results of cost estimation are not perfect, but must be calculated and compared with the estimates that the government uses for budgeting purposes. Uncertainties in the input parameters result from evolving requirements for missions that are typically the "first of a kind" with "state-of-the-art" instruments and new spacecraft and payload technologies that make it difficult to base estimates on the cost histories of previous missions. Even the cost of heritage avionics is uncertain due to parts obsolescence and the resulting redesign work. Through experience and use of industry best practices developed in participation with the Aerospace Industries Association (AIA), Northrop Grumman has developed a parametric modeling approach that can provide a reasonably accurate cost range and most probable cost for future space missions. During the initial mission phases, the approach uses mass- and powerbased cost estimating relationships (CER)'s developed with historical data from previous missions. In later mission phases, when the mission requirements are better defined, these estimates are updated with vendor's bids and "bottoms- up", "grass-roots" material and labor cost estimates based on detailed schedules and assigned tasks. In this paper we describe how we develop our CER's for parametric cost estimation and how they can be applied to estimate the costs for future space science missions like those presented to the Astronomy & Astrophysics Decadal Survey Study Committees.

  13. Data Service Provider Cost Estimation Tool

    NASA Technical Reports Server (NTRS)

    Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel

    2011-01-01

    The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.

  14. Challenges in measuring and valuing productivity costs, and their relevance in mood disorders

    PubMed Central

    Lensberg, Benedikte R; Drummond, Michael F; Danchenko, Natalya; Despiégel, Nicolas; François, Clément

    2013-01-01

    Lost productivity is often excluded from economic evaluations, which may lead to an underestimation of the societal benefits of treatment. However, there are multiple challenges in reliably estimating and reporting productivity losses. This article explores the main challenges, ie, selecting an appropriate valuation method (ie, human capital, friction cost, or multiplier), avoiding double counting, and accounting for equity. It also discusses the use of presenteeism instruments and their application in clinical trials, with a specific focus on their relevance in individuals with mood disorders. Further research and discussion is required on the development of reliable techniques for measuring and valuing productivity changes due to presenteeism. PMID:24273412

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennink, Ryan S.; Ferragut, Erik M.; Humble, Travis S.

    Modeling and simulation are essential for predicting and verifying the behavior of fabricated quantum circuits, but existing simulation methods are either impractically costly or require an unrealistic simplification of error processes. In this paper, we present a method of simulating noisy Clifford circuits that is both accurate and practical in experimentally relevant regimes. In particular, the cost is weakly exponential in the size and the degree of non-Cliffordness of the circuit. Our approach is based on the construction of exact representations of quantum channels as quasiprobability distributions over stabilizer operations, which are then sampled, simulated, and weighted to yield unbiasedmore » statistical estimates of circuit outputs and other observables. As a demonstration of these techniques, we simulate a Steane [[7,1,3

  16. Efforts to Support Consumer Enrollment Decisions Using Total Cost Estimators: Lessons from the Affordable Care Act’s Marketplaces.

    PubMed

    Giovannelli, Justin; Curran, Emily

    2017-02-01

    Issue: Policymakers have sought to improve the shopping experience on the Affordable Care Act’s marketplaces by offering decision support tools that help consumers better understand and compare their health plan options. Cost estimators are one such tool. They are designed to provide consumers a personalized estimate of the total cost--premium, minus subsidy, plus cost-sharing--of their coverage options. Cost estimators were available in most states by the start of the fourth open enrollment period. Goal: To understand the experiences of marketplaces that offer a total cost estimator and the interests and concerns of policymakers from states that are not using them. Methods: Structured interviews with marketplace officials, consumer enrollment assisters, technology vendors, and subject matter experts; analysis of the total cost estimators available on the marketplaces as of October 2016. Key findings and conclusions: Informants strongly supported marketplace adoption of a total cost estimator. Marketplaces that offer an estimator faced a range of design choices and varied significantly in their approaches to resolving them. Interviews suggested a clear need for additional consumer testing and data analysis of tool usage and for sustained outreach to enrollment assisters to encourage greater use of the estimators.

  17. A novel aliasing-free subband information fusion approach for wideband sparse spectral estimation

    NASA Astrophysics Data System (ADS)

    Luo, Ji-An; Zhang, Xiao-Ping; Wang, Zhi

    2017-12-01

    Wideband sparse spectral estimation is generally formulated as a multi-dictionary/multi-measurement (MD/MM) problem which can be solved by using group sparsity techniques. In this paper, the MD/MM problem is reformulated as a single sparse indicative vector (SIV) recovery problem at the cost of introducing an additional system error. Thus, the number of unknowns is reduced greatly. We show that the system error can be neglected under certain conditions. We then present a new subband information fusion (SIF) method to estimate the SIV by jointly utilizing all the frequency bins. With orthogonal matching pursuit (OMP) leveraging the binary property of SIV's components, we develop a SIF-OMP algorithm to reconstruct the SIV. The numerical simulations demonstrate the performance of the proposed method.

  18. Calibration of a COTS Integration Cost Model Using Local Project Data

    NASA Technical Reports Server (NTRS)

    Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David

    1997-01-01

    The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.

  19. Estimation of optimal educational cost per medical student.

    PubMed

    Yang, Eunbae B; Lee, Seunghee

    2009-09-01

    This study aims to estimate the optimal educational cost per medical student. A private medical college in Seoul was targeted by the study, and its 2006 learning environment and data from the 2003~2006 budget and settlement were carefully analyzed. Through interviews with 3 medical professors and 2 experts in the economics of education, the study attempted to establish the educational cost estimation model, which yields an empirically computed estimate of the optimal cost per student in medical college. The estimation model was based primarily upon the educational cost which consisted of direct educational costs (47.25%), support costs (36.44%), fixed asset purchases (11.18%) and costs for student affairs (5.14%). These results indicate that the optimal cost per student is approximately 20,367,000 won each semester; thus, training a doctor costs 162,936,000 won over 4 years. Consequently, we inferred that the tuition levels of a local medical college or professional medical graduate school cover one quarter or one-half of the per- student cost. The findings of this study do not necessarily imply an increase in medical college tuition; the estimation of the per-student cost for training to be a doctor is one matter, and the issue of who should bear this burden is another. For further study, we should consider the college type and its location for general application of the estimation method, in addition to living expenses and opportunity costs.

  20. Probabilistic Methodology for Estimation of Number and Economic Loss (Cost) of Future Landslides in the San Francisco Bay Region, California

    USGS Publications Warehouse

    Crovelli, Robert A.; Coe, Jeffrey A.

    2008-01-01

    The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.

Top