Science.gov

Sample records for accurate cost estimates

  1. Preparing Rapid, Accurate Construction Cost Estimates with a Personal Computer.

    ERIC Educational Resources Information Center

    Gerstel, Sanford M.

    1986-01-01

    An inexpensive and rapid method for preparing accurate cost estimates of construction projects in a university setting, using a personal computer, purchased software, and one estimator, is described. The case against defined estimates, the rapid estimating system, and adjusting standard unit costs are discussed. (MLW)

  2. How utilities can achieve more accurate decommissioning cost estimates

    SciTech Connect

    Knight, R.

    1999-07-01

    The number of commercial nuclear power plants that are undergoing decommissioning coupled with the economic pressure of deregulation has increased the focus on adequate funding for decommissioning. The introduction of spent-fuel storage and disposal of low-level radioactive waste into the cost analysis places even greater concern as to the accuracy of the fund calculation basis. The size and adequacy of the decommissioning fund have also played a major part in the negotiations for transfer of plant ownership. For all of these reasons, it is important that the operating plant owner reduce the margin of error in the preparation of decommissioning cost estimates. To data, all of these estimates have been prepared via the building block method. That is, numerous individual calculations defining the planning, engineering, removal, and disposal of plant systems and structures are performed. These activity costs are supplemented by the period-dependent costs reflecting the administration, control, licensing, and permitting of the program. This method will continue to be used in the foreseeable future until adequate performance data are available. The accuracy of the activity cost calculation is directly related to the accuracy of the inventory of plant system component, piping and equipment, and plant structural composition. Typically, it is left up to the cost-estimating contractor to develop this plant inventory. The data are generated by searching and analyzing property asset records, plant databases, piping and instrumentation drawings, piping system isometric drawings, and component assembly drawings. However, experience has shown that these sources may not be up to date, discrepancies may exist, there may be missing data, and the level of detail may not be sufficient. Again, typically, the time constraints associated with the development of the cost estimate preclude perfect resolution of the inventory questions. Another problem area in achieving accurate cost

  3. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained.

  4. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  5. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  6. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  7. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  8. Estimating Airline Operating Costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    The factors affecting commercial aircraft operating and delay costs were used to develop an airline operating cost model which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model permits estimates of aircraft related costs, i.e., aircraft service, landing fees, flight attendants, and control fees. A method for estimating the costs of certain types of airline delay is also described.

  9. Estimating airline operating costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    A review was made of the factors affecting commercial aircraft operating and delay costs. From this work, an airline operating cost model was developed which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model, similar in some respects to the standard Air Transport Association of America (ATA) Direct Operating Cost Model, permits estimates of aircraft-related costs not now included in the standard ATA model (e.g., aircraft service, landing fees, flight attendants, and control fees). A study of the cost of aircraft delay was also made and a method for estimating the cost of certain types of airline delay is described.

  10. Spacecraft platform cost estimating relationships

    NASA Technical Reports Server (NTRS)

    Gruhl, W. M.

    1972-01-01

    The three main cost areas of unmanned satellite development are discussed. The areas are identified as: (1) the spacecraft platform (SCP), (2) the payload or experiments, and (3) the postlaunch ground equipment and operations. The SCP normally accounts for over half of the total project cost and accurate estimates of SCP costs are required early in project planning as a basis for determining total project budget requirements. The development of single formula SCP cost estimating relationships (CER) from readily available data by statistical linear regression analysis is described. The advantages of single formula CER are presented.

  11. The Psychology of Cost Estimating

    NASA Technical Reports Server (NTRS)

    Price, Andy

    2016-01-01

    Cost estimation for large (and even not so large) government programs is a challenge. The number and magnitude of cost overruns associated with large Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) programs highlight the difficulties in developing and promulgating accurate cost estimates. These overruns can be the result of inadequate technology readiness or requirements definition, the whims of politicians or government bureaucrats, or even as failures of the cost estimating profession itself. However, there may be another reason for cost overruns that is right in front of us, but only recently have we begun to grasp it: the fact that cost estimators and their customers are human. The last 70+ years of research into human psychology and behavioral economics have yielded amazing findings into how we humans process and use information to make judgments and decisions. What these scientists have uncovered is surprising: humans are often irrational and illogical beings, making decisions based on factors such as emotion and perception, rather than facts and data. These built-in biases to our thinking directly affect how we develop our cost estimates and how those cost estimates are used. We cost estimators can use this knowledge of biases to improve our cost estimates and also to improve how we communicate and work with our customers. By understanding how our customers think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. By using psychology to our advantage, we can more effectively help the decision maker and our organizations make fact-based decisions.

  12. A better approach to cost estimation.

    PubMed

    Richmond, Russ

    2013-03-01

    Using ratios of costs to charges (RCCs) to estimate costs can cause hospitals to significantly over- or under-invest in service lines. A focus on improving cost estimation in cost centers where physicians have significant control over operating expenses, such as drugs or implants, can strengthen decision making and strategic planning. Connecting patient file information to purchasing data can lead to more accurate reflections of actual costs and help hospitals gain better visibility across service lines.

  13. A model for the cost of doing a cost estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1992-01-01

    A model for estimating the cost required to do a cost estimate for Deep Space Network (DSN) projects that range from $0.1 to $100 million is presented. The cost of the cost estimate in thousands of dollars, C(sub E), is found to be approximately given by C(sub E) = K((C(sub p))(sup 0.35)) where C(sub p) is the cost of the project being estimated in millions of dollars and K is a constant depending on the accuracy of the estimate. For an order-of-magnitude estimate, K = 24; for a budget estimate, K = 60; and for a definitive estimate, K = 115. That is, for a specific project, the cost of doing a budget estimate is about 2.5 times as much as that for an order-of-magnitude estimate, and a definitive estimate costs about twice as much as a budget estimate. Use of this model should help provide the level of resources required for doing cost estimates and, as a result, provide insights towards more accurate estimates with less potential for cost overruns.

  14. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  15. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Astrophysics Data System (ADS)

    Wheeler, K.; Knuth, K.; Castle, P.

    2005-12-01

    and IKONOS imagery and the 3-D volume estimates. The combination of these then allow for a rapid and hopefully very accurate estimation of biomass.

  16. You Can Accurately Predict Land Acquisition Costs.

    ERIC Educational Resources Information Center

    Garrigan, Richard

    1967-01-01

    Land acquisition costs were tested for predictability based upon the 1962 assessed valuations of privately held land acquired for campus expansion by the University of Wisconsin from 1963-1965. By correlating the land acquisition costs of 108 properties acquired during the 3 year period with--(1) the assessed value of the land, (2) the assessed…

  17. Equipment Cost Estimator

    SciTech Connect

    2016-08-24

    The ECE application forecasts annual costs of preventive and corrective maintenance for budgeting purposes. Features within the application enable the user to change the specifications of the model to customize your forecast to best fit their needs and support “what if” analysis. Based on the user's selections, the ECE model forecasts annual maintenance costs. Preventive maintenance costs include the cost of labor to perform preventive maintenance activities at the specific frequency and labor rate. Corrective maintenance costs include the cost of labor and the cost of replacement parts. The application presents forecasted maintenance costs for the next five years in two tables: costs by year and costs by site.

  18. Estimating the cost of extremely large telescopes

    NASA Astrophysics Data System (ADS)

    Stepp, Larry M.; Daggert, Larry G.; Gillett, Paul E.

    2003-01-01

    For future giant telescopes, control of construction and operation costs will be the key factor in their success. The best way to accomplish this cost control, while maximizing the performance of the telescope, will be through design-to-cost methods that use value engineering techniques to develop the most cost-effective design in terms of performance per dollar. This will require quantifiable measures of performance and cost, including: (1) a way of quantifying science value with scientific merit functions; (2) a way of predicting telescope performance in the presence of real-world disturbances by means of integrated modeling; and (3) a way of predicting the cost of multiple design configurations. Design-to-cost methods should be applied as early as possible in the project, since the majority of the life-cycle costs for the observatory will be locked in by choices made during the conceptual design phase. However, there is a dilemma: how can costs be accurately estimated for systems that have not yet been designed? This paper discusses cost estimating methods and describes their application to estimating the cost of ELTs, showing that the best method to use during the conceptual design phase is parametric cost estimating. Examples of parametric estimating techniques are described, based on experience gained from instrument development programs at NOAO. We then describe efforts underway to collect historical cost information and develop cost estimating relationships in preparation for the conceptual design phase of the Giant Segmented Mirror Telescope.

  19. Manned Mars mission cost estimate

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph; Smith, Keith

    1986-01-01

    The potential costs of several options of a manned Mars mission are examined. A cost estimating methodology based primarily on existing Marshall Space Flight Center (MSFC) parametric cost models is summarized. These models include the MSFC Space Station Cost Model and the MSFC Launch Vehicle Cost Model as well as other modes and techniques. The ground rules and assumptions of the cost estimating methodology are discussed and cost estimates presented for six potential mission options which were studied. The estimated manned Mars mission costs are compared to the cost of the somewhat analogous Apollo Program cost after normalizing the Apollo cost to the environment and ground rules of the manned Mars missions. It is concluded that a manned Mars mission, as currently defined, could be accomplished for under $30 billion in 1985 dollars excluding launch vehicle development and mission operations.

  20. Cost-estimating relationships for space programs

    NASA Technical Reports Server (NTRS)

    Mandell, Humboldt C., Jr.

    1992-01-01

    Cost-estimating relationships (CERs) are defined and discussed as they relate to the estimation of theoretical costs for space programs. The paper primarily addresses CERs based on analogous relationships between physical and performance parameters to estimate future costs. Analytical estimation principles are reviewed examining the sources of errors in cost models, and the use of CERs is shown to be affected by organizational culture. Two paradigms for cost estimation are set forth: (1) the Rand paradigm for single-culture single-system methods; and (2) the Price paradigms that incorporate a set of cultural variables. For space programs that are potentially subject to even small cultural changes, the Price paradigms are argued to be more effective. The derivation and use of accurate CERs is important for developing effective cost models to analyze the potential of a given space program.

  1. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  2. ARSENIC REMOVAL COST ESTIMATING PROGRAM

    EPA Science Inventory

    The Arsenic Removal Cost Estimating program (Excel) calculates the costs for using adsorptive media and anion exchange treatment systems to remove arsenic from drinking water. The program is an easy-to-use tool to estimate capital and operating costs for three types of arsenic re...

  3. ADP (Automated Data Processing) cost estimating heuristics

    SciTech Connect

    Sadlowe, A.R.; Arrowood, L.F.; Jones, K.A.; Emrich, M.L.; Watson, B.D.

    1987-09-11

    Artificial Intelligence, in particular expert systems methodologies, is being applied to the US Navy's Automated Data Processing estimating tasks. Skilled Navy project leaders are nearing retirement; replacements may not yet possess the many years of experience required to make accurate decisions regarding time, cost, equipment, and personnel needs. The potential departure of expertise resulted in the development of a system to capture organizational expertise. The prototype allows inexperienced project leaders to interactively generate cost estimates. 5 refs.

  4. Estimating the Cost of Doing a Cost Estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1996-01-01

    This article provides a model for estimating the cost required to do a cost estimate...Our earlier work provided data for high technology projects. This article adds data from the construction industry which validates the model over a wider range of technology.

  5. Cost Estimating Handbook for Environmental Restoration

    SciTech Connect

    1990-09-01

    Environmental restoration (ER) projects have presented the DOE and cost estimators with a number of properties that are not comparable to the normal estimating climate within DOE. These properties include: An entirely new set of specialized expressions and terminology. A higher than normal exposure to cost and schedule risk, as compared to most other DOE projects, due to changing regulations, public involvement, resource shortages, and scope of work. A higher than normal percentage of indirect costs to the total estimated cost due primarily to record keeping, special training, liability, and indemnification. More than one estimate for a project, particularly in the assessment phase, in order to provide input into the evaluation of alternatives for the cleanup action. While some aspects of existing guidance for cost estimators will be applicable to environmental restoration projects, some components of the present guidelines will have to be modified to reflect the unique elements of these projects. The purpose of this Handbook is to assist cost estimators in the preparation of environmental restoration estimates for Environmental Restoration and Waste Management (EM) projects undertaken by DOE. The DOE has, in recent years, seen a significant increase in the number, size, and frequency of environmental restoration projects that must be costed by the various DOE offices. The coming years will show the EM program to be the largest non-weapons program undertaken by DOE. These projects create new and unique estimating requirements since historical cost and estimating precedents are meager at best. It is anticipated that this Handbook will enhance the quality of cost data within DOE in several ways by providing: The basis for accurate, consistent, and traceable baselines. Sound methodologies, guidelines, and estimating formats. Sources of cost data/databases and estimating tools and techniques available at DOE cost professionals.

  6. Airframe RDT&E Cost Estimating: A Justification for and Development of Unique Cost Estimating Relationships According to Aircraft Type.

    DTIC Science & Technology

    1982-09-01

    T1I4 PAGO(ftWg, Dle& Bnt e 0 ’-Airframe RDT&E costs are invariably predicted by utilizing one general cost estimatin relationship (CER) regardless of...and in funding decisions demanding reliable , internally consistent estimates of absolute cost (10:1). Cost estimating capability is only as accurate as...pertaining to the RDT&E phase appear to be limited in their ability to accurately predict weapon system development costs. This thesis focuses on a

  7. Solar power satellite cost estimate

    NASA Technical Reports Server (NTRS)

    Harron, R. J.; Wadle, R. C.

    1981-01-01

    The solar power configuration costed is the 5 GW silicon solar cell reference system. The subsystems identified by work breakdown structure elements to the lowest level for which cost information was generated. This breakdown divides into five sections: the satellite, construction, transportation, the ground receiving station and maintenance. For each work breakdown structure element, a definition, design description and cost estimate were included. An effort was made to include for each element a reference that more thoroughly describes the element and the method of costing used. All costs are in 1977 dollars.

  8. Estimating 'costs' for cost-effectiveness analysis.

    PubMed

    Miners, Alec

    2008-01-01

    Since 1999, the National Institute for Health and Clinical Excellence (NICE) Technology Appraisal Programme has been charged with producing guidance for the NHS in England and Wales on the appropriate use of new and existing healthcare programmes. Guidance is based on an assessment of a number of factors, including cost effectiveness. The identification, measurement and valuation of costs are important components of any cost-effectiveness analysis. However, working through these steps raises a number of important methodological questions. For example, how should 'future' resource use be estimated, and is there a need to consider all 'future' costs? Given that NICE produces national guidance, should national unit cost data be used to value resources or should local variations in negotiated prices be taken into account? This paper was initially prepared as a briefing paper as part of the process of updating NICE's 2004 Guide to the Methods of Technology Appraisal for a workshop on 'costs'. It outlines the issues that were raised in the original briefing paper and the subsequent questions that were discussed at the workshop.

  9. Acquisition Cost/Price Estimating

    DTIC Science & Technology

    1981-01-01

    REVIEW AND VALIDATION, (3) RESEARCH AND METHODOLOGY AND (4) DATA ANALYSIS. THESE FUNCTIONAL THRUSTS ARE IN TURN FOCUSED TO ESTIMATING AND ANALISIS ...RELATIONSHIPS, CERs, IS LARGELY A FUNCTION OF THE QUANTITY AND QUALITY OF DATA THAT IS AVAILABLE AT THE TINE OF FORMULATION. IN ORDER TO ENSURE THAT SUCH COST...HAVE BEEN DISCUSSED PREVIOUSLY AS GOOD SOURCES OF DATA FOR COST ESTIMATING. THEIR PRIMARY FUNCTION , HOWEVER, IS TO PROVIDE THE ARMY WITH EARLY

  10. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  11. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  12. Software Development Cost Estimating Handbook

    DTIC Science & Technology

    2009-04-21

    Hill AFB, UT Researching Blueprinting Technical writing I t l i i / ditin erna rev ew ng e ng Naval Center for Cost Analysis (NCCA), Arlington, VA...development processes Software estimating models Defense Acquisition Framework Data collection Acronyms T i lerm no ogy References Systems & Software...Designed for readability and comprehension Large right margin for notes Systems & Software Technology Conference 921 April 2009 Part I - Basics

  13. Cost Estimation and Control for Flight Systems

    NASA Technical Reports Server (NTRS)

    Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)

    2002-01-01

    Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.

  14. Quantifying Accurate Calorie Estimation Using the "Think Aloud" Method

    ERIC Educational Resources Information Center

    Holmstrup, Michael E.; Stearns-Bruening, Kay; Rozelle, Jeffrey

    2013-01-01

    Objective: Clients often have limited time in a nutrition education setting. An improved understanding of the strategies used to accurately estimate calories may help to identify areas of focused instruction to improve nutrition knowledge. Methods: A "Think Aloud" exercise was recorded during the estimation of calories in a standard dinner meal…

  15. Supplemental report on cost estimates'

    SciTech Connect

    1992-04-29

    The Office of Management and Budget (OMB) and the U.S. Army Corps of Engineers have completed an analysis of the Department of Energy's (DOE) Fiscal Year (FY) 1993 budget request for its Environmental Restoration and Waste Management (ERWM) program. The results were presented to an interagency review group (IAG) of senior-Administration officials for their consideration in the budget process. This analysis included evaluations of the underlying legal requirements and cost estimates on which the ERWM budget request was based. The major conclusions are contained in a separate report entitled, ''Interagency Review of the Department of Energy Environmental Restoration and Waste Management Program.'' This Corps supplemental report provides greater detail on the cost analysis.

  16. Estimating decommissioning costs: The 1994 YNPS decommissioning cost study

    SciTech Connect

    Szymczak, W.J.

    1994-12-31

    Early this year, Yankee Atomic Electric Company began developing a revised decommissioning cost estimate for the Yankee Nuclear Power Station (YNPS) to provide a basis for detailed decommissioning planning and to reflect slow progress in siting low-level waste (LLW) and spent-nuclear-fuel disposal facilities. The revision also reflects the need to change from a cost estimate that focuses on overall costs to a cost estimate that is sufficiently detailed to implement decommissioning and identify the final cost of decommissioning.

  17. Estimating the Costs of Preventive Interventions

    ERIC Educational Resources Information Center

    Foster, E. Michael; Porter, Michele M.; Ayers, Tim S.; Kaplan, Debra L.; Sandler, Irwin

    2007-01-01

    The goal of this article is to improve the practice and reporting of cost estimates of prevention programs. It reviews the steps in estimating the costs of an intervention and the principles that should guide estimation. The authors then review prior efforts to estimate intervention costs using a sample of well-known but diverse studies. Finally,…

  18. Accurate genome relative abundance estimation based on shotgun metagenomic reads.

    PubMed

    Xia, Li C; Cram, Jacob A; Chen, Ting; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    Accurate estimation of microbial community composition based on metagenomic sequencing data is fundamental for subsequent metagenomics analysis. Prevalent estimation methods are mainly based on directly summarizing alignment results or its variants; often result in biased and/or unstable estimates. We have developed a unified probabilistic framework (named GRAMMy) by explicitly modeling read assignment ambiguities, genome size biases and read distributions along the genomes. Maximum likelihood method is employed to compute Genome Relative Abundance of microbial communities using the Mixture Model theory (GRAMMy). GRAMMy has been demonstrated to give estimates that are accurate and robust across both simulated and real read benchmark datasets. We applied GRAMMy to a collection of 34 metagenomic read sets from four metagenomics projects and identified 99 frequent species (minimally 0.5% abundant in at least 50% of the data-sets) in the human gut samples. Our results show substantial improvements over previous studies, such as adjusting the over-estimated abundance for Bacteroides species for human gut samples, by providing a new reference-based strategy for metagenomic sample comparisons. GRAMMy can be used flexibly with many read assignment tools (mapping, alignment or composition-based) even with low-sensitivity mapping results from huge short-read datasets. It will be increasingly useful as an accurate and robust tool for abundance estimation with the growing size of read sets and the expanding database of reference genomes.

  19. Hydrogen Station Cost Estimates: Comparing Hydrogen Station Cost Calculator Results with other Recent Estimates

    SciTech Connect

    Melaina, M.; Penev, M.

    2013-09-01

    This report compares hydrogen station cost estimates conveyed by expert stakeholders through the Hydrogen Station Cost Calculation (HSCC) to a select number of other cost estimates. These other cost estimates include projections based upon cost models and costs associated with recently funded stations.

  20. Accurate absolute GPS positioning through satellite clock error estimation

    NASA Astrophysics Data System (ADS)

    Han, S.-C.; Kwon, J. H.; Jekeli, C.

    2001-05-01

    An algorithm for very accurate absolute positioning through Global Positioning System (GPS) satellite clock estimation has been developed. Using International GPS Service (IGS) precise orbits and measurements, GPS clock errors were estimated at 30-s intervals. Compared to values determined by the Jet Propulsion Laboratory, the agreement was at the level of about 0.1 ns (3 cm). The clock error estimates were then applied to an absolute positioning algorithm in both static and kinematic modes. For the static case, an IGS station was selected and the coordinates were estimated every 30 s. The estimated absolute position coordinates and the known values had a mean difference of up to 18 cm with standard deviation less than 2 cm. For the kinematic case, data obtained every second from a GPS buoy were tested and the result from the absolute positioning was compared to a differential GPS (DGPS) solution. The mean differences between the coordinates estimated by the two methods are less than 40 cm and the standard deviations are less than 25 cm. It was verified that this poorer standard deviation on 1-s position results is due to the clock error interpolation from 30-s estimates with Selective Availability (SA). After SA was turned off, higher-rate clock error estimates (such as 1 s) could be obtained by a simple interpolation with negligible corruption. Therefore, the proposed absolute positioning technique can be used to within a few centimeters' precision at any rate by estimating 30-s satellite clock errors and interpolating them.

  1. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  2. Virtualness of the Cost Estimating Community

    DTIC Science & Technology

    2011-03-01

    VIRTUALNESS OF THE COST ESTIMATING COMMUNITY THESIS Whiticar S. Darvill , Capt, USAF AFIT/GCA/ENV/11-M01 DEPARTMENT OF THE AIR...of Master of Science in Cost Analysis Whiticar S. Darvill Capt, USAF March, 2011 APPROVED FOR RELEASE; DISTRUBUTION UNLIMITED AFIT...GCA/ENV/11-M01 VIRTUALNESS OF THE COST ESTIMATING COMMUNITY Whiticar S. Darvill Capt, USAF Approved: //SIGNED

  3. How accurate are physical property estimation programs for organosilicon compounds?

    PubMed

    Boethling, Robert; Meylan, William

    2013-11-01

    Organosilicon compounds are important in chemistry and commerce, and nearly 10% of new chemical substances for which premanufacture notifications are processed by the US Environmental Protection Agency (USEPA) contain silicon (Si). Yet, remarkably few measured values are submitted for key physical properties, and the accuracy of estimation programs such as the Estimation Programs Interface (EPI) Suite and the SPARC Performs Automated Reasoning in Chemistry (SPARC) system is largely unknown. To address this issue, the authors developed an extensive database of measured property values for organic compounds containing Si and evaluated the performance of no-cost estimation programs for several properties of importance in environmental assessment. These included melting point (mp), boiling point (bp), vapor pressure (vp), water solubility, n-octanol/water partition coefficient (log KOW ), and Henry's law constant. For bp and the larger of 2 vp datasets, SPARC, MPBPWIN, and the USEPA's Toxicity Estimation Software Tool (TEST) had similar accuracy. For log KOW and water solubility, the authors tested 11 and 6 no-cost estimators, respectively. The best performers were Molinspiration and WSKOWWIN, respectively. The TEST's consensus mp method outperformed that of MPBPWIN by a considerable margin. Generally, the best programs estimated the listed properties of diverse organosilicon compounds with accuracy sufficient for chemical screening. The results also highlight areas where improvement is most needed.

  4. Fast and accurate estimation for astrophysical problems in large databases

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.

    2010-10-01

    A recent flood of astronomical data has created much demand for sophisticated statistical and machine learning tools that can rapidly draw accurate inferences from large databases of high-dimensional data. In this Ph.D. thesis, methods for statistical inference in such databases will be proposed, studied, and applied to real data. I use methods for low-dimensional parametrization of complex, high-dimensional data that are based on the notion of preserving the connectivity of data points in the context of a Markov random walk over the data set. I show how this simple parameterization of data can be exploited to: define appropriate prototypes for use in complex mixture models, determine data-driven eigenfunctions for accurate nonparametric regression, and find a set of suitable features to use in a statistical classifier. In this thesis, methods for each of these tasks are built up from simple principles, compared to existing methods in the literature, and applied to data from astronomical all-sky surveys. I examine several important problems in astrophysics, such as estimation of star formation history parameters for galaxies, prediction of redshifts of galaxies using photometric data, and classification of different types of supernovae based on their photometric light curves. Fast methods for high-dimensional data analysis are crucial in each of these problems because they all involve the analysis of complicated high-dimensional data in large, all-sky surveys. Specifically, I estimate the star formation history parameters for the nearly 800,000 galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7 spectroscopic catalog, determine redshifts for over 300,000 galaxies in the SDSS photometric catalog, and estimate the types of 20,000 supernovae as part of the Supernova Photometric Classification Challenge. Accurate predictions and classifications are imperative in each of these examples because these estimates are utilized in broader inference problems

  5. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  6. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.

  7. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  8. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  9. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  10. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  11. Towards accurate and precise estimates of lion density.

    PubMed

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2016-12-13

    Reliable estimates of animal density are fundamental to our understanding of ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation biology since wildlife authorities rely on these figures to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging species such as carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores. African lions (Panthera leo) provide an excellent example as although abundance indices have been shown to produce poor inferences, they continue to be used to estimate lion density and inform management and policy. In this study we adapt a Bayesian spatially explicit capture-recapture model to estimate lion density in the Maasai Mara National Reserve (MMNR) and surrounding conservancies in Kenya. We utilize sightings data from a three-month survey period to produce statistically rigorous spatial density estimates. Overall posterior mean lion density was estimated to be 16.85 (posterior standard deviation = 1.30) lions over one year of age per 100km(2) with a sex ratio of 2.2♀:1♂. We argue that such methods should be developed, improved and favored over less reliable methods such as track and call-up surveys. We caution against trend analyses based on surveys of differing reliability and call for a unified framework to assess lion numbers across their range in order for better informed management and policy decisions to be made. This article is protected by copyright. All rights reserved.

  12. Conceptual framework for estimating the social cost of drug abuse.

    PubMed

    French, M T; Rachal, J V; Hubbard, R L

    1991-01-01

    Drug abuse imposes costs on individuals and society. Researchers have produced several studies on a subset of tangible costs of drug abuse and other illnesses, but key tangible costs sometimes have been overlooked and, even when recognized, rarely have been estimated. An assortment of intangible costs also have received very little research attention. This study outlines a comprehensive conceptual framework for estimating the social cost of drug abuse. We address both the tangible and intangible costs for the drug-abusing and non-drug-abusing population. Our conceptual framework is based on critical reviews of new and traditional methods for estimating the costs of illness and disease including cost-of-illness methods, averting behavior methods, and utility valuation techniques. We show how the proposed methods can be combined with existing data to estimate the total social cost of drug abuse. Using social cost estimates will enable policymakers to more accurately assess the total burden of drug abuse and related problems on society.

  13. Data Service Provider Cost Estimation Tool

    NASA Technical Reports Server (NTRS)

    Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel

    2011-01-01

    The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.

  14. Estimating nursing costs--a methodological review.

    PubMed

    Chiang, Bea

    2009-05-01

    A critical cost accounting issue relating to nursing costs is that nursing costs are currently averaged into the daily room rate in the hospitals. This accounting practice treats nursing units as cost centers as nursing costs are bundled into a per diem rate instead of billed separately. As a result, all patients in a given care unit of the hospital are presumed to consume the same amount of nursing care resources. This costing and billing system creates a mismatch between resources consumption and billed charges. The objective of this paper is to (1) demonstrate current practice to estimate nursing costs, (2) classify nursing costs into direct and indirect costs in order to refine the existing approaches and (3) argue that a system of billing nursing costs separately better reflects the costs of patient care.

  15. A Cost Estimation Tool for Charter Schools

    ERIC Educational Resources Information Center

    Hayes, Cheryl D.; Keller, Eric

    2009-01-01

    To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…

  16. Accurate estimators of correlation functions in Fourier space

    NASA Astrophysics Data System (ADS)

    Sefusatti, E.; Crocce, M.; Scoccimarro, R.; Couchman, H. M. P.

    2016-08-01

    Efficient estimators of Fourier-space statistics for large number of objects rely on fast Fourier transforms (FFTs), which are affected by aliasing from unresolved small-scale modes due to the finite FFT grid. Aliasing takes the form of a sum over images, each of them corresponding to the Fourier content displaced by increasing multiples of the sampling frequency of the grid. These spurious contributions limit the accuracy in the estimation of Fourier-space statistics, and are typically ameliorated by simultaneously increasing grid size and discarding high-frequency modes. This results in inefficient estimates for e.g. the power spectrum when desired systematic biases are well under per cent level. We show that using interlaced grids removes odd images, which include the dominant contribution to aliasing. In addition, we discuss the choice of interpolation kernel used to define density perturbations on the FFT grid and demonstrate that using higher order interpolation kernels than the standard Cloud-In-Cell algorithm results in significant reduction of the remaining images. We show that combining fourth-order interpolation with interlacing gives very accurate Fourier amplitudes and phases of density perturbations. This results in power spectrum and bispectrum estimates that have systematic biases below 0.01 per cent all the way to the Nyquist frequency of the grid, thus maximizing the use of unbiased Fourier coefficients for a given grid size and greatly reducing systematics for applications to large cosmological data sets.

  17. Demystifying the Cost Estimation Process

    ERIC Educational Resources Information Center

    Obi, Samuel C.

    2010-01-01

    In manufacturing today, nothing is more important than giving a customer a clear and straight-forward accounting of what their money has purchased. Many potentially promising return business orders are lost because of unclear, ambiguous, or improper billing. One of the best ways of resolving cost bargaining conflicts is by providing a…

  18. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  19. Cross Service Fixed-Wing Cost Estimation

    DTIC Science & Technology

    2016-05-17

    and Navy and used historical data from the costing databases maintained by each service: AFTOC for the Air Force and VAMOSC for the Navy.12 Using the...project was the procurement of data. We used the Air Force and Navy historical costing databases to collect all of our data. The databases had the same... historical data for each of the service weapon systems operational costs that can be used as a basis for future cost estimation stored in databases. It is

  20. A Framework for Automating Cost Estimates in Assembly Processes

    SciTech Connect

    Calton, T.L.; Peters, R.R.

    1998-12-09

    When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process may iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.

  1. Estimating the standardized mean difference with minimum risk: Maximizing accuracy and minimizing cost with sequential estimation.

    PubMed

    Chattopadhyay, Bhargab; Kelley, Ken

    2017-03-01

    The standardized mean difference is a widely used effect size measure. In this article, we develop a general theory for estimating the population standardized mean difference by minimizing both the mean square error of the estimator and the total sampling cost. Fixed sample size methods, when sample size is planned before the start of a study, cannot simultaneously minimize both the mean square error of the estimator and the total sampling cost. To overcome this limitation of the current state of affairs, this article develops a purely sequential sampling procedure, which provides an estimate of the sample size required to achieve a sufficiently accurate estimate with minimum expected sampling cost. Performance of the purely sequential procedure is examined via a simulation study to show that our analytic developments are highly accurate. Additionally, we provide freely available functions in R to implement the algorithm of the purely sequential procedure. (PsycINFO Database Record

  2. An approach to software cost estimation

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.

    1984-01-01

    A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.

  3. Statistical methods of estimating mining costs

    USGS Publications Warehouse

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  4. Estimating archiving costs for engineering records

    SciTech Connect

    Stutz, R.A.; Lamartine, B.C.

    1997-02-01

    Information technology has completely changed the concept of record keeping for engineering projects -- the advent of digital records was a momentous discovery, as significant as the invention of the printing press. Digital records allowed huge amounts of information to be stored in a very small space and to be examined quickly. However, digital documents are much more vulnerable to the passage of time than printed documents because the media on which they are stored are easily affected by physical phenomena, such as magnetic fields, oxidation, material decay, and by various environmental factors that may erase the information. Even more important, digital information becomes obsolete because, even if future generations may be able to read it, they may not necessarily be able to interpret it. Engineering projects of all sizes are becoming more dependent on digital records. These records are created on computers used in design, estimating, construction management, and construction. The necessity for the accurate and accessible storage of these documents, generated by computer software systems, is increasing for a number of reasons including legal and environment issues. This paper will discuss media life considerations and life cycle costs associated with several methods of storing engineering records.

  5. COST ESTIMATING EQUATIONS FOR BEST MANAGEMENT PRACTICES

    EPA Science Inventory

    This paper describes the development of an interactive internet-based cost-estimating tool for commonly used urban storm runoff best management practices (BMP), including: retention and detention ponds, grassed swales, and constructed wetlands. The paper presents the cost data, c...

  6. Decommissioning Cost Estimating -The ''Price'' Approach

    SciTech Connect

    Manning, R.; Gilmour, J.

    2002-02-26

    Over the past 9 years UKAEA has developed a formalized approach to decommissioning cost estimating. The estimating methodology and computer-based application are known collectively as the PRICE system. At the heart of the system is a database (the knowledge base) which holds resource demand data on a comprehensive range of decommissioning activities. This data is used in conjunction with project specific information (the quantities of specific components) to produce decommissioning cost estimates. PRICE is a dynamic cost-estimating tool, which can satisfy both strategic planning and project management needs. With a relatively limited analysis a basic PRICE estimate can be produced and used for the purposes of strategic planning. This same estimate can be enhanced and improved, primarily by the improvement of detail, to support sanction expenditure proposals, and also as a tender assessment and project management tool. The paper will: describe the principles of the PRICE estimating system; report on the experiences of applying the system to a wide range of projects from contaminated car parks to nuclear reactors; provide information on the performance of the system in relation to historic estimates, tender bids, and outturn costs.

  7. Cost estimation model for advanced planetary programs, fourth edition

    NASA Technical Reports Server (NTRS)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  8. Cost Estimating Cases: Educational Tools for Cost Analysts

    DTIC Science & Technology

    1993-09-01

    only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output

  9. Cost Estimation in Engineer-to-Order Manufacturing

    NASA Astrophysics Data System (ADS)

    Hooshmand, Yousef; Köhler, Peter; Korff-Krumm, Andrea

    2016-02-01

    In Engineer-to-Order (ETO) manufacturing the price of products must be defined during early stages of product design and during the bidding process, thus an overestimation of product development (PD) costs may lead to the loss of orders and an underestimation causes a profit loss. What many ETO systems have in common is that the products have to be developed based on different customer requirements so that each order usually results in a new variant. Furthermore, many customer requirement change-requests may arise in different phases of the PD, which is to be considered properly. Thus it is utmost important for ETO systems to have an accurate cost estimation in first stages of the product design and to be able to determine the cost of customer requirement changes in different phases of PD. This paper aims to present a cost estimation methodology as well as a cost estimation model, which estimate the cost of products by relative comparison of the attributes of new product variants with the attributes of standard product variants. In addition, as a necessity in ETO manufacturing, the cost calculation of customer requirement changes in different phases of PD is integrated in the presented method.

  10. Estimation of bone permeability using accurate microstructural measurements.

    PubMed

    Beno, Thoma; Yoon, Young-June; Cowin, Stephen C; Fritton, Susannah P

    2006-01-01

    While interstitial fluid flow is necessary for the viability of osteocytes, it is also believed to play a role in bone's mechanosensory system by shearing bone cell membranes or causing cytoskeleton deformation and thus activating biochemical responses that lead to the process of bone adaptation. However, the fluid flow properties that regulate bone's adaptive response are poorly understood. In this paper, we present an analytical approach to determine the degree of anisotropy of the permeability of the lacunar-canalicular porosity in bone. First, we estimate the total number of canaliculi emanating from each osteocyte lacuna based on published measurements from parallel-fibered shaft bones of several species (chick, rabbit, bovine, horse, dog, and human). Next, we determine the local three-dimensional permeability of the lacunar-canalicular porosity for these species using recent microstructural measurements and adapting a previously developed model. Results demonstrated that the number of canaliculi per osteocyte lacuna ranged from 41 for human to 115 for horse. Permeability coefficients were found to be different in three local principal directions, indicating local orthotropic symmetry of bone permeability in parallel-fibered cortical bone for all species examined. For the range of parameters investigated, the local lacunar-canalicular permeability varied more than three orders of magnitude, with the osteocyte lacunar shape and size along with the 3-D canalicular distribution determining the degree of anisotropy of the local permeability. This two-step theoretical approach to determine the degree of anisotropy of the permeability of the lacunar-canalicular porosity will be useful for accurate quantification of interstitial fluid movement in bone.

  11. Accurate estimates of age at maturity from the growth trajectories of fishes and other ectotherms.

    PubMed

    Honsey, Andrew E; Staples, David F; Venturelli, Paul A

    2017-01-01

    Age at maturity (AAM) is a key life history trait that provides insight into ecology, evolution, and population dynamics. However, maturity data can be costly to collect or may not be available. Life history theory suggests that growth is biphasic for many organisms, with a change-point in growth occurring at maturity. If so, then it should be possible to use a biphasic growth model to estimate AAM from growth data. To test this prediction, we used the Lester biphasic growth model in a likelihood profiling framework to estimate AAM from length at age data. We fit our model to simulated growth trajectories to determine minimum data requirements (in terms of sample size, precision in length at age, and the cost to somatic growth of maturity) for accurate AAM estimates. We then applied our method to a large walleye Sander vitreus data set and show that our AAM estimates are in close agreement with conventional estimates when our model fits well. Finally, we highlight the potential of our method by applying it to length at age data for a variety of ectotherms. Our method shows promise as a tool for estimating AAM and other life history traits from contemporary and historical samples.

  12. Support to LANL: Cost estimation. Final report

    SciTech Connect

    Not Available

    1993-10-04

    This report summarizes the activities and progress by ICF Kaiser Engineers conducted on behalf of Los Alamos National Laboratories (LANL) for the US Department of Energy, Office of Waste Management (EM-33) in the area of improving methods for Cost Estimation. This work was conducted between October 1, 1992 and September 30, 1993. ICF Kaiser Engineers supported LANL in providing the Office of Waste Management with planning and document preparation services for a Cost and Schedule Estimating Guide (Guide). The intent of the Guide was to use Activity-Based Cost (ABC) estimation as a basic method in preparing cost estimates for DOE planning and budgeting documents, including Activity Data Sheets (ADSs), which form the basis for the Five Year Plan document. Prior to the initiation of the present contract with LANL, ICF Kaiser Engineers was tasked to initiate planning efforts directed toward a Guide. This work, accomplished from June to September, 1992, included visits to eight DOE field offices and consultation with DOE Headquarters staff to determine the need for a Guide, the desired contents of a Guide, and the types of ABC estimation methods and documentation requirements that would be compatible with current or potential practices and expertise in existence at DOE field offices and their contractors.

  13. Cost estimating procedure for unmanned satellites

    NASA Astrophysics Data System (ADS)

    Greer, H.; Campbell, H. G.

    1980-11-01

    Historical costs from 11 unmanned satellite programs were analyzed. From these data, total satellite cost estimating relationships (CERs) were developed for use during preliminary design studies. A time-related factor, which it is believed accounts for differences in technology, was observed in the data. Stratification of the data by type of payload was also found to be necessary. Cost differences that stem from production quantity variations were accounted for by adjustment factors developed from standard learning curve theory. An example to illustrate use of the CERs is provided.

  14. PACE 2: Pricing and Cost Estimating Handbook

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.; Shepherd, T.

    1977-01-01

    An automatic data processing system to be used for the preparation of industrial engineering type manhour and material cost estimates has been established. This computer system has evolved into a highly versatile and highly flexible tool which significantly reduces computation time, eliminates computational errors, and reduces typing and reproduction time for estimators and pricers since all mathematical and clerical functions are automatic once basic inputs are derived.

  15. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  16. Cost estimate of initial SSC experimental equipment

    SciTech Connect

    1986-06-01

    The cost of the initial detector complement at recently constructed colliding beam facilities (or at those under construction) has been a significant fraction of the cost of the accelerator complex. Because of the complexity of large modern-day detectors, the time-scale for their design and construction is comparable to the time-scale needed for accelerator design and construction. For these reasons it is appropriate to estimate the cost of the anticipated detector complement in parallel with the cost estimates of the collider itself. The fundamental difficulty with this procedure is that, whereas a firm conceptual design of the collider does exist, comparable information is unavailable for the detectors. Traditionally, these have been built by the high energy physics user community according to their perception of the key scientific problems that need to be addressed. The role of the accelerator laboratory in that process has involved technical and managerial coordination and the allocation of running time and local facilities among the proposed experiments. It seems proper that the basic spirit of experimentation reflecting the scientific judgment of the community should be preserved at the SSC. Furthermore, the formal process of initiation of detector proposals can only start once the SSC has been approved as a construction project and a formal laboratory administration put in place. Thus an ad hoc mechanism had to be created to estimate the range of potential detector needs, potential detector costs, and associated computing equipment.

  17. PROCEDURE FOR ESTIMATING PERMANENT TOTAL ENCLOSURE COSTS

    EPA Science Inventory

    The paper discusses a procedure for estimating permanent total enclosure (PTE) costs. (NOTE: Industries that use add-on control devices must adequately capture emissions before delivering them to the control device. One way to capture emissions is to use PTEs, enclosures that mee...

  18. Hydrogen from coal cost estimation guidebook

    NASA Technical Reports Server (NTRS)

    Billings, R. E.

    1981-01-01

    In an effort to establish baseline information whereby specific projects can be evaluated, a current set of parameters which are typical of coal gasification applications was developed. Using these parameters a computer model allows researchers to interrelate cost components in a sensitivity analysis. The results make possible an approximate estimation of hydrogen energy economics from coal, under a variety of circumstances.

  19. Unmanned Aerial Vehicles unique cost estimating requirements

    NASA Astrophysics Data System (ADS)

    Malone, P.; Apgar, H.; Stukes, S.; Sterk, S.

    Unmanned Aerial Vehicles (UAVs), also referred to as drones, are aerial platforms that fly without a human pilot onboard. UAVs are controlled autonomously by a computer in the vehicle or under the remote control of a pilot stationed at a fixed ground location. There are a wide variety of drone shapes, sizes, configurations, complexities, and characteristics. Use of these devices by the Department of Defense (DoD), NASA, civil and commercial organizations continues to grow. UAVs are commonly used for intelligence, surveillance, reconnaissance (ISR). They are also use for combat operations, and civil applications, such as firefighting, non-military security work, surveillance of infrastructure (e.g. pipelines, power lines and country borders). UAVs are often preferred for missions that require sustained persistence (over 4 hours in duration), or are “ too dangerous, dull or dirty” for manned aircraft. Moreover, they can offer significant acquisition and operations cost savings over traditional manned aircraft. Because of these unique characteristics and missions, UAV estimates require some unique estimating methods. This paper describes a framework for estimating UAV systems total ownership cost including hardware components, software design, and operations. The challenge of collecting data, testing the sensitivities of cost drivers, and creating cost estimating relationships (CERs) for each key work breakdown structure (WBS) element is discussed. The autonomous operation of UAVs is especially challenging from a software perspective.

  20. Estimating Teacher Turnover Costs: A Case Study

    ERIC Educational Resources Information Center

    Levy, Abigail Jurist; Joy, Lois; Ellis, Pamela; Jablonski, Erica; Karelitz, Tzur M.

    2012-01-01

    High teacher turnover in large U.S. cities is a critical issue for schools and districts, and the students they serve; but surprisingly little work has been done to develop methodologies and standards that districts and schools can use to make reliable estimates of turnover costs. Even less is known about how to detect variations in turnover costs…

  1. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  2. Fast and Accurate Learning When Making Discrete Numerical Estimates

    PubMed Central

    Sanborn, Adam N.; Beierholm, Ulrik R.

    2016-01-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  3. Software Estimates Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Smith, C. L.

    2003-01-01

    Simulation-Based Cost Model (SiCM), a discrete event simulation developed in Extend , simulates pertinent aspects of the testing of rocket propulsion test articles for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  4. Software Estimates Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Simulation-Based Cost Model (SiCM) is a computer program that simulates pertinent aspects of the testing of rocket engines for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  5. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE BIOAVAILABILITY OF LEAD TO QUAIL

    EPA Science Inventory

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contami...

  6. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb, we incorporated Pb-contaminated soils or Pb acetate into diets for Japanese quail (Coturnix japonica), fed the quail for 15 days, and ...

  7. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities.

    PubMed

    Helb, Danica A; Tetteh, Kevin K A; Felgner, Philip L; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R; Beeson, James G; Tappero, Jordan; Smith, David L; Crompton, Peter D; Rosenthal, Philip J; Dorsey, Grant; Drakeley, Christopher J; Greenhouse, Bryan

    2015-08-11

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual's recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86-0.93), whereas responses to six antigens accurately estimated an individual's malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs.

  8. Probabilistic cost estimates for climate change mitigation.

    PubMed

    Rogelj, Joeri; McCollum, David L; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-01-03

    For more than a decade, the target of keeping global warming below 2 °C has been a key focus of the international climate debate. In response, the scientific community has published a number of scenario studies that estimate the costs of achieving such a target. Producing these estimates remains a challenge, particularly because of relatively well known, but poorly quantified, uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on the one hand, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other hand, has spent years improving its understanding of the geophysical response of the Earth system to emissions of greenhouse gases. This geophysical response remains a key uncertainty in the cost of mitigation scenarios but has been integrated with assessments of other uncertainties in only a rudimentary manner, that is, for equilibrium conditions. Here we bridge this gap between the two research communities by generating distributions of the costs associated with limiting transient global temperature increase to below specific values, taking into account uncertainties in four factors: geophysical, technological, social and political. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical uncertainties, social factors influencing future energy demand and, lastly, technological uncertainties surrounding the availability of greenhouse gas mitigation options. Our information on temperature risk and mitigation costs provides crucial information for policy-making, because it clarifies the relative importance of mitigation costs, energy demand and the timing of global action in reducing the risk of exceeding a global temperature increase of 2 °C, or other limits such as 3 °C or 1.5

  9. Essays on the Bayesian estimation of stochastic cost frontier

    NASA Astrophysics Data System (ADS)

    Zhao, Xia

    This dissertation consists of three essays that focus on a Bayesian estimation of stochastic cost frontiers for electric generation plants. This research gives insight into the changing development of the electric generation market and could serve to inform both private investment and public policy decisions. The main contributions to the growing literature on stochastic cost frontier analysis are to (1) Empirically estimate the possible efficiency gain of power plants due to deregulation. (2) Estimate the cost of electric power generating plants using coal as a fuel taking into account both regularity restrictions and sulfur dioxide emissions. (3) Compare costs of plants using coal to those who use natural gas. (4) Apply the Bayesian stochastic frontier model to estimate a single cost frontier and allow firm type to vary across regulated and deregulated plants. The average group efficiency for two different types of plants is estimated. (5) Use a fixed effects and random effects model on an unbalanced panel to estimated group efficiency for regulated and deregulated plants. The first essay focuses on the possible efficiency gain of 136 U.S. electric power generation coal-fired plants in 1996. Results favor the constrained model over the unconstrained model. SO2 is also included in the model to provide more accurate estimates of plant efficiency and returns to scale. The second essay focuses on the predicted costs and returns to scale of coal generation to natural gas generation at plants where the cost of both fuels could be obtained. It is found that, for power plants switching fuel from natural gas to coal in 1996, on average, the expected fuel cost would fall and returns to scale would increase. The third essay first uses pooled unbalanced panel data to analyze the differences in plant efficiency across plant types---regulated and deregulated. The application of a Bayesian stochastic frontier model enables us to apply different mean plant inefficiency terms by

  10. An Evaluation of Software Cost Estimating Models.

    DTIC Science & Technology

    1981-06-01

    EVALUATION OF SOFTWARE COST ESTIMATING Sep 73- Oct 79 MODELS. R14- --. R IOTNME 7. AUTHOR (.) * ce.4 **CT OR GRANT NUMBER(C.’ * ~ Robert Thibodeau K 1 F30602...review of the draft DCP begins, the program can be terminated with the approval of the highest command level which authorized it. Once DSARC review begins...concert with many other elements. Initially, we might speak of the navigation subsystem and its functions. Later, we would describe the alignment element

  11. Estimating the social costs of nitrogen pollution

    NASA Astrophysics Data System (ADS)

    Gourevitch, J.; Keeler, B.; Polasky, S.

    2014-12-01

    Agricultural expansion can degrade water quality and related ecosystem services through increased export of nutrients. Such damages to water quality can negatively affect recreation, property values, and human health. While the relationship between agricultural production and nitrogen export is well-studied, the economic costs of nitrogen loss are less well understood. We present a comprehensive assessment of the full costs associated with nitrate pollution from agricultural sources in Minnesota. We found that the most significant economic costs are likely from groundwater contamination of nitrate in public and private wells. For example, we estimated that loss of grassland to corn cultivation in Minnesota between 2007 and 2012 is expected to increase the future number of domestic wells exceeding nitrate concentrations of 10 ppm by 31%. This increase in contamination is estimated to cost well owners $1.4 to 19 million (present values over a 20 year horizon) through remediation, avoidance, and replacement. Our findings demonstrate linkages between changes in land use, water quality, and human well-being.

  12. Accurate feature detection and estimation using nonlinear and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Rudin, Leonid; Osher, Stanley

    1994-11-01

    A program for feature detection and estimation using nonlinear and multiscale analysis was completed. The state-of-the-art edge detection was combined with multiscale restoration (as suggested by the first author) and robust results in the presence of noise were obtained. Successful applications to numerous images of interest to DOD were made. Also, a new market in the criminal justice field was developed, based in part, on this work.

  13. The use of parametric cost estimating relationships for transport aircraft systems in establishing initial Design to Cost Targets

    NASA Technical Reports Server (NTRS)

    Beltramo, M. N.; Anderson, J. L.

    1977-01-01

    This paper provides a brief overview of Design to Cost (DTC). Problems inherent in attempting to estimate costs are discussed, along with techniques and types of models that have been developed to estimate aircraft costs. A set of cost estimating relationships that estimate the total production cost of commercial and military transport aircraft at the systems level is presented and the manner in which these equations might be used effectively in developing initial DTC targets is indicated. The principal point made in this paper is that, by using a disagregated set of equations to estimate transport aircraft costs at the systems level, reasonably accurate preliminary cost estimates may be achieved. These estimates may serve directly as initial DTC targets, or adjustments may be made to the estimates obtained for some of the systems to estimate the production cost impact of alternative designs or manufacturing technologies. The relative ease by which estimates may be made with this model, the flexibility it provides by being disaggregated, and the accuracy of the estimates it provides make it a unique and useful tool in establishing initial DTC targets.

  14. Accurate tempo estimation based on harmonic + noise decomposition

    NASA Astrophysics Data System (ADS)

    Alonso, Miguel; Richard, Gael; David, Bertrand

    2006-12-01

    We present an innovative tempo estimation system that processes acoustic audio signals and does not use any high-level musical knowledge. Our proposal relies on a harmonic + noise decomposition of the audio signal by means of a subspace analysis method. Then, a technique to measure the degree of musical accentuation as a function of time is developed and separately applied to the harmonic and noise parts of the input signal. This is followed by a periodicity estimation block that calculates the salience of musical accents for a large number of potential periods. Next, a multipath dynamic programming searches among all the potential periodicities for the most consistent prospects through time, and finally the most energetic candidate is selected as tempo. Our proposal is validated using a manually annotated test-base containing 961 music signals from various musical genres. In addition, the performance of the algorithm under different configurations is compared. The robustness of the algorithm when processing signals of degraded quality is also measured.

  15. Fast and Accurate Estimates of Divergence Times from Big Data.

    PubMed

    Mello, Beatriz; Tao, Qiqing; Tamura, Koichiro; Kumar, Sudhir

    2017-01-01

    Ongoing advances in sequencing technology have led to an explosive expansion in the molecular data available for building increasingly larger and more comprehensive timetrees. However, Bayesian relaxed-clock approaches frequently used to infer these timetrees impose a large computational burden and discourage critical assessment of the robustness of inferred times to model assumptions, influence of calibrations, and selection of optimal data subsets. We analyzed eight large, recently published, empirical datasets to compare time estimates produced by RelTime (a non-Bayesian method) with those reported by using Bayesian approaches. We find that RelTime estimates are very similar to Bayesian approaches, yet RelTime requires orders of magnitude less computational time. This means that the use of RelTime will enable greater rigor in molecular dating, because faster computational speeds encourage more extensive testing of the robustness of inferred timetrees to prior assumptions (models and calibrations) and data subsets. Thus, RelTime provides a reliable and computationally thrifty approach for dating the tree of life using large-scale molecular datasets.

  16. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    USGS Publications Warehouse

    Beyer, W. Nelson; Basta, Nicholas T; Chaney, Rufus L.; Henry, Paula F.; Mosby, David; Rattner, Barnett A.; Scheckel, Kirk G.; Sprague, Dan; Weber, John

    2016-01-01

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with phosphorus significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite and tertiary Pb phosphate), and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb.

  17. Bioaccessibility tests accurately estimate bioavailability of lead to quail.

    PubMed

    Beyer, W Nelson; Basta, Nicholas T; Chaney, Rufus L; Henry, Paula F P; Mosby, David E; Rattner, Barnett A; Scheckel, Kirk G; Sprague, Daniel T; Weber, John S

    2016-09-01

    Hazards of soil-borne lead (Pb) to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, the authors measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from 5 Pb-contaminated Superfund sites had relative bioavailabilities from 33% to 63%, with a mean of approximately 50%. Treatment of 2 of the soils with phosphorus (P) significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in 6 in vitro tests and regressed on bioavailability: the relative bioavailability leaching procedure at pH 1.5, the same test conducted at pH 2.5, the Ohio State University in vitro gastrointestinal method, the urban soil bioaccessible lead test, the modified physiologically based extraction test, and the waterfowl physiologically based extraction test. All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the relative bioavailability leaching procedure at pH 2.5 and Ohio State University in vitro gastrointestinal tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite, and tertiary Pb phosphate) and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb, and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb. Environ Toxicol Chem 2016;35:2311-2319. Published 2016 Wiley Periodicals Inc. on behalf of

  18. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... historical costs, and other analyses used to generate cost estimates. (b) General. The Contractor shall... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Cost estimating system... of Provisions And Clauses 252.215-7002 Cost estimating system requirements. As prescribed in...

  19. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  20. Counting the cost of not costing HIV health facilities accurately: pay now, or pay more later.

    PubMed

    Beck, Eduard J; Avila, Carlos; Gerbase, Sofia; Harling, Guy; De Lay, Paul

    2012-10-01

    The HIV pandemic continues to be one of our greatest contemporary public health threats. Policy makers in many middle- and low-income countries are in the process of scaling up HIV prevention, treatment and care services in the context of a reduction in international HIV funding due to the global economic downturn. In order to scale up services that are sustainable in the long term, policy makers and implementers need to have access to robust and contemporary strategic information, including financial information on expenditure and cost, in order to be able to plan, implement, monitor and evaluate HIV services. A major problem in middle- and low-income countries continues to be a lack of basic information on the use of services, their cost, outcome and impact, while those few costing studies that have been performed were often not done in a standardized fashion. Some researchers handle this by transposing information from one country to another, developing mathematical or statistical models that rest on assumptions or information that may not be applicable, or using top-down costing methods that only provide global financial costs rather than using bottom-up ingredients-based costing. While these methods provide answers in the short term, countries should develop systematic data collection systems to store, transfer and produce robust and contemporary strategic financial information for stakeholders at local, sub-national and national levels. National aggregated information should act as the main source of financial data for international donors, agencies or other organizations involved with the global HIV response. This paper describes the financial information required by policy makers and other stakeholders to enable them to make evidence-informed decisions and reviews the quantity and quality of the financial information available, as indicated by cost studies published between 1981 and 2008. Among the lessons learned from reviewing these studies, a need was

  1. Estimated generic prices of cancer medicines deemed cost-ineffective in England: a cost estimation analysis

    PubMed Central

    Hill, Andrew; Redd, Christopher; Gotham, Dzintars; Erbacher, Isabelle; Meldrum, Jonathan; Harada, Ryo

    2017-01-01

    Objectives The aim of this study was to estimate lowest possible treatment costs for four novel cancer drugs, hypothesising that generic manufacturing could significantly reduce treatment costs. Setting This research was carried out in a non-clinical research setting using secondary data. Participants There were no human participants in the study. Four drugs were selected for the study: bortezomib, dasatinib, everolimus and gefitinib. These medications were selected according to their clinical importance, novel pharmaceutical actions and the availability of generic price data. Primary and secondary outcome measures Target costs for treatment were to be generated for each indication for each treatment. The primary outcome measure was the target cost according to a production cost calculation algorithm. The secondary outcome measure was the target cost as the lowest available generic price; this was necessary where export data were not available to generate an estimate from our cost calculation algorithm. Other outcomes included patent expiry dates and total eligible treatment populations. Results Target prices were £411 per cycle for bortezomib, £9 per month for dasatinib, £852 per month for everolimus and £10 per month for gefitinib. Compared with current list prices in England, these target prices would represent reductions of 74–99.6%. Patent expiry dates were bortezomib 2014–22, dasatinib 2020–26, everolimus 2019–25 and gefitinib 2017. The total global eligible treatment population in 1 year is 769 736. Conclusions Our findings demonstrate that affordable drug treatment costs are possible for novel cancer drugs, suggesting that new therapeutic options can be made available to patients and doctors worldwide. Assessing treatment cost estimations alongside cost-effectiveness evaluations is an important area of future research. PMID:28110283

  2. Accurate and Robust Attitude Estimation Using MEMS Gyroscopes and a Monocular Camera

    NASA Astrophysics Data System (ADS)

    Kobori, Norimasa; Deguchi, Daisuke; Takahashi, Tomokazu; Ide, Ichiro; Murase, Hiroshi

    In order to estimate accurate rotations of mobile robots and vehicle, we propose a hybrid system which combines a low-cost monocular camera with gyro sensors. Gyro sensors have drift errors that accumulate over time. On the other hand, a camera cannot obtain the rotation continuously in the case where feature points cannot be extracted from images, although the accuracy is better than gyro sensors. To solve these problems we propose a method for combining these sensors based on Extended Kalman Filter. The errors of the gyro sensors are corrected by referring to the rotations obtained from the camera. In addition, by using the reliability judgment of camera rotations and devising the state value of the Extended Kalman Filter, even when the rotation is not continuously observable from the camera, the proposed method shows a good performance. Experimental results showed the effectiveness of the proposed method.

  3. Rapid, cost-effective and accurate quantification of Yucca schidigera Roezl. steroidal saponins using HPLC-ELSD method.

    PubMed

    Tenon, Mathieu; Feuillère, Nicolas; Roller, Marc; Birtić, Simona

    2017-04-15

    Yucca GRAS-labelled saponins have been and are increasingly used in food/feed, pharmaceutical or cosmetic industries. Existing techniques presently used for Yucca steroidal saponin quantification remain either inaccurate and misleading or accurate but time consuming and cost prohibitive. The method reported here addresses all of the above challenges. HPLC/ELSD technique is an accurate and reliable method that yields results of appropriate repeatability and reproducibility. This method does not over- or under-estimate levels of steroidal saponins. HPLC/ELSD method does not require each and every pure standard of saponins, to quantify the group of steroidal saponins. The method is a time- and cost-effective technique that is suitable for routine industrial analyses. HPLC/ELSD methods yield a saponin fingerprints specific to the plant species. As the method is capable of distinguishing saponin profiles from taxonomically distant species, it can unravel plant adulteration issues.

  4. IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.

    PubMed

    Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio

    2016-03-01

    The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result.

  5. ICPP tank farm closure study. Volume 3: Cost estimates, planning schedules, yearly cost flowcharts, and life-cycle cost estimates

    SciTech Connect

    1998-02-01

    This volume contains information on cost estimates, planning schedules, yearly cost flowcharts, and life-cycle costs for the six options described in Volume 1, Section 2: Option 1 -- Total removal clean closure; No subsequent use; Option 2 -- Risk-based clean closure; LLW fill; Option 3 -- Risk-based clean closure; CERCLA fill; Option 4 -- Close to RCRA landfill standards; LLW fill; Option 5 -- Close to RCRA landfill standards; CERCLA fill; and Option 6 -- Close to RCRA landfill standards; Clean fill. This volume is divided into two portions. The first portion contains the cost and planning schedule estimates while the second portion contains life-cycle costs and yearly cash flow information for each option.

  6. Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information

    NASA Technical Reports Server (NTRS)

    Butts, Glenn

    2007-01-01

    Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.

  7. Accurate and cost-effective natural resource data from super large scale aerial photography

    NASA Astrophysics Data System (ADS)

    Grotefendt, Richard Alan

    Increasing amounts and types of timely and accurate data are required for monitoring to ensure compliance with natural resource regulatory requirements. This study developed a cost-effective method to partially fulfill these data requirements using super large scale aerial photography (Scale: greater than 1:2,000). Two synchronized, metric, Rolleiflex 70mm (2.76in) cameras mounted 12m (40ft) apart on a rigid platform and carried at 5.6 km/hr (3 knots) by a helicopter collected this high resolution, 3D imagery from Alaska and Washington. The overlapping photo pairs provided 3D views of natural resource objects as fine as twigs. The 12m (40ft) inter-camera distance improved ground visibility between tree crowns of dense old growth forests. Analytical stereoplotters and the application of photogrammetric principles enabled measurement and interpretation of photo objects such as trees and their height in a cost-effective way. Horizontal and vertical measurement accuracy was within 2% and 3% of field measurement, respectively. Forest inventory and riparian buffer monitoring applications were used to test this method. Although field work is still required to develop photo-field relationships unique to each ecosystem and for quality assurance, the photo estimates of individual tree height, volume, diameter, type, and location, as well as down tree decay class and landing spot, plot timber volume, and area were comparable to and may replace approximately 95% of field effort. For example, the average of the absolute differences between field and photo estimates for tree height was 2.4m (7.8ft) (s.d. = 2.1m (6.8ft), n = 376), diameter at breast height (1.4m (4.5ft) above ground on uphill tree side) was 5.8cm (2.3in) (s.d. = 5.6cm (2.2in), n = 109), and plot volume in gross board feet was within 10.9% to 13.4% (n = 10) depending on the estimator used. Forest type was correctly classified 99.4% (n = 180) of the time. Timber inventory, species identification, sample

  8. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth

  9. AN OVERVIEW OF TOOL FOR RESPONSE ACTION COST ESTIMATING (TRACE)

    SciTech Connect

    FERRIES SR; KLINK KL; OSTAPKOWICZ B

    2012-01-30

    Tools and techniques that provide improved performance and reduced costs are important to government programs, particularly in current times. An opportunity for improvement was identified for preparation of cost estimates used to support the evaluation of response action alternatives. As a result, CH2M HILL Plateau Remediation Company has developed Tool for Response Action Cost Estimating (TRACE). TRACE is a multi-page Microsoft Excel{reg_sign} workbook developed to introduce efficiencies into the timely and consistent production of cost estimates for response action alternatives. This tool combines costs derived from extensive site-specific runs of commercially available remediation cost models with site-specific and estimator-researched and derived costs, providing the best estimating sources available. TRACE also provides for common quantity and key parameter links across multiple alternatives, maximizing ease of updating estimates and performing sensitivity analyses, and ensuring consistency.

  10. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  11. Closing the Credibility Gap in Construction Cost Estimating.

    ERIC Educational Resources Information Center

    Picardi, E. Alfred

    The construction cost estimate, often expressed as an absolute cost, leads to misunderstanding between client, designers, and builders. If estimates are to be used as adequate cost indicators, their probabilistic nature must be recognized and they must be expressed not as absolute numbers but in terms of a number with some indication of the…

  12. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  13. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  14. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  15. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  16. Assuring Software Cost Estimates: Is it an Oxymoron?

    NASA Technical Reports Server (NTRS)

    Hihn, Jarius; Tregre, Grant

    2013-01-01

    The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.

  17. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are

  18. Mars Rover/Sample Return - Phase A cost estimation

    NASA Technical Reports Server (NTRS)

    Stancati, Michael L.; Spadoni, Daniel J.

    1990-01-01

    This paper presents a preliminary cost estimate for the design and development of the Mars Rover/Sample Return (MRSR) mission. The estimate was generated using a modeling tool specifically built to provide useful cost estimates from design parameters of the type and fidelity usually available during early phases of mission design. The model approach and its application to MRSR are described.

  19. Pros, Cons, and Alternatives to Weight Based Cost Estimating

    NASA Technical Reports Server (NTRS)

    Joyner, Claude R.; Lauriem, Jonathan R.; Levack, Daniel H.; Zapata, Edgar

    2011-01-01

    Many cost estimating tools use weight as a major parameter in projecting the cost. This is often combined with modifying factors such as complexity, technical maturity of design, environment of operation, etc. to increase the fidelity of the estimate. For a set of conceptual designs, all meeting the same requirements, increased weight can be a major driver in increased cost. However, once a design is fixed, increased weight generally decreases cost, while decreased weight generally increases cost - and the relationship is not linear. Alternative approaches to estimating cost without using weight (except perhaps for materials costs) have been attempted to try to produce a tool usable throughout the design process - from concept studies through development. This paper will address the pros and cons of using weight based models for cost estimating, using liquid rocket engines as the example. It will then examine approaches that minimize the impct of weight based cost estimating. The Rocket Engine- Cost Model (RECM) is an attribute based model developed internally by Pratt & Whitney Rocketdyne for NASA. RECM will be presented primarily to show a successful method to use design and programmatic parameters instead of weight to estimate both design and development costs and production costs. An operations model developed by KSC, the Launch and Landing Effects Ground Operations model (LLEGO), will also be discussed.

  20. Estimating the cost of major ongoing cost plus hardware development programs

    NASA Technical Reports Server (NTRS)

    Bush, J. C.

    1990-01-01

    Approaches are developed for forecasting the cost of major hardware development programs while these programs are in the design and development C/D phase. Three approaches are developed: a schedule assessment technique for bottom-line summary cost estimation, a detailed cost estimation approach, and an intermediate cost element analysis procedure. The schedule assessment technique was developed using historical cost/schedule performance data.

  1. Accurate Non-parametric Estimation of Recent Effective Population Size from Segments of Identity by Descent.

    PubMed

    Browning, Sharon R; Browning, Brian L

    2015-09-03

    Existing methods for estimating historical effective population size from genetic data have been unable to accurately estimate effective population size during the most recent past. We present a non-parametric method for accurately estimating recent effective population size by using inferred long segments of identity by descent (IBD). We found that inferred segments of IBD contain information about effective population size from around 4 generations to around 50 generations ago for SNP array data and to over 200 generations ago for sequence data. In human populations that we examined, the estimates of effective size were approximately one-third of the census size. We estimate the effective population size of European-ancestry individuals in the UK four generations ago to be eight million and the effective population size of Finland four generations ago to be 0.7 million. Our method is implemented in the open-source IBDNe software package.

  2. Accurate Non-parametric Estimation of Recent Effective Population Size from Segments of Identity by Descent

    PubMed Central

    Browning, Sharon R.; Browning, Brian L.

    2015-01-01

    Existing methods for estimating historical effective population size from genetic data have been unable to accurately estimate effective population size during the most recent past. We present a non-parametric method for accurately estimating recent effective population size by using inferred long segments of identity by descent (IBD). We found that inferred segments of IBD contain information about effective population size from around 4 generations to around 50 generations ago for SNP array data and to over 200 generations ago for sequence data. In human populations that we examined, the estimates of effective size were approximately one-third of the census size. We estimate the effective population size of European-ancestry individuals in the UK four generations ago to be eight million and the effective population size of Finland four generations ago to be 0.7 million. Our method is implemented in the open-source IBDNe software package. PMID:26299365

  3. Estimating the cost of health care-associated infections: mind your p's and q's.

    PubMed

    Graves, Nicholas; Harbarth, Stephan; Beyersmann, Jan; Barnett, Adrian; Halton, Kate; Cooper, Ben

    2010-04-01

    Monetary valuations of the economic cost of health care-associated infections (HAIs) are important for decision making and should be estimated accurately. Erroneously high estimates of costs, designed to jolt decision makers into action, may do more harm than good in the struggle to attract funding for infection control. Expectations among policy makers might be raised, and then they are disappointed when the reduction in the number of HAIs does not yield the anticipated cost saving. For this article, we critically review the field and discuss 3 questions. Why measure the cost of an HAI? What outcome should be used to measure the cost of an HAI? What is the best method for making this measurement? The aim is to encourage researchers to collect and then disseminate information that accurately guides decisions about the economic value of expanding or changing current infection control activities.

  4. Bi-fluorescence imaging for estimating accurately the nuclear condition of Rhizoctonia spp.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aims: To simplify the determination of the nuclear condition of the pathogenic Rhizoctonia, which currently needs to be performed either using two fluorescent dyes, thus is more costly and time-consuming, or using only one fluorescent dye, and thus less accurate. Methods and Results: A red primary ...

  5. Adaptive arrival cost update for improving Moving Horizon Estimation performance.

    PubMed

    Sánchez, G; Murillo, M; Giovanini, L

    2017-03-01

    Moving horizon estimation is an efficient technique to estimate states and parameters of constrained dynamical systems. It relies on the solution of a finite horizon optimization problem to compute the estimates, providing a natural framework to handle bounds and constraints on estimates, noises and parameters. However, the approximation of the arrival cost and its updating mechanism are an active research topic. The arrival cost is very important because it provides a mean to incorporate information from previous measurements to the current estimates and it is difficult to estimate its true value. In this work, we exploit the features of adaptive estimation methods to update the parameters of the arrival cost. We show that, having a better approximation of the arrival cost, the size of the optimization problem can be significantly reduced guaranteeing the stability and convergence of the estimates. These properties are illustrated through simulation studies.

  6. LSimpute: accurate estimation of missing values in microarray data with least squares methods.

    PubMed

    Bø, Trond Hellem; Dysvik, Bjarte; Jonassen, Inge

    2004-02-20

    Microarray experiments generate data sets with information on the expression levels of thousands of genes in a set of biological samples. Unfortunately, such experiments often produce multiple missing expression values, normally due to various experimental problems. As many algorithms for gene expression analysis require a complete data matrix as input, the missing values have to be estimated in order to analyze the available data. Alternatively, genes and arrays can be removed until no missing values remain. However, for genes or arrays with only a small number of missing values, it is desirable to impute those values. For the subsequent analysis to be as informative as possible, it is essential that the estimates for the missing gene expression values are accurate. A small amount of badly estimated missing values in the data might be enough for clustering methods, such as hierachical clustering or K-means clustering, to produce misleading results. Thus, accurate methods for missing value estimation are needed. We present novel methods for estimation of missing values in microarray data sets that are based on the least squares principle, and that utilize correlations between both genes and arrays. For this set of methods, we use the common reference name LSimpute. We compare the estimation accuracy of our methods with the widely used KNNimpute on three complete data matrices from public data sets by randomly knocking out data (labeling as missing). From these tests, we conclude that our LSimpute methods produce estimates that consistently are more accurate than those obtained using KNNimpute. Additionally, we examine a more classic approach to missing value estimation based on expectation maximization (EM). We refer to our EM implementations as EMimpute, and the estimate errors using the EMimpute methods are compared with those our novel methods produce. The results indicate that on average, the estimates from our best performing LSimpute method are at least as

  7. Handbook for cost estimating. A method for developing estimates of costs for generic actions for nuclear power plants

    SciTech Connect

    Ball, J.R.; Cohen, S.; Ziegler, E.Z.

    1984-10-01

    This document provides overall guidance to assist the NRC in preparing the types of cost estimates required by the Regulatory Analysis Guidelines and to assist in the assignment of priorities in resolving generic safety issues. The Handbook presents an overall cost model that allows the cost analyst to develop a chronological series of activities needed to implement a specific regulatory requirement throughout all applicable commercial LWR power plants and to identify the significant cost elements for each activity. References to available cost data are provided along with rules of thumb and cost factors to assist in evaluating each cost element. A suitable code-of-accounts data base is presented to assist in organizing and aggregating costs. Rudimentary cost analysis methods are described to allow the analyst to produce a constant-dollar, lifetime cost for the requirement. A step-by-step example cost estimate is included to demonstrate the overall use of the Handbook.

  8. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Financial Requirements for Management of Excluded Hazardous Secondary... hazardous waste, and the potential cost of closing the facility as a treatment, storage, and...

  9. Parametric Equations for Estimating Aircraft Airframe Costs

    DTIC Science & Technology

    1976-02-01

    the following cost elements: engineering, tooling , nonrecurring manufacturing labor, recurring manu- facturing labor, nonrecurring manufacturing... TOOLING 23 V. MANUFACTURING LABOR 27 VI. MANUFACTURING MATERIALS 31 VII. FLIGHT TEST 36 VIII. QUALITY CONTROL 38 IX. TOTAL COST 40 X. OTHER...man-hours are higher than they would be under normal conditions. In anticipating a high produc- tion rate a contractor may expend many more tooling

  10. How to estimate productivity costs in economic evaluations.

    PubMed

    Krol, Marieke; Brouwer, Werner

    2014-04-01

    Productivity costs are frequently omitted from economic evaluations, despite their often strong impact on cost-effectiveness outcomes. This neglect may be partly explained by the lack of standardization regarding the methodology of estimating productivity costs. This paper aims to contribute to standardization of productivity cost methodology by offering practical guidance on how to estimate productivity costs in economic evaluations. The paper discusses the identification, measurement and valuation of productivity losses. It is recommended to include not only productivity losses related to absenteeism from and reduced productivity at paid work, but also those related to unpaid work. Hence, it is recommended to use a measurement instrument including questions about both paid and unpaid productivity, such as the iMTA Productivity Cost Questionnaire (iPCQ) or the Valuation of Lost Productivity (VOLP). We indicate how to apply the friction cost and the human capital approach and give practical guidance on deriving final cost estimates.

  11. A fast and accurate frequency estimation algorithm for sinusoidal signal with harmonic components

    NASA Astrophysics Data System (ADS)

    Hu, Jinghua; Pan, Mengchun; Zeng, Zhidun; Hu, Jiafei; Chen, Dixiang; Tian, Wugang; Zhao, Jianqiang; Du, Qingfa

    2016-10-01

    Frequency estimation is a fundamental problem in many applications, such as traditional vibration measurement, power system supervision, and microelectromechanical system sensors control. In this paper, a fast and accurate frequency estimation algorithm is proposed to deal with low efficiency problem in traditional methods. The proposed algorithm consists of coarse and fine frequency estimation steps, and we demonstrate that it is more efficient than conventional searching methods to achieve coarse frequency estimation (location peak of FFT amplitude) by applying modified zero-crossing technique. Thus, the proposed estimation algorithm requires less hardware and software sources and can achieve even higher efficiency when the experimental data increase. Experimental results with modulated magnetic signal show that the root mean square error of frequency estimation is below 0.032 Hz with the proposed algorithm, which has lower computational complexity and better global performance than conventional frequency estimation methods.

  12. Fuel Cost Estimation for Sumatra Grid System

    NASA Astrophysics Data System (ADS)

    Liun, Edwaren

    2010-06-01

    Sumatra has a high growth rate electricity energy demand from the first decade in this century. At the medium of this decade the growth is 11% per annum. On the other side capability of Government of Indonesia cq. PLN authority is limited, while many and most old existing power plants will be retired. The electricity demand growth of Sumatra is increasing the fuel consumption for several next decades. Based on several cases by vary growth scenarios and economic parameters, it shown that some kinds of fossil fuel keep to be required until next several decades. Although Sumatra has abundant coal resource, however, the other fuel types such as fuel oil, diesel, gas and nuclear are needed. On the Base Scenario and discount rate of 10%, the Sumatra System will require 11.6 million tones of coal until 2030 producing 866 TWh with cost of US10558 million. Nuclear plants produce about 501 TWh or 32% by cost of US3.1 billion. On the High Scenario and discount rate 10%, the coal consumption becomes 486.6 million tones by fuel cost of US12.7 billion producing 1033 TWh electricity energy. Nuclear fuel cost required in this scenario is US7.06 billion. The other fuel in large amount consumed is natural gas for combined cycle plants by cost of US1.38 billion producing 11.7 TWh of electricity energy on the Base Scenario and discount rate of 10%. In the High Scenario and discount rate 10% coal plants take role in power generation in Sumatra producing about 866 TWh or 54% of electricity energy. Coal consumption will be the highest on the Base Scenario with discount rate of 12% producing 756 TWh and required cost of US17.1 billion. Nuclear plants will not applicable in this scenario due to its un-competitiveness. The fuel cost will depend on nuclear power role in Sumatra system. Fuel cost will increase correspond to the increasing of coal consumption on the case where nuclear power plants not appear.

  13. Cost estimate guidelines for advanced nuclear power technologies

    SciTech Connect

    Delene, J.G.; Hudson, C.R. II.

    1990-03-01

    To make comparative assessments of competing technologies, consistent ground rules must be applied when developing cost estimates. This document provides a uniform set of assumptions, ground rules, and requirements that can be used in developing cost estimates for advanced nuclear power technologies. 10 refs., 8 figs., 32 tabs.

  14. Cost estimate guidelines for advanced nuclear power technologies

    SciTech Connect

    Hudson, C.R. II

    1986-07-01

    To make comparative assessments of competing technologies, consistent ground rules must be applied when developing cost estimates. This document provides a uniform set of assumptions, ground rules, and requirements that can be used in developing cost estimates for advanced nuclear power technologies.

  15. Estimating Instantaneous Energetic Cost During Gait Adaptation

    DTIC Science & Technology

    2014-08-31

    Energetic cost, in this context, refers to the input energy required to 23   power the cellular processes underlying the body’s movement. This energy is...entering the body is allowed to reach equilibrium with the rate at which cellular 57   processes are consuming it. By averaging over minutes of data

  16. Review of storage battery system cost estimates

    SciTech Connect

    Brown, D.R.; Russell, J.A.

    1986-04-01

    Cost analyses for zinc bromine, sodium sulfur, and lead acid batteries were reviewed. Zinc bromine and sodium sulfur batteries were selected because of their advanced design nature and the high level of interest in these two technologies. Lead acid batteries were included to establish a baseline representative of a more mature technology.

  17. Retrofit FGD cost-estimating guidelines. Final report. [6 processes

    SciTech Connect

    Shattuck, D.M.; Ireland, P.A.; Keeth, R.J.; Mora, R.R.; Scheck, R.W.; Archambeault, J.A.; Rathbun, G.R.

    1984-10-01

    This report presents a method to estimate specific plant FGD retrofit costs. The basis of the estimate is a new plant's FGD system cost, as provided in EPRI's Economic Evaluation of FGD Systems CS-3342, or any other generalized cost estimate. The methodology adjusts the capital cost for the sulfur content of the coal, sulfur removal required, unit size, geographic location variables, and retrofit considerations. The methodology also allows the user to calculate first year operating and maintenance (O and M) costs based on site-specific variables. Finally, the report provides a means to adjust for remaining unit life in determining the levelized busbar cost. Levelized cost is presented in mills/kWh and $/t SO/sub 2/ removed.

  18. Sample Size Requirements for Accurate Estimation of Squared Semi-Partial Correlation Coefficients.

    ERIC Educational Resources Information Center

    Algina, James; Moulder, Bradley C.; Moser, Barry K.

    2002-01-01

    Studied the sample size requirements for accurate estimation of squared semi-partial correlation coefficients through simulation studies. Results show that the sample size necessary for adequate accuracy depends on: (1) the population squared multiple correlation coefficient (p squared); (2) the population increase in p squared; and (3) the…

  19. Cost Estimation of Naval Ship Acquisition.

    DTIC Science & Technology

    1983-12-01

    tradeoffs in the design effort. 2. Provide a base for cost/effectiveness review of per- formance specifications . 3. Provide information useful in the...development set- backs such as engineering and design specification changes and other items that are not identifiable at the time of design. Industrial...and recording the specific information that is of value to the analyst." [Ref. 3] There are two basic categories of data that must be col- lected

  20. Estimates of costs by DRG in Sydney teaching hospitals: an application of the Yale cost model.

    PubMed

    Palmer, G; Aisbett, C; Fetter, R; Winchester, L; Reid, B; Rigby, E

    1991-01-01

    The results are reported of a first round of costing by DRG in seven major teaching hospital sites in Sydney using the Yale cost model. These results, when compared between the hospitals and with values of relative costs by DRG from the United States, indicate that the cost modelling procedure has produced credible and potentially useful estimates of casemix costs. The rationale and underlying theory of cost modelling is explained, and the need for further work to improve the method of allocating costs to DRGs, and to improve the cost centre definitions currently used by the hospitals, is emphasised.

  1. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  2. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  3. Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?

    PubMed

    Hey, Spencer Phillips; Kimmelman, Jonathan

    2016-10-01

    Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach.

  4. Towards an accurate estimation of the isosteric heat of adsorption - A correlation with the potential theory.

    PubMed

    Askalany, Ahmed A; Saha, Bidyut B

    2017-03-15

    Accurate estimation of the isosteric heat of adsorption is mandatory for a good modeling of adsorption processes. In this paper a thermodynamic formalism on adsorbed phase volume which is a function of adsorption pressure and temperature has been proposed for the precise estimation of the isosteric heat of adsorption. The estimated isosteric heat of adsorption using the new correlation has been compared with measured values of prudently selected several adsorbent-refrigerant pairs from open literature. Results showed that the proposed isosteric heat of adsorption correlation fits the experimentally measured values better than the Clausius-Clapeyron equation.

  5. Estimating lifetime healthcare costs with morbidity data

    PubMed Central

    2013-01-01

    Background In many developed countries, the economic crisis started in 2008 producing a serious contraction of the financial resources spent on healthcare. Identifying which individuals will require more resources and the moment in their lives these resources have to be allocated becomes essential. It is well known that a small number of individuals with complex healthcare needs consume a high percentage of health expenditures. Conversely, little is known on how morbidity evolves throughout life. The aim of this study is to introduce a longitudinal perspective to chronic disease management. Methods Data used relate to the population of the county of Baix Empordà in Catalonia for the period 2004–2007 (average population was N = 88,858). The database included individual information on morbidity, resource consumption, costs and activity records. The population was classified using the Clinical Risk Groups (CRG) model. Future morbidity evolution was simulated under different assumptions using a stationary Markov chain. We obtained morbidity patterns for the lifetime and the distribution function of the random variable lifetime costs. Individual information on acute episodes, chronic conditions and multimorbidity patterns were included in the model. Results The probability of having a specific health status in the future (healthy, acute process or different combinations of chronic illness) and the distribution function of healthcare costs for the individual lifetime were obtained for the sample population. The mean lifetime cost for women was €111,936, a third higher than for men, at €81,566 (all amounts calculated in 2007 Euros). Healthy life expectancy at birth for females was 46.99, lower than for males (50.22). Females also spent 28.41 years of life suffering from some type of chronic disease, a longer period than men (21.9). Conclusions Future morbidity and whole population costs can be reasonably predicted, combining stochastic microsimulation with a

  6. Fuzzy case based reasoning in sports facilities unit cost estimating

    NASA Astrophysics Data System (ADS)

    Zima, Krzysztof

    2016-06-01

    This article presents an example of estimating costs in the early phase of the project using fuzzy case-based reasoning. The fragment of database containing descriptions and unit cost of sports facilities was shown. The formulas used in Case Based Reasoning method were presented, too. The article presents similarity measurement using a few formulas, including fuzzy similarity. The outcome of cost calculations based on CBR method was presented as a fuzzy number of unit cost of construction work.

  7. The unit cost factors and calculation methods for decommissioning - Cost estimation of nuclear research facilities

    SciTech Connect

    Kwan-Seong Jeong; Dong-Gyu Lee; Chong-Hun Jung; Kune-Woo Lee

    2007-07-01

    Available in abstract form only. Full text of publication follows: The uncertainties of decommissioning costs increase high due to several conditions. Decommissioning cost estimation depends on the complexity of nuclear installations, its site-specific physical and radiological inventories. Therefore, the decommissioning costs of nuclear research facilities must be estimated in accordance with the detailed sub-tasks and resources by the tasks of decommissioning activities. By selecting the classified activities and resources, costs are calculated by the items and then the total costs of all decommissioning activities are reshuffled to match with its usage and objectives. And the decommissioning cost of nuclear research facilities is calculated by applying a unit cost factor method on which classification of decommissioning works fitted with the features and specifications of decommissioning objects and establishment of composition factors are based. Decommissioning costs of nuclear research facilities are composed of labor cost, equipment and materials cost. Of these three categorical costs, the calculation of labor costs are very important because decommissioning activities mainly depend on labor force. Labor costs in decommissioning activities are calculated on the basis of working time consumed in decommissioning objects and works. The working times are figured out of unit cost factors and work difficulty factors. Finally, labor costs are figured out by using these factors as parameters of calculation. The accuracy of decommissioning cost estimation results is much higher compared to the real decommissioning works. (authors)

  8. Estimating the Life Cycle Cost of Space Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    A space system's Life Cycle Cost (LCC) includes design and development, launch and emplacement, and operations and maintenance. Each of these cost factors is usually estimated separately. NASA uses three different parametric models for the design and development cost of crewed space systems; the commercial PRICE-H space hardware cost model, the NASA-Air Force Cost Model (NAFCOM), and the Advanced Missions Cost Model (AMCM). System mass is an important parameter in all three models. System mass also determines the launch and emplacement cost, which directly depends on the cost per kilogram to launch mass to Low Earth Orbit (LEO). The launch and emplacement cost is the cost to launch to LEO the system itself and also the rockets, propellant, and lander needed to emplace it. The ratio of the total launch mass to payload mass depends on the mission scenario and destination. The operations and maintenance costs include any material and spares provided, the ground control crew, and sustaining engineering. The Mission Operations Cost Model (MOCM) estimates these costs as a percentage of the system development cost per year.

  9. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  10. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the estimating methods and rationale used in developing cost estimates and budgets; (v) Provide for... management systems; and (4) Is subject to applicable financial control systems. Estimating system means the Contractor's policies, procedures, and practices for budgeting and planning controls, and...

  11. Chromatography paper as a low-cost medium for accurate spectrophotometric assessment of blood hemoglobin concentration.

    PubMed

    Bond, Meaghan; Elguea, Carlos; Yan, Jasper S; Pawlowski, Michal; Williams, Jessica; Wahed, Amer; Oden, Maria; Tkaczyk, Tomasz S; Richards-Kortum, Rebecca

    2013-06-21

    Anemia affects a quarter of the world's population, and a lack of appropriate diagnostic tools often prevents treatment in low-resource settings. Though the HemoCue 201+ is an appropriate device for diagnosing anemia in low-resource settings, the high cost of disposables ($0.99 per test in Malawi) limits its availability. We investigated using spectrophotometric measurement of blood spotted on chromatography paper as a low-cost (<$0.01 per test) alternative to HemoCue cuvettes. For this evaluation, donor blood was diluted with plasma to simulate anemia, a micropipette spotted blood on paper, and a bench-top spectrophotometer validated the approach before the development of a low-cost reader. We optimized impregnating paper with chemicals to lyse red blood cells, paper type, drying time, wavelengths measured, and sensitivity to variations in volume of blood, and we validated our approach using patient samples. Lysing the blood cells with sodium deoxycholate dried in Whatman Chr4 chromatography paper gave repeatable results, and the absorbance difference between 528 nm and 656 nm was stable over time in measurements taken up to 10 min after sample preparation. The method was insensitive to the amount of blood spotted on the paper over the range of 5 μL to 25 μL. We created a low-cost, handheld reader to measure the transmission of paper cuvettes at these optimal wavelengths. Training and validating our method with patient samples on both the spectrometer and the handheld reader showed that both devices are accurate to within 2 g dL(-1) of the HemoCue device for 98% and 95% of samples, respectively.

  12. Accurate Estimation of the Entropy of Rotation-Translation Probability Distributions.

    PubMed

    Fogolari, Federico; Dongmo Foumthuim, Cedrix Jurgal; Fortuna, Sara; Soler, Miguel Angel; Corazza, Alessandra; Esposito, Gennaro

    2016-01-12

    The estimation of rotational and translational entropies in the context of ligand binding has been the subject of long-time investigations. The high dimensionality (six) of the problem and the limited amount of sampling often prevent the required resolution to provide accurate estimates by the histogram method. Recently, the nearest-neighbor distance method has been applied to the problem, but the solutions provided either address rotation and translation separately, therefore lacking correlations, or use a heuristic approach. Here we address rotational-translational entropy estimation in the context of nearest-neighbor-based entropy estimation, solve the problem numerically, and provide an exact and an approximate method to estimate the full rotational-translational entropy.

  13. [Guidelines for Accurate and Transparent Health Estimates Reporting: the GATHER Statement].

    PubMed

    Stevens, Gretchen A; Alkema, Leontine; Black, Robert E; Boerma, J Ties; Collins, Gary S; Ezzati, Majid; Grove, John T; Hogan, Daniel R; Hogan, Margaret C; Horton, Richard; Lawn, Joy E; Marušic, Ana; Mathers, Colin D; Murray, Christopher J L; Rudan, Igor; Salomon, Joshua A; Simpson, Paul J; Vos, Theo; Welch, Vivian

    2017-01-01

    Measurements of health indicators are rarely available for every population and period of interest, and available data may not be comparable. The Guidelines for Accurate and Transparent Health Estimates Reporting (GATHER) define best reporting practices for studies that calculate health estimates for multiple populations (in time or space) using multiple information sources. Health estimates that fall within the scope of GATHER include all quantitative population-level estimates (including global, regional, national, or subnational estimates) of health indicators, including indicators of health status, incidence and prevalence of diseases, injuries, and disability and functioning; and indicators of health determinants, including health behaviours and health exposures. GATHER comprises a checklist of 18 items that are essential for best reporting practice. A more detailed explanation and elaboration document, describing the interpretation and rationale of each reporting item along with examples of good reporting, is available on the GATHER website (http://gather-statement.org).

  14. 16 CFR 305.5 - Determinations of estimated annual energy consumption, estimated annual operating cost, and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... consumption, estimated annual operating cost, and energy efficiency rating, and of water use rate. 305.5... energy efficiency rating, and of water use rate. (a) Procedures for determining the estimated annual energy consumption, the estimated annual operating costs, the energy efficiency ratings, and the...

  15. A cost estimation model for high power FELs

    SciTech Connect

    Neil, G.R.

    1995-12-31

    A cost estimation model for scaling high-power free-electron lasers has been developed for estimating the impact of system-level design choices in scaling high-average-power superconducting-accelerator-based FELs. The model consists of a number of modules which develop subsystem costs and derive as an economic criterion the cost per kilojoule of light produced. The model does not include design engineering or development costs, but represents the 2nd through nth device. Presented in the paper is the relative sensitivity of designs to power and linac frequency while allowing the operating temperature of the superconducting cavities to optimize.

  16. Polynomial fitting of DT-MRI fiber tracts allows accurate estimation of muscle architectural parameters.

    PubMed

    Damon, Bruce M; Heemskerk, Anneriet M; Ding, Zhaohua

    2012-06-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor magnetic resonance imaging fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image data sets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8 and 15.3 m(-1)), signal-to-noise ratio (50, 75, 100 and 150) and voxel geometry (13.8- and 27.0-mm(3) voxel volume with isotropic resolution; 13.5-mm(3) volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to second-order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m(-1)), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation.

  17. Polynomial Fitting of DT-MRI Fiber Tracts Allows Accurate Estimation of Muscle Architectural Parameters

    PubMed Central

    Damon, Bruce M.; Heemskerk, Anneriet M.; Ding, Zhaohua

    2012-01-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor MRI fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image datasets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8, and 15.3 m−1), signal-to-noise ratio (50, 75, 100, and 150), and voxel geometry (13.8 and 27.0 mm3 voxel volume with isotropic resolution; 13.5 mm3 volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to 2nd order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m−1), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation. PMID:22503094

  18. Measuring nonlinear oscillations using a very accurate and low-cost linear optical position transducer

    NASA Astrophysics Data System (ADS)

    Donoso, Guillermo; Ladera, Celso L.

    2016-09-01

    An accurate linear optical displacement transducer of about 0.2 mm resolution over a range of ∼40 mm is presented. This device consists of a stack of thin cellulose acetate strips, each strip longitudinally slid ∼0.5 mm over the precedent one so that one end of the stack becomes a stepped wedge of constant step. A narrowed light beam from a white LED orthogonally incident crosses the wedge at a known point, the transmitted intensity being detected with a phototransistor whose emitter is connected to a diode. We present the interesting analytical proof that the voltage across the diode is linearly dependent upon the ordinate of the point where the light beam falls on the wedge, as well as the experimental validation of such a theoretical proof. Applications to nonlinear oscillations are then presented—including the interesting case of a body moving under dry friction, and the more advanced case of an oscillator in a quartic energy potential—whose time-varying positions were accurately measured with our transducer. Our sensing device can resolve the dynamics of an object attached to it with great accuracy and precision at a cost considerably less than that of a linear neutral density wedge. The technique used to assemble the wedge of acetate strips is described.

  19. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  20. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere.

  1. Improving The Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning many program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility. This paper presents the plans for the newly established role. Described is how the Independent Program Assessment Office, working with all NASA Centers, NASA Headquarters, other Government agencies, and industry, is focused on creating cost estimation and analysis as a professional discipline that will be recognized equally with the technical disciplines needed to design new space and aeronautics activities. Investments in selected, new analysis tools, creating advanced training opportunities for analysts, and developing career paths for future analysts engaged in the discipline are all elements of the plan. Plans also include increasing the human resources available to conduct independent cost analysis of Agency programs during their formulation, to improve near-term capability to conduct economic cost-benefit assessments, to support NASA management's decision process, and to provide cost analysis results emphasizing "full-cost" and "full-life cycle" considerations. The Agency cost analysis improvement plan has been approved for implementation starting this calendar year. Adequate financial

  2. Space Station Furnace Facility. Volume 3: Program cost estimate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The approach used to estimate costs for the Space Station Furnace Facility (SSFF) is based on a computer program developed internally at Teledyne Brown Engineering (TBE). The program produces time-phased estimates of cost elements for each hardware component, based on experience with similar components. Engineering estimates of the degree of similarity or difference between the current project and the historical data is then used to adjust the computer-produced cost estimate and to fit it to the current project Work Breakdown Structure (WBS). The SSFF Concept as presented at the Requirements Definition Review (RDR) was used as the base configuration for the cost estimate. This program incorporates data on costs of previous projects and the allocation of those costs to the components of one of three, time-phased, generic WBS's. Input consists of a list of similar components for which cost data exist, number of interfaces with their type and complexity, identification of the extent to which previous designs are applicable, and programmatic data concerning schedules and miscellaneous data (travel, off-site assignments). Output is program cost in labor hours and material dollars, for each component, broken down by generic WBS task and program schedule phase.

  3. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  4. A Low-Cost, Accurate, and High-Precision Fluid Dispensing System for Microscale Application.

    PubMed

    Das, Champak; Wang, Guochun; Nguyen, Chien

    2017-04-01

    We present here the development of a low-cost, accurate, and precise fluid dispensing system. It can be used with peristaltic or any other pump to improve the flow characteristics. The dispensing system has a range of 1 to 100 µL with accuracy of ~99.5% and standard deviation at ~150 nL over the entire range. The system developed does not depend on the accuracy or precision of the driving pump; therefore, any positive displacement pump can be used to get similar accuracy and precision, which gives an opportunity to reduce the cost of the system. The dispensing system does not require periodic calibration and can also be miniaturized for microfluidic application. Although primarily designed for aqueous liquid, it can be extended for different nonconductive liquids as well with modifications. The unit is further used for near real-time measurement of lactate from microdialysate. The individual components can easily be made disposable or sterilized for use in biomedical applications.

  5. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Cost accounting standard-consistency in estimating, accumulating and reporting costs. 9904.401 Section 9904.401 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF...

  6. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Cost accounting standard-consistency in estimating, accumulating and reporting costs. 9904.401 Section 9904.401 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF...

  7. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Cost accounting standard-consistency in estimating, accumulating and reporting costs. 9904.401 Section 9904.401 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF...

  8. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Cost accounting standard-consistency in estimating, accumulating and reporting costs. 9904.401 Section 9904.401 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF...

  9. A Parametric Cost Model for Estimating Acquisition Costs of Conventional U.S. Navy Surface Ships.

    DTIC Science & Technology

    1999-09-01

    techniques return cost estimating relationships able to predict average procurement cost from ship light displacement, ship overall length, ship ... propulsion shaft horsepower or number of propulsion engines. The formulated parametric cost model is approximate and appropriate only for rough order of

  10. Costing for the Future: Exploring Cost Estimation With Unmanned Autonomous Systems

    DTIC Science & Technology

    2016-04-30

    Acquisition, Cranfield University Controlling Costs: The 6-3-5 Method—Case Studies at NAVSEA and NATO Bruce Nagy, President/CEO, Catalyst Technologies...Morgan Ames, Senior Advisor, Catalyst Technologies Costing for the Future: Exploring Cost Estimation With Unmanned Autonomous Systems Ricardo Valerdi

  11. Psychometrics for the Cost Conscious: Using Discriminant Analysis to Refine Estimates of Program Cost.

    ERIC Educational Resources Information Center

    Coffin, Raymond J.

    This presentation uses data from a long-term juvenile diversion counseling program to illustrate a sample method of estimating program cost. Several problems in using average cost figures are first presented. Then a method is described in which discriminant analysis can be used to refine cost comparisons. This makes it possible to produce…

  12. Commercial Crew Cost Estimating - A Look at Estimating Processes, Challenges and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Battle, Rick; Cole, Lance

    2015-01-01

    To support annual PPBE budgets and NASA HQ requests for cost information for commercial crew transportation to the International Space Station (ISS), the NASA ISS ACES team developed system development and per flight cost estimates for the potential providers for each annual PPBE submit from 2009-2014. This paper describes the cost estimating processes used, challenges and lessons learned to develop estimates for this key NASA project that diverted from the traditional procurement approach and used a new way of doing business

  13. Econometric estimation of country-specific hospital costs.

    PubMed

    Adam, Taghreed; Evans, David B; Murray, Christopher JL

    2003-02-26

    Information on the unit cost of inpatient and outpatient care is an essential element for costing, budgeting and economic-evaluation exercises. Many countries lack reliable estimates, however. WHO has recently undertaken an extensive effort to collect and collate data on the unit cost of hospitals and health centres from as many countries as possible; so far, data have been assembled from 49 countries, for various years during the period 1973-2000. The database covers a total of 2173 country-years of observations. Large gaps remain, however, particularly for developing countries. Although the long-term solution is that all countries perform their own costing studies, the question arises whether it is possible to predict unit costs for different countries in a standardized way for short-term use. The purpose of the work described in this paper, a modelling exercise, was to use the data collected across countries to predict unit costs in countries for which data are not yet available, with the appropriate uncertainty intervals.The model presented here forms part of a series of models used to estimate unit costs for the WHO-CHOICE project. The methods and the results of the model, however, may be used to predict a number of different types of country-specific unit costs, depending on the purpose of the exercise. They may be used, for instance, to estimate the costs per bed-day at different capacity levels; the "hotel" component of cost per bed-day; or unit costs net of particular components such as drugs.In addition to reporting estimates for selected countries, the paper shows that unit costs of hospitals vary within countries, sometimes by an order of magnitude. Basing cost-effectiveness studies or budgeting exercises on the results of a study of a single facility, or even a small group of facilities, is likely to be misleading.

  14. Robust and accurate fundamental frequency estimation based on dominant harmonic components.

    PubMed

    Nakatani, Tomohiro; Irino, Toshio

    2004-12-01

    This paper presents a new method for robust and accurate fundamental frequency (F0) estimation in the presence of background noise and spectral distortion. Degree of dominance and dominance spectrum are defined based on instantaneous frequencies. The degree of dominance allows one to evaluate the magnitude of individual harmonic components of the speech signals relative to background noise while reducing the influence of spectral distortion. The fundamental frequency is more accurately estimated from reliable harmonic components which are easy to select given the dominance spectra. Experiments are performed using white and babble background noise with and without spectral distortion as produced by a SRAEN filter. The results show that the present method is better than previously reported methods in terms of both gross and fine F0 errors.

  15. Software cost estimation using class point metrics (CPM)

    NASA Astrophysics Data System (ADS)

    Ghode, Aditi; Periyasamy, Kasilingam

    2011-12-01

    Estimating cost for the software project is one of the most important and crucial task to maintain the software reliability. Many cost estimation models have been reported till now, but most of them have significant drawbacks due to rapid changes in the technology. For example, Source Line Of Code (SLOC) can only be counted when the software construction is complete. Function Point (FP) metric is deficient in handling Object Oriented Technology, as it was designed for procedural languages such as COBOL. Since Object-Oriented Programming became a popular development practice, most of the software companies started applying the Unified Modeling Language (UML). The objective of this research is to develop a new cost estimation model with the application of class diagram for the software cost estimation.

  16. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merion M.

    2002-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  17. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2004-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  18. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2003-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  19. Development of Star Tracker System for Accurate Estimation of Spacecraft Attitude

    DTIC Science & Technology

    2009-12-01

    TRACKER SYSTEM FOR ACCURATE ESTIMATION OF SPACECRAFT ATTITUDE by Jack A. Tappe December 2009 Thesis Co-Advisors: Jae Jun Kim Brij N... Brij N. Agrawal Co-Advisor Dr. Knox T. Millsaps Chairman, Department of Mechanical and Astronautical Engineering iv THIS PAGE...much with my studies here. I would like to especially thank Professors Barry Leonard, Brij Agrawal, Grand Master Shin, and Comrade Oleg Yakimenko

  20. A method to accurately estimate the muscular torques of human wearing exoskeletons by torque sensors.

    PubMed

    Hwang, Beomsoo; Jeon, Doyoung

    2015-04-09

    In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions.

  1. The Acquisition Cost-Estimating Workforce. Census and Characteristics

    DTIC Science & Technology

    2009-01-01

    Abbreviations AAC Air Armament Center ACAT acquisition category ACEIT Automated Cost Estimating Integrated Tools AF Air Force AFB Air Force Base AFCAA Air...3 3 4 Automated Cost Estimating Integrated Tools ( ACEIT ) 0 1 12 6 Tecolotea training 0 0 10 5 Other 3 13 24 18 No training 18 4 29 18 Total 100 100...other sources, including AFIT, ACEIT ,9 or the contracting agency that employed them. The remain- ing 29 percent reported having received no training

  2. Accurate Attitude Estimation Using ARS under Conditions of Vehicle Movement Based on Disturbance Acceleration Adaptive Estimation and Correction

    PubMed Central

    Xing, Li; Hang, Yijun; Xiong, Zhi; Liu, Jianye; Wan, Zhong

    2016-01-01

    This paper describes a disturbance acceleration adaptive estimate and correction approach for an attitude reference system (ARS) so as to improve the attitude estimate precision under vehicle movement conditions. The proposed approach depends on a Kalman filter, where the attitude error, the gyroscope zero offset error and the disturbance acceleration error are estimated. By switching the filter decay coefficient of the disturbance acceleration model in different acceleration modes, the disturbance acceleration is adaptively estimated and corrected, and then the attitude estimate precision is improved. The filter was tested in three different disturbance acceleration modes (non-acceleration, vibration-acceleration and sustained-acceleration mode, respectively) by digital simulation. Moreover, the proposed approach was tested in a kinematic vehicle experiment as well. Using the designed simulations and kinematic vehicle experiments, it has been shown that the disturbance acceleration of each mode can be accurately estimated and corrected. Moreover, compared with the complementary filter, the experimental results have explicitly demonstrated the proposed approach further improves the attitude estimate precision under vehicle movement conditions. PMID:27754469

  3. Estimating Costs for Development of Candidate Performance Evaluation Procedures.

    ERIC Educational Resources Information Center

    Payne, David A.

    This paper contains cost unit tables and instructions for their use in estimating the total cost of evaluating a given instructional objective or group of objectives. Included is a list of analytical procedures to be followed in the development of any device to evaluate student performance, (e.g., a unit exam in child development or an attitude…

  4. Estimating design costs for first-of-a-kind projects

    SciTech Connect

    Banerjee, Bakul; /Fermilab

    2006-03-01

    Modern scientific facilities are often outcomes of projects that are first-of-a-kind, that is, minimal historical data are available for project costs and schedules. However, at Fermilab, there was an opportunity to execute two similar projects consecutively. In this paper, a comparative study of the design costs for these two projects is presented using earned value methodology. This study provides some insights into how to estimate the cost of a replicated project.

  5. Direct estimation of the cost effectiveness of tornado shelters.

    PubMed

    Simmons, Kevin M; Sutter, Daniel

    2006-08-01

    This article estimates the cost effectiveness of tornado shelters using the annual probability of a tornado and new data on fatalities per building struck by a tornado. This approach differs from recent estimates of the cost effectiveness of tornado shelters in Reference 1 that use historical casualties. Historical casualties combine both tornado risk and resident action. If residents of tornado-prone states take greater precautions, observed fatalities might not be much higher than in states with lower risk. Estimation using the tornado probability avoids this potential bias. Despite the very different method used, the estimates are 68 million US dollars in permanent homes and 6.0 million US dollars in mobile homes in Oklahoma using a 3% real discount rate, within about 10% of estimates based on historical fatalities. The findings suggest that shelters provide cost-effective protection for mobile homes in the most tornado-prone states but not for permanent homes.

  6. Improving the Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning man), program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility.

  7. Hydrogen Production Cost Estimate Using Biomass Gasification: Independent Review

    SciTech Connect

    Ruth, M.

    2011-10-01

    This independent review is the conclusion arrived at from data collection, document reviews, interviews and deliberation from December 2010 through April 2011 and the technical potential of Hydrogen Production Cost Estimate Using Biomass Gasification. The Panel reviewed the current H2A case (Version 2.12, Case 01D) for hydrogen production via biomass gasification and identified four principal components of hydrogen levelized cost: CapEx; feedstock costs; project financing structure; efficiency/hydrogen yield. The panel reexamined the assumptions around these components and arrived at new estimates and approaches that better reflect the current technology and business environments.

  8. Cost estimation for unmanned lunar and planetary programs

    NASA Technical Reports Server (NTRS)

    Dunkin, J. H.; Pekar, P. R.; Spadoni, D. J.; Stone, C. A.

    1973-01-01

    A basic model is presented for estimating the cost of unmanned lunar and planetary programs. Cost data were collected and analyzed for eight lunar and planetary programs. Total cost was separated into the following components: labor, overhead, materials, and technical support. The study determined that direct labor cost of unmanned lunar and planetary programs comprises 30 percent of the total program cost. Twelve program categories were defined for modeling: six spacecraft subsystem categories (science, structure, propulsion, electrical power, communications, and guidance and integration, test and quality assurance, launch and flight operations, ground equipment, systems analysis and engineering, and program management). An analysis showed that on a percentage basis, direct labor cost and direct labor manhours compare on a one-to-one ratio. Therefore, direct labor hours is used as the parameter for predicting cost, with the advantage of eliminating the effect of inflation on the analysis.

  9. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  10. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images

    PubMed Central

    Lavoie, Benjamin R.; Okoniewski, Michal; Fear, Elise C.

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  11. Estimating the economic and social costs of dementia in Ireland.

    PubMed

    Connolly, Sheelah; Gillespie, Paddy; O'Shea, Eamon; Cahill, Suzanne; Pierce, Maria

    2014-01-01

    Dementia is a costly condition and one that differs from other conditions in the significant cost burden placed on informal caregivers. The aim of this analysis was to estimate the economic and social costs of dementia in Ireland in 2010. With an estimate of 41,470 people with dementia, the total baseline annual cost was found to be over €1.69 billion, 48% of which was attributable to the opportunity cost of informal care provided by family and friends and 43% to residential care. Due to the impact of demographic ageing in the coming decades and the expected increase in the number of people with dementia, family caregivers and the general health and social care system will come under increasing pressure to provide adequate levels of care. Without a significant increase in the amount of resources devoted to dementia, it is unclear how the system will cope in the future.

  12. 48 CFR 9905.501 - Cost accounting standard-consistency in estimating, accumulating and reporting costs by...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-consistency in estimating, accumulating and reporting costs by educational institutions. 9905.501 Section 9905... ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.501 Cost accounting standard—consistency in estimating, accumulating and reporting costs by educational institutions....

  13. 48 CFR 9905.501 - Cost accounting standard-consistency in estimating, accumulating and reporting costs by...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-consistency in estimating, accumulating and reporting costs by educational institutions. 9905.501 Section 9905... ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.501 Cost accounting standard—consistency in estimating, accumulating and reporting costs by educational institutions....

  14. 48 CFR 9905.501 - Cost accounting standard-consistency in estimating, accumulating and reporting costs by...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...-consistency in estimating, accumulating and reporting costs by educational institutions. 9905.501 Section 9905... ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.501 Cost accounting standard—consistency in estimating, accumulating and reporting costs by educational institutions....

  15. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization.

    PubMed

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-10-27

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 µs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online.

  16. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization

    PubMed Central

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 μs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  17. Estimating the costs of landslide damage in the United States

    USGS Publications Warehouse

    Fleming, Robert W.; Taylor, Fred A.

    1980-01-01

    Landslide damages are one of the most costly natural disasters in the United States. A recent estimate of the total annual cost of landslide damage is in excess of $1 billion {Schuster, 1978}. The damages can be significantly reduced, however, through the combined action of technical experts, government, and the public. Before they can be expected to take action, local governments need to have an appreciation of costs of damage in their areas of responsibility and of the reductions in losses that can be achieved. Where studies of cost of landslide damages have been conducted, it is apparent that {1} costs to the public and private sectors of our economy due to landslide damage are much larger than anticipated; {2} taxpayers and public officials generally are unaware of the magnitude of the cost, owing perhaps to the lack of any centralization of data; and {3} incomplete records and unavailability of records result in lower reported costs than actually were incurred. The U.S. Geological Survey has developed a method to estimate the cost of landslide damages in regional and local areas and has applied the method in three urban areas and one rural area. Costs are for different periods and are unadjusted for inflation; therefore, strict comparisons of data from different years should be avoided. Estimates of the average annual cost of landslide damage for the urban areas studied are $5,900,000 in the San Francisco Bay area; $4,000,000 in Allegheny County, Pa.; and $5,170,000 in Hamilton County, Ohio. Adjusting these figures for the population of each area, the annual cost of damages per capita are $1.30 in the nine-county San Francisco Bay region; $2.50 in Allegheny County, Pa.; and $5.80 in Hamilton County, Ohio. On the basis of data from other sources, the estimated annual damages on a per capita basis for the City of Los Angeles, Calif., are about $1.60. If the costs were available for the damages from landslides in Los Angeles in 1977-78 and 1979-80, the annual per

  18. Intraocular lens power estimation by accurate ray tracing for eyes underwent previous refractive surgeries

    NASA Astrophysics Data System (ADS)

    Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong

    2015-08-01

    For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.

  19. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  20. Accurate estimation of cardinal growth temperatures of Escherichia coli from optimal dynamic experiments.

    PubMed

    Van Derlinden, E; Bernaerts, K; Van Impe, J F

    2008-11-30

    Prediction of the microbial growth rate as a response to changing temperatures is an important aspect in the control of food safety and food spoilage. Accurate model predictions of the microbial evolution ask for correct model structures and reliable parameter values with good statistical quality. Given the widely accepted validity of the Cardinal Temperature Model with Inflection (CTMI) [Rosso, L., Lobry, J. R., Bajard, S. and Flandrois, J. P., 1995. Convenient model to describe the combined effects of temperature and pH on microbial growth, Applied and Environmental Microbiology, 61: 610-616], this paper focuses on the accurate estimation of its four parameters (T(min), T(opt), T(max) and micro(opt)) by applying the technique of optimal experiment design for parameter estimation (OED/PE). This secondary model describes the influence of temperature on the microbial specific growth rate from the minimum to the maximum temperature for growth. Dynamic temperature profiles are optimized within two temperature regions ([15 degrees C, 43 degrees C] and [15 degrees C, 45 degrees C]), focusing on the minimization of the parameter estimation (co)variance (D-optimal design). The optimal temperature profiles are implemented in a computer controlled bioreactor, and the CTMI parameters are identified from the resulting experimental data. Approximately equal CTMI parameter values were derived irrespective of the temperature region, except for T(max). The latter could only be estimated accurately from the optimal experiments within [15 degrees C, 45 degrees C]. This observation underlines the importance of selecting the upper temperature constraint for OED/PE as close as possible to the true T(max). Cardinal temperature estimates resulting from designs within [15 degrees C, 45 degrees C] correspond with values found in literature, are characterized by a small uncertainty error and yield a good result during validation. As compared to estimates from non-optimized dynamic

  1. Cost functions to estimate a posteriori probabilities in multiclass problems.

    PubMed

    Cid-Sueiro, J; Arribas, J I; Urbán-Muñoz, S; Figueiras-Vidal, A R

    1999-01-01

    The problem of designing cost functions to estimate a posteriori probabilities in multiclass problems is addressed in this paper. We establish necessary and sufficient conditions that these costs must satisfy in one-class one-output networks whose outputs are consistent with probability laws. We focus our attention on a particular subset of the corresponding cost functions; those which verify two usually interesting properties: symmetry and separability (well-known cost functions, such as the quadratic cost or the cross entropy are particular cases in this subset). Finally, we present a universal stochastic gradient learning rule for single-layer networks, in the sense of minimizing a general version of these cost functions for a wide family of nonlinear activation functions.

  2. Cost estimates for membrane filtration and conventional treatment

    SciTech Connect

    Wiesner, M.R.; Hackney, J.; Sethi, S. ); Jacangelo, J.G. ); Laine, J.M. . Lyonnaise des Eaux)

    1994-12-01

    Costs of several ultrafiltration and nanofiltration processes are compared with the cost of conventional liquid-solid separation with and without GAC adsorption for small water treatment facilities. Data on raw-water quality, permeate flux, recovery, frequency of backflushing, and chemical dosage obtained from a pilot study were used with a previously developed model for membrane costs to calculate anticipated capital and operating costs for each instance. Data from the US Environmental Protection Agency were used to estimate conventional treatment costs. All of the membrane process calculations showed comparable or lower total costs per unit volume treated compared with conventional treatment for small facilities (< 200,000 m[sup 3]/d or about 5 mgd). Membrane processes may offer small facilities a less expensive alternative for the removal of particles and organic materials from drinking water.

  3. READSCAN: a fast and scalable pathogen discovery program with accurate genome relative abundance estimation

    PubMed Central

    Rashid, Mamoon; Pain, Arnab

    2013-01-01

    Summary: READSCAN is a highly scalable parallel program to identify non-host sequences (of potential pathogen origin) and estimate their genome relative abundance in high-throughput sequence datasets. READSCAN accurately classified human and viral sequences on a 20.1 million reads simulated dataset in <27 min using a small Beowulf compute cluster with 16 nodes (Supplementary Material). Availability: http://cbrc.kaust.edu.sa/readscan Contact: arnab.pain@kaust.edu.sa or raeece.naeem@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23193222

  4. 48 CFR 1552.216-76 - Estimated cost and cost-sharing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Estimated cost and cost-sharing. 1552.216-76 Section 1552.216-76 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and...

  5. The estimation of tumor cell percentage for molecular testing by pathologists is not accurate.

    PubMed

    Smits, Alexander J J; Kummer, J Alain; de Bruin, Peter C; Bol, Mijke; van den Tweel, Jan G; Seldenrijk, Kees A; Willems, Stefan M; Offerhaus, G Johan A; de Weger, Roel A; van Diest, Paul J; Vink, Aryan

    2014-02-01

    Molecular pathology is becoming more and more important in present day pathology. A major challenge for any molecular test is its ability to reliably detect mutations in samples consisting of mixtures of tumor cells and normal cells, especially when the tumor content is low. The minimum percentage of tumor cells required to detect genetic abnormalities is a major variable. Information on tumor cell percentage is essential for a correct interpretation of the result. In daily practice, the percentage of tumor cells is estimated by pathologists on hematoxylin and eosin (H&E)-stained slides, the reliability of which has been questioned. This study aimed to determine the reliability of estimated tumor cell percentages in tissue samples by pathologists. On 47 H&E-stained slides of lung tumors a tumor area was marked. The percentage of tumor cells within this area was estimated independently by nine pathologists, using categories of 0-5%, 6-10%, 11-20%, 21-30%, and so on, until 91-100%. As gold standard, the percentage of tumor cells was counted manually. On average, the range between the lowest and the highest estimate per sample was 6.3 categories. In 33% of estimates, the deviation from the gold standard was at least three categories. The mean absolute deviation was 2.0 categories (range between observers 1.5-3.1 categories). There was a significant difference between the observers (P<0.001). If 20% of tumor cells were considered the lower limit to detect a mutation, samples with an insufficient tumor cell percentage (<20%) would have been estimated to contain enough tumor cells in 27/72 (38%) observations, possibly causing false negative results. In conclusion, estimates of tumor cell percentages on H&E-stained slides are not accurate, which could result in misinterpretation of test results. Reliability could possibly be improved by using a training set with feedback.

  6. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-07

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems.

  7. Utilizing Expert Knowledge in Estimating Future STS Costs

    NASA Technical Reports Server (NTRS)

    Fortner, David B.; Ruiz-Torres, Alex J.

    2004-01-01

    A method of estimating the costs of future space transportation systems (STSs) involves classical activity-based cost (ABC) modeling combined with systematic utilization of the knowledge and opinions of experts to extend the process-flow knowledge of existing systems to systems that involve new materials and/or new architectures. The expert knowledge is particularly helpful in filling gaps that arise in computational models of processes because of inconsistencies in historical cost data. Heretofore, the costs of planned STSs have been estimated following a "top-down" approach that tends to force the architectures of new systems to incorporate process flows like those of the space shuttles. In this ABC-based method, one makes assumptions about the processes, but otherwise follows a "bottoms up" approach that does not force the new system architecture to incorporate a space-shuttle-like process flow. Prototype software has been developed to implement this method. Through further development of software, it should be possible to extend the method beyond the space program to almost any setting in which there is a need to estimate the costs of a new system and to extend the applicable knowledge base in order to make the estimate.

  8. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    SciTech Connect

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-18

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1–2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S{sub 0} and A{sub 0}, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A{sub 0} to thickness variations was shown to be superior to S{sub 0}, however, the attenuation from A{sub 0} when a liquid loading was present was much higher than S{sub 0}. A{sub 0} was less sensitive to the presence of coatings on the surface of than S{sub 0}.

  9. Magnetic gaps in organic tri-radicals: From a simple model to accurate estimates

    NASA Astrophysics Data System (ADS)

    Barone, Vincenzo; Cacelli, Ivo; Ferretti, Alessandro; Prampolini, Giacomo

    2017-03-01

    The calculation of the energy gap between the magnetic states of organic poly-radicals still represents a challenging playground for quantum chemistry, and high-level techniques are required to obtain accurate estimates. On these grounds, the aim of the present study is twofold. From the one side, it shows that, thanks to recent algorithmic and technical improvements, we are able to compute reliable quantum mechanical results for the systems of current fundamental and technological interest. From the other side, proper parameterization of a simple Hubbard Hamiltonian allows for a sound rationalization of magnetic gaps in terms of basic physical effects, unraveling the role played by electron delocalization, Coulomb repulsion, and effective exchange in tuning the magnetic character of the ground state. As case studies, we have chosen three prototypical organic tri-radicals, namely, 1,3,5-trimethylenebenzene, 1,3,5-tridehydrobenzene, and 1,2,3-tridehydrobenzene, which differ either for geometric or electronic structure. After discussing the differences among the three species and their consequences on the magnetic properties in terms of the simple model mentioned above, accurate and reliable values for the energy gap between the lowest quartet and doublet states are computed by means of the so-called difference dedicated configuration interaction (DDCI) technique, and the final results are discussed and compared to both available experimental and computational estimates.

  10. Do modelled or satellite-based estimates of surface solar irradiance accurately describe its temporal variability?

    NASA Astrophysics Data System (ADS)

    Bengulescu, Marc; Blanc, Philippe; Boilley, Alexandre; Wald, Lucien

    2017-02-01

    This study investigates the characteristic time-scales of variability found in long-term time-series of daily means of estimates of surface solar irradiance (SSI). The study is performed at various levels to better understand the causes of variability in the SSI. First, the variability of the solar irradiance at the top of the atmosphere is scrutinized. Then, estimates of the SSI in cloud-free conditions as provided by the McClear model are dealt with, in order to reveal the influence of the clear atmosphere (aerosols, water vapour, etc.). Lastly, the role of clouds on variability is inferred by the analysis of in-situ measurements. A description of how the atmosphere affects SSI variability is thus obtained on a time-scale basis. The analysis is also performed with estimates of the SSI provided by the satellite-derived HelioClim-3 database and by two numerical weather re-analyses: ERA-Interim and MERRA2. It is found that HelioClim-3 estimates render an accurate picture of the variability found in ground measurements, not only globally, but also with respect to individual characteristic time-scales. On the contrary, the variability found in re-analyses correlates poorly with all scales of ground measurements variability.

  11. Removing the thermal component from heart rate provides an accurate VO2 estimation in forest work.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Lebel, Luc; Kolus, Ahmet

    2016-05-01

    Heart rate (HR) was monitored continuously in 41 forest workers performing brushcutting or tree planting work. 10-min seated rest periods were imposed during the workday to estimate the HR thermal component (ΔHRT) per Vogt et al. (1970, 1973). VO2 was measured using a portable gas analyzer during a morning submaximal step-test conducted at the work site, during a work bout over the course of the day (range: 9-74 min), and during an ensuing 10-min rest pause taken at the worksite. The VO2 estimated, from measured HR and from corrected HR (thermal component removed), were compared to VO2 measured during work and rest. Varied levels of HR thermal component (ΔHRTavg range: 0-38 bpm) originating from a wide range of ambient thermal conditions, thermal clothing insulation worn, and physical load exerted during work were observed. Using raw HR significantly overestimated measured work VO2 by 30% on average (range: 1%-64%). 74% of VO2 prediction error variance was explained by the HR thermal component. VO2 estimated from corrected HR, was not statistically different from measured VO2. Work VO2 can be estimated accurately in the presence of thermal stress using Vogt et al.'s method, which can be implemented easily by the practitioner with inexpensive instruments.

  12. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    PubMed Central

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  13. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    NASA Astrophysics Data System (ADS)

    Granata, Daniele; Carnevale, Vincenzo

    2016-08-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset.

  14. MIDAS robust trend estimator for accurate GPS station velocities without step detection.

    PubMed

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij  = (xj-xi )/(tj-ti ) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  15. Methods for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, M.R.; Bland, R.

    2000-01-01

    Accurate estimates of net residual discharge in tidally affected rivers and estuaries are possible because of recently developed ultrasonic discharge measurement techniques. Previous discharge estimates using conventional mechanical current meters and methods based on stage/discharge relations or water slope measurements often yielded errors that were as great as or greater than the computed residual discharge. Ultrasonic measurement methods consist of: 1) the use of ultrasonic instruments for the measurement of a representative 'index' velocity used for in situ estimation of mean water velocity and 2) the use of the acoustic Doppler current discharge measurement system to calibrate the index velocity measurement data. Methods used to calibrate (rate) the index velocity to the channel velocity measured using the Acoustic Doppler Current Profiler are the most critical factors affecting the accuracy of net discharge estimation. The index velocity first must be related to mean channel velocity and then used to calculate instantaneous channel discharge. Finally, discharge is low-pass filtered to remove the effects of the tides. An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin Rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. Two sets of data were collected during a spring tide (monthly maximum tidal current) and one of data collected during a neap tide (monthly minimum tidal current). The relative magnitude of instrumental errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was found to be the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three

  16. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    NASA Astrophysics Data System (ADS)

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj-xi)/(tj-ti) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  17. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    PubMed Central

    Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-01-01

    Abstract Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil‐Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj–xi)/(tj–ti) computed between all data pairs i > j. For normally distributed data, Theil‐Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil‐Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one‐sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root‐mean‐square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences. PMID:27668140

  18. Estimating the mission-related costs of teaching hospitals.

    PubMed

    Koenig, Lane; Dobson, Allen; Ho, Silver; Siegel, Jonathan M; Blumenthal, David; Weissman, Joel S

    2003-01-01

    Academic health centers and other teaching hospitals face higher patient care costs than nonteaching community hospitals face, because of their missions of graduate medical education (GME), biomedical research, and the maintenance of standby capacity for medically complex patients. We estimate that total mission-related costs were dollar 27 billion in 2002 for all teaching hospitals, with GME (including indirect and direct GME) and standby capacity accounting for roughly 60 and 35 percent of these costs, respectively. To assure their continued ability to perform important social missions in a competitive environment, it may be necessary to reassess the way in which these activities are financed.

  19. Weight and cost estimating relationships for heavy lift airships

    NASA Technical Reports Server (NTRS)

    Gray, D. W.

    1979-01-01

    Weight and cost estimating relationships, including additional parameters that influence the cost and performance of heavy-lift airships (HLA), are discussed. Inputs to a closed loop computer program, consisting of useful load, forward speed, lift module positive or negative thrust, and rotors and propellers, are examined. Detail is given to the HLA cost and weight program (HLACW), which computes component weights, vehicle size, buoyancy lift, rotor and propellar thrust, and engine horse power. This program solves the problem of interrelating the different aerostat, rotors, engines and propeller sizes. Six sets of 'default parameters' are left for the operator to change during each computer run enabling slight data manipulation without altering the program.

  20. Estimating the Cost of Care for Emergency Department Syncope Patients: Comparison of Three Models

    PubMed Central

    Probst, Marc A.; McConnell, John K.; Weiss, Robert E.; Laurie, Amber L.; Yagapen, Annick N.; Lin, Michelle P.; Caterino, Jeffrey M.; Shah, Manish N.; Sun, Benjamin C.

    2017-01-01

    Introduction We sought to compare three hospital cost-estimation models for patients undergoing evaluation for unexplained syncope using hospital cost data. Developing such a model would allow researchers to assess the value of novel clinical algorithms for syncope management. Methods We collected complete health services data, including disposition, testing, and length of stay (LOS), on 67 adult patients (age 60 years and older) who presented to the emergency department (ED) with syncope at a single hospital. Patients were excluded if a serious medical condition was identified. We created three hospital cost-estimation models to estimate facility costs: V1, unadjusted Medicare payments for observation and/or hospital admission; V2: modified Medicare payment, prorated by LOS in calendar days; and V3: modified Medicare payment, prorated by LOS in hours. Total hospital costs included unadjusted Medicare payments for diagnostic testing and estimated facility costs. We plotted these estimates against actual cost data from the hospital finance department, and performed correlation and regression analyses. Results Of the three models, V3 consistently outperformed the others with regard to correlation and goodness of fit. The Pearson correlation coefficient for V3 was 0.88 (95% confidence interval [CI] 0.81, 0.92) with an R-square value of 0.77 and a linear regression coefficient of 0.87 (95% CI 0.76, 0.99). Conclusion Using basic health services data, it is possible to accurately estimate hospital costs for older adults undergoing a hospital-based evaluation for unexplained syncope. This methodology could help assess the potential economic impact of implementing novel clinical algorithms for ED syncope. PMID:28210361

  1. A process model to estimate biodiesel production costs.

    PubMed

    Haas, Michael J; McAloon, Andrew J; Yee, Winnie C; Foglia, Thomas A

    2006-03-01

    'Biodiesel' is the name given to a renewable diesel fuel that is produced from fats and oils. It consists of the simple alkyl esters of fatty acids, most typically the methyl esters. We have developed a computer model to estimate the capital and operating costs of a moderately-sized industrial biodiesel production facility. The major process operations in the plant were continuous-process vegetable oil transesterification, and ester and glycerol recovery. The model was designed using contemporary process simulation software, and current reagent, equipment and supply costs, following current production practices. Crude, degummed soybean oil was specified as the feedstock. Annual production capacity of the plant was set at 37,854,118 l (10 x 10(6)gal). Facility construction costs were calculated to be US dollar 11.3 million. The largest contributors to the equipment cost, accounting for nearly one third of expenditures, were storage tanks to contain a 25 day capacity of feedstock and product. At a value of US dollar 0.52/kg (dollar 0.236/lb) for feedstock soybean oil, a biodiesel production cost of US dollar 0.53/l (dollar 2.00/gal) was predicted. The single greatest contributor to this value was the cost of the oil feedstock, which accounted for 88% of total estimated production costs. An analysis of the dependence of production costs on the cost of the feedstock indicated a direct linear relationship between the two, with a change of US dollar 0.020/l (dollar 0.075/gal) in product cost per US dollar 0.022/kg (dollar 0.01/lb) change in oil cost. Process economics included the recovery of coproduct glycerol generated during biodiesel production, and its sale into the commercial glycerol market as an 80% w/w aqueous solution, which reduced production costs by approximately 6%. The production cost of biodiesel was found to vary inversely and linearly with variations in the market value of glycerol, increasing by US dollar 0.0022/l (dollar 0.0085/gal) for every US

  2. Accurate relative location estimates for the North Korean nuclear tests using empirical slowness corrections

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna, T.; Mykkeltveit, S.

    2017-01-01

    velocity gradients reduce the residuals, the relative location uncertainties and the sensitivity to the combination of stations used. The traveltime gradients appear to be overestimated for the regional phases, and teleseismic relative location estimates are likely to be more accurate despite an apparent lower precision. Calibrations for regional phases are essential given that smaller magnitude events are likely not to be recorded teleseismically. We discuss the implications for the absolute event locations. Placing the 2006 event under a local maximum of overburden at 41.293°N, 129.105°E would imply a location of 41.299°N, 129.075°E for the January 2016 event, providing almost optimal overburden for the later four events.

  3. Accurate Relative Location Estimates for the North Korean Nuclear Tests Using Empirical Slowness Corrections

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna', T.; Mykkeltveit, S.

    2016-10-01

    modified velocity gradients reduce the residuals, the relative location uncertainties, and the sensitivity to the combination of stations used. The traveltime gradients appear to be overestimated for the regional phases, and teleseismic relative location estimates are likely to be more accurate despite an apparent lower precision. Calibrations for regional phases are essential given that smaller magnitude events are likely not to be recorded teleseismically. We discuss the implications for the absolute event locations. Placing the 2006 event under a local maximum of overburden at 41.293°N, 129.105°E would imply a location of 41.299°N, 129.075°E for the January 2016 event, providing almost optimal overburden for the later four events.

  4. Estimating the extra cost of living with disability in Vietnam.

    PubMed

    Minh, Hoang Van; Giang, Kim Bao; Liem, Nguyen Thanh; Palmer, Michael; Thao, Nguyen Phuong; Duong, Le Bach

    2015-01-01

    Disability is shown to be both a cause and a consequence of poverty. However, relatively little research has investigated the economic cost of living with a disability. This study reports the results of a study on the extra cost of living with disability in Vietnam in 2011. The study was carried out in eight cities/provinces in Vietnam, including Hanoi and Ho Chi Minh cities (two major metropolitan in Vietnam) and six provinces from each of the six socio-economic regions in Vietnam. Costs are estimated using the standard of living approach whereby the difference in incomes between people with disability and those without disability for a given standard of living serves as a proxy for the cost of living with disability. The extra cost of living with disability in Vietnam accounted for about 8.8-9.5% of annual household income, or valued about US$200-218. Communication difficulty was shown to result in highest additional cost of living with disability and self-care difficulty was shown to lead to the lowest levels of extra of living cost. The extra cost of living with disability increased as people had more severe impairment. Interventions to promote the economic security of livelihood for people with disabilities are needed.

  5. Painfree and accurate Bayesian estimation of psychometric functions for (potentially) overdispersed data.

    PubMed

    Schütt, Heiko H; Harmeling, Stefan; Macke, Jakob H; Wichmann, Felix A

    2016-05-01

    The psychometric function describes how an experimental variable, such as stimulus strength, influences the behaviour of an observer. Estimation of psychometric functions from experimental data plays a central role in fields such as psychophysics, experimental psychology and in the behavioural neurosciences. Experimental data may exhibit substantial overdispersion, which may result from non-stationarity in the behaviour of observers. Here we extend the standard binomial model which is typically used for psychometric function estimation to a beta-binomial model. We show that the use of the beta-binomial model makes it possible to determine accurate credible intervals even in data which exhibit substantial overdispersion. This goes beyond classical measures for overdispersion-goodness-of-fit-which can detect overdispersion but provide no method to do correct inference for overdispersed data. We use Bayesian inference methods for estimating the posterior distribution of the parameters of the psychometric function. Unlike previous Bayesian psychometric inference methods our software implementation-psignifit 4-performs numerical integration of the posterior within automatically determined bounds. This avoids the use of Markov chain Monte Carlo (MCMC) methods typically requiring expert knowledge. Extensive numerical tests show the validity of the approach and we discuss implications of overdispersion for experimental design. A comprehensive MATLAB toolbox implementing the method is freely available; a python implementation providing the basic capabilities is also available.

  6. Accurate estimation of the RMS emittance from single current amplifier data

    SciTech Connect

    Stockli, Martin P.; Welton, R.F.; Keller, R.; Letchford, A.P.; Thomae, R.W.; Thomason, J.W.G.

    2002-05-31

    This paper presents the SCUBEEx rms emittance analysis, a self-consistent, unbiased elliptical exclusion method, which combines traditional data-reduction methods with statistical methods to obtain accurate estimates for the rms emittance. Rather than considering individual data, the method tracks the average current density outside a well-selected, variable boundary to separate the measured beam halo from the background. The average outside current density is assumed to be part of a uniform background and not part of the particle beam. Therefore the average outside current is subtracted from the data before evaluating the rms emittance within the boundary. As the boundary area is increased, the average outside current and the inside rms emittance form plateaus when all data containing part of the particle beam are inside the boundary. These plateaus mark the smallest acceptable exclusion boundary and provide unbiased estimates for the average background and the rms emittance. Small, trendless variations within the plateaus allow for determining the uncertainties of the estimates caused by variations of the measured background outside the smallest acceptable exclusion boundary. The robustness of the method is established with complementary variations of the exclusion boundary. This paper presents a detailed comparison between traditional data reduction methods and SCUBEEx by analyzing two complementary sets of emittance data obtained with a Lawrence Berkeley National Laboratory and an ISIS H{sup -} ion source.

  7. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method.

  8. Accurate estimation of motion blur parameters in noisy remote sensing image

    NASA Astrophysics Data System (ADS)

    Shi, Xueyan; Wang, Lin; Shao, Xiaopeng; Wang, Huilin; Tao, Zhong

    2015-05-01

    The relative motion between remote sensing satellite sensor and objects is one of the most common reasons for remote sensing image degradation. It seriously weakens image data interpretation and information extraction. In practice, point spread function (PSF) should be estimated firstly for image restoration. Identifying motion blur direction and length accurately is very crucial for PSF and restoring image with precision. In general, the regular light-and-dark stripes in the spectrum can be employed to obtain the parameters by using Radon transform. However, serious noise existing in actual remote sensing images often causes the stripes unobvious. The parameters would be difficult to calculate and the error of the result relatively big. In this paper, an improved motion blur parameter identification method to noisy remote sensing image is proposed to solve this problem. The spectrum characteristic of noisy remote sensing image is analyzed firstly. An interactive image segmentation method based on graph theory called GrabCut is adopted to effectively extract the edge of the light center in the spectrum. Motion blur direction is estimated by applying Radon transform on the segmentation result. In order to reduce random error, a method based on whole column statistics is used during calculating blur length. Finally, Lucy-Richardson algorithm is applied to restore the remote sensing images of the moon after estimating blur parameters. The experimental results verify the effectiveness and robustness of our algorithm.

  9. Common Day Care Safety Renovations: Descriptions, Explanations and Cost Estimates.

    ERIC Educational Resources Information Center

    Spack, Stan

    This booklet explains some of the day care safety features specified by the new Massachusetts State Building Code (January 1, 1975) which must be met before a new day care center can be licensed. The safety features described are those which most often require renovation to meet the building code standards. Best estimates of the costs involved in…

  10. A Semantics-Based Approach to Construction Cost Estimating

    ERIC Educational Resources Information Center

    Niknam, Mehrdad

    2015-01-01

    A construction project requires collaboration of different organizations such as owner, designer, contractor, and resource suppliers. These organizations need to exchange information to improve their teamwork. Understanding the information created in other organizations requires specialized human resources. Construction cost estimating is one of…

  11. Stochastic Estimation of Cost Frontier: Evidence from Bangladesh

    ERIC Educational Resources Information Center

    Mamun, Shamsul Arifeen Khan

    2012-01-01

    In the literature of higher education cost function study, enough knowledge is created in the area of economy scale in the context of developed countries but the knowledge of input demand is lacking. On the other hand, empirical knowledge in the context of developing countries is very meagre. The paper fills up the knowledge gap, estimating a…

  12. Advanced Composite Air Frame Life Cycle Cost Estimating

    DTIC Science & Technology

    2014-06-19

    achievements. And, also, I dedicate this page to the soul of my father and lovely mother who taught me many life lessons and to my beautiful wife and four...Test Panel [ 17] .......................................................... 18 Figure 2 : Boeing 787 Dreamliner External Skin Makeup [ 7...exist a relationship between variables and empty wieght (EW) to build better cost estimation relationships (CERs). This research will help the

  13. Estimating the Cost-Effectiveness of Coordinated DSM Programs.

    ERIC Educational Resources Information Center

    Hill, Lawrence J.; Brown, Marilyn A.

    1995-01-01

    A methodology for estimating the cost-effectiveness of coordinated programs from the standpoint of an electric or gas utility is described and illustrated. The discussion focuses on demand-side management programs cofunded by the government and utilities, but it can be applied to other types of cofunded programs. (SLD)

  14. 16 CFR 305.5 - Determinations of estimated annual energy consumption, estimated annual operating cost, and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... consumption, estimated annual operating cost, and energy efficiency rating, and of water use rate. 305.5... energy efficiency rating, and of water use rate. Link to an amendment published at 75 FR 41713, July 19... operating costs, the energy efficiency ratings, and the efficacy factors of the following covered...

  15. Two-wavelength interferometry: extended range and accurate optical path difference analytical estimator.

    PubMed

    Houairi, Kamel; Cassaing, Frédéric

    2009-12-01

    Two-wavelength interferometry combines measurement at two wavelengths lambda(1) and lambda(2) in order to increase the unambiguous range (UR) for the measurement of an optical path difference. With the usual algorithm, the UR is equal to the synthetic wavelength Lambda=lambda(1)lambda(2)/|lambda(1)-lambda(2)|, and the accuracy is a fraction of Lambda. We propose here a new analytical algorithm based on arithmetic properties, allowing estimation of the absolute fringe order of interference in a noniterative way. This algorithm has nice properties compared with the usual algorithm: it is at least as accurate as the most accurate measurement at one wavelength, whereas the UR is extended to several times the synthetic wavelength. The analysis presented shows how the actual UR depends on the wavelengths and different sources of error. The simulations presented are confirmed by experimental results, showing that the new algorithm has enabled us to reach an UR of 17.3 microm, much larger than the synthetic wavelength, which is only Lambda=2.2 microm. Applications to metrology and fringe tracking are discussed.

  16. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms

    PubMed Central

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes’ principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of ‘unellipticity’ introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  17. Accurate biopsy-needle depth estimation in limited-angle tomography using multi-view geometry

    NASA Astrophysics Data System (ADS)

    van der Sommen, Fons; Zinger, Sveta; de With, Peter H. N.

    2016-03-01

    Recently, compressed-sensing based algorithms have enabled volume reconstruction from projection images acquired over a relatively small angle (θ < 20°). These methods enable accurate depth estimation of surgical tools with respect to anatomical structures. However, they are computationally expensive and time consuming, rendering them unattractive for image-guided interventions. We propose an alternative approach for depth estimation of biopsy needles during image-guided interventions, in which we split the problem into two parts and solve them independently: needle-depth estimation and volume reconstruction. The complete proposed system consists of the previous two steps, preceded by needle extraction. First, we detect the biopsy needle in the projection images and remove it by interpolation. Next, we exploit epipolar geometry to find point-to-point correspondences in the projection images to triangulate the 3D position of the needle in the volume. Finally, we use the interpolated projection images to reconstruct the local anatomical structures and indicate the position of the needle within this volume. For validation of the algorithm, we have recorded a full CT scan of a phantom with an inserted biopsy needle. The performance of our approach ranges from a median error of 2.94 mm for an distributed viewing angle of 1° down to an error of 0.30 mm for an angle larger than 10°. Based on the results of this initial phantom study, we conclude that multi-view geometry offers an attractive alternative to time-consuming iterative methods for the depth estimation of surgical tools during C-arm-based image-guided interventions.

  18. Estimating the Cost of NASA's Space Launch Initiative: How SLI Cost Stack Up Against the Shuttle

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph H.; Roth, Axel (Technical Monitor)

    2002-01-01

    NASA is planning to replace the Space Shuttle with a new completely reusable Second Generation Launch System by approximately 2012. Numerous contracted and NASA in-house Space Transportation Architecture Studies and various technology maturation activities are proceeding and have resulted in scores of competing architecture configurations being proposed. Life cycle cost is a key discriminator between all these various concepts. However, the one obvious analogy for costing purposes remains the current Shuttle system. Are there credible reasons to believe that a second generation reusable launch system can be accomplished at less cost than the Shuttle? The need for a credible answer to this question is critical. This paper reviews the cost estimating approaches being used by the contractors and the government estimators to address this issue and explores the rationale behind the numbers.

  19. Building of an experimental cline with Arabidopsis thaliana to estimate herbicide fitness cost.

    PubMed

    Roux, Fabrice; Giancola, Sandra; Durand, Stéphanie; Reboud, Xavier

    2006-06-01

    Various management strategies aim at maintaining pesticide resistance frequency under a threshold value by taking advantage of the benefit of the fitness penalty (the cost) expressed by the resistance allele outside the treated area or during the pesticide selection "off years." One method to estimate a fitness cost is to analyze the resistance allele frequency along transects across treated and untreated areas. On the basis of the shape of the cline, this method gives the relative contributions of both gene flow and the fitness difference between genotypes in the treated and untreated areas. Taking advantage of the properties of such migration-selection balance, an artificial cline was built up to optimize the conditions where the fitness cost of two herbicide-resistant mutants (acetolactate synthase and auxin-induced target genes) in the model species Arabidopsis thaliana could be more accurately measured. The analysis of the microevolutionary dynamics in these experimental populations indicated mean fitness costs of approximately 15 and 92% for the csr1-1 and axr2-1 resistances, respectively. In addition, negative frequency dependence for the fitness cost was also detected for the axr2-1 resistance. The advantages and disadvantages of the cline approach are discussed in regard to other methods of cost estimation. This comparison highlights the powerful ability of an experimental cline to measure low fitness costs and detect sensibility to frequency-dependent variations.

  20. Breckinridge Project, initial effort. Report VIII. Capital cost estimate

    SciTech Connect

    1982-01-01

    The major objective of the Initial Effort for the Breckinridge Project is to develop engineering to the point where realistic economics for the construction and operation of the plant can be made. The plant is designed to process 23,000 tons per day of run-of-mine coal to produce a nominal 50,000 barrels per day of liquid products using the H-COAL and standard industry technology. The plant will be located in Breckinridge County, Kentucky. Considerable preliminary engineering has been performed for this estimate. This work uses a single-point design based on the Process Demonstration Unit (PDU) data from run 5, period 29 of the pilot plant. The design basis is discussed in Volume II of this report. Many aspects of plant construction and cost have been considered that were not taken into account in the past studies. Ashland and Bechtel believe the accuracy of the capital estimate to be +19%, -17%. This accuracy is postulated on January 1981 dollars, the as-spent dollar amount naturally depending upon the inflation rate through the construction period. Considerable attention has been devoted to reliability of operation, and redundant equipment has been used where it was deemed necessary to assure reasonable onstream time. This equipment is included in the capital estimate. The capital is summarized by total plant cost on Table 1. The subtotal plant cost, excluding contingency, fee, and adjustment is $2,710,940,000. Adding the contingency, fee and adjustment, the total depreciable cost of the plant is $3,167,430,000. Adding the working capital to the total plant cost results in total capital requirements of $3,258,430,000 as shown on the individual plant cost summary Table 2.

  1. The potential of more accurate InSAR covariance matrix estimation for land cover mapping

    NASA Astrophysics Data System (ADS)

    Jiang, Mi; Yong, Bin; Tian, Xin; Malhotra, Rakesh; Hu, Rui; Li, Zhiwei; Yu, Zhongbo; Zhang, Xinxin

    2017-04-01

    Synthetic aperture radar (SAR) and Interferometric SAR (InSAR) provide both structural and electromagnetic information for the ground surface and therefore have been widely used for land cover classification. However, relatively few studies have developed analyses that investigate SAR datasets over richly textured areas where heterogeneous land covers exist and intermingle over short distances. One of main difficulties is that the shapes of the structures in a SAR image cannot be represented in detail as mixed pixels are likely to occur when conventional InSAR parameter estimation methods are used. To solve this problem and further extend previous research into remote monitoring of urban environments, we address the use of accurate InSAR covariance matrix estimation to improve the accuracy of land cover mapping. The standard and updated methods were tested using the HH-polarization TerraSAR-X dataset and compared with each other using the random forest classifier. A detailed accuracy assessment complied for six types of surfaces shows that the updated method outperforms the standard approach by around 9%, with an overall accuracy of 82.46% over areas with rich texture in Zhuhai, China. This paper demonstrates that the accuracy of land cover mapping can benefit from the 3 enhancement of the quality of the observations in addition to classifiers selection and multi-source data ingratiation reported in previous studies.

  2. Can student health professionals accurately estimate alcohol content in commonly occurring drinks?

    PubMed Central

    Sinclair, Julia; Searle, Emma

    2016-01-01

    Objectives: Correct identification of alcohol as a contributor to, or comorbidity of, many psychiatric diseases requires health professionals to be competent and confident to take an accurate alcohol history. Being able to estimate (or calculate) the alcohol content in commonly consumed drinks is a prerequisite for quantifying levels of alcohol consumption. The aim of this study was to assess this ability in medical and nursing students. Methods: A cross-sectional survey of 891 medical and nursing students across different years of training was conducted. Students were asked the alcohol content of 10 different alcoholic drinks by seeing a slide of the drink (with picture, volume and percentage of alcohol by volume) for 30 s. Results: Overall, the mean number of correctly estimated drinks (out of the 10 tested) was 2.4, increasing to just over 3 if a 10% margin of error was used. Wine and premium strength beers were underestimated by over 50% of students. Those who drank alcohol themselves, or who were further on in their clinical training, did better on the task, but overall the levels remained low. Conclusions: Knowledge of, or the ability to work out, the alcohol content of commonly consumed drinks is poor, and further research is needed to understand the reasons for this and the impact this may have on the likelihood to undertake screening or initiate treatment. PMID:27536344

  3. Greater contrast in Martian hydrological history from more accurate estimates of paleodischarge

    NASA Astrophysics Data System (ADS)

    Jacobsen, R. E.; Burr, D. M.

    2016-09-01

    Correlative width-discharge relationships from the Missouri River Basin are commonly used to estimate fluvial paleodischarge on Mars. However, hydraulic geometry provides alternative, and causal, width-discharge relationships derived from broader samples of channels, including those in reduced-gravity (submarine) environments. Comparison of these relationships implies that causal relationships from hydraulic geometry should yield more accurate and more precise discharge estimates. Our remote analysis of a Martian-terrestrial analog channel, combined with in situ discharge data, substantiates this implication. Applied to Martian features, these results imply that paleodischarges of interior channels of Noachian-Hesperian (~3.7 Ga) valley networks have been underestimated by a factor of several, whereas paleodischarges for smaller fluvial deposits of the Late Hesperian-Early Amazonian (~3.0 Ga) have been overestimated. Thus, these new paleodischarges significantly magnify the contrast between early and late Martian hydrologic activity. Width-discharge relationships from hydraulic geometry represent validated tools for quantifying fluvial input near candidate landing sites of upcoming missions.

  4. Ocean Lidar Measurements of Beam Attenuation and a Roadmap to Accurate Phytoplankton Biomass Estimates

    NASA Astrophysics Data System (ADS)

    Hu, Yongxiang; Behrenfeld, Mike; Hostetler, Chris; Pelon, Jacques; Trepte, Charles; Hair, John; Slade, Wayne; Cetinic, Ivona; Vaughan, Mark; Lu, Xiaomei; Zhai, Pengwang; Weimer, Carl; Winker, David; Verhappen, Carolus C.; Butler, Carolyn; Liu, Zhaoyan; Hunt, Bill; Omar, Ali; Rodier, Sharon; Lifermann, Anne; Josset, Damien; Hou, Weilin; MacDonnell, David; Rhew, Ray

    2016-06-01

    Beam attenuation coefficient, c, provides an important optical index of plankton standing stocks, such as phytoplankton biomass and total particulate carbon concentration. Unfortunately, c has proven difficult to quantify through remote sensing. Here, we introduce an innovative approach for estimating c using lidar depolarization measurements and diffuse attenuation coefficients from ocean color products or lidar measurements of Brillouin scattering. The new approach is based on a theoretical formula established from Monte Carlo simulations that links the depolarization ratio of sea water to the ratio of diffuse attenuation Kd and beam attenuation C (i.e., a multiple scattering factor). On July 17, 2014, the CALIPSO satellite was tilted 30° off-nadir for one nighttime orbit in order to minimize ocean surface backscatter and demonstrate the lidar ocean subsurface measurement concept from space. Depolarization ratios of ocean subsurface backscatter are measured accurately. Beam attenuation coefficients computed from the depolarization ratio measurements compare well with empirical estimates from ocean color measurements. We further verify the beam attenuation coefficient retrievals using aircraft-based high spectral resolution lidar (HSRL) data that are collocated with in-water optical measurements.

  5. Estimating the Deep Space Network modification costs to prepare for future space missions by using major cost drivers

    NASA Technical Reports Server (NTRS)

    Remer, Donald S.; Sherif, Josef; Buchanan, Harry R.

    1993-01-01

    This paper develops a cost model to do long range planning cost estimates for Deep Space Network (DSN) support of future space missions. The paper focuses on the costs required to modify and/or enhance the DSN to prepare for future space missions. The model is a function of eight major mission cost drivers and estimates both the total cost and the annual costs of a similar future space mission. The model is derived from actual cost data from three space missions: Voyager (Uranus), Voyager (Neptune), and Magellan. Estimates derived from the model are tested against actual cost data for two independent missions, Viking and Mariner Jupiter/Saturn (MJS).

  6. A Comparative Analysis of the Cost Estimating Error Risk Associated with Flyaway Costs Versus Individual Components of Aircraft

    DTIC Science & Technology

    2003-03-01

    18 Parametric Estimation .......................................................................19...prevalent in many different cost estimation techniques. They are the cornerstones of the parametric estimation technique developed by the RAND Corporation...estimation technique spectrum are parametric estimation and grass roots estimation. The parametric estimation technique can be 19 considered a macro

  7. Regional economic activity and absenteeism: a new approach to estimating the indirect costs of employee productivity loss.

    PubMed

    Bankert, Brian; Coberley, Carter; Pope, James E; Wells, Aaron

    2015-02-01

    This paper presents a new approach to estimating the indirect costs of health-related absenteeism. Productivity losses related to employee absenteeism have negative business implications for employers and these losses effectively deprive the business of an expected level of employee labor. The approach herein quantifies absenteeism cost using an output per labor hour-based method and extends employer-level results to the region. This new approach was applied to the employed population of 3 health insurance carriers. The economic cost of absenteeism was estimated to be $6.8 million, $0.8 million, and $0.7 million on average for the 3 employers; regional losses were roughly twice the magnitude of employer-specific losses. The new approach suggests that costs related to absenteeism for high output per labor hour industries exceed similar estimates derived from application of the human capital approach. The materially higher costs under the new approach emphasize the importance of accurately estimating productivity losses.

  8. Discrete state model and accurate estimation of loop entropy of RNA secondary structures.

    PubMed

    Zhang, Jian; Lin, Ming; Chen, Rong; Wang, Wei; Liang, Jie

    2008-03-28

    Conformational entropy makes important contribution to the stability and folding of RNA molecule, but it is challenging to either measure or compute conformational entropy associated with long loops. We develop optimized discrete k-state models of RNA backbone based on known RNA structures for computing entropy of loops, which are modeled as self-avoiding walks. To estimate entropy of hairpin, bulge, internal loop, and multibranch loop of long length (up to 50), we develop an efficient sampling method based on the sequential Monte Carlo principle. Our method considers excluded volume effect. It is general and can be applied to calculating entropy of loops with longer length and arbitrary complexity. For loops of short length, our results are in good agreement with a recent theoretical model and experimental measurement. For long loops, our estimated entropy of hairpin loops is in excellent agreement with the Jacobson-Stockmayer extrapolation model. However, for bulge loops and more complex secondary structures such as internal and multibranch loops, we find that the Jacobson-Stockmayer extrapolation model has large errors. Based on estimated entropy, we have developed empirical formulae for accurate calculation of entropy of long loops in different secondary structures. Our study on the effect of asymmetric size of loops suggest that loop entropy of internal loops is largely determined by the total loop length, and is only marginally affected by the asymmetric size of the two loops. Our finding suggests that the significant asymmetric effects of loop length in internal loops measured by experiments are likely to be partially enthalpic. Our method can be applied to develop improved energy parameters important for studying RNA stability and folding, and for predicting RNA secondary and tertiary structures. The discrete model and the program used to calculate loop entropy can be downloaded at http://gila.bioengr.uic.edu/resources/RNA.html.

  9. Verify by Genability - Providing Solar Customers with Accurate Reports of Utility Bill Cost Savings

    SciTech Connect

    2015-12-01

    The National Renewable Energy Laboratory (NREL), partnering with Genability and supported by the U.S. Department of Energy's SunShot Incubator program, independently verified the accuracy of Genability's monthly cost savings.

  10. Estimating boiling water reactor decommissioning costs: A user`s manual for the BWR Cost Estimating Computer Program (CECP) software. Final report

    SciTech Connect

    Bierschbach, M.C.

    1996-06-01

    Nuclear power plant licensees are required to submit to the US Nuclear Regulatory Commission (NRC) for review their decommissioning cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning boiling water reactor (BWR) power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  11. Estimating boiling water reactor decommissioning costs. A user`s manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    SciTech Connect

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  12. Cost Estimation of Laser Additive Manufacturing of Stainless Steel

    NASA Astrophysics Data System (ADS)

    Piili, Heidi; Happonen, Ari; Väistö, Tapio; Venkataramanan, Vijaikrishnan; Partanen, Jouni; Salminen, Antti

    Laser additive manufacturing (LAM) is a layer wise fabrication method in which a laser beam melts metallic powder to form solid objects. Although 3D printing has been invented 30 years ago, the industrial use is quite limited whereas the introduction of cheap consumer 3D printers, in recent years, has familiarized the 3D printing. Interest is focused more and more in manufacturing of functional parts. Aim of this study is to define and discuss the current economic opportunities and restrictions of LAM process. Manufacturing costs were studied with different build scenarios each with estimated cost structure by calculated build time and calculating the costs of the machine, material and energy with optimized machine utilization. All manufacturing and time simulations in this study were carried out with a research machine equal to commercial EOS M series equipment. The study shows that the main expense in LAM is the investment cost of the LAM machine, compared to which the relative proportions of the energy and material costs are very low. The manufacturing time per part is the key factor to optimize costs of LAM.

  13. Cost estimate for muddy water palladium production facility at Mound

    SciTech Connect

    McAdams, R.K.

    1988-11-30

    An economic feasibility study was performed on the ''Muddy Water'' low-chlorine content palladium powder production process developed by Mound. The total capital investment and total operating costs (dollars per gram) were determined for production batch sizes of 1--10 kg in 1-kg increments. The report includes a brief description of the Muddy Water process, the process flow diagram, and material balances for the various production batch sizes. Two types of facilities were evaluated--one for production of new, ''virgin'' palladium powder, and one for recycling existing material. The total capital investment for virgin facilities ranged from $600,000 --$1.3 million for production batch sizes of 1--10 kg, respectively. The range for recycle facilities was $1--$2.3 million. The total operating cost for 100% acceptable powder production in the virgin facilities ranged from $23 per gram for a 1-kg production batch size to $8 per gram for a 10-kg batch size. Similarly for recycle facilities, the total operating cost ranged from $34 per gram to $5 per gram. The total operating cost versus product acceptability (ranging from 50%--100% acceptability) was also evaluated for both virgin and recycle facilities. Because production sizes studied vary widely and because scale-up factors are unknown for batch sizes greater than 1 kg, all costs are ''order-of-magnitude'' estimates. All costs reported are in 1987 dollars.

  14. Man power/cost estimation model: Automated planetary projects

    NASA Technical Reports Server (NTRS)

    Kitchen, L. D.

    1975-01-01

    A manpower/cost estimation model is developed which is based on a detailed level of financial analysis of over 30 million raw data points which are then compacted by more than three orders of magnitude to the level at which the model is applicable. The major parameter of expenditure is manpower (specifically direct labor hours) for all spacecraft subsystem and technical support categories. The resultant model is able to provide a mean absolute error of less than fifteen percent for the eight programs comprising the model data base. The model includes cost saving inheritance factors, broken down in four levels, for estimating follow-on type programs where hardware and design inheritance are evident or expected.

  15. Test Case Study: Estimating the Cost of Ada Software Development

    DTIC Science & Technology

    1989-04-01

    selected Navy data were used to develop the SASEr model (HEA89]. SASET is meant to be used to estimate development in Assembly, Ada, or any HOL. However...3-6 3.4 Criteria For Ratings Selection ........ 3-10 3.4.1 Normalization ..... ................. .. 3-11 3.4.2 Cunparison of Model Results...costing issues. The guiding principle for model selection for inclusion in the study was the availability of models to the AF=, USACEAC, and IITRI

  16. Computer-Aided Final Design Cost Estimating System Overview.

    DTIC Science & Technology

    1977-05-01

    laboratory _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /1— COMPUTER-AIDED FINAL DESIGN • COST ESTIMATING SYSTEM OVERVIEW © by...PROJECT . TASKAAEA~~ WORK UNIT NUMBERSCONSTRUCTION ENGINEERING RESEARCH LABORATORY ~~~~~~~~~ .• . — P.O. Box 4005 ~~ 4A7627~ %T4fldt11 Champa ign , IL 61820...Construction Division (FA), U.S. Army Construction Engineering Re- search Laboratory (CERL), Champaign , IL. The Principal Investigator was Mr. Michael

  17. Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Douglas, Freddie; Bourgeois, Edit Kaminsky

    2005-01-01

    The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).

  18. Methodology for estimating reprocessing costs for nuclear fuels

    SciTech Connect

    Carter, W. L.; Rainey, R. H.

    1980-02-01

    A technological and economic evaluation of reprocessing requirements for alternate fuel cycles requires a common assessment method and a common basis to which various cycles can be related. A methodology is described for the assessment of alternate fuel cycles utilizing a side-by-side comparison of functional flow diagrams of major areas of the reprocessing plant with corresponding diagrams of the well-developed Purex process as installed in the Barnwell Nuclear Fuel Plant (BNFP). The BNFP treats 1500 metric tons of uranium per year (MTU/yr). Complexity and capacity factors are determined for adjusting the estimated facility and equipment costs of BNFP to determine the corresponding costs for the alternate fuel cycle. Costs of capacities other than the reference 1500 MT of heavy metal per year are estimated by the use of scaling factors. Unit costs of reprocessed fuel are calculated using a discounted cash flow analysis for three economic bases to show the effect of low-risk, typical, and high-risk financing methods.

  19. Cost estimate for a proposed GDF Suez LNG testing program

    SciTech Connect

    Blanchat, Thomas K.; Brady, Patrick Dennis; Jernigan, Dann A.; Luketa, Anay Josephine; Nissen, Mark R.; Lopez, Carlos; Vermillion, Nancy; Hightower, Marion Michael

    2014-02-01

    At the request of GDF Suez, a Rough Order of Magnitude (ROM) cost estimate was prepared for the design, construction, testing, and data analysis for an experimental series of large-scale (Liquefied Natural Gas) LNG spills on land and water that would result in the largest pool fires and vapor dispersion events ever conducted. Due to the expected cost of this large, multi-year program, the authors utilized Sandia's structured cost estimating methodology. This methodology insures that the efforts identified can be performed for the cost proposed at a plus or minus 30 percent confidence. The scale of the LNG spill, fire, and vapor dispersion tests proposed by GDF could produce hazard distances and testing safety issues that need to be fully explored. Based on our evaluations, Sandia can utilize much of our existing fire testing infrastructure for the large fire tests and some small dispersion tests (with some modifications) in Albuquerque, but we propose to develop a new dispersion testing site at our remote test area in Nevada because of the large hazard distances. While this might impact some testing logistics, the safety aspects warrant this approach. In addition, we have included a proposal to study cryogenic liquid spills on water and subsequent vaporization in the presence of waves. Sandia is working with DOE on applications that provide infrastructure pertinent to wave production. We present an approach to conduct repeatable wave/spill interaction testing that could utilize such infrastructure.

  20. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  1. Estimating the cost of blood: past, present, and future directions.

    PubMed

    Shander, Aryeh; Hofmann, Axel; Gombotz, Hans; Theusinger, Oliver M; Spahn, Donat R

    2007-06-01

    Understanding the costs associated with blood products requires sophisticated knowledge about transfusion medicine and is attracting the attention of clinical and administrative healthcare sectors worldwide. To improve outcomes, blood usage must be optimized and expenditures controlled so that resources may be channeled toward other diagnostic, therapeutic, and technological initiatives. Estimating blood costs, however, is a complex undertaking, surpassing simple supply versus demand economics. Shrinking donor availability and application of a precautionary principle to minimize transfusion risks are factors that continue to drive the cost of blood products upward. Recognizing that historical accounting attempts to determine blood costs have varied in scope, perspective, and methodology, new approaches have been initiated to identify all potential cost elements related to blood and blood product administration. Activities are also under way to tie these elements together in a comprehensive and practical model that will be applicable to all single-donor blood products without regard to practice type (e.g., academic, private, multi- or single-center clinic). These initiatives, their rationale, importance, and future directions are described.

  2. Accurate optical flow field estimation using mechanical properties of soft tissues

    NASA Astrophysics Data System (ADS)

    Mehrabian, Hatef; Karimi, Hirad; Samani, Abbas

    2009-02-01

    A novel optical flow based technique is presented in this paper to measure the nodal displacements of soft tissue undergoing large deformations. In hyperelasticity imaging, soft tissues maybe compressed extensively [1] and the deformation may exceed the number of pixels ordinary optical flow approaches can detect. Furthermore in most biomedical applications there is a large amount of image information that represent the geometry of the tissue and the number of tissue types present in the organ of interest. Such information is often ignored in applications such as image registration. In this work we incorporate the information pertaining to soft tissue mechanical behavior (Neo-Hookean hyperelastic model is used here) in addition to the tissue geometry before compression into a hierarchical Horn-Schunck optical flow method to overcome this large deformation detection weakness. Applying the proposed method to a phantom using several compression levels proved that it yields reasonably accurate displacement fields. Estimated displacement results of this phantom study obtained for displacement fields of 85 pixels/frame and 127 pixels/frame are reported and discussed in this paper.

  3. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  4. Estimating cost ratio distribution between fatal and non-fatal road accidents in Malaysia

    NASA Astrophysics Data System (ADS)

    Hamdan, Nurhidayah; Daud, Noorizam

    2014-07-01

    Road traffic crashes are a global major problem, and should be treated as a shared responsibility. In Malaysia, road accident tragedies kill 6,917 people and injure or disable 17,522 people in year 2012, and government spent about RM9.3 billion in 2009 which cost the nation approximately 1 to 2 percent loss of gross domestic product (GDP) reported annually. The current cost ratio for fatal and non-fatal accident used by Ministry of Works Malaysia simply based on arbitrary value of 6:4 or equivalent 1.5:1 depends on the fact that there are six factors involved in the calculation accident cost for fatal accident while four factors for non-fatal accident. The simple indication used by the authority to calculate the cost ratio is doubted since there is lack of mathematical and conceptual evidence to explain how this ratio is determined. The main aim of this study is to determine the new accident cost ratio for fatal and non-fatal accident in Malaysia based on quantitative statistical approach. The cost ratio distributions will be estimated based on Weibull distribution. Due to the unavailability of official accident cost data, insurance claim data both for fatal and non-fatal accident have been used as proxy information for the actual accident cost. There are two types of parameter estimates used in this study, which are maximum likelihood (MLE) and robust estimation. The findings of this study reveal that accident cost ratio for fatal and non-fatal claim when using MLE is 1.33, while, for robust estimates, the cost ratio is slightly higher which is 1.51. This study will help the authority to determine a more accurate cost ratio between fatal and non-fatal accident as compared to the official ratio set by the government, since cost ratio is an important element to be used as a weightage in modeling road accident related data. Therefore, this study provides some guidance tips to revise the insurance claim set by the Malaysia road authority, hence the appropriate method

  5. Measuring costs of child abuse and neglect: a mathematic model of specific cost estimations.

    PubMed

    Conrad, Cynthia

    2006-01-01

    Few empirical facts exist regarding the actual costs of child abuse in the United States. Consistent data is not available for national or even statewide analysis. Clearly there is a need for such accounting in order to fully understand the damage created by child abuse and neglect. Policy makers and social welfare planners should take child abuse costs into consideration when determining expenditures for prevention and intervention programs. The real savings may far outweigh the costs of such programs when both direct and indirect costs of child abuse and neglect enter into the analysis. This paper offers a model in which the actual costs of child abuse and neglect, based on direct, indirect, and opportunity costs associated with each case. Direct costs are those associated with the treatment of abused and neglected children as well as the costs of family intervention programs or foster care. Indirect costs are costs to society created by the negative effects of child abuse and neglect evinced by individuals who suffer such abuse and then as teens or adults engage in criminal behavior. Indirect costs also derive from the long term and ongoing health care needs required by victims of abuse, for both physical and mental health disorders. With the existence of this model, the author hopes to stimulate the discussion and desire for better data collection and analysis. In order to demonstrate the utility of the model, the author has included some cost estimates from the Connecticut State Department of Children and Families and the works of other scholars looking into the question of costs for child abuse and neglect. This data represents the best available at this time. As a result, the model appearing here is specific to Connecticut. Even so, once more valid data becomes available, the model's structure and theoretical framework should adapt to the needs of other states to facilitate better measurement of relevant costs and provide a clearer picture of the utility of

  6. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  7. Why Don't They Just Give Us Money? Project Cost Estimating and Cost Reporting

    NASA Technical Reports Server (NTRS)

    Comstock, Douglas A.; Van Wychen, Kristin; Zimmerman, Mary Beth

    2015-01-01

    Successful projects require an integrated approach to managing cost, schedule, and risk. This is especially true for complex, multi-year projects involving multiple organizations. To explore solutions and leverage valuable lessons learned, NASA's Virtual Project Management Challenge will kick off a three-part series examining some of the challenges faced by project and program managers when it comes to managing these important elements. In this first session of the series, we will look at cost management, with an emphasis on the critical roles of cost estimating and cost reporting. By taking a proactive approach to both of these activities, project managers can better control life cycle costs, maintain stakeholder confidence, and protect other current and future projects in the organization's portfolio. Speakers will be Doug Comstock, Director of NASA's Cost Analysis Division, Kristin Van Wychen, Senior Analyst in the GAO Acquisition and Sourcing Management Team, and Mary Beth Zimmerman, Branch Chief for NASA's Portfolio Analysis Branch, Strategic Investments Division. Moderator Ramien Pierre is from NASA's Academy for Program/Project and Engineering Leadership (APPEL).

  8. Architects and Design-Phase Cost Estimates: Design Professionals Should Reconsider the Value of Third-Party Estimates

    ERIC Educational Resources Information Center

    Coakley, John

    2010-01-01

    Professional cost estimators are widely used by architects during the design phases of a project to provide preliminary cost estimates. These estimates may begin at the conceptual design phase and are prepared at regular intervals through the construction document phase. Estimating professionals are frequently tasked with "selling" the importance…

  9. 48 CFR 2452.216-70 - Estimated cost, base fee and award fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost, base fee... Provisions and Clauses 2452.216-70 Estimated cost, base fee and award fee. As prescribed in 2416.406(e)(1), insert the following clause in all cost-plus-award-fee contracts: Estimated Cost, Base Fee and Award...

  10. 48 CFR 2452.216-70 - Estimated cost, base fee and award fee.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Estimated cost, base fee... Provisions and Clauses 2452.216-70 Estimated cost, base fee and award fee. As prescribed in 2416.406(e)(1), insert the following clause in all cost-plus-award-fee contracts: Estimated Cost, Base Fee and Award...

  11. Cost-Estimating Relationships for Tactical Combat Aircraft

    DTIC Science & Technology

    1984-11-01

    Copy 19 f 275 copies " 0 IDA MEMORANDUM REPORT M-14 L2 COST-ESTIMATING RELATIONSHIPS FOR TACTICAL COMBAT AIRCRAFT Joseph W. Stahl Joseph A. Arena...Mark 1. Knapp " November 1984 0 MAR 91985 Prepared jor Office of the Under Secretary of Defense for Research and Engineering f his document has been...34. *","-, . , ... . - . " *. ° •" . •-• o- . • include aircraft introduced into the force in the 1970s and 1980s; the F -14, F -15, F -16, F /A-18, A-10 and AV-8B

  12. IUS/TUG orbital operations and mission support study. Volume 5: Cost estimates

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The costing approach, methodology, and rationale utilized for generating cost data for composite IUS and space tug orbital operations are discussed. Summary cost estimates are given along with cost data initially derived for the IUS program and space tug program individually, and cost estimates for each work breakdown structure element.

  13. Evaluation of a low-cost and accurate ocean temperature logger on subsurface mooring systems

    SciTech Connect

    Tian, Chuan; Deng, Zhiqun; Lu, Jun; Xu, Xiaoyang; Zhao, Wei; Xu, Ming

    2014-06-23

    Monitoring seawater temperature is important to understanding evolving ocean processes. To monitor internal waves or ocean mixing, a large number of temperature loggers are typically mounted on subsurface mooring systems to obtain high-resolution temperature data at different water depths. In this study, we redesigned and evaluated a compact, low-cost, self-contained, high-resolution and high-accuracy ocean temperature logger, TC-1121. The newly designed TC-1121 loggers are smaller, more robust, and their sampling intervals can be automatically changed by indicated events. They have been widely used in many mooring systems to study internal wave and ocean mixing. The logger’s fundamental design, noise analysis, calibration, drift test, and a long-term sea trial are discussed in this paper.

  14. DRG-based per diem payment system matches costs more accurately.

    PubMed

    Brannen, T J

    1999-04-01

    Some managed care organizations use the DRG hospital payment method developed for Medicare to set case rates. Unfortunately, when such a method is used in a risk-sharing arrangement, hospital and physician incentives are misaligned. Hospitals and payers would benefit from using a hospital reimbursement model that calculates inpatient per diem payments for medical and surgical cases by classifying DRGs in tiers and ranking the tiers according to how resource-intensive they are. DRGs provide the means for a rational classification system of per diem rates that recognizes cases where the expected resources are going to be higher or lower than the average per diem amount. If payers use per diem rates that are weighted according to a DRG classification, hospital payments can correlate closely with the actual costs per day for a specific case, rather than an average for all surgical or medical admissions.

  15. 48 CFR 9905.501 - Cost accounting standard-consistency in estimating, accumulating and reporting costs by...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-consistency in estimating, accumulating and reporting costs by educational institutions. 9905.501 Section 9905... PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.501 Cost accounting standard—consistency in...

  16. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, B.; Hut, R.; Van De Giesen, N.

    2012-12-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the $150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  17. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, Boy-Santhos; Hut, Rolf; van de Giesen, Nick

    2013-04-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the 150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  18. Notification: Preliminary Research: Review of Independent Government Cost Estimates and Indirect Costs for EPA’s Interagency Agreements

    EPA Pesticide Factsheets

    Project #OA-FY14-0130, February 11, 2014. The EPA OIG plans to begin preliminary research of the independent government cost estimates and indirect costs for the EPA's funds-in interagency agreements.

  19. A cost-effective transparency-based digital imaging for efficient and accurate wound area measurement.

    PubMed

    Li, Pei-Nan; Li, Hong; Wu, Mo-Li; Wang, Shou-Yu; Kong, Qing-You; Zhang, Zhen; Sun, Yuan; Liu, Jia; Lv, De-Cheng

    2012-01-01

    Wound measurement is an objective and direct way to trace the course of wound healing and to evaluate therapeutic efficacy. Nevertheless, the accuracy and efficiency of the current measurement methods need to be improved. Taking the advantages of reliability of transparency tracing and the accuracy of computer-aided digital imaging, a transparency-based digital imaging approach is established, by which data from 340 wound tracing were collected from 6 experimental groups (8 rats/group) at 8 experimental time points (Day 1, 3, 5, 7, 10, 12, 14 and 16) and orderly archived onto a transparency model sheet. This sheet was scanned and its image was saved in JPG form. Since a set of standard area units from 1 mm(2) to 1 cm(2) was integrated into the sheet, the tracing areas in JPG image were measured directly, using the "Magnetic lasso tool" in Adobe Photoshop program. The pixel values/PVs of individual outlined regions were obtained and recorded in an average speed of 27 second/region. All PV data were saved in an excel form and their corresponding areas were calculated simultaneously by the formula of Y (PV of the outlined region)/X (PV of standard area unit) × Z (area of standard unit). It took a researcher less than 3 hours to finish area calculation of 340 regions. In contrast, over 3 hours were expended by three skillful researchers to accomplish the above work with traditional transparency-based method. Moreover, unlike the results obtained traditionally, little variation was found among the data calculated by different persons and the standard area units in different sizes and shapes. Given its accurate, reproductive and efficient properties, this transparency-based digital imaging approach would be of significant values in basic wound healing research and clinical practice.

  20. A Cost-Benefit and Accurate Method for Assessing Microalbuminuria: Single versus Frequent Urine Analysis.

    PubMed

    Hemmati, Roholla; Gharipour, Mojgan; Khosravi, Alireza; Jozan, Mahnaz

    2013-01-01

    Background. The purpose of this study was to answer the question whether a single testing for microalbuminuria results in a reliable conclusion leading costs saving. Methods. This current cross-sectional study included a total of 126 consecutive persons. Microalbuminuria was assessed by collection of two fasting random urine specimens on arrival to the clinic as well as one week later in the morning. Results. In overall, 17 out of 126 participants suffered from microalbuminuria that, among them, 12 subjects were also diagnosed as microalbuminuria once assessing this factor with a sensitivity of 70.6%, a specificity of 100%, a PPV of 100%, a NPV of 95.6%, and an accuracy of 96.0%. The measured sensitivity, specificity, PVV, NPV, and accuracy in hypertensive patients were 73.3%, 100%, 100%, 94.8%, and 95.5%, respectively. Also, these rates in nonhypertensive groups were 50.0%, 100%, 100%, 97.3%, and 97.4%, respectively. According to the ROC curve analysis, a single measurement of UACR had a high value for discriminating defected from normal renal function state (c = 0.989). Urinary albumin concentration in a single measurement had also high discriminative value for diagnosis of damaged kidney (c = 0.995). Conclusion. The single testing of both UACR and urine albumin level rather frequent testing leads to high diagnostic sensitivity, specificity, and accuracy as well as high predictive values in total population and also in hypertensive subgroups.

  1. Software cost/resource modeling: Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  2. Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification.

    PubMed

    Kumar, Ajay; Kwong, Cyril

    2015-03-01

    Human identification using fingerprint impressions has been widely studied and employed for more than 2000 years. Despite new advancements in the 3D imaging technologies, widely accepted representation of 3D fingerprint features and matching methodology is yet to emerge. This paper investigates 3D representation of widely employed 2D minutiae features by recovering and incorporating (i) minutiae height z and (ii) its 3D orientation φ information and illustrates an effective matching strategy for matching popular minutiae features extended in 3D space. One of the obstacles of the emerging 3D fingerprint identification systems to replace the conventional 2D fingerprint system lies in their bulk and high cost, which is mainly contributed from the usage of structured lighting system or multiple cameras. This paper attempts to addresses such key limitations of the current 3D fingerprint technologies bydeveloping the single camera-based 3D fingerprint identification system. We develop a generalized 3D minutiae matching model and recover extended 3D fingerprint features from the reconstructed 3D fingerprints. 2D fingerprint images acquired for the 3D fingerprint reconstruction can themselves be employed for the performance improvement and have been illustrated in the work detailed in this paper. This paper also attempts to answer one of the most fundamental questions on the availability of inherent discriminable information from 3D fingerprints. The experimental results are presented on a database of 240 clients 3D fingerprints, which is made publicly available to further research efforts in this area, and illustrate the discriminant power of 3D minutiae representation and matching to achieve performance improvement.

  3. Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates

    NASA Technical Reports Server (NTRS)

    Peffley, Al F.

    1991-01-01

    The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.

  4. Insights on the role of accurate state estimation in coupled model parameter estimation by a conceptual climate model study

    NASA Astrophysics Data System (ADS)

    Yu, Xiaolin; Zhang, Shaoqing; Lin, Xiaopei; Li, Mingkui

    2017-03-01

    The uncertainties in values of coupled model parameters are an important source of model bias that causes model climate drift. The values can be calibrated by a parameter estimation procedure that projects observational information onto model parameters. The signal-to-noise ratio of error covariance between the model state and the parameter being estimated directly determines whether the parameter estimation succeeds or not. With a conceptual climate model that couples the stochastic atmosphere and slow-varying ocean, this study examines the sensitivity of state-parameter covariance on the accuracy of estimated model states in different model components of a coupled system. Due to the interaction of multiple timescales, the fast-varying atmosphere with a chaotic nature is the major source of the inaccuracy of estimated state-parameter covariance. Thus, enhancing the estimation accuracy of atmospheric states is very important for the success of coupled model parameter estimation, especially for the parameters in the air-sea interaction processes. The impact of chaotic-to-periodic ratio in state variability on parameter estimation is also discussed. This simple model study provides a guideline when real observations are used to optimize model parameters in a coupled general circulation model for improving climate analysis and predictions.

  5. Cost estimation of HVDC transmission system of Bangka's NPP candidates

    NASA Astrophysics Data System (ADS)

    Liun, Edwaren; Suparman

    2014-09-01

    Regarding nuclear power plant development in Bangka Island, it can be estimated that produced power will be oversupply for the Bangka Island and needs to transmit to Sumatra or Java Island. The distance between the regions or islands causing considerable loss of power in transmission by alternating current, and a wide range of technical and economical issues. The objective of this paper addresses to economics analysis of direct current transmission system to overcome those technical problem. Direct current transmission has a stable characteristic, so that the power delivery from Bangka to Sumatra or Java in a large scale efficiently and reliably can be done. HVDC system costs depend on the power capacity applied to the system and length of the transmission line in addition to other variables that may be different.

  6. FASTSim: A Model to Estimate Vehicle Efficiency, Cost and Performance

    SciTech Connect

    Brooker, A.; Gonder, J.; Wang, L.; Wood, E.; Lopp, S.; Ramroth, L.

    2015-05-04

    The Future Automotive Systems Technology Simulator (FASTSim) is a high-level advanced vehicle powertrain systems analysis tool supported by the U.S. Department of Energy’s Vehicle Technologies Office. FASTSim provides a quick and simple approach to compare powertrains and estimate the impact of technology improvements on light- and heavy-duty vehicle efficiency, performance, cost, and battery batches of real-world drive cycles. FASTSim’s calculation framework and balance among detail, accuracy, and speed enable it to simulate thousands of driven miles in minutes. The key components and vehicle outputs have been validated by comparing the model outputs to test data for many different vehicles to provide confidence in the results. A graphical user interface makes FASTSim easy and efficient to use. FASTSim is freely available for download from the National Renewable Energy Laboratory’s website (see www.nrel.gov/fastsim).

  7. Global cost estimates of reducing carbon emissions through avoided deforestation.

    PubMed

    Kindermann, Georg; Obersteiner, Michael; Sohngen, Brent; Sathaye, Jayant; Andrasko, Kenneth; Rametsteiner, Ewald; Schlamadinger, Bernhard; Wunder, Sven; Beach, Robert

    2008-07-29

    Tropical deforestation is estimated to cause about one-quarter of anthropogenic carbon emissions, loss of biodiversity, and other environmental services. United Nations Framework Convention for Climate Change talks are now considering mechanisms for avoiding deforestation (AD), but the economic potential of AD has yet to be addressed. We use three economic models of global land use and management to analyze the potential contribution of AD activities to reduced greenhouse gas emissions. AD activities are found to be a competitive, low-cost abatement option. A program providing a 10% reduction in deforestation from 2005 to 2030 could provide 0.3-0.6 Gt (1 Gt = 1 x 10(5) g) CO(2).yr(-1) in emission reductions and would require $0.4 billion to $1.7 billion.yr(-1) for 30 years. A 50% reduction in deforestation from 2005 to 2030 could provide 1.5-2.7 Gt CO(2).yr(-1) in emission reductions and would require $17.2 billion to $28.0 billion.yr(-1). Finally, some caveats to the analysis that could increase costs of AD programs are described.

  8. Advanced Composite Cost Estimating Manual. Volume II. Appendix

    DTIC Science & Technology

    1976-08-01

    0~~~~~~; - -m- O1 .,a 0 j in N j "s w f 0 0 Av .s 00 4 164 "i’ 3. -%’ 1*4*J. - N w 0-uo 16 fo moNi o w w -Q * * 0 h" v 0 a rq)E 4.1 w 0 -0 S p4 4 4...REkMOVE CHAIS 0.00J32 06COC43 I I 34 OPIN LC-OR 0*01402 o.ci’r2 I 31 PELEA *. 1.L. L JAO4 000035 22 36 RILtLA5E VAC. LINES 0.0031 !I kfMOVF lPAIk Ptk OVL...0 - NO 1- YES Ia YES INSERTS: 0 - NO 1 - YES 59 a *1 ACCEM-S COST PROJECTION CAM I UNIT MNER~aJboo~ AVE , LOT SIZiTYPE OF ESTIMATE UNIT COST aL. 21

  9. Life Cycle Cost Estimate of LSD(X)

    DTIC Science & Technology

    2012-06-01

    for 100 SWBS Materiel Cost ............................44  Figure 11.  Details of Regression Model for 200 SWBS Materiel Cost...45  Figure 12.  Details of Regression Model for 300 SWBS Materiel Cost ............................46  Figure 13.  Details of Regression Model for...400 SWBS Materiel Cost ............................47  Figure 14.  Details of Regression Model for 500 SWBS Materiel Cost

  10. Children Can Accurately Monitor and Control Their Number-Line Estimation Performance

    ERIC Educational Resources Information Center

    Wall, Jenna L.; Thompson, Clarissa A.; Dunlosky, John; Merriman, William E.

    2016-01-01

    Accurate monitoring and control are essential for effective self-regulated learning. These metacognitive abilities may be particularly important for developing math skills, such as when children are deciding whether a math task is difficult or whether they made a mistake on a particular item. The present experiments investigate children's ability…

  11. 48 CFR 1852.216-84 - Estimated cost and incentive fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the following clause: Estimated Cost and Incentive Fee (OCT 1996) The target cost of this contract is $___. The target fee of this contract is $___. The total target cost and target fee as contemplated by the... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost...

  12. Taking the Evolutionary Road to Developing an In-House Cost Estimate

    NASA Technical Reports Server (NTRS)

    Jacintho, David; Esker, Lind; Herman, Frank; Lavaque, Rodolfo; Regardie, Myma

    2011-01-01

    This slide presentation reviews the process and some of the problems and challenges of developing an In-House Cost Estimate (IHCE). Using as an example the Space Network Ground Segment Sustainment (SGSS) project, the presentation reviews the phases for developing a Cost estimate within the project to estimate government and contractor project costs to support a budget request.

  13. Improved rapid magnitude estimation for a community-based, low-cost MEMS accelerometer network

    USGS Publications Warehouse

    Chung, Angela I.; Cochran, Elizabeth S.; Kaiser, Anna E.; Christensen, Carl M.; Yildirim, Battalgazi; Lawrence, Jesse F.

    2015-01-01

    Immediately following the Mw 7.2 Darfield, New Zealand, earthquake, over 180 Quake‐Catcher Network (QCN) low‐cost micro‐electro‐mechanical systems accelerometers were deployed in the Canterbury region. Using data recorded by this dense network from 2010 to 2013, we significantly improved the QCN rapid magnitude estimation relationship. The previous scaling relationship (Lawrence et al., 2014) did not accurately estimate the magnitudes of nearby (<35  km) events. The new scaling relationship estimates earthquake magnitudes within 1 magnitude unit of the GNS Science GeoNet earthquake catalog magnitudes for 99% of the events tested, within 0.5 magnitude units for 90% of the events, and within 0.25 magnitude units for 57% of the events. These magnitudes are reliably estimated within 3 s of the initial trigger recorded on at least seven stations. In this report, we present the methods used to calculate a new scaling relationship and demonstrate the accuracy of the revised magnitude estimates using a program that is able to retrospectively estimate event magnitudes using archived data.

  14. Handbook for quick cost estimates. A method for developing quick approximate estimates of costs for generic actions for nuclear power plants

    SciTech Connect

    Ball, J.R.

    1986-04-01

    This document is a supplement to a ''Handbook for Cost Estimating'' (NUREG/CR-3971) and provides specific guidance for developing ''quick'' approximate estimates of the cost of implementing generic regulatory requirements for nuclear power plants. A method is presented for relating the known construction costs for new nuclear power plants (as contained in the Energy Economic Data Base) to the cost of performing similar work, on a back-fit basis, at existing plants. Cost factors are presented to account for variations in such important cost areas as construction labor productivity, engineering and quality assurance, replacement energy, reworking of existing features, and regional variations in the cost of materials and labor. Other cost categories addressed in this handbook include those for changes in plant operating personnel and plant documents, licensee costs, NRC costs, and costs for other government agencies. Data sheets, worksheets, and appropriate cost algorithms are included to guide the user through preparation of rough estimates. A sample estimate is prepared using the method and the estimating tools provided.

  15. Innovation in the pharmaceutical industry: New estimates of R&D costs.

    PubMed

    DiMasi, Joseph A; Grabowski, Henry G; Hansen, Ronald W

    2016-05-01

    The research and development costs of 106 randomly selected new drugs were obtained from a survey of 10 pharmaceutical firms. These data were used to estimate the average pre-tax cost of new drug and biologics development. The costs of compounds abandoned during testing were linked to the costs of compounds that obtained marketing approval. The estimated average out-of-pocket cost per approved new compound is $1395 million (2013 dollars). Capitalizing out-of-pocket costs to the point of marketing approval at a real discount rate of 10.5% yields a total pre-approval cost estimate of $2558 million (2013 dollars). When compared to the results of the previous study in this series, total capitalized costs were shown to have increased at an annual rate of 8.5% above general price inflation. Adding an estimate of post-approval R&D costs increases the cost estimate to $2870 million (2013 dollars).

  16. Accurate, fast and cost-effective diagnostic test for monosomy 1p36 using real-time quantitative PCR.

    PubMed

    Cunha, Pricila da Silva; Pena, Heloisa B; D'Angelo, Carla Sustek; Koiffmann, Celia P; Rosenfeld, Jill A; Shaffer, Lisa G; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5-0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  17. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  18. Cost estimation: An expert-opinion approach. [cost analysis of research projects using the Delphi method (forecasting)

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Fogleman, S.; Gielecki, M.

    1976-01-01

    A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.

  19. Estimating Well Costs for Enhanced Geothermal System Applications

    SciTech Connect

    K. K. Bloomfield; P. T. Laney

    2005-08-01

    The objective of the work reported was to investigate the costs of drilling and completing wells and to relate those costs to the economic viability of enhanced geothermal systems (EGS). This is part of a larger parametric study of major cost components in an EGS. The possibility of improving the economics of EGS can be determined by analyzing the major cost components of the system, which include well drilling and completion. Determining what costs in developing an EGS are most sensitive will determine the areas of research to reduce those costs. The results of the well cost analysis will help determine the cost of a well for EGS development.

  20. IDC reengineering Phase 2 & 3 US industry standard cost estimate summary

    SciTech Connect

    Harris, James M.; Huelskamp, Robert M.

    2015-01-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, using a commercial software cost estimation tool calibrated to US industry performance parameters. This is not a cost estimate for Sandia to perform the project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  1. Bayesian parameter estimation of a k-ε model for accurate jet-in-crossflow simulations

    SciTech Connect

    Ray, Jaideep; Lefantzi, Sophia; Arunajatesan, Srinivasan; Dechant, Lawrence

    2016-05-31

    Reynolds-averaged Navier–Stokes models are not very accurate for high-Reynolds-number compressible jet-in-crossflow interactions. The inaccuracy arises from the use of inappropriate model parameters and model-form errors in the Reynolds-averaged Navier–Stokes model. In this study, the hypothesis is pursued that Reynolds-averaged Navier–Stokes predictions can be significantly improved by using parameters inferred from experimental measurements of a supersonic jet interacting with a transonic crossflow.

  2. Accurate state estimation for a hydraulic actuator via a SDRE nonlinear filter

    NASA Astrophysics Data System (ADS)

    Strano, Salvatore; Terzo, Mario

    2016-06-01

    The state estimation in hydraulic actuators is a fundamental tool for the detection of faults or a valid alternative to the installation of sensors. Due to the hard nonlinearities that characterize the hydraulic actuators, the performances of the linear/linearization based techniques for the state estimation are strongly limited. In order to overcome these limits, this paper focuses on an alternative nonlinear estimation method based on the State-Dependent-Riccati-Equation (SDRE). The technique is able to fully take into account the system nonlinearities and the measurement noise. A fifth order nonlinear model is derived and employed for the synthesis of the estimator. Simulations and experimental tests have been conducted and comparisons with the largely used Extended Kalman Filter (EKF) are illustrated. The results show the effectiveness of the SDRE based technique for applications characterized by not negligible nonlinearities such as dead zone and frictions.

  3. Accurate liability estimation improves power in ascertained case-control studies.

    PubMed

    Weissbrod, Omer; Lippert, Christoph; Geiger, Dan; Heckerman, David

    2015-04-01

    Linear mixed models (LMMs) have emerged as the method of choice for confounded genome-wide association studies. However, the performance of LMMs in nonrandomly ascertained case-control studies deteriorates with increasing sample size. We propose a framework called LEAP (liability estimator as a phenotype; https://github.com/omerwe/LEAP) that tests for association with estimated latent values corresponding to severity of phenotype, and we demonstrate that this can lead to a substantial power increase.

  4. Robust and Accurate Vision-Based Pose Estimation Algorithm Based on Four Coplanar Feature Points

    PubMed Central

    Zhang, Zimiao; Zhang, Shihai; Li, Qiu

    2016-01-01

    Vision-based pose estimation is an important application of machine vision. Currently, analytical and iterative methods are used to solve the object pose. The analytical solutions generally take less computation time. However, the analytical solutions are extremely susceptible to noise. The iterative solutions minimize the distance error between feature points based on 2D image pixel coordinates. However, the non-linear optimization needs a good initial estimate of the true solution, otherwise they are more time consuming than analytical solutions. Moreover, the image processing error grows rapidly with measurement range increase. This leads to pose estimation errors. All the reasons mentioned above will cause accuracy to decrease. To solve this problem, a novel pose estimation method based on four coplanar points is proposed. Firstly, the coordinates of feature points are determined according to the linear constraints formed by the four points. The initial coordinates of feature points acquired through the linear method are then optimized through an iterative method. Finally, the coordinate system of object motion is established and a method is introduced to solve the object pose. The growing image processing error causes pose estimation errors the measurement range increases. Through the coordinate system, the pose estimation errors could be decreased. The proposed method is compared with two other existing methods through experiments. Experimental results demonstrate that the proposed method works efficiently and stably. PMID:27999338

  5. Accurate and efficient velocity estimation using Transmission matrix formalism based on the domain decomposition method

    NASA Astrophysics Data System (ADS)

    Wang, Benfeng; Jakobsen, Morten; Wu, Ru-Shan; Lu, Wenkai; Chen, Xiaohong

    2017-03-01

    Full waveform inversion (FWI) has been regarded as an effective tool to build the velocity model for the following pre-stack depth migration. Traditional inversion methods are built on Born approximation and are initial model dependent, while this problem can be avoided by introducing Transmission matrix (T-matrix), because the T-matrix includes all orders of scattering effects. The T-matrix can be estimated from the spatial aperture and frequency bandwidth limited seismic data using linear optimization methods. However the full T-matrix inversion method (FTIM) is always required in order to estimate velocity perturbations, which is very time consuming. The efficiency can be improved using the previously proposed inverse thin-slab propagator (ITSP) method, especially for large scale models. However, the ITSP method is currently designed for smooth media, therefore the estimation results are unsatisfactory when the velocity perturbation is relatively large. In this paper, we propose a domain decomposition method (DDM) to improve the efficiency of the velocity estimation for models with large perturbations, as well as guarantee the estimation accuracy. Numerical examples for smooth Gaussian ball models and a reservoir model with sharp boundaries are performed using the ITSP method, the proposed DDM and the FTIM. The estimated velocity distributions, the relative errors and the elapsed time all demonstrate the validity of the proposed DDM.

  6. Comparing the standards of one metabolic equivalent of task in accurately estimating physical activity energy expenditure based on acceleration.

    PubMed

    Kim, Dohyun; Lee, Jongshill; Park, Hoon Ki; Jang, Dong Pyo; Song, Soohwa; Cho, Baek Hwan; Jung, Yoo-Suk; Park, Rae-Woong; Joo, Nam-Seok; Kim, In Young

    2016-08-24

    The purpose of the study is to analyse how the standard of resting metabolic rate (RMR) affects estimation of the metabolic equivalent of task (MET) using an accelerometer. In order to investigate the effect on estimation according to intensity of activity, comparisons were conducted between the 3.5 ml O2 · kg(-1) · min(-1) and individually measured resting VO2 as the standard of 1 MET. MET was estimated by linear regression equations that were derived through five-fold cross-validation using 2 types of MET values and accelerations; the accuracy of estimation was analysed through cross-validation, Bland and Altman plot, and one-way ANOVA test. There were no significant differences in the RMS error after cross-validation. However, the individual RMR-based estimations had as many as 0.5 METs of mean difference in modified Bland and Altman plots than RMR of 3.5 ml O2 · kg(-1) · min(-1). Finally, the results of an ANOVA test indicated that the individual RMR-based estimations had less significant differences between the reference and estimated values at each intensity of activity. In conclusion, the RMR standard is a factor that affects accurate estimation of METs by acceleration; therefore, RMR requires individual specification when it is used for estimation of METs using an accelerometer.

  7. Developing a Cost Model and Methodology to Estimate Capital Costs for Thermal Energy Storage

    SciTech Connect

    Glatzmaier, G.

    2011-12-01

    This report provides an update on the previous cost model for thermal energy storage (TES) systems. The update allows NREL to estimate the costs of such systems that are compatible with the higher operating temperatures associated with advanced power cycles. The goal of the Department of Energy (DOE) Solar Energy Technology Program is to develop solar technologies that can make a significant contribution to the United States domestic energy supply. The recent DOE SunShot Initiative sets a very aggressive cost goal to reach a Levelized Cost of Energy (LCOE) of 6 cents/kWh by 2020 with no incentives or credits for all solar-to-electricity technologies.1 As this goal is reached, the share of utility power generation that is provided by renewable energy sources is expected to increase dramatically. Because Concentrating Solar Power (CSP) is currently the only renewable technology that is capable of integrating cost-effective energy storage, it is positioned to play a key role in providing renewable, dispatchable power to utilities as the share of power generation from renewable sources increases. Because of this role, future CSP plants will likely have as much as 15 hours of Thermal Energy Storage (TES) included in their design and operation. As such, the cost and performance of the TES system is critical to meeting the SunShot goal for solar technologies. The cost of electricity from a CSP plant depends strongly on its overall efficiency, which is a product of two components - the collection and conversion efficiencies. The collection efficiency determines the portion of incident solar energy that is captured as high-temperature thermal energy. The conversion efficiency determines the portion of thermal energy that is converted to electricity. The operating temperature at which the overall efficiency reaches its maximum depends on many factors, including material properties of the CSP plant components. Increasing the operating temperature of the power generation

  8. Energetic costs of mange in wolves estimated from infrared thermography

    USGS Publications Warehouse

    Cross, Paul C.; Almberg, Emily S.; Haase, Catherine G; Hudson, Peter J.; Maloney, Shane K; Metz, Matthew C; Munn, Adam J; Nugent, Paul; Putzeys, Olivier; Stahler, Daniel R.; Stewart, Anya C; Smith, Doug W.

    2016-01-01

    Parasites, by definition, extract energy from their hosts and thus affect trophic and food web dynamics even when the parasite may have limited effects on host population size. We studied the energetic costs of mange (Sarcoptes scabiei) in wolves (Canis lupus) using thermal cameras to estimate heat losses associated with compromised insulation during the winter. We combined the field data of known, naturally infected wolves with data set on captive wolves with shaved patches of fur as a positive control to simulate mange-induced hair loss. We predict that during the winter in Montana, more severe mange infection increases heat loss by around 5.2 to 12 MJ per night (1240 to 2850 kcal, or a 65% to 78% increase) for small and large wolves, respectively accounting for wind effects. To maintain body temperature would require a significant proportion of a healthy wolf's total daily energy demands (18-22 MJ/day). We also predict how these thermal costs may increase in colder climates by comparing our predictions in Bozeman, Montana to those from a place with lower ambient temperatures (Fairbanks, Alaska). Contrary to our expectations, the 14°C differential between these regions was not as important as the potential differences in wind speed. These large increases in energetic demands can be mitigated by either increasing consumption rates or decreasing other energy demands. Data from GPS-collared wolves indicated that healthy wolves move, on average, 17 km per day, which was reduced by 1.5, 1.8 and 6.5 km for light, medium, and severe hair loss. In addition, the wolf with the most hair loss was less active at night and more active during the day, which is the converse of the movement patterns of healthy wolves. At the individual level mange infections create significant energy demands and altered behavioral patterns, this may have cascading effects on prey consumption rates, food web dynamics, predator-prey interactions, and scavenger communities.

  9. Accurate kinetic parameter estimation during progress curve analysis of systems with endogenous substrate production.

    PubMed

    Goudar, Chetan T

    2011-10-01

    We have identified an error in the published integral form of the modified Michaelis-Menten equation that accounts for endogenous substrate production. The correct solution is presented and the error in both the substrate concentration, S, and the kinetic parameters Vm , Km , and R resulting from the incorrect solution was characterized. The incorrect integral form resulted in substrate concentration errors as high as 50% resulting in 7-50% error in kinetic parameter estimates. To better reflect experimental scenarios, noise containing substrate depletion data were analyzed by both the incorrect and correct integral equations. While both equations resulted in identical fits to substrate depletion data, the final estimates of Vm , Km , and R were different and Km and R estimates from the incorrect integral equation deviated substantially from the actual values. Another observation was that at R = 0, the incorrect integral equation reduced to the correct form of the Michaelis-Menten equation. We believe this combination of excellent fits to experimental data, albeit with incorrect kinetic parameter estimates, and the reduction to the Michaelis-Menten equation at R = 0 is primarily responsible for the incorrectness to go unnoticed. However, the resulting error in kinetic parameter estimates will lead to incorrect biological interpretation and we urge the use of the correct integral form presented in this study.

  10. Use of Multiple Data Sources to Estimate the Economic Cost of Dengue Illness in Malaysia

    PubMed Central

    Shepard, Donald S.; Undurraga, Eduardo A.; Lees, Rosemary Susan; Halasa, Yara; Lum, Lucy Chai See; Ng, Chiu Wan

    2012-01-01

    Dengue represents a substantial burden in many tropical and sub-tropical regions of the world. We estimated the economic burden of dengue illness in Malaysia. Information about economic burden is needed for setting health policy priorities, but accurate estimation is difficult because of incomplete data. We overcame this limitation by merging multiple data sources to refine our estimates, including an extensive literature review, discussion with experts, review of data from health and surveillance systems, and implementation of a Delphi process. Because Malaysia has a passive surveillance system, the number of dengue cases is under-reported. Using an adjusted estimate of total dengue cases, we estimated an economic burden of dengue illness of US$56 million (Malaysian Ringgit MYR196 million) per year, which is approximately US$2.03 (Malaysian Ringgit 7.14) per capita. The overall economic burden of dengue would be even higher if we included costs associated with dengue prevention and control, dengue surveillance, and long-term sequelae of dengue. PMID:23033404

  11. Breckinridge Project, initial effort. Report IX. Operating cost estimate

    SciTech Connect

    1982-01-01

    Operating costs are normally broken into three major categories: variable costs including raw materials, annual catalyst and chemicals, and utilities; semi-variable costs including labor and labor related cost; and fixed or capital related charges. The raw materials and utilities costs are proportional to production; however, a small component of utilities cost is independent of production. The catalyst and chemicals costs are also normally proportional to production. Semi-variable costs include direct labor, maintenance labor, labor supervision, contract maintenance, maintenance materials, payroll overheads, operation supplies, and general overhead and administration. Fixed costs include local taxes, insurance and the time value of the capital investment. The latter charge often includes the investor's anticipated return on investment. In determining operating costs for financial analysis, return on investment (ROI) and depreciation are not treated as cash operating costs. These costs are developed in the financial analysis; the annual operating cost determined here omits ROI and depreciation. Project Annual Operating Costs are summarized in Table 1. Detailed supporting information for the cost elements listed below is included in the following sections: Electrical, catalyst and chemicals, and salaries and wages.

  12. Alpha's standard error (ASE): an accurate and precise confidence interval estimate.

    PubMed

    Duhachek, Adam; Lacobucci, Dawn

    2004-10-01

    This research presents the inferential statistics for Cronbach's coefficient alpha on the basis of the standard statistical assumption of multivariate normality. The estimation of alpha's standard error (ASE) and confidence intervals are described, and the authors analytically and empirically investigate the effects of the components of these equations. The authors then demonstrate the superiority of this estimate compared with previous derivations of ASE in a separate Monte Carlo simulation. The authors also present a sampling error and test statistic for a test of independent sample alphas. They conclude with a recommendation that all alpha coefficients be reported in conjunction with standard error or confidence interval estimates and offer SAS and SPSS programming codes for easy implementation.

  13. Precision Pointing Control to and Accurate Target Estimation of a Non-Cooperative Vehicle

    NASA Technical Reports Server (NTRS)

    VanEepoel, John; Thienel, Julie; Sanner, Robert M.

    2006-01-01

    In 2004, NASA began investigating a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates in order to achieve capture by the proposed Hubble Robotic Vehicle (HRV), but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST. To generalize the situation, HST is the target vehicle and HRV is the chaser. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a control scheme. Non-cooperative in this context relates to the target vehicle no longer having the ability to maintain attitude control or transmit attitude knowledge.

  14. Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Sanner, Robert M.

    2006-01-01

    Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.

  15. A microbial clock provides an accurate estimate of the postmortem interval in a mouse model system

    PubMed Central

    Metcalf, Jessica L; Wegener Parfrey, Laura; Gonzalez, Antonio; Lauber, Christian L; Knights, Dan; Ackermann, Gail; Humphrey, Gregory C; Gebert, Matthew J; Van Treuren, Will; Berg-Lyons, Donna; Keepers, Kyle; Guo, Yan; Bullard, James; Fierer, Noah; Carter, David O; Knight, Rob

    2013-01-01

    Establishing the time since death is critical in every death investigation, yet existing techniques are susceptible to a range of errors and biases. For example, forensic entomology is widely used to assess the postmortem interval (PMI), but errors can range from days to months. Microbes may provide a novel method for estimating PMI that avoids many of these limitations. Here we show that postmortem microbial community changes are dramatic, measurable, and repeatable in a mouse model system, allowing PMI to be estimated within approximately 3 days over 48 days. Our results provide a detailed understanding of bacterial and microbial eukaryotic ecology within a decomposing corpse system and suggest that microbial community data can be developed into a forensic tool for estimating PMI. DOI: http://dx.doi.org/10.7554/eLife.01104.001 PMID:24137541

  16. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  17. Spectral estimation from laser scanner data for accurate color rendering of objects

    NASA Astrophysics Data System (ADS)

    Baribeau, Rejean

    2002-06-01

    Estimation methods are studied for the recovery of the spectral reflectance across the visible range from the sensing at just three discrete laser wavelengths. Methods based on principal component analysis and on spline interpolation are judged based on the CIE94 color differences for some reference data sets. These include the Macbeth color checker, the OSA-UCS color charts, some artist pigments, and a collection of miscellaneous surface colors. The optimal three sampling wavelengths are also investigated. It is found that color can be estimated with average accuracy ΔE94 = 2.3 when optimal wavelengths 455 nm, 540 n, and 610 nm are used.

  18. Data Anonymization that Leads to the Most Accurate Estimates of Statistical Characteristics: Fuzzy-Motivated Approach

    PubMed Central

    Xiang, G.; Ferson, S.; Ginzburg, L.; Longpré, L.; Mayorga, E.; Kosheleva, O.

    2013-01-01

    To preserve privacy, the original data points (with exact values) are replaced by boxes containing each (inaccessible) data point. This privacy-motivated uncertainty leads to uncertainty in the statistical characteristics computed based on this data. In a previous paper, we described how to minimize this uncertainty under the assumption that we use the same standard statistical estimates for the desired characteristics. In this paper, we show that we can further decrease the resulting uncertainty if we allow fuzzy-motivated weighted estimates, and we explain how to optimally select the corresponding weights. PMID:25187183

  19. Accurate and unbiased estimation of power-law exponents from single-emitter blinking data.

    PubMed

    Hoogenboom, Jacob P; den Otter, Wouter K; Offerhaus, Herman L

    2006-11-28

    Single emitter blinking with a power-law distribution for the on and off times has been observed on a variety of systems including semiconductor nanocrystals, conjugated polymers, fluorescent proteins, and organic fluorophores. The origin of this behavior is still under debate. Reliable estimation of power exponents from experimental data is crucial in validating the various models under consideration. We derive a maximum likelihood estimator for power-law distributed data and analyze its accuracy as a function of data set size and power exponent both analytically and numerically. Results are compared to least-squares fitting of the double logarithmically transformed probability density. We demonstrate that least-squares fitting introduces a severe bias in the estimation result and that the maximum likelihood procedure is superior in retrieving the correct exponent and reducing the statistical error. For a data set as small as 50 data points, the error margins of the maximum likelihood estimator are already below 7%, giving the possibility to quantify blinking behavior when data set size is limited, e.g., due to photobleaching.

  20. How Accurate and Robust Are the Phylogenetic Estimates of Austronesian Language Relationships?

    PubMed Central

    Greenhill, Simon J.; Drummond, Alexei J.; Gray, Russell D.

    2010-01-01

    We recently used computational phylogenetic methods on lexical data to test between two scenarios for the peopling of the Pacific. Our analyses of lexical data supported a pulse-pause scenario of Pacific settlement in which the Austronesian speakers originated in Taiwan around 5,200 years ago and rapidly spread through the Pacific in a series of expansion pulses and settlement pauses. We claimed that there was high congruence between traditional language subgroups and those observed in the language phylogenies, and that the estimated age of the Austronesian expansion at 5,200 years ago was consistent with the archaeological evidence. However, the congruence between the language phylogenies and the evidence from historical linguistics was not quantitatively assessed using tree comparison metrics. The robustness of the divergence time estimates to different calibration points was also not investigated exhaustively. Here we address these limitations by using a systematic tree comparison metric to calculate the similarity between the Bayesian phylogenetic trees and the subgroups proposed by historical linguistics, and by re-estimating the age of the Austronesian expansion using only the most robust calibrations. The results show that the Austronesian language phylogenies are highly congruent with the traditional subgroupings, and the date estimates are robust even when calculated using a restricted set of historical calibrations. PMID:20224774

  1. Endometriosis in Italy: from cost estimates to new medical treatment.

    PubMed

    Luisi, Stefano; Lazzeri, Lucia; Ciani, Valentina; Petraglia, Felice

    2009-11-01

    Endometriosis is defined as the presence of endometrial-like tissue outside the uterus, which induced a chronic inflammatory reaction. The data collected from Italy showed that around 3 million women are affected by endoemtriosis and the condition was predominantly found in women of reproductive age (50% of women were in the 29-39 age range), only 25% of women were asymptomatic. The associated symptoms can create an impact in general physical, mental, and social well-being. Endometriosis is associated with severe dysmenorrhea, deep dyspareunia, chronic pelvic pain, ovulation pain, cyclical, or perimenstrual symptoms, with or without abnormal bleeding, infertility, and chronic fatigue. The annual cost for hospital admission can be estimated to be in a total around 54 million euros. The average time for right diagnosis is around 9 years still today and it follows a long and expensive diagnostic search. Therapies can be useful to relieve and sometimes solve the symptoms, encourage fertility, eliminate endometrial lesions, and restore the anatomy of the pelvis. For medical therapy, several different preparations (oral contraceptives, progestogenics, gestrinone, danazol, and GnRHa) and new options (GnRH antagonists, aromatase inhibitors, estrogen receptor beta agoinist, progesterone receptor modulators, angiogenesis inhibitors, and COX-2 selective inhibitors) are available.

  2. Different approaches to estimating transition costs in the electric- utility industry

    SciTech Connect

    Baxter, L.W.

    1995-10-01

    The term ``transition costs`` describes the potential revenue shortfall (or welfare loss) a utility (or other actor) may experience through government-initiated deregulation of electricity generation. The potential for transition costs arises whenever a regulated industry is subject to competitive market forces as a result of explicit government action. Federal and state proposals to deregulate electricity generation sparked a national debate on transition costs in the electric-utility industry. Industry-wide transition cost estimates range from about $20 billion to $500 billion. Such disparate estimates raise important questions on estimation methods for decision makers. This report examines different approaches to estimating transition costs. The study has three objectives. First, we discuss the concept of transition cost. Second, we identify the major cost categories included in transition cost estimates and summarize the current debate on which specific costs are appropriately included in these estimates. Finally, we identify general and specific estimation approaches and assess their strengths and weaknesses. We relied primarily on the evidentiary records established at the Federal Energy Regulatory Commission and the California Public Utilities Commission to identify major cost categories and specific estimation approaches. We also contacted regulatory commission staffs in ten states to ascertain estimation activities in each of these states. We refined a classification framework to describe and assess general estimation options. We subsequently developed and applied criteria to describe and assess specific estimation approaches proposed by federal regulators, state regulators, utilities, independent power companies, and consultants.

  3. Accurate estimation of influenza epidemics using Google search data via ARGO

    PubMed Central

    Yang, Shihao; Santillana, Mauricio; Kou, S. C.

    2015-01-01

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  4. Do hand-held calorimeters provide reliable and accurate estimates of resting metabolic rate?

    PubMed

    Van Loan, Marta D

    2007-12-01

    This paper provides an overview of a new technique for indirect calorimetry and the assessment of resting metabolic rate. Information from the research literature includes findings on the reliability and validity of a new hand-held indirect calorimeter as well as use in clinical and field settings. Research findings to date are of mixed results. The MedGem instrument has provided more consistent results when compared to the Douglas bag method of measuring metabolic rate. The BodyGem instrument has been shown to be less accurate when compared to standard metabolic carts. Furthermore, when the Body Gem has been used with clinical patients or with under nourished individuals the results have not been acceptable. Overall, there is not a large enough body of evidence to definitively support the use of these hand-held devices for assessment of metabolic rate in a wide variety of clinical or research environments.

  5. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions.

  6. Raman spectroscopy for highly accurate estimation of the age of refrigerated porcine muscle

    NASA Astrophysics Data System (ADS)

    Timinis, Constantinos; Pitris, Costas

    2016-03-01

    The high water content of meat, combined with all the nutrients it contains, make it vulnerable to spoilage at all stages of production and storage even when refrigerated at 5 °C. A non-destructive and in situ tool for meat sample testing, which could provide an accurate indication of the storage time of meat, would be very useful for the control of meat quality as well as for consumer safety. The proposed solution is based on Raman spectroscopy which is non-invasive and can be applied in situ. For the purposes of this project, 42 meat samples from 14 animals were obtained and three Raman spectra per sample were collected every two days for two weeks. The spectra were subsequently processed and the sample age was calculated using a set of linear differential equations. In addition, the samples were classified in categories corresponding to the age in 2-day steps (i.e., 0, 2, 4, 6, 8, 10, 12 or 14 days old), using linear discriminant analysis and cross-validation. Contrary to other studies, where the samples were simply grouped into two categories (higher or lower quality, suitable or unsuitable for human consumption, etc.), in this study, the age was predicted with a mean error of ~ 1 day (20%) or classified, in 2-day steps, with 100% accuracy. Although Raman spectroscopy has been used in the past for the analysis of meat samples, the proposed methodology has resulted in a prediction of the sample age far more accurately than any report in the literature.

  7. Disaster warning system study summary. [cost estimates using NOAA satellites

    NASA Technical Reports Server (NTRS)

    Leroy, B. F.; Maloy, J. E.; Braley, R. C.; Provencher, C. E.; Schumaker, H. A.; Valgora, M. E.

    1977-01-01

    A conceptual satellite system to replace or complement NOAA's data collection, internal communications, and public information dissemination systems for the mid-1980's was defined. Program cost and cost sensitivity to variations in communications functions are analyzed.

  8. Multiple candidates and multiple constraints based accurate depth estimation for multi-view stereo

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Zhou, Fugen; Xue, Bindang

    2017-02-01

    In this paper, we propose a depth estimation method for multi-view image sequence. To enhance the accuracy of dense matching and reduce the inaccurate matching which is produced by inaccurate feature description, we select multiple matching points to build candidate matching sets. Then we compute an optimal depth from a candidate matching set which satisfies multiple constraints (epipolar constraint, similarity constraint and depth consistency constraint). To further increase the accuracy of depth estimation, depth consistency constraint of neighbor pixels is used to filter the inaccurate matching. On this basis, in order to get more complete depth map, depth diffusion is performed by neighbor pixels' depth consistency constraint. Through experiments on the benchmark datasets for multiple view stereo, we demonstrate the superiority of proposed method over the state-of-the-art method in terms of accuracy.

  9. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1990-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  10. Accurate dynamic power estimation for CMOS combinational logic circuits with real gate delay model.

    PubMed

    Fadl, Omnia S; Abu-Elyazeed, Mohamed F; Abdelhalim, Mohamed B; Amer, Hassanein H; Madian, Ahmed H

    2016-01-01

    Dynamic power estimation is essential in designing VLSI circuits where many parameters are involved but the only circuit parameter that is related to the circuit operation is the nodes' toggle rate. This paper discusses a deterministic and fast method to estimate the dynamic power consumption for CMOS combinational logic circuits using gate-level descriptions based on the Logic Pictures concept to obtain the circuit nodes' toggle rate. The delay model for the logic gates is the real-delay model. To validate the results, the method is applied to several circuits and compared against exhaustive, as well as Monte Carlo, simulations. The proposed technique was shown to save up to 96% processing time compared to exhaustive simulation.

  11. Accurate group velocity estimation for unmanned aerial vehicle-based acoustic atmospheric tomography.

    PubMed

    Rogers, Kevin J; Finn, Anthony

    2017-02-01

    Acoustic atmospheric tomography calculates temperature and wind velocity fields in a slice or volume of atmosphere based on travel time estimates between strategically located sources and receivers. The technique discussed in this paper uses the natural acoustic signature of an unmanned aerial vehicle as it overflies an array of microphones on the ground. The sound emitted by the aircraft is recorded on-board and by the ground microphones. The group velocities of the intersecting sound rays are then derived by comparing these measurements. Tomographic inversion is used to estimate the temperature and wind fields from the group velocity measurements. This paper describes a technique for deriving travel time (and hence group velocity) with an accuracy of 0.1% using these assets. This is shown to be sufficient to obtain highly plausible tomographic inversion results that correlate well with independent SODAR measurements.

  12. Techniques for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, Michael R.; Bland, Roger

    1999-01-01

    An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. The relative magnitude of equipment errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three sets of calibration data differed by less than an average of 4 cubic meters per second. Typical maximum flow rates during the data-collection period averaged 750 cubic meters per second.

  13. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1991-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  14. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture

  15. What Would It Cost to Coach Every New Principal? An Estimate Using Statewide Personnel Data

    ERIC Educational Resources Information Center

    Lochmiller, Chad R.

    2014-01-01

    In this paper, I use Levin and McEwan's (2001) cost feasibility approach and personnel data obtained from the Superintendent of Public Instruction to estimate the cost of providing coaching support to every newly hired principal in Washington State. Based on this descriptive analysis, I estimate that the cost to provide leadership coaching to…

  16. A Robust Design Approach to Cost Estimation: Solar Energy for Marine Corps Expeditionary Operations

    DTIC Science & Technology

    2014-07-14

    Resources Energy Technology Basics Electricity Grid Basics Costs Renewable Technologies Biomass Geothermal Solar Concentrators Solar Photovoltaics Wind...SPONSORED REPORT SERIES A Robust Design Approach to Cost Estimation: Solar Energy for Marine Corps Expeditionary Operations 14 July 2014...SUBTITLE A Robust Design Approach to Cost Estimation: Solar Energy for Marine Corps Expeditionary Operations 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  17. Energetic costs of mange in wolves estimated from infrared thermography.

    PubMed

    Cross, P C; Almberg, E S; Haase, C G; Hudson, P J; Maloney, S K; Metz, M C; Munn, A J; Nugent, P; Putzeys, O; Stahler, D R; Stewart, A C; Smith, D W

    2016-08-01

    Parasites, by definition, extract energy from their hosts and thus affect trophic and food web dynamics even when the parasite may have limited effects on host population size. We studied the energetic costs of mange (Sarcoptes scabiei) in wolves (Canis lupus) using thermal cameras to estimate heat losses associated with compromised insulation during the winter. We combined the field data of known, naturally infected wolves with a data set on captive wolves with shaved patches of fur as a positive control to simulate mange-induced hair loss. We predict that during the winter in Montana, more severe mange infection increases heat loss by around 5.2-12 MJ per night (1,240-2,850 kcal, or a 65-78% increase) for small and large wolves, respectively, accounting for wind effects. To maintain body temperature would require a significant proportion of a healthy wolf's total daily energy demands (18-22 MJ/day). We also predict how these thermal costs may increase in colder climates by comparing our predictions in Bozeman, Montana to those from a place with lower ambient temperatures (Fairbanks, Alaska). Contrary to our expectations, the 14°C differential between these regions was not as important as the potential differences in wind speed. These large increases in energetic demands can be mitigated by either increasing consumption rates or decreasing other energy demands. Data from GPS-collared wolves indicated that healthy wolves move, on average, 17 km per day, which was reduced by 1.5, 1.8, and 6.5 km for light, medium, and severe hair loss. In addition, the wolf with the most hair loss was less active at night and more active during the day, which is the converse of the movement patterns of healthy wolves. At the individual level, mange infections create significant energy demands and altered behavioral patterns, this may have cascading effects on prey consumption rates, food web dynamics, predator-prey interactions, and scavenger communities.

  18. A Simple and Accurate Equation for Peak Capacity Estimation in Two Dimensional Liquid Chromatography

    PubMed Central

    Li, Xiaoping; Stoll, Dwight R.; Carr, Peter W.

    2009-01-01

    Two dimensional liquid chromatography (2DLC) is a very powerful way to greatly increase the resolving power and overall peak capacity of liquid chromatography. The traditional “product rule” for peak capacity usually overestimates the true resolving power due to neglect of the often quite severe under-sampling effect and thus provides poor guidance for optimizing the separation and biases comparisons to optimized one dimensional gradient liquid chromatography. Here we derive a simple yet accurate equation for the effective two dimensional peak capacity that incorporates a correction for under-sampling of the first dimension. The results show that not only is the speed of the second dimension separation important for reducing the overall analysis time, but it plays a vital role in determining the overall peak capacity when the first dimension is under-sampled. A surprising subsidiary finding is that for relatively short 2DLC separations (much less than a couple of hours), the first dimension peak capacity is far less important than is commonly believed and need not be highly optimized, for example through use of long columns or very small particles. PMID:19053226

  19. Accurate Estimation of Expression Levels of Homologous Genes in RNA-seq Experiments

    NASA Astrophysics Data System (ADS)

    Paşaniuc, Bogdan; Zaitlen, Noah; Halperin, Eran

    Next generation high throughput sequencing (NGS) is poised to replace array based technologies as the experiment of choice for measuring RNA expression levels. Several groups have demonstrated the power of this new approach (RNA-seq), making significant and novel contributions and simultaneously proposing methodologies for the analysis of RNA-seq data. In a typical experiment, millions of short sequences (reads) are sampled from RNA extracts and mapped back to a reference genome. The number of reads mapping to each gene is used as proxy for its corresponding RNA concentration. A significant challenge in analyzing RNA expression of homologous genes is the large fraction of the reads that map to multiple locations in the reference genome. Currently, these reads are either dropped from the analysis, or a naïve algorithm is used to estimate their underlying distribution. In this work, we present a rigorous alternative for handling the reads generated in an RNA-seq experiment within a probabilistic model for RNA-seq data; we develop maximum likelihood based methods for estimating the model parameters. In contrast to previous methods, our model takes into account the fact that the DNA of the sequenced individual is not a perfect copy of the reference sequence. We show with both simulated and real RNA-seq data that our new method improves the accuracy and power of RNA-seq experiments.

  20. Accurate estimation of expression levels of homologous genes in RNA-seq experiments.

    PubMed

    Paşaniuc, Bogdan; Zaitlen, Noah; Halperin, Eran

    2011-03-01

    Abstract Next generation high-throughput sequencing (NGS) is poised to replace array-based technologies as the experiment of choice for measuring RNA expression levels. Several groups have demonstrated the power of this new approach (RNA-seq), making significant and novel contributions and simultaneously proposing methodologies for the analysis of RNA-seq data. In a typical experiment, millions of short sequences (reads) are sampled from RNA extracts and mapped back to a reference genome. The number of reads mapping to each gene is used as proxy for its corresponding RNA concentration. A significant challenge in analyzing RNA expression of homologous genes is the large fraction of the reads that map to multiple locations in the reference genome. Currently, these reads are either dropped from the analysis, or a naive algorithm is used to estimate their underlying distribution. In this work, we present a rigorous alternative for handling the reads generated in an RNA-seq experiment within a probabilistic model for RNA-seq data; we develop maximum likelihood-based methods for estimating the model parameters. In contrast to previous methods, our model takes into account the fact that the DNA of the sequenced individual is not a perfect copy of the reference sequence. We show with both simulated and real RNA-seq data that our new method improves the accuracy and power of RNA-seq experiments.

  1. Cost estimation for solid waste management in industrialising regions - Precedents, problems and prospects

    SciTech Connect

    Parthan, Shantha R.; Milke, Mark W.; Wilson, David C.; Cocks, John H.

    2012-03-15

    Highlights: Black-Right-Pointing-Pointer We review cost estimation approaches for solid waste management. Black-Right-Pointing-Pointer Unit cost method and benchmarking techniques used in industrialising regions (IR). Black-Right-Pointing-Pointer Variety in scope, quality and stakeholders makes cost estimation challenging in IR. Black-Right-Pointing-Pointer Integrate waste flow and cost models using cost functions to improve cost planning. - Abstract: The importance of cost planning for solid waste management (SWM) in industrialising regions (IR) is not well recognised. The approaches used to estimate costs of SWM can broadly be classified into three categories - the unit cost method, benchmarking techniques and developing cost models using sub-approaches such as cost and production function analysis. These methods have been developed into computer programmes with varying functionality and utility. IR mostly use the unit cost and benchmarking approach to estimate their SWM costs. The models for cost estimation, on the other hand, are used at times in industrialised countries, but not in IR. Taken together, these approaches could be viewed as precedents that can be modified appropriately to suit waste management systems in IR. The main challenges (or problems) one might face while attempting to do so are a lack of cost data, and a lack of quality for what data do exist. There are practical benefits to planners in IR where solid waste problems are critical and budgets are limited.

  2. Voxel-based registration of simulated and real patient CBCT data for accurate dental implant pose estimation

    NASA Astrophysics Data System (ADS)

    Moreira, António H. J.; Queirós, Sandro; Morais, Pedro; Rodrigues, Nuno F.; Correia, André Ricardo; Fernandes, Valter; Pinho, A. C. M.; Fonseca, Jaime C.; Vilaça, João. L.

    2015-03-01

    The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant's pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant's pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant's main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant's pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67+/-34μm and 108μm, and angular misfits of 0.15+/-0.08° and 1.4°, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants' pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

  3. Estimating the Cost of Standardized Student Testing in the United States.

    ERIC Educational Resources Information Center

    Phelps, Richard P.

    2000-01-01

    Describes and contrasts different methods of estimating costs of standardized testing. Using a cost-accounting approach, compares gross and marginal costs and considers testing objects (test materials and services, personnel and student time, and administrative/building overhead). Social marginal costs of replacing existing tests with a national…

  4. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    SciTech Connect

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.; Huelskamp, Robert M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  5. Los Alamos Waste Management Cost Estimation Model; Final report: Documentation of waste management process, development of Cost Estimation Model, and model reference manual

    SciTech Connect

    Matysiak, L.M.; Burns, M.L.

    1994-03-01

    This final report completes the Los Alamos Waste Management Cost Estimation Project, and includes the documentation of the waste management processes at Los Alamos National Laboratory (LANL) for hazardous, mixed, low-level radioactive solid and transuranic waste, development of the cost estimation model and a user reference manual. The ultimate goal of this effort was to develop an estimate of the life cycle costs for the aforementioned waste types. The Cost Estimation Model is a tool that can be used to calculate the costs of waste management at LANL for the aforementioned waste types, under several different scenarios. Each waste category at LANL is managed in a separate fashion, according to Department of Energy requirements and state and federal regulations. The cost of the waste management process for each waste category has not previously been well documented. In particular, the costs associated with the handling, treatment and storage of the waste have not been well understood. It is anticipated that greater knowledge of these costs will encourage waste generators at the Laboratory to apply waste minimization techniques to current operations. Expected benefits of waste minimization are a reduction in waste volume, decrease in liability and lower waste management costs.

  6. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  7. Estimating the direct and indirect costs associated with Parkinson's disease.

    PubMed

    Rodríguez-Blázquez, Carmen; Forjaz, Maria João; Lizán, Luis; Paz, Silvia; Martínez-Martín, Pablo

    2015-01-01

    Parkinson's disease (PD) is a progressive, neurodegenerative disorder whose symptoms and manifestations greatly deteriorate the health, functional status and quality of life of patients, has severe consequences on their families and caregivers and supposes a challenge for the healthcare system and society. The aim of this paper is to comprehensively and descriptively review studies on the economic impact of the disease and interventions, analyzing major contributing factors to direct and indirect costs in PD. Cost-of-illness studies have shown that costs of PD are high, mainly due to drug, hospitalization and productivity loss, and tend to increase as the disease progresses. Studies on PD treatment have suggested that therapies for advanced PD (levodopa/carbidopa intestinal gel and apomorphine) and surgical procedures are cost-effective and cost saving, despite their high expenditures; however, further research such as on the economic impact of non-motor manifestations or on the cost-effectiveness of non-medical interventions is still needed.

  8. Laboratory demonstration of aircraft estimation using low-cost sensors

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.

    1978-01-01

    Four nonlinear state estimators were devised which provide techniques for obtaining the angular orientation (attitude) of the aircraft. An extensive FORTRAN computer program was developed to demonstrate and evaluate the estimators by using recorded flight test data. This program simulates the estimator operation, and it compares the state estimates with actual state measurements. The program was used to evaluate the state estimators with data recorded on the NASA Ames CV-990 and CESSNA 402B aircraft. A preliminary assessment was made of the memory, word length, and timing requirements for implementing the selected state estimator on a typical microcomputer.

  9. Lunar base scenario cost estimates: Lunar base systems study task 6.1

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The projected development and production costs of each of the Lunar Base's systems are described and unit costs are estimated for transporting the systems to the lunar surface and for setting up the system.

  10. [Research on maize multispectral image accurate segmentation and chlorophyll index estimation].

    PubMed

    Wu, Qian; Sun, Hong; Li, Min-zan; Song, Yuan-yuan; Zhang, Yan-e

    2015-01-01

    In order to rapidly acquire maize growing information in the field, a non-destructive method of maize chlorophyll content index measurement was conducted based on multi-spectral imaging technique and imaging processing technology. The experiment was conducted at Yangling in Shaanxi province of China and the crop was Zheng-dan 958 planted in about 1 000 m X 600 m experiment field. Firstly, a 2-CCD multi-spectral image monitoring system was available to acquire the canopy images. The system was based on a dichroic prism, allowing precise separation of the visible (Blue (B), Green (G), Red (R): 400-700 nm) and near-infrared (NIR, 760-1 000 nm) band. The multispectral images were output as RGB and NIR images via the system vertically fixed to the ground with vertical distance of 2 m and angular field of 50°. SPAD index of each sample was'measured synchronously to show the chlorophyll content index. Secondly, after the image smoothing using adaptive smooth filtering algorithm, the NIR maize image was selected to segment the maize leaves from background, because there was a big difference showed in gray histogram between plant and soil background. The NIR image segmentation algorithm was conducted following steps of preliminary and accuracy segmentation: (1) The results of OTSU image segmentation method and the variable threshold algorithm were discussed. It was revealed that the latter was better one in corn plant and weed segmentation. As a result, the variable threshold algorithm based on local statistics was selected for the preliminary image segmentation. The expansion and corrosion were used to optimize the segmented image. (2) The region labeling algorithm was used to segment corn plants from soil and weed background with an accuracy of 95. 59 %. And then, the multi-spectral image of maize canopy was accurately segmented in R, G and B band separately. Thirdly, the image parameters were abstracted based on the segmented visible and NIR images. The average gray

  11. On the Utility of National Datasets and Resource Cost Models for Estimating Faculty Instructional Costs in Higher Education

    ERIC Educational Resources Information Center

    Morphew, Christopher; Baker, Bruce

    2007-01-01

    In this article, the authors present the results of a research study in which they used two national datasets to construct and examine a model that estimates relative faculty instructional costs for specific undergraduate degree programs and also identifies differences in these costs by region and institutional type. They conducted this research…

  12. The challenges of accurately estimating time of long bone injury in children.

    PubMed

    Pickett, Tracy A

    2015-07-01

    The ability to determine the time an injury occurred can be of crucial significance in forensic medicine and holds special relevance to the investigation of child abuse. However, dating paediatric long bone injury, including fractures, is nuanced by complexities specific to the paediatric population. These challenges include the ability to identify bone injury in a growing or only partially-calcified skeleton, different injury patterns seen within the spectrum of the paediatric population, the effects of bone growth on healing as a separate entity from injury, differential healing rates seen at different ages, and the relative scarcity of information regarding healing rates in children, especially the very young. The challenges posed by these factors are compounded by a lack of consistency in defining and categorizing healing parameters. This paper sets out the primary limitations of existing knowledge regarding estimating timing of paediatric bone injury. Consideration and understanding of the multitude of factors affecting bone injury and healing in children will assist those providing opinion in the medical-legal forum.

  13. Error Estimation And Accurate Mapping Based ALE Formulation For 3D Simulation Of Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Guerdoux, Simon; Fourment, Lionel

    2007-05-01

    An Arbitrary Lagrangian Eulerian (ALE) formulation is developed to simulate the different stages of the Friction Stir Welding (FSW) process with the FORGE3® F.E. software. A splitting method is utilized: a) the material velocity/pressure and temperature fields are calculated, b) the mesh velocity is derived from the domain boundary evolution and an adaptive refinement criterion provided by error estimation, c) P1 and P0 variables are remapped. Different velocity computation and remap techniques have been investigated, providing significant improvement with respect to more standard approaches. The proposed ALE formulation is applied to FSW simulation. Steady state welding, but also transient phases are simulated, showing good robustness and accuracy of the developed formulation. Friction parameters are identified for an Eulerian steady state simulation by comparison with experimental results. Void formation can be simulated. Simulations of the transient plunge and welding phases help to better understand the deposition process that occurs at the trailing edge of the probe. Flexibility and robustness of the model finally allows investigating the influence of new tooling designs on the deposition process.

  14. An Analysis of the Cost Estimating Process in Air Force Research and Development Laboratories.

    DTIC Science & Technology

    1981-09-01

    4. TITLE (and Subtitle) S. TYPE Of REPORT & PERIOO COvEREO AN ANALYSIS OF THE COST ESTIMATING PROCESS IN AIR FORCE RESEARCH AND DEVELOPMENT Master’s...final typed thesis. Her efficiency and professionalism was unexcelled. Finally, very special thanks go to my children, Chris and Brian, and especially my...42 3-6 Computer Costs - Estimating Methods. . 44 3-7 Type of Work Unit Versus Estimating Methods Used ... ............. .47 3-8 Cost Variance Between

  15. Defense Contractor’s Cost Estimating Methods for State-of-the-Art Extensions.

    DTIC Science & Technology

    1987-12-01

    framework in which to place empirical measures of technological change, hedonic price indices, and cost estimating relationships. In this study they...importance and magnitude is software cost estimating, which experts like Elmer Branyan from General Electric predict will account for eighty-five percent...developed by RCA 9 ( Price -S model) and Hughes Aircraft (developed by Dr. Jensen). Hardware cost estimation for initial developed relies on three

  16. COST ESTIMATION MODELS FOR DRINKING WATER TREATMENT UNIT PROCESSES

    EPA Science Inventory

    Cost models for unit processes typically utilized in a conventional water treatment plant and in package treatment plant technology are compiled in this paper. The cost curves are represented as a function of specified design parameters and are categorized into four major catego...

  17. Accurate path integral molecular dynamics simulation of ab-initio water at near-zero added cost

    NASA Astrophysics Data System (ADS)

    Elton, Daniel; Fritz, Michelle; Soler, José; Fernandez-Serra, Marivi

    It is now established that nuclear quantum motion plays an important role in determining water's structure and dynamics. These effects are important to consider when evaluating DFT functionals and attempting to develop better ones for water. The standard way of treating nuclear quantum effects, path integral molecular dynamics (PIMD), multiplies the number of energy/force calculations by the number of beads, which is typically 32. Here we introduce a method whereby PIMD can be incorporated into a DFT molecular dynamics simulation at virtually zero cost. The method is based on the cluster (many body) expansion of the energy. We first subtract the DFT monomer energies, using a custom DFT-based monomer potential energy surface. The evolution of the PIMD beads is then performed using only the more-accurate Partridge-Schwenke monomer energy surface. The DFT calculations are done using the centroid positions. Various bead thermostats can be employed to speed up the sampling of the quantum ensemble. The method bears some resemblance to multiple timestep algorithms and other schemes used to speed up PIMD with classical force fields. We show that our method correctly captures some of key effects of nuclear quantum motion on both the structure and dynamics of water. We acknowledge support from DOE Award No. DE-FG02-09ER16052 (D.E.) and DOE Early Career Award No. DE-SC0003871 (M.V.F.S.).

  18. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  19. Improving Space Project Cost Estimating with Engineering Management Variables

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph W.; Roth, Axel (Technical Monitor)

    2001-01-01

    Current space project cost models attempt to predict space flight project cost via regression equations, which relate the cost of projects to technical performance metrics (e.g. weight, thrust, power, pointing accuracy, etc.). This paper examines the introduction of engineering management parameters to the set of explanatory variables. A number of specific engineering management variables are considered and exploratory regression analysis is performed to determine if there is statistical evidence for cost effects apart from technical aspects of the projects. It is concluded that there are other non-technical effects at work and that further research is warranted to determine if it can be shown that these cost effects are definitely related to engineering management.

  20. A new method based on the subpixel Gaussian model for accurate estimation of asteroid coordinates

    NASA Astrophysics Data System (ADS)

    Savanevych, V. E.; Briukhovetskyi, O. B.; Sokovikova, N. S.; Bezkrovny, M. M.; Vavilova, I. B.; Ivashchenko, Yu. M.; Elenin, L. V.; Khlamov, S. V.; Movsesian, Ia. S.; Dashkova, A. M.; Pogorelov, A. V.

    2015-08-01

    We describe a new iteration method to estimate asteroid coordinates, based on a subpixel Gaussian model of the discrete object image. The method operates by continuous parameters (asteroid coordinates) in a discrete observational space (the set of pixel potentials) of the CCD frame. In this model, the kind of coordinate distribution of the photons hitting a pixel of the CCD frame is known a priori, while the associated parameters are determined from a real digital object image. The method that is developed, which is flexible in adapting to any form of object image, has a high measurement accuracy along with a low calculating complexity, due to the maximum-likelihood procedure that is implemented to obtain the best fit instead of a least-squares method and Levenberg-Marquardt algorithm for minimization of the quadratic form. Since 2010, the method has been tested as the basis of our Collection Light Technology (COLITEC) software, which has been installed at several observatories across the world with the aim of the automatic discovery of asteroids and comets in sets of CCD frames. As a result, four comets (C/2010 X1 (Elenin), P/2011 NO1(Elenin), C/2012 S1 (ISON) and P/2013 V3 (Nevski)) as well as more than 1500 small Solar system bodies (including five near-Earth objects (NEOs), 21 Trojan asteroids of Jupiter and one Centaur object) have been discovered. We discuss these results, which allowed us to compare the accuracy parameters of the new method and confirm its efficiency. In 2014, the COLITEC software was recommended to all members of the Gaia-FUN-SSO network for analysing observations as a tool to detect faint moving objects in frames.

  1. Stochastic Frontier Estimation of a CES Cost Function: The Case of Higher Education in Britain.

    ERIC Educational Resources Information Center

    Izadi, Hooshang; Johnes, Geraint; Oskrochi, Reza; Crouchley, Robert

    2002-01-01

    Examines the use of stochastic frontier estimation of constant elasticity of substitution (CES) cost function to measure differences in efficiency among British universities. (Contains 28 references.) (PKP)

  2. Quaternion-Based Unscented Kalman Filter for Accurate Indoor Heading Estimation Using Wearable Multi-Sensor System

    PubMed Central

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  3. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-05-07

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path.

  4. But what will it Cost? The history of NASA cost estimating

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph W.

    1994-01-01

    Within two years of being chartered in 1958 as an independent agency to conduct civilian pursuits in aeronautics and space, NASA absorbed either wholly or partially the people, facilities, and equipment of several existing organizations. These included the laboratories of the National Advisory Committee of Aeronautics (NACA) at Langley Research Center in Virginia, Ames Research Center in California, and Lewis Research Center in Ohio; the Army Ballistic Missile Agency (ABMA) at Redstone Arsenal Alabama, for which the team of Wernher von Braun worked; and the Department of Defense Advanced Research Projects Agency (ARPA) and their ongoing work on big boosters. These were especially valuable resources to jump start the new agency in light of the shocking success of the Soviet space probe Sputnik in the autumn of the previous year and the corresponding pressure from an impatient American public to produce some response. Along with these inheritances, there came some existing systems engineering and management practices, including project cost estimating methodologies. This paper will briefly trace the origins of those methods and how they evolved within the agency over the past three decades.

  5. The health and visibility cost of air pollution: a comparison of estimation methods.

    PubMed

    Delucchi, Mark A; Murphy, James J; McCubbin, Donald R

    2002-02-01

    Air pollution from motor vehicles, electricity-generating plants, industry, and other sources can harm human health, injure crops and forests, damage building materials, and impair visibility. Economists sometimes analyze the social cost of these impacts, in order to illuminate tradeoffs, compare alternatives, and promote efficient use of scarce resource. In this paper, we compare estimates of the health and visibility costs of air pollution derived from a meta-hedonic price analysis, with an estimate of health costs derived from a damage-function analysis and an estimate of the visibility cost derived from contingent valuation. We find that the meta-hedonic price analysis produces an estimate of the health cost that lies at the low end of the range of damage-function estimates. This is consistent with hypotheses that on the one hand, hedonic price analysis does not capture all of the health costs of air pollution (because individuals may not be fully informed about all of the health effects), and that on the other hand, the value of mortality used in the high-end damage function estimates is too high. The analysis of the visibility cost of air pollution derived from a meta-hedonic price analysis produces an estimate that is essentially identical to an independent estimate based on contingent valuation. This close agreement lends some credence to the estimates. We then apply the meta hedonic-price model to estimate the visibility cost per kilogram of motor vehicle emissions.

  6. Aggregate versus individual-level sexual behavior assessment: how much detail is needed to accurately estimate HIV/STI risk?

    PubMed

    Pinkerton, Steven D; Galletly, Carol L; McAuliffe, Timothy L; DiFranceisco, Wayne; Raymond, H Fisher; Chesson, Harrell W

    2010-02-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate). There is a natural trade-off between the level of sexual behavior detail and the precision of HIV/STI acquisition risk estimates. The results of this study indicate that relatively simple aggregate data collection techniques suffice to adequately estimate HIV risk. For highly infectious STIs, in contrast, accurate STI risk assessment requires more intensive partner-by-partner methods.

  7. Estimates and implications of the costs of compliance with biosafety regulations in developing countries.

    PubMed

    Falck-Zepeda, Jose; Yorobe, Jose; Husin, Bahagiawati Amir; Manalo, Abraham; Lokollo, Erna; Ramon, Godfrey; Zambrano, Patricia; Sutrisno

    2012-01-01

    Estimating the cost of compliance with biosafety regulations is important as it helps developers focus their investments in producer development. We provide estimates for the cost of compliance for a set of technologies in Indonesia, the Philippines and other countries. These costs vary from US $100,000 to 1.7 million. These are estimates of regulatory costs and do not include product development or deployment costs. Cost estimates need to be compared with potential gains when the technology is introduced in these countries and the gains in knowledge accumulate during the biosafety assessment process. Although the cost of compliance is important, time delays and uncertainty are even more important and may have an adverse impact on innovations reaching farmers.

  8. Estimation of marginal costs at existing waste treatment facilities.

    PubMed

    Martinez-Sanchez, Veronica; Hulgaard, Tore; Hindsgaul, Claus; Riber, Christian; Kamuk, Bettina; Astrup, Thomas F

    2016-04-01

    This investigation aims at providing an improved basis for assessing economic consequences of alternative Solid Waste Management (SWM) strategies for existing waste facilities. A bottom-up methodology was developed to determine marginal costs in existing facilities due to changes in the SWM system, based on the determination of average costs in such waste facilities as function of key facility and waste compositional parameters. The applicability of the method was demonstrated through a case study including two existing Waste-to-Energy (WtE) facilities, one with co-generation of heat and power (CHP) and another with only power generation (Power), affected by diversion strategies of five waste fractions (fibres, plastic, metals, organics and glass), named "target fractions". The study assumed three possible responses to waste diversion in the WtE facilities: (i) biomass was added to maintain a constant thermal load, (ii) Refused-Derived-Fuel (RDF) was included to maintain a constant thermal load, or (iii) no reaction occurred resulting in a reduced waste throughput without full utilization of the facility capacity. Results demonstrated that marginal costs of diversion from WtE were up to eleven times larger than average costs and dependent on the response in the WtE plant. Marginal cost of diversion were between 39 and 287 € Mg(-1) target fraction when biomass was added in a CHP (from 34 to 303 € Mg(-1) target fraction in the only Power case), between -2 and 300 € Mg(-1) target fraction when RDF was added in a CHP (from -2 to 294 € Mg(-1) target fraction in the only Power case) and between 40 and 303 € Mg(-1) target fraction when no reaction happened in a CHP (from 35 to 296 € Mg(-1) target fraction in the only Power case). Although average costs at WtE facilities were highly influenced by energy selling prices, marginal costs were not (provided a response was initiated at the WtE to keep constant the utilized thermal capacity). Failing to systematically

  9. Estimating Software Development Costs and Schedules for Space Systems

    DTIC Science & Technology

    1994-05-01

    SSCAG) participants. The SSCAG is an industry and government group 3l formed to enhance space system cost analysis. Another data source was the NASA...government and industry working group formed to advance space systems cost analysis. The SSCAG database contains software development information of...34The Management of Large Software Projects in the Space Industry Meeting," Logica, CNES, Toulouse, France, 1991. C-1 [17] Anderson, Christine, and

  10. Probabilistic estimation of numbers and costs of future landslides in the San Francisco Bay region

    USGS Publications Warehouse

    Crovelli, R.A.; Coe, J.A.

    2009-01-01

    We used historical records of damaging landslides triggered by rainstorms and a newly developed Probabilistic Landslide Assessment Cost Estimation System (PLACES) to estimate the numbers and direct costs of future landslides in the 10-county San Francisco Bay region. Historical records of damaging landslides in the region are incomplete. Therefore, our estimates of numbers and costs of future landslides are minimal estimates. The estimated mean annual number of future damaging landslides for the entire 10-county region is about 65. Santa Cruz County has the highest estimated mean annual number of damaging future landslides (about 18), whereas Napa, San Francisco, and Solano Counties have the lowest estimated mean numbers of damaging landslides (about 1 each). The estimated mean annual cost of future landslides in the entire region is about US $14.80 million (year 2000 $). The estimated mean annual cost is highest for San Mateo County ($3.24 million) and lowest for Solano County ($0.18 million). The annual per capita cost for the entire region will be about $2.10. Santa Cruz County will have the highest annual per capita cost at $8.45, whereas San Francisco County will have the lowest per capita cost at $0.31. Normalising costs by dividing by the percentage of land area with slopes equal to or greater than 17% indicates that San Francisco County will have the highest cost per square km ($7,101), whereas Santa Clara County will have the lowest cost per square km ($229). These results indicate that the San Francisco Bay region has one of the highest levels of landslide risk in the United States. Compared with landslide cost estimates from the rest of the world, the risk level in the Bay region seems high, but not exceptionally high.

  11. Linear-In-The-Parameters Oblique Least Squares (LOLS) Provides More Accurate Estimates of Density-Dependent Survival

    PubMed Central

    Vieira, Vasco M. N. C. S.; Engelen, Aschwin H.; Huanel, Oscar R.; Guillemin, Marie-Laure

    2016-01-01

    Survival is a fundamental demographic component and the importance of its accurate estimation goes beyond the traditional estimation of life expectancy. The evolutionary stability of isomorphic biphasic life-cycles and the occurrence of its different ploidy phases at uneven abundances are hypothesized to be driven by differences in survival rates between haploids and diploids. We monitored Gracilaria chilensis, a commercially exploited red alga with an isomorphic biphasic life-cycle, having found density-dependent survival with competition and Allee effects. While estimating the linear-in-the-parameters survival function, all model I regression methods (i.e, vertical least squares) provided biased line-fits rendering them inappropriate for studies about ecology, evolution or population management. Hence, we developed an iterative two-step non-linear model II regression (i.e, oblique least squares), which provided improved line-fits and estimates of survival function parameters, while robust to the data aspects that usually turn the regression methods numerically unstable. PMID:27936048

  12. Low-Cost MEMS sensors and vision system for motion and position estimation of a scooter.

    PubMed

    Guarnieri, Alberto; Pirotti, Francesco; Vettore, Antonio

    2013-01-24

    The possibility to identify with significant accuracy the position of a vehicle in a mapping reference frame for driving directions and best-route analysis is a topic which is attracting a lot of interest from the research and development sector. To reach the objective of accurate vehicle positioning and integrate response events, it is necessary to estimate position, orientation and velocity of the system with high measurement rates. In this work we test a system which uses low-cost sensors, based on Micro Electro-Mechanical Systems (MEMS) technology, coupled with information derived from a video camera placed on a two-wheel motor vehicle (scooter). In comparison to a four-wheel vehicle; the dynamics of a two-wheel vehicle feature a higher level of complexity given that more degrees of freedom must be taken into account. For example a motorcycle can twist sideways; thus generating a roll angle. A slight pitch angle has to be considered as well; since wheel suspensions have a higher degree of motion compared to four-wheel motor vehicles. In this paper we present a method for the accurate reconstruction of the trajectory of a "Vespa" scooter; which can be used as alternative to the "classical" approach based on GPS/INS sensor integration. Position and orientation of the scooter are obtained by integrating MEMS-based orientation sensor data with digital images through a cascade of a Kalman filter and a Bayesian particle filter.

  13. Low-Cost MEMS Sensors and Vision System for Motion and Position Estimation of a Scooter

    PubMed Central

    Guarnieri, Alberto; Pirotti, Francesco; Vettore, Antonio

    2013-01-01

    The possibility to identify with significant accuracy the position of a vehicle in a mapping reference frame for driving directions and best-route analysis is a topic which is attracting a lot of interest from the research and development sector. To reach the objective of accurate vehicle positioning and integrate response events, it is necessary to estimate position, orientation and velocity of the system with high measurement rates. In this work we test a system which uses low-cost sensors, based on Micro Electro-Mechanical Systems (MEMS) technology, coupled with information derived from a video camera placed on a two-wheel motor vehicle (scooter). In comparison to a four-wheel vehicle; the dynamics of a two-wheel vehicle feature a higher level of complexity given that more degrees of freedom must be taken into account. For example a motorcycle can twist sideways; thus generating a roll angle. A slight pitch angle has to be considered as well; since wheel suspensions have a higher degree of motion compared to four-wheel motor vehicles. In this paper we present a method for the accurate reconstruction of the trajectory of a “Vespa” scooter; which can be used as alternative to the “classical” approach based on GPS/INS sensor integration. Position and orientation of the scooter are obtained by integrating MEMS-based orientation sensor data with digital images through a cascade of a Kalman filter and a Bayesian particle filter. PMID:23348036

  14. Species Distribution 2.0: An Accurate Time- and Cost-Effective Method of Prospection Using Street View Imagery

    PubMed Central

    Schwoertzig, Eugénie; Millon, Alexandre

    2016-01-01

    Species occurrence data provide crucial information for biodiversity studies in the current context of global environmental changes. Such studies often rely on a limited number of occurrence data collected in the field and on pseudo-absences arbitrarily chosen within the study area, which reduces the value of these studies. To overcome this issue, we propose an alternative method of prospection using geo-located street view imagery (SVI). Following a standardised protocol of virtual prospection using both vertical (aerial photographs) and horizontal (SVI) perceptions, we have surveyed 1097 randomly selected cells across Spain (0.1x0.1 degree, i.e. 20% of Spain) for the presence of Arundo donax L. (Poaceae). In total we have detected A. donax in 345 cells, thus substantially expanding beyond the now two-centuries-old field-derived record, which described A. donax only 216 cells. Among the field occurrence cells, 81.1% were confirmed by SVI prospection to be consistent with species presence. In addition, we recorded, by SVI prospection, 752 absences, i.e. cells where A. donax was considered absent. We have also compared the outcomes of climatic niche modeling based on SVI data against those based on field data. Using generalized linear models fitted with bioclimatic predictors, we have found SVI data to provide far more compelling results in terms of niche modeling than does field data as classically used in SDM. This original, cost- and time-effective method provides the means to accurately locate highly visible taxa, reinforce absence data, and predict species distribution without long and expensive in situ prospection. At this time, the majority of available SVI data is restricted to human-disturbed environments that have road networks. However, SVI is becoming increasingly available in natural areas, which means the technique has considerable potential to become an important factor in future biodiversity studies. PMID:26751565

  15. Accurate Estimation of Fungal Diversity and Abundance through Improved Lineage-Specific Primers Optimized for Illumina Amplicon Sequencing

    PubMed Central

    Walters, William A.; Lennon, Niall J.; Bochicchio, James; Krohn, Andrew; Pennanen, Taina

    2016-01-01

    ABSTRACT While high-throughput sequencing methods are revolutionizing fungal ecology, recovering accurate estimates of species richness and abundance has proven elusive. We sought to design internal transcribed spacer (ITS) primers and an Illumina protocol that would maximize coverage of the kingdom Fungi while minimizing nontarget eukaryotes. We inspected alignments of the 5.8S and large subunit (LSU) ribosomal genes and evaluated potential primers using PrimerProspector. We tested the resulting primers using tiered-abundance mock communities and five previously characterized soil samples. We recovered operational taxonomic units (OTUs) belonging to all 8 members in both mock communities, despite DNA abundances spanning 3 orders of magnitude. The expected and observed read counts were strongly correlated (r = 0.94 to 0.97). However, several taxa were consistently over- or underrepresented, likely due to variation in rRNA gene copy numbers. The Illumina data resulted in clustering of soil samples identical to that obtained with Sanger sequence clone library data using different primers. Furthermore, the two methods produced distance matrices with a Mantel correlation of 0.92. Nonfungal sequences comprised less than 0.5% of the soil data set, with most attributable to vascular plants. Our results suggest that high-throughput methods can produce fairly accurate estimates of fungal abundances in complex communities. Further improvements might be achieved through corrections for rRNA copy number and utilization of standardized mock communities. IMPORTANCE Fungi play numerous important roles in the environment. Improvements in sequencing methods are providing revolutionary insights into fungal biodiversity, yet accurate estimates of the number of fungal species (i.e., richness) and their relative abundances in an environmental sample (e.g., soil, roots, water, etc.) remain difficult to obtain. We present improved methods for high-throughput Illumina sequencing of the

  16. A Review of Cost Estimates for Direct Spending Legislation

    DTIC Science & Technology

    1991-06-10

    savings estimated. OMB estimated savings of $124 billion for all of the fourteen bills—7 percent or $9 billion less than CBO. On balance , almost all of...five-year savings of $133.3 billion—a difference of $8.8 billion, or 7 percent. On balance , almost all of this difference arises in just three...go scorecard . If OMB moved to the CBO definitions for scoring, OMB’s estimate of pay-as-you-go savings in the Budget would be $24.4 billion. If CBO

  17. Estimating Development Cost of an Interactive Website Based Cancer Screening Promotion Program

    PubMed Central

    Lairson, David R.; Chung, Tong Han; Smith, Lisa G.; Springston, Jeffrey K.; Champion, Victoria L.

    2015-01-01

    Objectives The aim of this study was to estimate the initial development costs for an innovative talk show format tailored intervention delivered via the interactive web, for increasing cancer screening in women 50 to 75 who were non-adherent to screening guidelines for colorectal cancer and/or breast cancer. Methods The cost of the intervention development was estimated from a societal perspective. Micro costing methods plus vendor contract costs were used to estimate cost. Staff logs were used to track personnel time. Non-personnel costs include all additional resources used to produce the intervention. Results Development cost of the interactive web based intervention was $.39 million, of which 77% was direct cost. About 98% of the cost was incurred in personnel time cost, contract cost and overhead cost. Conclusions The new web-based disease prevention medium required substantial investment in health promotion and media specialist time. The development cost was primarily driven by the high level of human capital required. The cost of intervention development is important information for assessing and planning future public and private investments in web-based health promotion interventions. PMID:25749548

  18. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance: Treasury 1 2012-07-01 2012-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  19. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance: Treasury 1 2013-07-01 2013-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  20. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance: Treasury 1 2011-07-01 2011-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  1. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  2. Estimates of the national benefits and costs of improving ambient air quality

    SciTech Connect

    Brady, G.L.; Bower, B.T.; Lakhani, H.A.

    1983-04-01

    This paper examines the estimates of national benefits and national costs of ambient air quality improvement in the United States for the period 1970 to 1978. Analysis must be at the micro-level for both receptors of pollution and the dischargers of residuals. Section 2 discusses techniques for estimating the national benefits from improving ambient air quality. The literature on national benefits to health (mortality and morbidity) and non-health (avoiding damages to materials, plants, crops, etc.) is critically reviewed in this section. For the period 1970 to 1978, the value of these benefits ranged from about $5 billion to $51 billion, with a point estimate of about $22 billion. The national cost estimates by the Council on Environmental Quality, Bureau of Economic Analysis, and McGraw-Hill are provided in section 2. Cost estimates must include not only the end-of-pipe treatment measures, but also the alternatives: changes in product specification, product mix, processes, etc. These types of responses are not generally considered in estimates of national costs. For the period 1970 to 1978, estimates provided in section 3 of national costs of improving ambient air quality ranged from $8 to $9 billion in 1978 dollars. Section 4 concludes that the national benefits for improving ambient air quality exceed the national costs for the average and the high values of benefits, but not for the low estimates. Section 5 discusses the requirements for establishing a national regional computational framework for estimating national benefits and national costs. 49 references, 2 tables

  3. The Pilot Training Study: A Cost-Estimating Model for Advanced Pilot Training (APT).

    ERIC Educational Resources Information Center

    Knollmeyer, L. E.

    The Advanced Pilot Training Cost Model is a statement of relationships that may be used, given the necessary inputs, for estimating the resources required and the costs to train pilots in the Air Force formal flying training schools. Resources and costs are computed by weapon system on an annual basis for use in long-range planning or sensitivity…

  4. Estimating Resource Costs of Levy Campaigns in Five Ohio School Districts

    ERIC Educational Resources Information Center

    Ingle, W. Kyle; Petroff, Ruth Ann; Johnson, Paul A.

    2011-01-01

    Using Levin and McEwan's (2001) "ingredients method," this study identified the major activities and associated costs of school levy campaigns in five districts. The ingredients were divided into one of five cost categories--human resources, facilities, fees, marketing, and supplies. As to overall costs of the campaigns, estimates ranged…

  5. Aircraft Contractor Logistics Support: A Cost Estimating Guide.

    DTIC Science & Technology

    1983-09-01

    April 1983. 17. Enright , Mike. C-20A Cost Analyst, WPAFB OH. Per- sonal interview. 15 August 1983. 18. Fatkin, Allen. AFSC CAIG Research Report NRI...Aviation, Peterson AFB CO. Telephone interview. 11 May 1983. 32. Robert , Bob. Engineer for Doss Aviation, Randolph AFB TX. Telephone interview. 17 June

  6. ESTIMATING INNOVATIVE TECHNOLOGY COSTS FOR THE SITE PROGRAM

    EPA Science Inventory

    Among the objectives of the EPA`s Superfund Innovative Technology Evaluation (SITE) Program are two which pertain to the issue of economics: 1) That the program will provide a projected cost for each treatment technology demonstrated. 2) That the program will attempt to identify ...

  7. Estimating the Costs of Torture: Challenges and Opportunities.

    PubMed

    Mpinga, Emmanuel Kabengele; Kandala, Ngianga-Bakwin; Hasselgård-Rowe, Jennifer; Tshimungu Kandolo, Félicien; Verloo, Henk; Bukonda, Ngoyi K Zacharie; Chastonay, Philippe

    2015-12-01

    Due to its nature, extent and consequences, torture is considered a major public health problem and a serious violation of human rights. Our study aims to set the foundation for a theoretical framework of the costs related to torture. It examines existing challenges and proposes some solutions. Our proposed framework targets policy makers, human rights activists, professionals working in programmes, centres and rehabilitation projects, judges and lawyers, survivors of torture and their families and anyone involved in the prevention and fight against this practice and its consequences. We adopted a methodology previously used in studies investigating the challenges in measuring and valuing productivity costs in health disorders. We identify and discuss conceptual, methodological, political and ethical challenges that studies on the economic and social costs of torture pose and propose alternatives in terms of possible solutions to these challenges. The economic dimension of torture is rarely debated and integrated in research, policies and programmes. Several challenges such as epistemological, methodological, ethical or political ones have often been presented as obstacles to cost studies of torture and as an excuse for not investigating this dimension. In identifying, analysing and proposing solutions to these challenges, we intend to stimulate the integration of the economic dimension in research and prevention of torture strategies.

  8. A Robust Design Approach to Cost Estimation: Solar Energy for Marine Corps Expeditionary Operations (Briefing Charts)

    DTIC Science & Technology

    2014-05-01

    days 75-134 for Salt Lake City, by year, 1961-2010 Cost projections: oil and solar 12 Oil cost projections (from USEIA, 2014) and PV array cost...A ROBUST DESIGN APPROACH TO COST ESTIMATION: SOLAR ENERGY FOR MARINE CORPS EXPEDITIONARY OPERATIONS S.M. Sanchez, M.M. Morse, S.C. Upton, M.L...DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE A Robust Design Approach to Cost Estimation: Solar Energy for Marine Corps

  9. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  10. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  11. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  12. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  13. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  14. Estimating the cost of new drug development: is it really 802 million dollars?

    PubMed

    Adams, Christopher P; Brantner, Van V

    2006-01-01

    This paper replicates the drug development cost estimates of Joseph DiMasi and colleagues ("The Price of Innovation"), using their published cost estimates along with information on success rates and durations from a publicly available data set. For drugs entering human clinical trials for the first time between 1989 and 2002, the paper estimated the cost per new drug to be 868 million dollars. However, our estimates vary from around 500 million dollars to more than 2,000 million dollars, depending on the therapy or the developing firm.

  15. Accurate estimation of entropy in very short physiological time series: the problem of atrial fibrillation detection in implanted ventricular devices.

    PubMed

    Lake, Douglas E; Moorman, J Randall

    2011-01-01

    Entropy estimation is useful but difficult in short time series. For example, automated detection of atrial fibrillation (AF) in very short heart beat interval time series would be useful in patients with cardiac implantable electronic devices that record only from the ventricle. Such devices require efficient algorithms, and the clinical situation demands accuracy. Toward these ends, we optimized the sample entropy measure, which reports the probability that short templates will match with others within the series. We developed general methods for the rational selection of the template length m and the tolerance matching r. The major innovation was to allow r to vary so that sufficient matches are found for confident entropy estimation, with conversion of the final probability to a density by dividing by the matching region volume, 2r(m). The optimized sample entropy estimate and the mean heart beat interval each contributed to accurate detection of AF in as few as 12 heartbeats. The final algorithm, called the coefficient of sample entropy (COSEn), was developed using the canonical MIT-BIH database and validated in a new and much larger set of consecutive Holter monitor recordings from the University of Virginia. In patients over the age of 40 yr old, COSEn has high degrees of accuracy in distinguishing AF from normal sinus rhythm in 12-beat calculations performed hourly. The most common errors are atrial or ventricular ectopy, which increase entropy despite sinus rhythm, and atrial flutter, which can have low or high entropy states depending on dynamics of atrioventricular conduction.

  16. Estimating Criminal Justice System Costs and Cost-Savings Benefits of Day Reporting Centers

    ERIC Educational Resources Information Center

    Craddock, Amy

    2004-01-01

    This paper reports on the net cost-savings benefits (loss) to the criminal justice system of one rural and one urban day reporting center, both of which serve high risk/high need probationers. It also discusses issues of conducting criminal justice system cost studies of community corrections programs. The average DRC participant in the rural…

  17. Molten Salt: Concept Definition and Capital Cost Estimate

    SciTech Connect

    Stoddard, Larry; Andrew, Daniel; Adams, Shannon; Galluzzo, Geoff

    2016-06-30

    The Department of Energy’s (DOE’s) Office of Renewable Power (ORP) has been tasked to provide effective program management and strategic direction for all of the DOE’s Energy Efficiency & Renewable Energy’s (EERE’s) renewable power programs. The ORP’s efforts to accomplish this mission are aligned with national energy policies, DOE strategic planning, EERE’s strategic planning, Congressional appropriation, and stakeholder advice. ORP is supported by three renewable energy offices, of which one is the Solar Energy Technology Office (SETO) whose SunShot Initiative has a mission to accelerate research, development and large scale deployment of solar technologies in the United States. SETO has a goal of reducing the cost of Concentrating Solar Power (CSP) by 75 percent of 2010 costs by 2020 to reach parity with base-load energy rates, and to reduce costs 30 percent further by 2030. The SunShot Initiative is promoting the implementation of high temperature CSP with thermal energy storage allowing generation during high demand hours. The SunShot Initiative has funded significant research and development work on component testing, with attention to high temperature molten salts, heliostats, receiver designs, and high efficiency high temperature supercritical CO2 (sCO2) cycles. DOE retained Black & Veatch to support SETO’s SunShot Initiative for CSP solar power tower technology in the following areas: 1. Concept definition, including costs and schedule, of a flexible test facility to be used to test and prove components in part to support financing. 2. Concept definition, including costs and schedule, of an integrated high temperature molten salt (MS) facility with thermal energy storage and with a supercritical CO2 cycle generating approximately 10MWe. 3. Concept definition, including costs and schedule, of an integrated high temperature falling particle facility with thermal energy storage and with a supercritical CO2

  18. A systematic approach for the accurate non-invasive estimation of blood glucose utilizing a novel light-tissue interaction adaptive modelling scheme

    NASA Astrophysics Data System (ADS)

    Rybynok, V. O.; Kyriacou, P. A.

    2007-10-01

    Diabetes is one of the biggest health challenges of the 21st century. The obesity epidemic, sedentary lifestyles and an ageing population mean prevalence of the condition is currently doubling every generation. Diabetes is associated with serious chronic ill health, disability and premature mortality. Long-term complications including heart disease, stroke, blindness, kidney disease and amputations, make the greatest contribution to the costs of diabetes care. Many of these long-term effects could be avoided with earlier, more effective monitoring and treatment. Currently, blood glucose can only be monitored through the use of invasive techniques. To date there is no widely accepted and readily available non-invasive monitoring technique to measure blood glucose despite the many attempts. This paper challenges one of the most difficult non-invasive monitoring techniques, that of blood glucose, and proposes a new novel approach that will enable the accurate, and calibration free estimation of glucose concentration in blood. This approach is based on spectroscopic techniques and a new adaptive modelling scheme. The theoretical implementation and the effectiveness of the adaptive modelling scheme for this application has been described and a detailed mathematical evaluation has been employed to prove that such a scheme has the capability of extracting accurately the concentration of glucose from a complex biological media.

  19. Accuracy of a low-cost global positioning system receiver for estimating grade during outdoor walking.

    PubMed

    de Müllenheim, Pierre-Yves; Chaudru, Ségolène; Gernigon, Marie; Mahé, Guillaume; Bickert, Sandrine; Prioux, Jacques; Noury-Desvaux, Bénédicte; Le Faucheur, Alexis

    2016-09-21

    The aim of this study was to assess, for the first time, the accuracy of a low-cost global positioning system (GPS) receiver for estimating grade during outdoor walking. Thirty subjects completed outdoor walks (2.0, 3.5 and 5.0 km · h(-1)) in three randomized conditions: 1/level walking on a 0.0% grade; 2/graded (uphill and downhill) walking on a 3.4% grade; and 3/on a 10.4% grade. Subjects were equipped with a GPS receiver (DG100, GlobalSat Technology Corp., Taiwan; ~US$75). The GPS receiver was set to record at 1 Hz and its antenna was placed on the right shoulder. Grade was calculated from GPS speed and altitude data (grade  =  altitude variation/travelled distance  ×  100). Two methods were used for the grade calculation: one using uncorrected altitude data given by the GPS receiver and another one using corrected altitude data obtained using map projection software (CartoExploreur, version 3.11.0, build 2.6.6.22, Bayo Ltd, Appoigny, France, ~US$35). Linear regression of GPS-estimated versus actual grade with R (2) coefficients, bias with 95% limits of agreement (±95% LoA), and typical error of the estimate with 95% confidence interval (TEE (95% CI)) were computed to assess the accuracy of the GPS receiver. 444 walking periods were performed. Using uncorrected altitude data, we obtained: R (2)  =  0.88 (p  <  0.001), bias  =  0.0  ±  6.6%, TEE between 1.9 (1.7-2.2)% and 4.2 (3.6-4.9)% according to the grade level. Using corrected altitude data, we obtained: R (2)  =  0.98 (p  <  0.001), bias  =  0.2  ±  1.9%, TEE between 0.2 (0.2-0.3)% and 1.0 (0.9-1.2)% according to the grade level. The low-cost GPS receiver used was weakly accurate for estimating grade during outdoor walking when using uncorrected altitude data. However, the accuracy was greatly improved when using corrected altitude data. This study supports the potential interest of using GPS for estimating energy

  20. Estimating Power Outage Cost based on a Survey for Industrial Customers

    NASA Astrophysics Data System (ADS)

    Yoshida, Yoshikuni; Matsuhashi, Ryuji

    A survey was conducted on power outage cost for industrial customers. 5139 factories, which are designated energy management factories in Japan, answered their power consumption and the loss of production value due to the power outage in an hour in summer weekday. The median of unit cost of power outage of whole sectors is estimated as 672 yen/kWh. The sector of services for amusement and hobbies and the sector of manufacture of information and communication electronics equipment relatively have higher unit cost of power outage. Direct damage cost from power outage in whole sectors reaches 77 billion yen. Then utilizing input-output analysis, we estimated indirect damage cost that is caused by the repercussion of production halt. Indirect damage cost in whole sectors reaches 91 billion yen. The sector of wholesale and retail trade has the largest direct damage cost. The sector of manufacture of transportation equipment has the largest indirect damage cost.

  1. Falling Particles: Concept Definition and Capital Cost Estimate

    SciTech Connect

    Stoddard, Larry; Galluzzo, Geoff; Adams, Shannon; Andrew, Daniel

    2016-06-30

    The Department of Energy’s (DOE) Office of Renewable Power (ORP) has been tasked to provide effective program management and strategic direction for all of the DOE’s Energy Efficiency & Renewable Energy’s (EERE’s) renewable power programs. The ORP’s efforts to accomplish this mission are aligned with national energy policies, DOE strategic planning, EERE’s strategic planning, Congressional appropriation, and stakeholder advice. ORP is supported by three renewable energy offices, of which one is the Solar Energy Technology Office (SETO) whose SunShot Initiative has a mission to accelerate research, development and large scale deployment of solar technologies in the United States. SETO has a goal of reducing the cost of Concentrating Solar Power (CSP) by 75 percent of 2010 costs by 2020 to reach parity with base-load energy rates, and to reduce costs 30 percent further by 2030. The SunShot Initiative is promoting the implementation of high temperature CSP with thermal energy storage allowing generation during high demand hours. The SunShot Initiative has funded significant research and development work on component testing, with attention to high temperature molten salts, heliostats, receiver designs, and high efficiency high temperature supercritical CO2 (sCO2) cycles.

  2. Impact of interfacial high-density water layer on accurate estimation of adsorption free energy by Jarzynski's equality

    NASA Astrophysics Data System (ADS)

    Zhang, Zhisen; Wu, Tao; Wang, Qi; Pan, Haihua; Tang, Ruikang

    2014-01-01

    The interactions between proteins/peptides and materials are crucial to research and development in many biomedical engineering fields. The energetics of such interactions are key in the evaluation of new proteins/peptides and materials. Much research has recently focused on the quality of free energy profiles by Jarzynski's equality, a widely used equation in biosystems. In the present work, considerable discrepancies were observed between the results obtained by Jarzynski's equality and those derived by umbrella sampling in biomaterial-water model systems. Detailed analyses confirm that such discrepancies turn up only when the target molecule moves in the high-density water layer on a material surface. Then a hybrid scheme was adopted based on this observation. The agreement between the results of the hybrid scheme and umbrella sampling confirms the former observation, which indicates an approach to a fast and accurate estimation of adsorption free energy for large biomaterial interfacial systems.

  3. Accurate state estimation from uncertain data and models: an application of data assimilation to mathematical models of human brain tumors

    PubMed Central

    2011-01-01

    Background Data assimilation refers to methods for updating the state vector (initial condition) of a complex spatiotemporal model (such as a numerical weather model) by combining new observations with one or more prior forecasts. We consider the potential feasibility of this approach for making short-term (60-day) forecasts of the growth and spread of a malignant brain cancer (glioblastoma multiforme) in individual patient cases, where the observations are synthetic magnetic resonance images of a hypothetical tumor. Results We apply a modern state estimation algorithm (the Local Ensemble Transform Kalman Filter), previously developed for numerical weather prediction, to two different mathematical models of glioblastoma, taking into account likely errors in model parameters and measurement uncertainties in magnetic resonance imaging. The filter can accurately shadow the growth of a representative synthetic tumor for 360 days (six 60-day forecast/update cycles) in the presence of a moderate degree of systematic model error and measurement noise. Conclusions The mathematical methodology described here may prove useful for other modeling efforts in biology and oncology. An accurate forecast system for glioblastoma may prove useful in clinical settings for treatment planning and patient counseling. Reviewers This article was reviewed by Anthony Almudevar, Tomas Radivoyevitch, and Kristin Swanson (nominated by Georg Luebeck). PMID:22185645

  4. Reservoir evaluation of thin-bedded turbidites and hydrocarbon pore thickness estimation for an accurate quantification of resource

    NASA Astrophysics Data System (ADS)

    Omoniyi, Bayonle; Stow, Dorrik

    2016-04-01

    One of the major challenges in the assessment of and production from turbidite reservoirs is to take full account of thin and medium-bedded turbidites (<10cm and <30cm respectively). Although such thinner, low-pay sands may comprise a significant proportion of the reservoir succession, they can go unnoticed by conventional analysis and so negatively impact on reserve estimation, particularly in fields producing from prolific thick-bedded turbidite reservoirs. Field development plans often take little note of such thin beds, which are therefore bypassed by mainstream production. In fact, the trapped and bypassed fluids can be vital where maximising field value and optimising production are key business drivers. We have studied in detail, a succession of thin-bedded turbidites associated with thicker-bedded reservoir facies in the North Brae Field, UKCS, using a combination of conventional logs and cores to assess the significance of thin-bedded turbidites in computing hydrocarbon pore thickness (HPT). This quantity, being an indirect measure of thickness, is critical for an accurate estimation of original-oil-in-place (OOIP). By using a combination of conventional and unconventional logging analysis techniques, we obtain three different results for the reservoir intervals studied. These results include estimated net sand thickness, average sand thickness, and their distribution trend within a 3D structural grid. The net sand thickness varies from 205 to 380 ft, and HPT ranges from 21.53 to 39.90 ft. We observe that an integrated approach (neutron-density cross plots conditioned to cores) to HPT quantification reduces the associated uncertainties significantly, resulting in estimation of 96% of actual HPT. Further work will focus on assessing the 3D dynamic connectivity of the low-pay sands with the surrounding thick-bedded turbidite facies.

  5. Medical costs of smoking in the United States: estimates, their validity, and their implications

    PubMed Central

    Warner, K.; Hodgson, T.; Carroll, C.

    1999-01-01

    OBJECTIVE—To compare estimates of the medical costs of smoking in the United States and to consider their relevance to assessing the costs of smoking in developing countries and the net economic burden of smoking.
DATA SOURCES—A Medline search through early 1999 using keywords "smoking" and "cost", with review of article reference lists.
STUDY SELECTION—Peer-reviewed papers examining medical costs in a single year, covering the non-institutionalised American population.
DATA EXTRACTION—Methods underlying study estimates were identified, described, and compared with attributable expenditure methodology in the literature dealing with costs of illness. Differences in methods were associated with implied differences in findings.
DATA SYNTHESIS—With one exception, the studies find the annual medical costs of smoking to constitute approximately 6-8% of American personal health expenditures. The exception, a recent study, found much larger attributable expenditures. The lower estimates may reflect the limitation of analysis to costs associated with the principal smoking-related diseases. The higher estimate derives from analysis of smoking-attributable differences in all medical costs. However, the finding from the most recent study, also considering all medical costs, fell in the 6-8% range.
CONCLUSIONS—The medical costs of smoking in the United States equal, and may well exceed, the commonly referenced figure of 6-8%. This literature has direct methodological relevance to developing countries interested in assessing the magnitude of their current cost-of-smoking burden and their future burdens, with differences in tobacco use histories and the availability of chronic disease treatment affecting country-specific estimates. The debate over the use of gross or net medical cost estimates is likely to intensify with the proliferation of lawsuits against the tobacco industry to recover expenditures on tobacco-produced disease.


Keywords: medical

  6. Cost estimation and modeling for space missions at APL/JHU

    NASA Astrophysics Data System (ADS)

    Crawford, L. J.; Coughlin, T. B.; Ebert, W. L.

    1996-07-01

    Cost estimation of space science missions at APL over the past two decades has been singularly successful in arriving at program costs that are within a few percent of the actual costs at program completion. The most recent example is the Near Earth Asteroid Rendezvous (NEAR) mission which was estimated at approximately 112 million FY-92 dollars and came in at approximately three percent under the estimated cost. This demonstrated performance has been achieved without the benefit of a formal cost model, such as those used in government and industry (GSFC, MSFC, SAIC, etc.). In light of this performance, it is important to understand the parameters that are used in the cost estimating process in an effort to quantify those elements in a program that are most important to the final cost. We have identified a number of areas which contribute to eventual cost performance; these include: (a) spacecraft and mission complexity; (b) use of already-developed (facility- class) instruments versus "to be developed" new instruments; (c) synergism among programs being implemented concurrently; (d) program implementation length; (e) design-to-cost practice for all major subsystems and instruments without contingency; (f) lead engineer responsibility throughout design, layout, fabrication, test, integration, and initial flight operations; (g) designed-in quality and testability to minimize rework; (h) incorporation of reliability and quality assurance engineering within the program structure; (i) minimization of documentation and encouragement of oral and electronic communication as required. We have found that gross parametrization of costs such as the traditional weight, power, and length of the program commonly included in typical models do not reliably predict actual costs. A methodology will be presented, whereby, the elements identified above plus others are used to describe the process implemented by APL in previous missions to generate cost estimates and to control costs

  7. NAS Automation Equipment Operating Cost Estimates, FY 1978-1984,

    DTIC Science & Technology

    1981-06-01

    AIRWAY FACILITIES cost $78,947,000 (70%), AIR TRAFFIC software support was $30,027,000 (27%) and other labor was $3,244,000 (3%). Support of the Enroute...andRepair . 4-9 4.7 Support Engineering................4-10 4.8 Automatic Data Interchange System, Service’B . .... 4-10 4.9 Field Software ...Maintenance .. .. .. ..... ...... 4-10 4.10 AT Training....................4-10 4.11 Operational Software S4pport........................4-11 ’ 4.12 NAFEC

  8. The Hospitalization Costs of Diabetes and Hypertension Complications in Zimbabwe: Estimations and Correlations

    PubMed Central

    Mutowo, Mutsa P.; Lorgelly, Paula K.; Laxy, Michael; Mangwiro, John C.; Owen, Alice J.

    2016-01-01

    Objective. Treating complications associated with diabetes and hypertension imposes significant costs on health care systems. This study estimated the hospitalization costs for inpatients in a public hospital in Zimbabwe. Methods. The study was retrospective and utilized secondary data from medical records. Total hospitalization costs were estimated using generalized linear models. Results. The median cost and interquartile range (IQR) for patients with diabetes, $994 (385–1553) mean $1319 (95% CI: 981–1657), was higher than patients with hypertension, $759 (494–1147) mean $914 (95% CI: 825–1003). Female patients aged below 65 years with diabetes had the highest estimated mean costs ($1467 (95% CI: 1177–1828)). Wound care had the highest estimated mean cost of all procedures, $2884 (95% CI: 2004–4149) for patients with diabetes and $2239 (95% CI: 1589–3156) for patients with hypertension. Age below 65 years, medical procedures (amputation, wound care, dialysis, and physiotherapy), the presence of two or more comorbidities, and being prescribed two or more drugs were associated with significantly higher hospitalization costs. Conclusion. Our estimated costs could be used to evaluate and improve current inpatient treatment and management of patients with diabetes and hypertension and determine the most cost-effective interventions to prevent complications and comorbidities. PMID:27403444

  9. Survey of State-Level Cost and Benefit Estimates of Renewable Portfolio Standards

    SciTech Connect

    Heeter, J.; Barbose, G.; Bird, L.; Weaver, S.; Flores-Espino, F.; Kuskova-Burns, K.; Wiser, R.

    2014-05-01

    Most renewable portfolio standards (RPS) have five or more years of implementation experience, enabling an assessment of their costs and benefits. Understanding RPS costs and benefits is essential for policymakers evaluating existing RPS policies, assessing the need for modifications, and considering new policies. This study provides an overview of methods used to estimate RPS compliance costs and benefits, based on available data and estimates issued by utilities and regulators. Over the 2010-2012 period, average incremental RPS compliance costs in the United States were equivalent to 0.8% of retail electricity rates, although substantial variation exists around this average, both from year-to-year and across states. The methods used by utilities and regulators to estimate incremental compliance costs vary considerably from state to state and a number of states are currently engaged in processes to refine and standardize their approaches to RPS cost calculation. The report finds that state assessments of RPS benefits have most commonly attempted to quantitatively assess avoided emissions and human health benefits, economic development impacts, and wholesale electricity price savings. Compared to the summary of RPS costs, the summary of RPS benefits is more limited, as relatively few states have undertaken detailed benefits estimates, and then only for a few types of potential policy impacts. In some cases, the same impacts may be captured in the assessment of incremental costs. For these reasons, and because methodologies and level of rigor vary widely, direct comparisons between the estimates of benefits and costs are challenging.

  10. Cost estimate of hospital stays for premature newborns of adolescent mothers in a Brazilian public hospital

    PubMed Central

    Mwamakamba, Lutufyo Witson; Zucchi, Paola

    2014-01-01

    ABSTRACT Objective: To estimate the direct costs of hospital stay for premature newborns of adolescent mothers, in a public hospital. Methods: A cost estimate study conducted between 2009 and 2011, in which direct hospital costs were estimated for premature newborns of adolescent mothers, with 22 to 36 6/7 gestational weeks, and treated at the neonatal unit of the hospital. Results: In 2006, there were 5,180 deliveries at this hospital, and 17.8% (922) were newborns of adolescent mothers, of which 19.63% (181) were admitted to the neonatal unit. Out of the 181 neonates, 58% (105) were premature and 80% (84) of them were included in this study. These 84 neonates had a total of 1,633 days in-patient hospital care at a total cost of US$195,609.00. Approximately 72% of this total cost (US$141,323.00) accounted for hospital services. The mean daily costs ranged from US$97.00 to US$157.00. Conclusion: This study demonstrated that the average cost of premature newborns from adolescent mothers was US$2,328.00 and varied according to birth weight. For those weighing <1,000g at birth, the mean direct cost was US$8,930.00 per stay as opposed to a cost of US$642.00 for those with birth weight >2,000g. The overall estimated direct cost for the 84 neonates in the study totaled US$195,609.00. PMID:25003930

  11. Can endocranial volume be estimated accurately from external skull measurements in great-tailed grackles (Quiscalus mexicanus)?

    PubMed Central

    Palmstrom, Christin R.

    2015-01-01

    There is an increasing need to validate and collect data approximating brain size on individuals in the field to understand what evolutionary factors drive brain size variation within and across species. We investigated whether we could accurately estimate endocranial volume (a proxy for brain size), as measured by computerized tomography (CT) scans, using external skull measurements and/or by filling skulls with beads and pouring them out into a graduated cylinder for male and female great-tailed grackles. We found that while females had higher correlations than males, estimations of endocranial volume from external skull measurements or beads did not tightly correlate with CT volumes. We found no accuracy in the ability of external skull measures to predict CT volumes because the prediction intervals for most data points overlapped extensively. We conclude that we are unable to detect individual differences in endocranial volume using external skull measurements. These results emphasize the importance of validating and explicitly quantifying the predictive accuracy of brain size proxies for each species and each sex. PMID:26082858

  12. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    NASA Astrophysics Data System (ADS)

    An, Zhe; Rey, Daniel; Ye, Jingxin; Abarbanel, Henry D. I.

    2017-01-01

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of the full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. We show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.

  13. Comparing NASA and ESA Cost Estimating Methods for Human Missions to Mars

    NASA Technical Reports Server (NTRS)

    Hunt, Charles D.; vanPelt, Michel O.

    2004-01-01

    To compare working methodologies between the cost engineering functions in NASA Marshall Space Flight Center (MSFC) and ESA European Space Research and Technology Centre (ESTEC), as well as to set-up cost engineering capabilities for future manned Mars projects and other studies which involve similar subsystem technologies in MSFC and ESTEC, a demonstration cost estimate exercise was organized. This exercise was a direct way of enhancing not only cooperation between agencies but also both agencies commitment to credible cost analyses. Cost engineers in MSFC and ESTEC independently prepared life-cycle cost estimates for a reference human Mars project and subsequently compared the results and estimate methods in detail. As a non-sensitive, public domain reference case for human Mars projects, the Mars Direct concept was chosen. In this paper the results of the exercise are shown; the differences and similarities in estimate methodologies, philosophies, and databases between MSFC and ESTEC, as well as the estimate results for the Mars Direct concept. The most significant differences are explained and possible estimate improvements identified. In addition, the Mars Direct plan and the extensive cost breakdown structure jointly set-up by MSFC and ESTEC for this concept are presented. It was found that NASA applied estimate models mainly based on historic Apollo and Space Shuttle cost data, taking into account the changes in technology since then. ESA used models mostly based on European satellite and launcher cost data, taking into account the higher equipment and testing standards for human space flight. Most of NASA's and ESA s estimates for the Mars Direct case are comparable, but there are some important, consistent differences in the estimates for: 1) Large Structures and Thermal Control subsystems; 2) System Level Management, Engineering, Product Assurance and Assembly, Integration and Test/Verification activities; 3) Mission Control; 4) Space Agency Program Level

  14. Commercial Vessel Safety. Economic Costs. Appendix A. Estimation Procedures for Costs and Cost Impacts of Marine Safety Regulations.

    DTIC Science & Technology

    1979-12-01

    Vessel Delays at the Hackensack River Portal Bridge (NOTE: These exercises are unpublished) I I7. It" VW49 Is. Oe6iuio"as Statement Cost-benefit... Engineering Economy, by Eugene L. Grant and W. G. Ireson, Ronald Press Company, 1960. F. Inflation Cost-benefit analysis is complicated by the fact prices...Alexandria, Virginia: Naval Facilities Engineering Command, June 1975. Hirshleifer, J., Investment, Interest and Capital. Englewood Cliffs, N.J.: Prentice

  15. Uncertainty of air pollution cost estimates: to what extent does it matter?

    PubMed

    Rabl, Ari; Spadaro, Joseph V; van der Zwaan, Bob

    2005-01-15

    How large is the social cost penalty if one makes the wrong choice because of uncertainties in the estimates of the costs and benefits of environmental policy measures? For discrete choices there is no general rule other than the recommendation to always carefully compare costs and benefits when introducing policies for environmental protection. For continuous choices (e.g., the ceiling for the total emissions of a pollutant by an entire sector or region), it is instructive to look at the cost penalty as a function of the error in the incremental damage cost estimate. Using abatement cost curves for NOx, SO2, dioxins, and CO2, this paper evaluates the cost penalty for errors in the following: national emission ceilings for NOx and SO2 in each of 12 countries of Europe, an emission ceiling for dioxins in the UK, and limits for the emission of CO2 in Europe. The cost penalty turns out to be remarkably insensitive to errors. An error by a factor of 3 due to uncertainties in the damage estimates for NOx and SO2 increases the total social cost by at most 20% and in most cases much less. For dioxins, the total social cost is increased by at most 10%. For CO2, several different possible cost curves are examined: for some the sensitivity to uncertainties is greater than for the other pollutants, but even here the penalty is less than 30% and in most cases much less if the true damage costs are twice as high as the ones estimated. The paper also quantifies the benefit of improving the accuracy of damage cost estimates by further research.

  16. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    DTIC Science & Technology

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  17. Cost estimate of hospital stays for premature newborns in a public tertiary hospital in Brazil

    PubMed Central

    Desgualdo, Claudia Maria; Riera, Rachel; Zucchi, Paola

    2011-01-01

    OBJECTIVES: To estimate the direct costs of hospital stays for premature newborns in the Interlagos Hospital and Maternity Center in São Paulo, Brazil and to assess the difference between the amount reimbursed to the hospital by the Unified Health System and the real cost of care for each premature newborn. METHODS: A cost-estimate study in which hospital and professional costs were estimated for premature infants born at 22 to 36 weeks gestation during the calendar year of 2004 and surviving beyond one hour of age. Direct costs included hospital services, professional care, diagnoses and therapy, orthotics, prosthetics, special materials, and blood products. Costs were estimated using tables published by the Unified Health System and the Brasíndice as well as the list of medical procedures provided by the Brazilian Classification of Medical Procedures. RESULTS: The average direct cost of care for initial hospitalization of a premature newborn in 2004 was $2,386 USD. Total hospital expenses and professional services for all premature infants in this hospital were $227,000 and $69,500 USD, respectively. The costs for diagnostic testing and blood products for all premature infants totaled $22,440 and $1,833 USD. The daily average cost of a premature newborn weighing less than 1,000 g was $115 USD, and the daily average cost of a premature newborn weighing more than 2,500 g was $89 USD. Amounts reimbursed to the hospital by the Unified Health System corresponded to only 27.42% of the real cost of care. CONCLUSIONS: The cost of hospital stays for premature newborns was much greater than the amount reimbursed to the hospital by the Unified Health System. The highest costs corresponded to newborns with lower birth weight. Hospital costs progressively and discretely decreased as the newborns' weight increased. PMID:22012050

  18. Estimating the effect of hospital closure on areawide inpatient hospital costs: a preliminary model and application.

    PubMed Central

    Shepard, D S

    1983-01-01

    A preliminary model is developed for estimating the extent of savings, if any, likely to result from discontinuing a specific inpatient service. By examining the sources of referral to the discontinued service, the model estimates potential demand and how cases will be redistributed among remaining hospitals. This redistribution determines average cost per day in hospitals that receive these cases, relative to average cost per day of the discontinued service. The outflow rate, which measures the proportion of cases not absorbed in other acute care hospitals, is estimated as 30 percent for the average discontinuation. The marginal cost ratio, which relates marginal costs of cases absorbed in surrounding hospitals to the average costs in those hospitals, is estimated as 87 percent in the base case. The model was applied to the discontinuation of all inpatient services in the 75-bed Chelsea Memorial Hospital, near Boston, Massachusetts, using 1976 data. As the precise value of key parameters is uncertain, sensitivity analysis was used to explore a range of values. The most likely result is a small increase ($120,000) in the area's annual inpatient hospital costs, because many patients are referred to more costly teaching hospitals. A similar situation may arise with other urban closures. For service discontinuations to generate savings, recipient hospitals must be low in costs, the outflow rate must be large, and the marginal cost ratio must be low. PMID:6668181

  19. The social cost of rheumatoid arthritis in Italy: the results of an estimation exercise.

    PubMed

    Turchetti, G; Bellelli, S; Mosca, M

    2014-03-14

    The objective of this study is to estimate the mean annual social cost per adult person and the total social cost of rheumatoid arthritis (RA) in Italy. A literature review was performed by searching primary economic studies on adults in order to collect cost data of RA in Italy in the last decade. The review results were merged with data of institutional sources for estimating - following the methodological steps of the cost of illness analysis - the social cost of RA in Italy. The mean annual social cost of RA was € 13,595 per adult patient in Italy. Affecting 259,795 persons, RA determines a social cost of € 3.5 billions in Italy. Non-medical direct cost and indirect cost represent the main cost items (48% and 31%) of the total social cost of RA in Italy. Based on these results, it appears evident that the assessment of the economic burden of RA solely based on direct medical costs evaluation gives a limited view of the phenomenon.

  20. Technology Cost and Schedule Estimation (TCASE) Final Report

    NASA Technical Reports Server (NTRS)

    Wallace, Jon; Schaffer, Mark

    2015-01-01

    During the 2014-2015 project year, the focus of the TCASE project has shifted from collection of historical data from many sources to securing a data pipeline between TCASE and NASA's widely used TechPort system. TCASE v1.0 implements a data import solution that was achievable within the project scope, while still providing the basis for a long-term ability to keep TCASE in sync with TechPort. Conclusion: TCASE data quantity is adequate and the established data pipeline will enable future growth. Data quality is now highly dependent the quality of data in TechPort. Recommendation: Technology development organizations within NASA should continue to work closely with project/program data tracking and archiving efforts (e.g. TechPort) to ensure that the right data is being captured at the appropriate quality level. TCASE would greatly benefit, for example, if project cost/budget information was included in TechPort in the future.

  1. Motion estimation by integrated low cost system (vision and MEMS) for positioning of a scooter "Vespa"

    NASA Astrophysics Data System (ADS)

    Guarnieri, A.; Milan, N.; Pirotti, F.; Vettore, A.

    2011-12-01

    In the automotive sector, especially in these last decade, a growing number of investigations have taken into account electronic systems to check and correct the behavior of drivers, increasing road safety. The possibility to identify with high accuracy the vehicle position in a mapping reference frame for driving directions and best-route analysis is also another topic which attracts lot of interest from the research and development sector. To reach the objective of accurate vehicle positioning and integrate response events, it is necessary to estimate time by time the position, orientation and velocity of the system. To this aim low cost GPS and MEMS (sensors can be used. In comparison to a four wheel vehicle, the dynamics of a two wheel vehicle (e.g. a scooter) feature a higher level of complexity. Indeed more degrees of freedom must be taken into account to describe the motion of the latter. For example a scooter can twist sideways, thus generating a roll angle. A slight pitch angle has to be considered as well, since wheel suspensions have a higher degree of motion with respect to four wheel vehicles. In this paper we present a method for the accurate reconstruction of the trajectory of a motorcycle ("Vespa" scooter), which can be used as alternative to the "classical" approach based on the integration of GPS and INS sensors. Position and orientation of the scooter are derived from MEMS data and images acquired by on-board digital camera. A Bayesian filter provides the means for integrating the data from MEMS-based orientation sensor and the GPS receiver.

  2. Statistical estimation of service cracks and maintenance cost for aircraft structures

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1975-01-01

    A method is developed for the statistical estimation of the number of cracks to be repaired in service as well as the repair and the maintenance costs. The present approach accounts for the statistical distribution of the initial crack size, the statistical nature of the NDI technique used for detecting the crack, and the renewal process for the crack propagation of repaired cracks. The mean and the standard deviation of the cumulative number of cracks to be repaired are computed as a function of service time. The statistics of the costs of repair and maintenance, expressed in terms of the percentage of the cost of replacement, are estimated as a function of service time. The results of the present study provide relevant information for the decision of fleet management, the estimation of life cycle cost, and procurement specifications. The present study is essential to the design and cost optimization of aircraft structures.

  3. Bureau of mines cost estimating system handbook (in two parts). 1. Surface and underground mining

    SciTech Connect

    Not Available

    1987-01-01

    The handbook provides a convenient costing procedure based on the summation of the costs for unit processes required in any particular mining or mineral processing operation. The costing handbook consists of a series of costing sections, each corresponding to a specific mining unit process. Contained within each section is the methodology to estimate either the capital or operating cost for that unit process. The unit process sections may be used to generate, in January 1984 dollars, costs through the use of either costing curves or formulae representing the prevailing technology. Coverage for surface mining includes dredging, quarrying, strip mining, and open pit mining. The underground mining includes individual development sections for drifting, raising, shaft sinking, stope development, various mining methods, underground mine haulage, general plant, and underground mine administrative cost.

  4. Estimating the human recovery costs of seriously injured road crash casualties.

    PubMed

    Bambach, M R; Mitchell, R J

    2015-12-01

    Road crashes result in substantial trauma and costs to societies around the world. Robust costing methods are an important tool to estimate costs associated with road trauma, and are key inputs into policy development and cost-benefit analysis for road safety programmes and infrastructure projects. With an expanding focus on seriously injured road crash casualties, in addition to the long standing focus on fatalities, methods for costing seriously injured casualties are becoming increasingly important. Some road safety agencies are defining a seriously injured casualty as an individual that was admitted to hospital following a road crash, and as a result, hospital separation data provide substantial potential for estimating the costs associated with seriously injured road crash casualties. The aim of this study is to establish techniques for estimating the human recovery costs of (non-fatal) seriously injured road crash casualties directly from hospital separation data. An individuals' road crash-related hospitalisation record and their personal injury insurance claim were linked for road crashes that occurred in New South Wales, Australia. These records provided the means for estimating all of the costs to the casualty directly related to their recovery from their injuries. A total of 10,897 seriously injured road crash casualties were identified and four methods for estimating their recovery costs were examined, using either unit record or aggregated hospital separation data. The methods are shown to provide robust techniques for estimating the human recovery costs of seriously injured road crash casualties, that may prove useful for identifying, implementing and evaluating safety programmes intended to reduce the incidence of road crash-related serious injuries.

  5. Estimating the gas transfer velocity: a prerequisite for more accurate and higher resolution GHG fluxes (lower Aare River, Switzerland)

    NASA Astrophysics Data System (ADS)

    Sollberger, S.; Perez, K.; Schubert, C. J.; Eugster, W.; Wehrli, B.; Del Sontro, T.

    2013-12-01

    Currently, carbon dioxide (CO2) and methane (CH4) emissions from lakes, reservoirs and rivers are readily investigated due to the global warming potential of those gases and the role these inland waters play in the carbon cycle. However, there is a lack of high spatiotemporally-resolved emission estimates, and how to accurately assess the gas transfer velocity (K) remains controversial. In anthropogenically-impacted systems where run-of-river reservoirs disrupt the flow of sediments by increasing the erosion and load accumulation patterns, the resulting production of carbonic greenhouse gases (GH-C) is likely to be enhanced. The GH-C flux is thus counteracting the terrestrial carbon sink in these environments that act as net carbon emitters. The aim of this project was to determine the GH-C emissions from a medium-sized river heavily impacted by several impoundments and channelization through a densely-populated region of Switzerland. Estimating gas emission from rivers is not trivial and recently several models have been put forth to do so; therefore a second goal of this project was to compare the river emission models available with direct measurements. Finally, we further validated the modeled fluxes by using a combined approach with water sampling, chamber measurements, and highly temporal GH-C monitoring using an equilibrator. We conducted monthly surveys along the 120 km of the lower Aare River where we sampled for dissolved CH4 (';manual' sampling) at a 5-km sampling resolution, and measured gas emissions directly with chambers over a 35 km section. We calculated fluxes (F) via the boundary layer equation (F=K×(Cw-Ceq)) that uses the water-air GH-C concentration (C) gradient (Cw-Ceq) and K, which is the most sensitive parameter. K was estimated using 11 different models found in the literature with varying dependencies on: river hydrology (n=7), wind (2), heat exchange (1), and river width (1). We found that chamber fluxes were always higher than boundary

  6. Cost and price estimate of Brayton and Stirling engines in selected production volumes

    NASA Technical Reports Server (NTRS)

    Fortgang, H. R.; Mayers, H. F.

    1980-01-01

    The methods used to determine the production costs and required selling price of Brayton and Stirling engines modified for use in solar power conversion units are presented. Each engine part, component and assembly was examined and evaluated to determine the costs of its material and the method of manufacture based on specific annual production volumes. Cost estimates are presented for both the Stirling and Brayton engines in annual production volumes of 1,000, 25,000, 100,000 and 400,000. At annual production volumes above 50,000 units, the costs of both engines are similar, although the Stirling engine costs are somewhat lower. It is concluded that modifications to both the Brayton and Stirling engine designs could reduce the estimated costs.

  7. Preliminary Estimates of Performance and Cost of Mercury Control Technology Applications on Electric Utility Boilers.

    PubMed

    Srivastava, Ravi K; Sedman, Charles B; Kilgroe, James D; Smith, Dennis; Renninger, Scott

    2001-10-01

    Under the Clean Air Act Amendments of 1990, the U.S. Environmental Protection Agency (EPA) determined that regulation of mercury emissions from coal-fired power plants is appropriate and necessary. To aid in this determination, preliminary estimates of the performance and cost of powdered activated carbon (PAC) injection-based mercury control technologies were developed. This paper presents these estimates and develops projections of costs for future applications. Cost estimates were developed using PAC to achieve a minimum of 80% mercury removal at plants using electrostatic precipitators and a minimum of 90% removal at plants using fabric filters. These estimates ranged from 0.305 to 3.783 mills/kWh. However, the higher costs were associated with a minority of plants using hot-side electrostatic precipitators (HESPs). If these costs are excluded, the estimates range from 0.305 to 1.915 mills/kWh. Cost projections developed using a composite lime-PAC sorbent for mercury removal ranged from 0.183 to 2.270 mills/kWh, with the higher costs being associated with a minority of plants that used HESPs.

  8. Preliminary estimates of performance and cost of mercury control technology applications on electric utility boilers.

    PubMed

    Srivastava, R K; Sedman, C B; Kilgroe, J D; Smith, D; Renninger, S

    2001-10-01

    Under the Clean Air Act Amendments of 1990, the U.S. Environmental Protection Agency (EPA) determined that regulation of mercury emissions from coal-fired power plants is appropriate and necessary. To aid in this determination, preliminary estimates of the performance and cost of powdered activated carbon (PAC) injection-based mercury control technologies were developed. This paper presents these estimates and develops projections of costs for future applications. Cost estimates were developed using PAC to achieve a minimum of 80% mercury removal at plants using electrostatic precipitators and a minimum of 90% removal at plants using fabric filters. These estimates ranged from 0.305 to 3.783 mills/kWh. However, the higher costs were associated with a minority of plants using hot-side electrostatic precipitators (HESPs). If these costs are excluded, the estimates range from 0.305 to 1.915 mills/kWh. Cost projections developed using a composite lime-PAC sorbent for mercury removal ranged from 0.183 to 2.270 mills/kWh, with the higher costs being associated with a minority of plants that used HESPs.

  9. Preliminary Estimates of Performance and Cost of Mercury Control Technology Applications on Electric Utility Boilers.

    PubMed

    Srivastava, Ravi K; Sedman, Charles B; Kilgroe, James D; Smith, Dennis; Renninger, Scott

    2001-10-01

    Under the Clean Air Act Amendments of 1990, the U.S. Environmental Protection Agency (EPA) determined that regulation of mercury emissions from coal-fired power plants is appropriate and necessary. To aid in this determination, preliminary estimates of the performance and cost of powdered activated carbon (PAC) injection-based mercury control technologies were developed. This paper presents these estimates and develops projections of costs for future applications. Cost estimates were developed using PAC to achieve a minimum of 80% mercury removal at plants using electrostatic precipitators and a minimum of 90% removal at plants using fabric filters. These estimates ranged from 0.305 to 3.783 mills/kWh. However, the higher costs were associated with a minority of plants using hot-side electrostatic precipitators (HESPs). If these costs are excluded, the estimates range from 0.305 to 1.915 mills/kWh. Cost projections developed using a composite lime-PAC sor-bent for mercury removal ranged from 0.183 to 2.270 mills/kWh, with the higher costs being associated with a minority of plants that used HESPs.

  10. The cost of crime to society: new crime-specific estimates for policy and program evaluation.

    PubMed

    McCollister, Kathryn E; French, Michael T; Fang, Hai

    2010-04-01

    Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than 10 years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost to society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime.

  11. How Much Does Intellectual Disability Really Cost? First Estimates for Australia

    ERIC Educational Resources Information Center

    Doran, Christopher M.; Einfeld, Stewart L.; Madden, Rosamond H.; Otim, Michael; Horstead, Sian K.; Ellis, Louise A.; Emerson, Eric

    2012-01-01

    Background: Given the paucity of relevant data, this study estimates the cost of intellectual disability (ID) to families and the government in Australia. Method: Family costs were collected via the Client Service Receipt Inventory, recording information relating to service use and personal expense as a consequence of ID. Government expenditure on…

  12. The Budding SV3: Estimating the Cost of Architectural Growth Early in the Life Cycle

    DTIC Science & Technology

    2014-04-30

    Systems Engineering Cost Model ) and the SV3. Based on its stochastic nature, this procedure is further implemented as a Monte Carlo simulation ...ABSTRACT As the systems engineering community continues to mature model -based approaches exciting opportunities for sophisticated, computational analysis...we estimate the marginal increase in systems engineering effort via an explicit connection between the open academic cost model COSYSMO (Constructive

  13. 33 CFR 241.5 - Procedures for estimating the alternative cost-share.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROVISION § 241.5 Procedures for estimating the alternative cost-share. (a) Step one, the benefits test... factor determined in § 241.5(a)(1), when expressed as a percentage, is greater than the standard level of cost-sharing, the standard level will apply. (3) If the factor determined in § 241.5(a)(1),...

  14. 33 CFR 241.5 - Procedures for estimating the alternative cost-share.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROVISION § 241.5 Procedures for estimating the alternative cost-share. (a) Step one, the benefits test...) Calculate the ratio of flood control benefits (developed using the Water Resources Council's Principles and... costs. Divide the result by four. For example, if the project's (or separable element's)...

  15. Preliminary weight and cost estimates for transport aircraft composite structural design concepts

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Preliminary weight and cost estimates have been prepared for design concepts utilized for a transonic long range transport airframe with extensive applications of advanced composite materials. The design concepts, manufacturing approach, and anticipated details of manufacturing cost reflected in the composite airframe are substantially different from those found in conventional metal structure and offer further evidence of the advantages of advanced composite materials.

  16. A model for estimating the cost impact of schedule perturbations on aerospace research and development programs

    NASA Technical Reports Server (NTRS)

    Bishop, D. F.

    1972-01-01

    The problem of determining the cost impact attributable to perturbations in an aerospace R and D program schedule is discussed in terms of the diminishing availability of funds. The methodology from which a model is presented for updating R and D cost estimates as a function of perturbations in program time is presented.

  17. Guidelines for the Estimation of Program Costs at Macomb County Community College. Project No. 78072.

    ERIC Educational Resources Information Center

    Dulaney-Sorochak, Jeanne

    The Education Amendments of 1976 require that any educational institution must be able to report the full cost of attending that institution to any student who requests the information. This report outlines guidelines for estimating the cost to the student attending Macomb County Community College. Tuition and living expenses, with required fees…

  18. Two Computer Programs for Equipment Cost Estimation and Economic Evaluation of Chemical Processes.

    ERIC Educational Resources Information Center

    Kuri, Carlos J.; Corripio, Armando B.

    1984-01-01

    Describes two computer programs for use in process design courses: an easy-to-use equipment cost estimation program based on latest cost correlations available and an economic evaluation program which calculates two profitability indices. Comparisons between programed and hand-calculated results are included. (JM)

  19. Estimating the Full Cost of Family-Financed Time Inputs to Education.

    ERIC Educational Resources Information Center

    Levine, Victor

    This paper presents a methodology for estimating the full cost of parental time allocated to child-care activities at home. Building upon the human capital hypothesis, a model is developed in which the cost of an hour diverted from labor market activity is seen as consisting of three components: 1) direct wages foregone; 2) investments in…

  20. Simple calculator to estimate the medical cost of diabetes in sub-Saharan Africa

    PubMed Central

    Alouki, Koffi; Delisle, Hélène; Besançon, Stéphane; Baldé, Naby; Sidibé-Traoré, Assa; Drabo, Joseph; Djrolo, François; Mbanya, Jean-Claude; Halimi, Serge

    2015-01-01

    AIM: To design a medical cost calculator and show that diabetes care is beyond reach of the majority particularly patients with complications. METHODS: Out-of-pocket expenditures of patients for medical treatment of type-2 diabetes were estimated based on price data collected in Benin, Burkina Faso, Guinea and Mali. A detailed protocol for realistic medical care of diabetes and its complications in the African context was defined. Care components were based on existing guidelines, published data and clinical experience. Prices were obtained in public and private health facilities. The cost calculator used Excel. The cost for basic management of uncomplicated diabetes was calculated per person and per year. Incremental costs were also computed per annum for chronic complications and per episode for acute complications. RESULTS: Wide variations of estimated care costs were observed among countries and between the public and private healthcare system. The minimum estimated cost for the treatment of uncomplicated diabetes (in the public sector) would amount to 21%-34% of the country’s gross national income per capita, 26%-47% in the presence of retinopathy, and above 70% for nephropathy, the most expensive complication. CONCLUSION: The study provided objective evidence for the exorbitant medical cost of diabetes considering that no medical insurance is available in the study countries. Although the calculator only estimates the cost of inaction, it is innovative and of interest for several stakeholders. PMID:26617974

  1. The Effect of Infrastructure Sharing in Estimating Operations Cost of Future Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Sundaram, Meenakshi

    2005-01-01

    NASA and the aerospace industry are extremely serious about reducing the cost and improving the performance of launch vehicles both manned or unmanned. In the aerospace industry, sharing infrastructure for manufacturing more than one type spacecraft is becoming a trend to achieve economy of scale. An example is the Boeing Decatur facility where both Delta II and Delta IV launch vehicles are made. The author is not sure how Boeing estimates the costs of each spacecraft made in the same facility. Regardless of how a contractor estimates the cost, NASA in its popular cost estimating tool, NASA Air force Cost Modeling (NAFCOM) has to have a method built in to account for the effect of infrastructure sharing. Since there is no provision in the most recent version of NAFCOM2002 to take care of this, it has been found by the Engineering Cost Community at MSFC that the tool overestimates the manufacturing cost by as much as 30%. Therefore, the objective of this study is to develop a methodology to assess the impact of infrastructure sharing so that better operations cost estimates may be made.

  2. Model for Estimating Life-Cycle Costs Associated with Noise-Induced Hearing Loss

    DTIC Science & Technology

    2007-01-10

    decisions. Currently, the cash outlays by the government for noise-induced hearing loss ( NIHL ) caused to service personnel by loud systems and spaces are...un-accounted for in estimates of life-cycle costs. A companion report demonstrated that a NIHL prediction algorithm from the American National...compensation costs of the predicted NIHL in this population. A numerical example of the algorithm operation was included. Using cost values applicable to

  3. Comparative evaluation of features and techniques for identifying activity type and estimating energy cost from accelerometer data.

    PubMed

    Kate, Rohit J; Swartz, Ann M; Welch, Whitney A; Strath, Scott J

    2016-03-01

    Wearable accelerometers can be used to objectively assess physical activity. However, the accuracy of this assessment depends on the underlying method used to process the time series data obtained from accelerometers. Several methods have been proposed that use this data to identify the type of physical activity and estimate its energy cost. Most of the newer methods employ some machine learning technique along with suitable features to represent the time series data. This paper experimentally compares several of these techniques and features on a large dataset of 146 subjects doing eight different physical activities wearing an accelerometer on the hip. Besides features based on statistics, distance based features and simple discrete features straight from the time series were also evaluated. On the physical activity type identification task, the results show that using more features significantly improve results. Choice of machine learning technique was also found to be important. However, on the energy cost estimation task, choice of features and machine learning technique were found to be less influential. On that task, separate energy cost estimation models trained specifically for each type of physical activity were found to be more accurate than a single model trained for all types of physical activities.

  4. A method for simple and accurate estimation of fog deposition in a mountain forest using a meteorological model

    NASA Astrophysics Data System (ADS)

    Katata, Genki; Kajino, Mizuo; Hiraki, Takatoshi; Aikawa, Masahide; Kobayashi, Tomiki; Nagai, Haruyasu

    2011-10-01

    To apply a meteorological model to investigate fog occurrence, acidification and deposition in mountain forests, the meteorological model WRF was modified to calculate fog deposition accurately by the simple linear function of fog deposition onto vegetation derived from numerical experiments using the detailed multilayer atmosphere-vegetation-soil model (SOLVEG). The modified version of WRF that includes fog deposition (fog-WRF) was tested in a mountain forest on Mt. Rokko in Japan. fog-WRF provided a distinctly better prediction of liquid water content of fog (LWC) than the original version of WRF. It also successfully simulated throughfall observations due to fog deposition inside the forest during the summer season that excluded the effect of forest edges. Using the linear relationship between fog deposition and altitude given by the fog-WRF calculations and the data from throughfall observations at a given altitude, the vertical distribution of fog deposition can be roughly estimated in mountain forests. A meteorological model that includes fog deposition will be useful in mapping fog deposition in mountain cloud forests.

  5. Development of a new, robust and accurate, spectroscopic metric for scatterer size estimation in optical coherence tomography (OCT) images

    NASA Astrophysics Data System (ADS)

    Kassinopoulos, Michalis; Pitris, Costas

    2016-03-01

    The modulations appearing on the backscattering spectrum originating from a scatterer are related to its diameter as described by Mie theory for spherical particles. Many metrics for Spectroscopic Optical Coherence Tomography (SOCT) take advantage of this observation in order to enhance the contrast of Optical Coherence Tomography (OCT) images. However, none of these metrics has achieved high accuracy when calculating the scatterer size. In this work, Mie theory was used to further investigate the relationship between the degree of modulation in the spectrum and the scatterer size. From this study, a new spectroscopic metric, the bandwidth of the Correlation of the Derivative (COD) was developed which is more robust and accurate, compared to previously reported techniques, in the estimation of scatterer size. The self-normalizing nature of the derivative and the robustness of the first minimum of the correlation as a measure of its width, offer significant advantages over other spectral analysis approaches especially for scatterer sizes above 3 μm. The feasibility of this technique was demonstrated using phantom samples containing 6, 10 and 16 μm diameter microspheres as well as images of normal and cancerous human colon. The results are very promising, suggesting that the proposed metric could be implemented in OCT spectral analysis for measuring nuclear size distribution in biological tissues. A technique providing such information would be of great clinical significance since it would allow the detection of nuclear enlargement at the earliest stages of precancerous development.

  6. ESTIMATING TREATMENT EFFECTS ON HEALTHCARE COSTS UNDER EXOGENEITY: IS THERE A 'MAGIC BULLET'?

    PubMed

    Basu, Anirban; Polsky, Daniel; Manning, Willard G

    2011-07-01

    Methods for estimating average treatment effects, under the assumption of no unmeasured confounders, include regression models; propensity score adjustments using stratification, weighting, or matching; and doubly robust estimators (a combination of both). Researchers continue to debate about the best estimator for outcomes such as health care cost data, as they are usually characterized by an asymmetric distribution and heterogeneous treatment effects,. Challenges in finding the right specifications for regression models are well documented in the literature. Propensity score estimators are proposed as alternatives to overcoming these challenges. Using simulations, we find that in moderate size samples (n= 5000), balancing on propensity scores that are estimated from saturated specifications can balance the covariate means across treatment arms but fails to balance higher-order moments and covariances amongst covariates. Therefore, unlike regression model, even if a formal model for outcomes is not required, propensity score estimators can be inefficient at best and biased at worst for health care cost data. Our simulation study, designed to take a 'proof by contradiction' approach, proves that no one estimator can be considered the best under all data generating processes for outcomes such as costs. The inverse-propensity weighted estimator is most likely to be unbiased under alternate data generating processes but is prone to bias under misspecification of the propensity score model and is inefficient compared to an unbiased regression estimator. Our results show that there are no 'magic bullets' when it comes to estimating treatment effects in health care costs. Care should be taken before naively applying any one estimator to estimate average treatment effects in these data. We illustrate the performance of alternative methods in a cost dataset on breast cancer treatment.

  7. Estimating Development Cost for a Tailored Interactive Computer Program to Enhance Colorectal Cancer Screening Compliance

    PubMed Central

    Lairson, David R.; Chang, Yu-Chia; Bettencourt, Judith L.; Vernon, Sally W.; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was $328,866. The development cost was $52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  8. Estimating development cost for a tailored interactive computer program to enhance colorectal cancer screening compliance.

    PubMed

    Lairson, David R; Chang, Yu-Chia; Bettencourt, Judith L; Vernon, Sally W; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was Dollars 328,866. The development cost was Dollars 52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions.

  9. Estimates of the national benefits and costs of improving ambient air quality

    SciTech Connect

    Brady, G.L.; Bower, B.T.; Lakhani, H.A.

    1983-04-01

    This paper examines the estimates of national benefits and national costs of ambient air quality improvement in the US for the period 1970 to 1978. Analysis must be at the micro-level for both receptors of pollution and the dischargers of residuals. Section 2 discusses techniques for estimating the national benefits from improving ambient air quality. The literature on national benefits to health (mortality and morbidity) and non-health (avoiding damages to materials, plants, crops, etc.) is critically reviewed in this section. For the period 1970 to 1978, the value of these benefits ranged from about $5 billion to $51 billion, with a point estimate of about $22 billion. The national cost estimates by the Council on Environmental Quality, Bureau of Economic Analysis, and McGraw-Hill are provided in section 2. Cost estimates must include not only the end-of-pipe treatment measures, but also the alternatives: changes in product specification, product mix, processes, etc. These types of responses are not generally considered in estimates of national costs of improving ambient air quality ranged from $8 to $9 billion in 1978 dollars. Section 4 concludes that the national benefits for improving ambient air quality exceed the national costs for the average and the high values of benefits, but not for the low estimates.

  10. F-16X MSIP (Multi-National Staged Improvement Program) Case Example: Operating and Support Cost Estimation Using VAMOSC (Visibility and Management of Operating and Support Costs).

    DTIC Science & Technology

    2014-09-26

    cats. 70M0 CrP -5763 Precision Location Major Now Avionics syatSM. Higher Strike System materiel and labor cost. Small (PLSS) radme may increase profile...MSIP maintenance material and labor costs are estimated using a bottoms-up cost factor estimation technique . A set of functional analogies for the MSIP

  11. The Cost of Universal Health Care in India: A Model Based Estimate

    PubMed Central

    Prinja, Shankar; Bahuguna, Pankaj; Pinto, Andrew D.; Sharma, Atul; Bharaj, Gursimer; Kumar, Vishal; Tripathy, Jaya Prasad; Kaur, Manmeet; Kumar, Rajesh

    2012-01-01

    Introduction As high out-of-pocket healthcare expenses pose heavy financial burden on the families, Government of India is considering a variety of financing and delivery options to universalize health care services. Hence, an estimate of the cost of delivering universal health care services is needed. Methods We developed a model to estimate recurrent and annual costs for providing health services through a mix of public and private providers in Chandigarh located in northern India. Necessary health services required to deliver good quality care were defined by the Indian Public Health Standards. National Sample Survey data was utilized to estimate disease burden. In addition, morbidity and treatment data was collected from two secondary and two tertiary care hospitals. The unit cost of treatment was estimated from the published literature. For diseases where data on treatment cost was not available, we collected data on standard treatment protocols and cost of care from local health providers. Results We estimate that the cost of universal health care delivery through the existing mix of public and private health institutions would be INR 1713 (USD 38, 95%CI USD 18–73) per person per annum in India. This cost would be 24% higher, if branded drugs are used. Extrapolation of these costs to entire country indicates that Indian government needs to spend 3.8% (2.1%–6.8%) of the GDP for universalizing health care services. Conclusion The cost of universal health care delivered through a combination of public and private providers is estimated to be INR 1713 per capita per year in India. Important issues such as delivery strategy for ensuring quality, reducing inequities in access, and managing the growth of health care demand need be explored. PMID:22299038

  12. Handbook of estimating data, factors, and procedures. [for manufacturing cost studies

    NASA Technical Reports Server (NTRS)

    Freeman, L. M.

    1977-01-01

    Elements to be considered in estimating production costs are discussed in this manual. Guidelines, objectives, and methods for analyzing requirements and work structure are given. Time standards for specific specfic operations are listed for machining, sheet metal working, electroplating and metal treating; painting; silk screening, etching and encapsulating; coil winding; wire preparation and wiring; soldering; and the fabrication of etched circuits and terminal boards. The relation of the various elements of cost to the total cost as proposed for various programs by various contractors is compared with government estimates.

  13. Solar thermal technology development: Estimated market size and energy cost savings. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Gates, W. R.

    1983-01-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. The fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. STT R&D is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), dependng on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest.

  14. Solar thermal technology development: Estimated market size and energy cost savings. Volume 1: Executive summary

    NASA Astrophysics Data System (ADS)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. The fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. STT R&D is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), dependng on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest.

  15. GME: at what cost?

    PubMed

    Young, David W

    2003-11-01

    Current computing methods impede determining the real cost of graduate medical education. However, a more accurate estimate could be obtained if policy makers would allow for the application of basic cost-accounting principles, including consideration of department-level costs, unbundling of joint costs, and other factors.

  16. Using Data Envelopment Analysis to Improve Estimates of Higher Education Institution's Per-Student Education Costs

    ERIC Educational Resources Information Center

    Salerno, Carlo

    2006-01-01

    This paper puts forth a data envelopment analysis (DEA) approach to estimating higher education institutions' per-student education costs (PSCs) in an effort to redress a number of methodological problems endemic to such estimations, particularly the allocation of shared expenditures between education and other institutional activities. An example…

  17. Price Estimation Guidelines

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.; Aster, R. W.; Firnett, P. J.; Miller, M. A.

    1985-01-01

    Improved Price Estimation Guidelines, IPEG4, program provides comparatively simple, yet relatively accurate estimate of price of manufactured product. IPEG4 processes user supplied input data to determine estimate of price per unit of production. Input data include equipment cost, space required, labor cost, materials and supplies cost, utility expenses, and production volume on industry wide or process wide basis.

  18. A national hypertension treatment program in Germany and its estimated impact on costs, life expectancy, and cost-effectiveness.

    PubMed

    Gandjour, Afschin; Stock, Stephanie

    2007-10-01

    Almost 15 million Germans may suffer from untreated hypertension. The purpose of this paper is to estimate the cost-effectiveness of a national hypertension treatment program compared to no program. A Markov decision model from the perspective of the statutory health insurance (SHI) was built. All data were taken from secondary sources. The target population consists of hypertensive male and female patients at high or low risk for cardiovascular events at different age groups (40-49, 50-59, and 60-69 years). The analysis shows fairly moderate cost-effectiveness ratios even for low-risk groups (less than 12,000 euros per life year gained). In women at high risk antihypertensive treatment even leads to savings. This suggests that a national hypertension treatment program provides good value for money. Given the considerable costs of the program itself, any savings from avoiding long-term consequences of hypertension are likely to be offset, however.

  19. Review of cost estimates for reducing CO2 emissions. Final report, Task 9

    SciTech Connect

    Not Available

    1990-10-01

    Since the ground breaking work of William Nordhaus in 1977, cost estimates for reducing CO{sub 2} emissions have been developed by numerous groups. The various studies have reported sometimes widely divergent cost estimates for reducing CO{sub 2} emissions. Some recent analyses have indicated that large reductions in CO{sub 2} emissions could be achieved at zero or negative costs (e.g. Rocky Mountain Institute 1989). In contrast, a recent study by Alan Manne of Stanford and Richard Richels of the Electric Power Research Institute (Manne-Richels 1989) concluded that in the US the total discounted costs of reducing CO{sub 2} emissions by 20 percent below the 1990 level could be as much as 3.6 trillion dollars over the period from 1990 to 2100. Costs of this order of magnitude would represent about 5 percent of US GNP. The purpose of this briefing paper is to summarize the different cost estimates for CO{sub 2} emission reduction and to identify the key issues and assumptions that underlie these cost estimates.

  20. Regional Cost Estimates for Reclamation Practices on Arid and Semiarid Lands

    SciTech Connect

    W. K. Ostler

    2002-02-01

    The U.S. Army uses the Integrated Training Area Management program for managing training land. One of the major objectives of the Integrated Training Area Management program has been to develop a method for estimating training land carrying capacity in a sustainable manner. The Army Training and Testing Area Carrying Capacity methodology measures training load in terms of Maneuver Impact Miles. One Maneuver Impact Mile is the equivalent impact of an M1A2 tank traveling one mile while participating in an armor battalion field training exercise. The Army Training and Testing Area Carrying Capacity methodology is also designed to predict land maintenance costs in terms of dollars per Maneuver Impact Mile. The overall cost factor is calculated using the historical cost of land maintenance practices and the effectiveness of controlling erosion. Because land maintenance costs and effectiveness are influenced by the characteristics of the land, Army Training and Testing Area Carrying Capacity cost factors must be developed for each ecological region of the country. Costs for land maintenance activities are presented here for the semiarid and arid regions of the United States. Five ecoregions are recognized, and average values for reclamation activities are presented. Because there are many variables that can influence costs, ranges for reclamation activities are also presented. Costs are broken down into six major categories: seedbed preparation, fertilization, seeding, planting, mulching, and supplemental erosion control. Costs for most land reclamation practices and materials varied widely within and between ecological provinces. Although regional cost patterns were evident for some practices, the patterns were not consistent between practices. For the purpose of estimating land reclamation costs for the Army Training and Testing Area Carrying Capacity methodology, it may be desirable to use the ''Combined Average'' of all provinces found in the last row of each table

  1. Cost and size estimates for an electrochemical bulk energy storage concept

    NASA Technical Reports Server (NTRS)

    Warshay, M.; Wright, L. O.

    1975-01-01

    Preliminary capital cost and size estimates were made for a titanium trichloride, titanium tetrachloride, ferric chloride, ferrous chloride redox-flow-cell electric power system. On the basis of these preliminary estimates plus other important considerations, this electrochemical system emerged as having great promise as a bulk energy storage system for power load leveling. The size of this system is less than two per cent of that of a comparable pumped hydroelectric plant. The estimated capital cost of a 10 MW, 60- and 85-MWh redox-flow system compared well with that of competing systems.

  2. Electric Power Interruption Cost Estimates for Individual Industries, Sectors and the U.S. Economy

    SciTech Connect

    Balducci, Patrick J.; Roop, Joseph M.; Schienbein, Lawrence A.; DeSteese, John G.; Weimar, Mark R.

    2003-05-16

    Distributed energy resources (DER) have been promoted as the least-cost approach to meeting steadily increasing energy demand. However, it is unclear whether DER deployment can maintain or improve the electric power supply reliability and quality currently available to consumers. This report addresses two key factors relating to this question: 1) characteristics of existing power supply reliability, and 2) costs resulting from supply interruptions characteristic of the existing power grid. Interruption cost data collected by the University of Saskatchewan was used in conjunction with data generated by the United States Department of Energy’s Annual Survey of Manufacturers, along with industry shares of gross domestic product (GDP) and gross output to derive interruption cost estimates for U.S. industries at the 2-digit Standard Industrial Classification (SIC) level. Interruption cost estimates are presented as a function of outage duration (e.g., 20 minutes, 1-hour, 3-hour), and are normalized in terms of dollars per peak kW.

  3. Comparative analysis of monetary estimates of external environmental costs associated with combustion of fossil fuels

    SciTech Connect

    Koomey, J.

    1990-07-01

    Public utility commissions in a number of states have begun to explicitly treat costs of environmental externalities in the resource planning and acquisition process (Cohen et al. 1990). This paper compares ten different estimates and regulatory determinations of external environmental costs associated with fossil fuel combustion, using consistent assumptions about combustion efficiency, emissions factors, and resource costs. This consistent comparison is useful because it makes explicit the effects of various assumptions. This paper uses the results of the comparison to illustrate pitfalls in calculation of external environmental costs, and to derive lessons for design of policies to incorporate these externalities into resource planning. 38 refs., 2 figs., 10 tabs.

  4. Development of weight and cost estimates for lifting surfaces with active controls

    NASA Technical Reports Server (NTRS)

    Anderson, R. D.; Flora, C. C.; Nelson, R. M.; Raymond, E. T.; Vincent, J. H.

    1976-01-01

    Equations and methodology were developed for estimating the weight and cost incrementals due to active controls added to the wing and horizontal tail of a subsonic transport airplane. The methods are sufficiently generalized to be suitable for preliminary design. Supporting methodology and input specifications for the weight and cost equations are provided. The weight and cost equations are structured to be flexible in terms of the active control technology (ACT) flight control system specification. In order to present a self-contained package, methodology is also presented for generating ACT flight control system characteristics for the weight and cost equations. Use of the methodology is illustrated.

  5. Cost and size estimates for an electrochemical bulk energy storage concept

    NASA Technical Reports Server (NTRS)

    Warshay, M.; Wright, L. O.

    1975-01-01

    Preliminary capital cost and size estimates were made for an electrochemical bulk energy storage concept. The electrochemical system considered was an electrically rechargeable flow cell with a redox couple. On the basis of preliminary capital cost estimates, size estimates, and several other important considerations, the redox-flow-cell system emerges as having great promise as a bulk energy storage system for power load leveling. The size of this system would be less than 2 percent of that of a comparable pumped hydroelectric plant. The capital cost of a 10-megawatt, 60- and 85-megawatt-hour redox system is estimated to be $190 to $330 per kilowatt. The other important features of the redox system contributing to its load leveling application are its low adverse environmental impact, its high efficiency, its apparent absence of electrochemically-related cycle life limitations, and its fast response.

  6. Estimation of Life-Year Loss and Lifetime Costs for Different Stages of Colon Adenocarcinoma in Taiwan

    PubMed Central

    Chen, Po-Chuan; Lee, Jenq-Chang; Wang, Jung-Der

    2015-01-01

    Backgrounds and aims Life-expectancy of colon cancer patients cannot be accurately answered due to the lack of both large datasets and long-term follow-ups, which impedes accurate estimation of lifetime cost to treat colon cancer patients. In this study, we applied a method to estimate life-expectancy of colon cancer patients in Taiwan and calculate the lifetime costs by different stages and age groups. Methods A total of 17,526 cases with pathologically verified colon adenocarcinoma between 2002 and 2009 were extracted from Taiwan Cancer Registry database for analysis. All patients were followed-up until the end of 2011. Life-expectancy, expected-years-of-life-lost and lifetime costs were estimated, using a semi-parametric survival extrapolation method and borrowing information from life tables of vital statistics. Results Patients with more advanced stages of colon cancer were generally younger and less co-morbid with major chronic diseases than those with stages I and II. The LE of stage I was not significantly different from that of the age- and sex-matched general population, whereas those of stages II, III, and IV colon cancer patients after diagnosis were 16.57±0.07, 13.35±0.07, and 4.05±0.05 years, respectively; the corresponding expected-years-of-life-lost were 1.28±0.07, 5.93±0.07 and 16.42±0.06 years, significantly shorter than the general population after accounting for lead time bias. Besides, the lifetime cost of managing stage II colon cancer patients would be US $8,416±1939, 14,334±1,755, and 21,837±1,698, respectively, indicating a big saving for early diagnosis and treatment after stratification for age and sex. Conclusions Treating colon cancer at younger age and earlier stage saves more life-years and healthcare costs. Future studies are indicated to apply these quantitative results into the cost-effectiveness evaluation of screening program for colon cancers. PMID:26207912

  7. Cost estimates for near-term depolyment of advanced traffic management systems. Final report

    SciTech Connect

    Stevens, S.S.; Chin, S.M.

    1993-02-15

    The objective of this study is to provide cost est engineering, design, installation, operation and maintenance of Advanced Traffic Management Systems (ATMS) in the largest 75 metropolitan areas in the United States. This report gives estimates for deployment costs for ATMS in the next five years, subject to the qualifications and caveats set out in following paragraphs. The report considers infrastructure components required to realize fully a functional ATMS over each of two highway networks (as discussed in the Section describing our general assumptions) under each of the four architectures identified in the MITRE Intelligent Vehicle Highway Systems (IVHS) Architecture studies. The architectures are summarized in this report in Table 2. Estimates are given for eight combinations of highway networks and architectures. We estimate that it will cost between $8.5 Billion (minimal network) and $26 Billion (augmented network) to proceed immediately with deployment of ATMS in the largest 75 metropolitan areas. Costs are given in 1992 dollars, and are not adjusted for future inflation. Our estimates are based partially on completed project costs, which have been adjusted to 1992 dollars. We assume that a particular architecture will be chosen; projected costs are broken by architecture.

  8. Coal gasification systems engineering and analysis. Appendix E: Cost estimation and economic evaluation methodology

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The cost estimation and economic evaluation methodologies presented are consistent with industry practice for assessing capital investment requirements and operating costs of coal conversion systems. All values stated are based on January, 1980 dollars with appropriate recognition of the time value of money. Evaluation of project economic feasibility can be considered a two step process (subject to considerable refinement). First, the costs of the project must be quantified and second, the price at which the product can be manufacturd must be determined. These two major categories are discussed. The summary of methodology is divided into five parts: (1) systems costs, (2)instant plant costs, (3) annual operating costs, (4) escalation and discounting process, and (5) product pricing.

  9. Multiclass support vector machines with example-dependent costs applied to plankton biomass estimation.

    PubMed

    González, Pablo; Álvarez, Eva; Barranquero, Jose; Díez, Jorge; González-Quirós, Rafael; Nogueira, Enrique; López-Urrutia, Ángel; del Coz, Juan José

    2013-11-01

    In many applications, the mistakes made by an automatic classifier are not equal, they have different costs. These problems may be solved using a cost-sensitive learning approach. The main idea is not to minimize the number of errors, but the total cost produced by such mistakes. This brief presents a new multiclass cost-sensitive algorithm, in which each example has attached its corresponding misclassification cost. Our proposal is theoretically well-founded and is designed to optimize cost-sensitive loss functions. This research was motivated by a real-world problem, the biomass estimation of several plankton taxonomic groups. In this particular application, our method improves the performance of traditional multiclass classification approaches that optimize the accuracy.

  10. MBRidge: an accurate and cost-effective method for profiling DNA methylome at single-base resolution.

    PubMed

    Cai, Wanshi; Mao, Fengbiao; Teng, Huajing; Cai, Tao; Zhao, Fangqing; Wu, Jinyu; Sun, Zhong Sheng

    2015-08-01

    Organisms and cells, in response to environmental influences or during development, undergo considerable changes in DNA methylation on a genome-wide scale, which are linked to a variety of biological processes. Using MethylC-seq to decipher DNA methylome at single-base resolution is prohibitively costly. In this study, we develop a novel approach, named MBRidge, to detect the methylation levels of repertoire CpGs, by innovatively introducing C-hydroxylmethylated adapters and bisulfate treatment into the MeDIP-seq protocol and employing ridge regression in data analysis. A systematic evaluation of DNA methylome in a human ovarian cell line T29 showed that MBRidge achieved high correlation (R > 0.90) with much less cost (∼10%) in comparison with MethylC-seq. We further applied MBRidge to profiling DNA methylome in T29H, an oncogenic counterpart of T29's. By comparing methylomes of T29H and T29, we identified 131790 differential methylation regions (DMRs), which are mainly enriched in carcinogenesis-related pathways. These are substantially different from 7567 DMRs that were obtained by RRBS and related with cell development or differentiation. The integrated analysis of DMRs in the promoter and expression of DMR-corresponding genes revealed that DNA methylation enforced reverse regulation of gene expression, depending on the distance from the proximal DMR to transcription starting sites in both mRNA and lncRNA. Taken together, our results demonstrate that MBRidge is an efficient and cost-effective method that can be widely applied to profiling DNA methylomes.

  11. An examination of sources of sensitivity of consumer surplus estimates in travel cost models.

    PubMed

    Blaine, Thomas W; Lichtkoppler, Frank R; Bader, Timothy J; Hartman, Travis J; Lucente, Joseph E

    2015-03-15

    We examine sensitivity of estimates of recreation demand using the Travel Cost Method (TCM) to four factors. Three of the four have been routinely and widely discussed in the TCM literature: a) Poisson verses negative binomial regression; b) application of Englin correction to account for endogenous stratification; c) truncation of the data set to eliminate outliers. A fourth issue we address has not been widely modeled: the potential effect on recreation demand of the interaction between income and travel cost. We provide a straightforward comparison of all four factors, analyzing the impact of each on regression parameters and consumer surplus estimates. Truncation has a modest effect on estimates obtained from the Poisson models but a radical effect on the estimates obtained by way of the negative binomial. Inclusion of an income-travel cost interaction term generally produces a more conservative but not a statistically significantly different estimate of consumer surplus in both Poisson and negative binomial models. It also generates broader confidence intervals. Application of truncation, the Englin correction and the income-travel cost interaction produced the most conservative estimates of consumer surplus and eliminated the statistical difference between the Poisson and the negative binomial. Use of the income-travel cost interaction term reveals that for visitors who face relatively low travel costs, the relationship between income and travel demand is negative, while it is positive for those who face high travel costs. This provides an explanation of the ambiguities on the findings regarding the role of income widely observed in the TCM literature. Our results suggest that policies that reduce access to publicly owned resources inordinately impact local low income recreationists and are contrary to environmental justice.

  12. Estimates of incidence and costs of intestinal infectious diseases in the United States.

    PubMed

    Garthright, W E; Archer, D L; Kvenberg, J E

    1988-01-01

    The incidence of acute episodes of intestinal infectious diseases in the United States was estimated through analysis of community-based studies and national interview surveys. Their differing results were reconciled by adjusting the study population age distributions in the community-based studies, by excluding those cases that also showed respiratory symptoms, and by accounting for structural differences in the surveys. The reconciliation process provided an estimate of 99 million acute cases of either vomiting or diarrhea, or both, each year in this country, half of which involved more than a full day of restricted activity. The analysis was limited to cases of acute gastrointestinal diseases with vomiting or diarrhea but without respiratory symptoms. Physicians were consulted for 8.2 million illnesses; 250,000 of these required hospitalization. In 1985, hospitalizations incurred $560 million in medical costs and $200 million in lost productivity. Nonhospitalized cases (7.9 million) for which physicians were consulted incurred $690 million in medical costs and $2.06 billion in lost productivity. More than 90 million cases for which no physician was consulted cost an estimated $19.5 billion in lost productivity. The estimates excluded such costs as death, pain and suffering, lost leisure time, financial losses to food establishments, and legal expenses. According to these estimates, medical costs and lost productivity from acute intestinal infectious diseases amount to a minimum of about $23 billion a year in the United States.

  13. Application of Boosting Regression Trees to Preliminary Cost Estimation in Building Construction Projects.

    PubMed

    Shin, Yoonseok

    2015-01-01

    Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project.

  14. Reusable Reentry Satellite (RRS) system design study: System cost estimates document

    NASA Astrophysics Data System (ADS)

    1991-02-01

    The Reusable Reentry Satellite (RRS) program was initiated to provide life science investigators relatively inexpensive, frequent access to space for extended periods of time with eventual satellite recovery on earth. The RRS will provide an on-orbit laboratory for research on biological and material processes, be launched from a number of expendable launch vehicles, and operate in Low-Altitude Earth Orbit (LEO) as a free-flying unmanned laboratory. SAIC's design will provide independent atmospheric reentry and soft landing in the continental U.S., orbit for a maximum of 60 days, and will sustain three flights per year for 10 years. The Reusable Reentry Vehicle (RRV) will be 3-axis stabilized with artificial gravity up to 1.5g's, be rugged and easily maintainable, and have a modular design to accommodate a satellite bus and separate modular payloads (e.g., rodent module, general biological module, ESA microgravity botany facility, general botany module). The purpose of this System Cost Estimate Document is to provide a Life Cycle Cost Estimate (LCCE) for a NASA RRS Program using SAIC's RRS design. The estimate includes development, procurement, and 10 years of operations and support (O&S) costs for NASA's RRS program. The estimate does not include costs for other agencies which may track or interface with the RRS program (e.g., Air Force tracking agencies or individual RRS experimenters involved with special payload modules (PM's)). The life cycle cost estimate extends over the 10 year operation and support period FY99-2008.

  15. Reusable Reentry Satellite (RRS) system design study: System cost estimates document

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Reusable Reentry Satellite (RRS) program was initiated to provide life science investigators relatively inexpensive, frequent access to space for extended periods of time with eventual satellite recovery on earth. The RRS will provide an on-orbit laboratory for research on biological and material processes, be launched from a number of expendable launch vehicles, and operate in Low-Altitude Earth Orbit (LEO) as a free-flying unmanned laboratory. SAIC's design will provide independent atmospheric reentry and soft landing in the continental U.S., orbit for a maximum of 60 days, and will sustain three flights per year for 10 years. The Reusable Reentry Vehicle (RRV) will be 3-axis stabilized with artificial gravity up to 1.5g's, be rugged and easily maintainable, and have a modular design to accommodate a satellite bus and separate modular payloads (e.g., rodent module, general biological module, ESA microgravity botany facility, general botany module). The purpose of this System Cost Estimate Document is to provide a Life Cycle Cost Estimate (LCCE) for a NASA RRS Program using SAIC's RRS design. The estimate includes development, procurement, and 10 years of operations and support (O&S) costs for NASA's RRS program. The estimate does not include costs for other agencies which may track or interface with the RRS program (e.g., Air Force tracking agencies or individual RRS experimenters involved with special payload modules (PM's)). The life cycle cost estimate extends over the 10 year operation and support period FY99-2008.

  16. Using propensity scores to estimate the cost-effectiveness of medical therapies.

    PubMed

    Indurkhya, Alka; Mitra, Nandita; Schrag, Deborah

    2006-05-15

    The cost-effectiveness ratio is a popular statistic that is used by policy makers to decide which programs are cost-effective in the public health sector. Recently, the net monetary benefit has been proposed as an alternative statistical summary measure to overcome the limitations associated with the cost-effectiveness ratio. Research on using the net monetary benefit to assess the cost-effectiveness of therapies in non-randomized studies has yet to be done. Propensity scores are useful in estimating adjusted effectiveness of programs that have non-randomized or quasi-experimental designs. This article introduces the use of propensity score adjustment in cost-effectiveness analyses to estimate net monetary benefits for non-randomized studies. The uncertainty associated with the net monetary benefit estimate is evaluated using cost-effectiveness acceptability curves. Our method is illustrated by applying it to SEER-Medicare data for muscle invasive bladder cancer to determine the most cost-effective treatment protocol.

  17. The economic costs of radiation-induced health effects: Estimation and simulation

    SciTech Connect

    Nieves, L.A.; Tawil, J.J.

    1988-08-01

    This effort improves the quantitative information available for use in evaluating actions that alter health risks due to population exposure to ionizing radiation. To project the potential future costs of changes in health effects risks, Pacific Northwest Laboratory (PNL) constructed a probabilistic computer model, Health Effects Costs Model (HECOM), which utilizes the health effect incidence estimates from accident consequences models to calculate the discounted sum of the economic costs associated with population exposure to ionizing radiation. Application of HECOM to value-impact and environmental impact analyses should greatly increase the quality of the information available for regulatory decision making. Three major types of health effects present risks for any population sustaining a significant radiation exposure: acute radiation injuries (and fatalities), latent cancers, and impairments due to genetic effects. The literature pertaining to both incidence and treatment of these health effects was reviewed by PNL and provided the basis for developing economic cost estimates. The economic costs of health effects estimated by HECOM represent both the value of resources consumed in diagnosing, treating, and caring for the patient and the value of goods not produced because of illness or premature death due to the health effect. Additional costs to society, such as pain and suffering, are not included in the PNL economic cost measures since they do not divert resources from other uses, are difficult to quantify, and do not have a value observable in the marketplace. 83 refs., 3 figs., 19 tabs.

  18. Software Cost Estimating Models: A Comparative Study of What the Models Estimate

    DTIC Science & Technology

    1993-09-01

    45 4-3 Example of SEER-SEM, ver 3.21 Report with L3bor Categories .......... 46 4-4 Sample Ada and Assembly Program Operation...technical, physical, or other characteristics (8:596). PRICE-S. Programmed Review of Information for Costing and Evaluation - Software. A commercial...Chapter I for details regarding this model. 6 Software. The combination of computer programs , data, and documentation which enable computer equipment

  19. An estimate of the cost of administering intravenous biological agents in Spanish day hospitals

    PubMed Central

    Nolla, Joan Miquel; Martín, Esperanza; Llamas, Pilar; Manero, Javier; Rodríguez de la Serna, Arturo; Fernández-Miera, Manuel Francisco; Rodríguez, Mercedes; López, José Manuel; Ivanova, Alexandra; Aragón, Belén

    2017-01-01

    Objective To estimate the unit costs of administering intravenous (IV) biological agents in day hospitals (DHs) in the Spanish National Health System. Patients and methods Data were obtained from 188 patients with rheumatoid arthritis, collected from nine DHs, receiving one of the following IV therapies: infliximab (n=48), rituximab (n=38), abatacept (n=41), or tocilizumab (n=61). The fieldwork was carried out between March 2013 and March 2014. The following three groups of costs were considered: 1) structural costs, 2) material costs, and 3) staff costs. Staff costs were considered a fixed cost and were estimated according to the DH theoretical level of activity, which includes, as well as personal care of each patient, the DH general activities (complete imputation method, CIM). In addition, an alternative calculation was performed, in which the staff costs were considered a variable cost imputed according to the time spent on direct care (partial imputation method, PIM). All costs were expressed in euros for the reference year 2014. Results The average total cost was €146.12 per infusion (standard deviation [SD] ±87.11; CIM) and €29.70 per infusion (SD ±11.42; PIM). The structure-related costs per infusion varied between €2.23 and €62.35 per patient and DH; the cost of consumables oscillated between €3.48 and €20.34 per patient and DH. In terms of the care process, the average difference between the shortest and the longest time taken by different hospitals to administer an IV biological therapy was 113 minutes. Conclusion The average total cost of infusion was less than that normally used in models of economic evaluation coming from secondary sources. This cost is even less when the staff costs are imputed according to the PIM. A high degree of variability was observed between different DHs in the cost of the consumables, in the structure-related costs, and in those of the care process. PMID:28356746

  20. Estimating patient-borne water and electricity costs in home hemodialysis: a simulation

    PubMed Central

    Nickel, Matthew; Rideout, Wes; Shah, Nikhil; Reintjes, Frances; Chen, Justin Z.; Burrell, Robert; Pauly, Robert P.

    2017-01-01

    Background: Home hemodialysis is associated with lower costs to the health care system compared with conventional facility-based hemodialysis because of lower staffing and overhead costs, and by transferring the treatment cost of utilities (water and power) to the patient. The purpose of this study was to determine the utility costs of home hemodialysis and create a formula such that patients and renal programs can estimate the annual patient-borne costs involved with this type of treatment. Methods: Seven common combinations of treatment duration and dialysate flows were replicated 5 times using various combinations of home hemodialysis and reverse osmosis machines. Real-time utility (electricity and water) consumption was monitored during these simulations. A generic formula was developed to allow patients and programs to calculate a more precise estimate of utility costs based on individual combinations of dialysis intensity, frequency and utility costs unique to any patient. Results: Using typical 2014 utility costs for Edmonton, the most expensive prescription was for nocturnal home hemodialysis (8 h at 300 mL/min, 6 d/wk), which resulted in a utility cost of $1269 per year; the least expensive prescription was for conventional home hemodialysis (4 h at 500 mL/min, 3 d/wk), which cost $420 per year. Water consumption makes up most of this expense, with electricity accounting for only 12% of the cost. Interpretation: We show that a substantial cost burden is transferred to the patient on home hemodialysis, which would otherwise be borne by the renal program.

  1. Estimating the costs of tsetse control options: an example for Uganda.

    PubMed

    Shaw, A P M; Torr, S J; Waiswa, C; Cecchi, G; Wint, G R W; Mattioli, R C; Robinson, T P

    2013-07-01

    Decision-making and financial planning for tsetse control is complex, with a particularly wide range of choices to be made on location, timing, strategy and methods. This paper presents full cost estimates for eliminating or continuously controlling tsetse in a hypothetical area of 10,000km(2) located in south-eastern Uganda. Four tsetse control techniques were analysed: (i) artificial baits (insecticide-treated traps/targets), (ii) insecticide-treated cattle (ITC), (iii) aerial spraying using the sequential aerosol technique (SAT) and (iv) the addition of the sterile insect technique (SIT) to the insecticide-based methods (i-iii). For the creation of fly-free zones and using a 10% discount rate, the field costs per km(2) came to US$283 for traps (4 traps per km(2)), US$30 for ITC (5 treated cattle per km(2) using restricted application), US$380 for SAT and US$758 for adding SIT. The inclusion of entomological and other preliminary studies plus administrative overheads adds substantially to the overall cost, so that the total costs become US$482 for traps, US$220 for ITC, US$552 for SAT and US$993 - 1365 if SIT is added following suppression using another method. These basic costs would apply to trouble-free operations dealing with isolated tsetse populations. Estimates were also made for non-isolated populations, allowing for a barrier covering 10% of the intervention area, maintained for 3 years. Where traps were used as a barrier, the total cost of elimination increased by between 29% and 57% and for ITC barriers the increase was between 12% and 30%. In the case of continuous tsetse control operations, costs were estimated over a 20-year period and discounted at 10%. Total costs per km(2) came to US$368 for ITC, US$2114 for traps, all deployed continuously, and US$2442 for SAT applied at 3-year intervals. The lower costs compared favourably with the regular treatment of cattle with prophylactic trypanocides (US$3862 per km(2) assuming four doses per annum at 45

  2. Estimating Costs Associated with a Community Outbreak of Meningococcal Disease in a Colombian Caribbean City

    PubMed Central

    Pinzón-Redondo, Hernando; Coronell-Rodriguez, Wilfrido; Díaz-Martinez, Inés; Guzmán-Corena, Ángel; Constenla, Dagna

    2014-01-01

    ABSTRACT Meningococcal disease is a serious and potentially life-threatening infection that is caused by the bacterium Neisseria meningitidis (N. meningitidis), and it can cause meningitis, meningococcaemia outbreaks and epidemics. The disease is fatal in 9-12% of cases and with a death rate of up to 40% among patients with meningococcaemia. The objective of this study was to estimate the costs of a meningococcal outbreak that occurred in a Caribbean city of Colombia. We contacted experts involved in the outbreak and asked them specific questions about the diagnosis and treatment for meningococcal cases during the outbreak. Estimates of costs of the outbreak were also based on extensive review of medical records available during the outbreak. The costs associated with the outbreak were divided into the cost of the disease response phase and the cost of the disease surveillance phase. The costs associated with the outbreak control and surveillance were expressed in US$ (2011) as cost per 1,000 inhabitants. The average age of patients was 4.6 years (SD 3.5); 50% of the cases died; 50% of the cases were reported to have meningitis (3/6); 33% were diagnosed with meningococcaemia and myocarditis (2/6); 50% of the cases had bacteraemia (3/6); 66% of the cases had a culture specimen positive for Neisseria meningitidis; 5 of the 6 cases had RT-PCR positive for N. meningitidis. All N. meningitidis were serogroup B; 50 doses of ceftriaxone were administered as prophylaxis. Vaccine was not available at the time. The costs associated with control of the outbreak were estimated at US$ 0.8 per 1,000 inhabitants, disease surveillance at US$ 4.1 per 1,000 inhabitants, and healthcare costs at US$ 5.1 per 1,000 inhabitants. The costs associated with meningococcal outbreaks are substantial, and the outbreaks should be prevented. The mass chemoprophylaxis implemented helped control the outbreak. PMID:25395916

  3. Observing Volcanic Thermal Anomalies from Space: How Accurate is the Estimation of the Hotspot's Size and Temperature?

    NASA Astrophysics Data System (ADS)

    Zaksek, K.; Pick, L.; Lombardo, V.; Hort, M. K.

    2015-12-01

    Measuring the heat emission from active volcanic features on the basis of infrared satellite images contributes to the volcano's hazard assessment. Because these thermal anomalies only occupy a small fraction (< 1 %) of a typically resolved target pixel (e.g. from Landsat 7, MODIS) the accurate determination of the hotspot's size and temperature is however problematic. Conventionally this is overcome by comparing observations in at least two separate infrared spectral wavebands (Dual-Band method). We investigate the resolution limits of this thermal un-mixing technique by means of a uniquely designed indoor analog experiment. Therein the volcanic feature is simulated by an electrical heating alloy of 0.5 mm diameter installed on a plywood panel of high emissivity. Two thermographic cameras (VarioCam high resolution and ImageIR 8300 by Infratec) record images of the artificial heat source in wavebands comparable to those available from satellite data. These range from the short-wave infrared (1.4-3 µm) over the mid-wave infrared (3-8 µm) to the thermal infrared (8-15 µm). In the conducted experiment the pixel fraction of the hotspot was successively reduced by increasing the camera-to-target distance from 3 m to 35 m. On the basis of an individual target pixel the expected decrease of the hotspot pixel area with distance at a relatively constant wire temperature of around 600 °C was confirmed. The deviation of the hotspot's pixel fraction yielded by the Dual-Band method from the theoretically calculated one was found to be within 20 % up until a target distance of 25 m. This means that a reliable estimation of the hotspot size is only possible if the hotspot is larger than about 3 % of the pixel area, a resolution boundary most remotely sensed volcanic hotspots fall below. Future efforts will focus on the investigation of a resolution limit for the hotspot's temperature by varying the alloy's amperage. Moreover, the un-mixing results for more realistic multi

  4. Estimation of the cost of treatment by chemotherapy for early breast cancer in Morocco

    PubMed Central

    2010-01-01

    Background Breast cancer is the first cancer in women both in incidence and mortality. The treatment of breast cancer benefited from the progress of chemotherapy and targeted therapies, but there was a parallel increase in treatment costs. Despite a relatively high incidence of many sites of cancer, so far, there is no national register for this disease in Morocco. The main goal of this paper is to estimate the total cost of chemotherapy in the early stages of breast cancer due to its frequency and the chances of patients being cured. This study provides health decision-makers with a first estimate of costs and the opportunity to achieve the optimal use of available data to estimate the needs of antimitotics and trastuzumab in Morocco. Method We start by evaluating the individual cost according to the therapeutic sub-groups, namely: 1. Patients needing chemotherapy with only anthracycline-based therapy. 2. Patients needing chemotherapy with both anthracycline and taxane but without trastuzumab. 3. Patients needing trastuzumab in addition to chemotherapy. For each sub-group, the protocol of treatment is described, and the individual costs per unit, and for the whole cycle, are evaluated. Then we estimate the number of women suffering from breast cancer on the basis of two data bases available in Morocco. Finally, we calculate the total annual cost of treatment of breast cancer in Morocco. Results The total cost of breast cancer in Morocco is given in Moroccan dirhams (MAD), the US dollar at the current exchange rate (MAD 10 = USD 1.30) and in international dollars or purchasing power parity (MAD 10 = PPP 1.95). The cost of a therapy with trastuzumab is 8.4 times the cost of a sequential chemotherapy combining anthracycline and taxane, and nearly 60 times the cost of chemotherapy based on anthracycline alone. Globally, between USD 13.3 million and USD 28.6 million need to be devoted every year by the Moroccan health authorities to treat women with localized breast

  5. Estimating costs of low-level radioactive waste disposal alternatives for the Commonwealth of Massachusetts

    SciTech Connect

    Not Available

    1994-02-01

    This report was prepared for the Commonwealth of Massachusetts by the Idaho National Engineering Laboratory, National Low-Level Waste Management Program. It presents planning life-cycle cost (PLCC) estimates for four sizes of in-state low-level radioactive waste (LLRW) disposal facilities. These PLCC estimates include preoperational and operational expenditures, all support facilities, materials, labor, closure costs, and long-term institutional care and monitoring costs. It is intended that this report bc used as a broad decision making tool for evaluating one of the several complex factors that must be examined when deciding between various LLRW management options -- relative costs. Because the underlying assumptions of these analyses will change as the Board decides how it will manage Massachusett`s waste and the specific characteristics any disposal facility will have, the results of this study are not absolute and should only be used to compare the relative costs of the options presented. The disposal technology selected for this analysis is aboveground earth-mounded vaults. These vaults are reinforced concrete structures where low-level waste is emplaced and later covered with a multi-layered earthen cap. The ``base case`` PLCC estimate was derived from a preliminary feasibility design developed for the Illinois Low-Level Radioactive Waste Disposal Facility. This PLCC report describes facility operations and details the procedure used to develop the base case PLCC estimate for each facility component and size. Sensitivity analyses were performed on the base case PLCC estimate by varying several factors to determine their influences upon the unit disposal costs. The report presents the results of the sensitivity analyses for the five most significant cost factors.

  6. Sound Cost Estimating: A Pre-Requisite to Ascertaining Affordability of DoD Programs

    DTIC Science & Technology

    2011-10-01

    predictions, especially about the future,” has been variously attributed to Niels Bohr , Mark Twain and, of course, Yogi Berra. In his best-selling...critical for program success: con-ducting a sound program life cycle cost estimate and establishing a program’s budget. These two processes are...supposedly “on-track” is one strategy to provide early warning about problems before they become un- manageable. A sound cost esti- mate is a neces- sary

  7. Equipment Design and Cost Estimation for Small Modular Biomass Systems, Synthesis Gas Cleanup, and Oxygen Separation Equipment; Task 1: Cost Estimates of Small Modular Systems

    SciTech Connect

    Nexant Inc.

    2006-05-01

    This deliverable is the Final Report for Task 1, Cost Estimates of Small Modular Systems, as part of NREL Award ACO-5-44027, ''Equipment Design and Cost Estimation for Small Modular Biomass Systems, Synthesis Gas Cleanup and Oxygen Separation Equipment''. Subtask 1.1 looked into processes and technologies that have been commercially built at both large and small scales, with three technologies, Fluidized Catalytic Cracking (FCC) of refinery gas oil, Steam Methane Reforming (SMR) of Natural Gas, and Natural Gas Liquids (NGL) Expanders, chosen for further investigation. These technologies were chosen due to their applicability relative to other technologies being considered by NREL for future commercial applications, such as indirect gasification and fluidized bed tar cracking. Research in this subject is driven by an interest in the impact that scaling has on the cost and major process unit designs for commercial technologies. Conclusions from the evaluations performed could be applied to other technologies being considered for modular or skid-mounted applications.

  8. The cost of forming more accurate impressions: accuracy-motivated perceivers see the personality of others more distinctively but less normatively than perceivers without an explicit goal.

    PubMed

    Biesanz, Jeremy C; Human, Lauren J

    2010-04-01

    Does the motivation to form accurate impressions actually improve accuracy? The present work extended Kenny's (1991, 1994) weighted-average model (WAM)--a theoretical model of the factors that influence agreement among personality judgments--to examine two components of interpersonal perception: distinctive and normative accuracy. WAM predicts that an accuracy motivation should enhance distinctive accuracy but decrease normative accuracy. In other words, the impressions of a perceiver with an accuracy motivation will correspond more with the target person's unique characteristics and less with the characteristics of the average person. Perceivers randomly assigned to receive the social goal of forming accurate impressions, which was communicated through a single-sentence instruction, achieved higher levels of distinctive self-other agreement but lower levels of normative agreement compared with perceivers not given an explicit impression-formation goal. The results suggest that people motivated to form accurate impressions do indeed become more accurate, but at the cost of seeing others less normatively and, in particular, less positively.

  9. Early-Stage Capital Cost Estimation of Biorefinery Processes: A Comparative Study of Heuristic Techniques.

    PubMed

    Tsagkari, Mirela; Couturier, Jean-Luc; Kokossis, Antonis; Dubois, Jean-Luc

    2016-09-08

    Biorefineries offer a promising alternative to fossil-based processing industries and have undergone rapid development in recent years. Limited financial resources and stringent company budgets necessitate quick capital estimation of pioneering biorefinery projects at the early stages of their conception to screen process alternatives, decide on project viability, and allocate resources to the most promising cases. Biorefineries are capital-intensive projects that involve state-of-the-art technologies for which there is no prior experience or sufficient historical data. This work reviews existing rapid cost estimation practices, which can be used by researchers with no previous cost estimating experience. It also comprises a comparative study of six cost methods on three well-documented biorefinery processes to evaluate their accuracy and precision. The results illustrate discrepancies among the methods because their extrapolation on biorefinery data often violates inherent assumptions. This study recommends the most appropriate rapid cost methods and urges the development of an improved early-stage capital cost estimation tool suitable for biorefinery processes.

  10. Alternative methods of marginal abatement cost estimation: Non- parametric distance functions

    SciTech Connect

    Boyd, G.; Molburg, J.; Prince, R.

    1996-12-31

    This project implements a economic methodology to measure the marginal abatement costs of pollution by measuring the lost revenue implied by an incremental reduction in pollution. It utilizes observed performance, or `best practice`, of facilities to infer the marginal abatement cost. The initial stage of the project is to use data from an earlier published study on productivity trends and pollution in electric utilities to test this approach and to provide insights on its implementation to issues of cost-benefit analysis studies needed by the Department of Energy. The basis for this marginal abatement cost estimation is a relationship between the outputs and the inputs of a firm or plant. Given a fixed set of input resources, including quasi-fixed inputs like plant and equipment and variable inputs like labor and fuel, a firm is able to produce a mix of outputs. This paper uses this theoretical view of the joint production process to implement a methodology and obtain empirical estimates of marginal abatement costs. These estimates are compared to engineering estimates.

  11. Early‐Stage Capital Cost Estimation of Biorefinery Processes: A Comparative Study of Heuristic Techniques

    PubMed Central

    Couturier, Jean‐Luc; Kokossis, Antonis; Dubois, Jean‐Luc

    2016-01-01

    Abstract Biorefineries offer a promising alternative to fossil‐based processing industries and have undergone rapid development in recent years. Limited financial resources and stringent company budgets necessitate quick capital estimation of pioneering biorefinery projects at the early stages of their conception to screen process alternatives, decide on project viability, and allocate resources to the most promising cases. Biorefineries are capital‐intensive projects that involve state‐of‐the‐art technologies for which there is no prior experience or sufficient historical data. This work reviews existing rapid cost estimation practices, which can be used by researchers with no previous cost estimating experience. It also comprises a comparative study of six cost methods on three well‐documented biorefinery processes to evaluate their accuracy and precision. The results illustrate discrepancies among the methods because their extrapolation on biorefinery data often violates inherent assumptions. This study recommends the most appropriate rapid cost methods and urges the development of an improved early‐stage capital cost estimation tool suitable for biorefinery processes. PMID:27484398

  12. Site restoration: Estimation of attributable costs from plutonium-dispersal accidents

    SciTech Connect

    Chanin, D.I.; Murfin, W.B.

    1996-05-01

    A nuclear weapons accident is an extremely unlikely event due to the extensive care taken in operations. However, under some hypothetical accident conditions, plutonium might be dispersed to the environment. This would result in costs being incurred by the government to remediate the site and compensate for losses. This study is a multi-disciplinary evaluation of the potential scope of the post-accident response that includes technical factors, current and proposed legal requirements and constraints, as well as social/political factors that could influence decision making. The study provides parameters that can be used to assess economic costs for accidents postulated to occur in urban areas, Midwest farmland, Western rangeland, and forest. Per-area remediation costs have been estimated, using industry-standard methods, for both expedited and extended remediation. Expedited remediation costs have been evaluated for highways, airports, and urban areas. Extended remediation costs have been evaluated for all land uses except highways and airports. The inclusion of cost estimates in risk assessments, together with the conventional estimation of doses and health effects, allows a fuller understanding of the post-accident environment. The insights obtained can be used to minimize economic risks by evaluation of operational and design alternatives, and through development of improved capabilities for accident response.

  13. Tug fleet and ground operations schedules and controls. Volume 3: Program cost estimates

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Cost data for the tug DDT&E and operations phases are presented. Option 6 is the recommended option selected from seven options considered and was used as the basis for ground processing estimates. Option 6 provides for processing the tug in a factory clean environment in the low bay area of VAB with subsequent cleaning to visibly clean. The basis and results of the trade study to select Option 6 processing plan is included. Cost estimating methodology, a work breakdown structure, and a dictionary of WBS definitions is also provided.

  14. Conceptual capital-cost estimate and facility design of the Mirror-Fusion Technology Demonstration Facility

    SciTech Connect

    Not Available

    1982-09-01

    This report contains contributions by Bechtel Group, Inc. to Lawrence Livermore National Laboratory (LLNL) for the final report on the conceptual design of the Mirror Fusion Technology Demonstration Facility (TDF). Included in this report are the following contributions: (1) conceptual capital cost estimate, (2) structural design, and (3) plot plan and plant arrangement drawings. The conceptual capital cost estimate is prepared in a format suitable for inclusion as a section in the TDF final report. The structural design and drawings are prepared as partial inputs to the TDF final report section on facilities design, which is being prepared by the FEDC.

  15. Design and cost estimate of an 800 MVA superconducting power transmission

    SciTech Connect

    Alex, P.; Ernst, A. ); Forsyth, E.; Gibbs, R.; Thomas, R.; Muller, T. )

    1990-10-18

    Numerous studies involving cost estimates have been performed for superconducting power transmission systems. As these systems were usually aimed at providing transmission from large clusters of generation the base power rating of the corridor was very high; in the case of the most comprehensive study it was 10,000 MVA. The purpose of this study is to examine a system which is very closely based on the prototype 1000 MVA system which was operated at Brookhaven National Laboratory over a four year period. The purpose of the study is to provide cost estimates for the superconducting system and to compare these estimates with a design based on the use of advanced but conventional cable designs. The work is supported by funding from the Office of Energy Research's Industry/Laboratory Technology Exchange Program. This program is designed to commercialize energy technologies. The technical design of the superconducting system was prepared by the BNL staff, the design of the 800 MVA conventional cable system was done by engineers from Underground Systems Incorporated. Both institutions worked on the cost estimate of the superconducting system. The description and cost estimate of the conventional cable system is given in the Appendix. 5 refs.

  16. Particulate Matter Exposure and Preterm Birth: Estimates of U.S. Attributable Burden and Economic Costs

    PubMed Central

    Trasande, Leonardo; Malecha, Patrick; Attina, Teresa M.

    2016-01-01

    Background: Preterm birth (PTB) rates (11.4% in 2013) in the United States remain high and are a substantial cause of morbidity. Studies of prenatal exposure have associated particulate matter ≤ 2.5 μm in diameter (PM2.5) and other ambient air pollutants with adverse birth outcomes; yet, to our knowledge, burden and costs of PM2.5-attributable PTB have not been estimated in the United States. Objectives: We aimed to estimate burden of PTB in the United States and economic costs attributable to PM2.5 exposure in 2010. Methods: Annual deciles of PM2.5 were obtained from the U.S. Environmental Protection Agency. We converted PTB odds ratio (OR), identified in a previous meta-analysis (1.15 per 10 μg/m3 for our base case, 1.07–1.16 for low- and high-end scenarios) to relative risk (RRs), to obtain an estimate that better represents the true relative risk. A reference level (RL) of 8.8 μg/m3 was applied. We then used the RR estimates and county-level PTB prevalence to quantify PM2.5-attributable PTB. Direct medical costs were obtained from the 2007 Institute of Medicine report, and lost economic productivity (LEP) was estimated using a meta-analysis of PTB-associated IQ loss, and well-established relationships of IQ loss with LEP. All costs were calculated using 2010 dollars. Results: An estimated 3.32% of PTBs nationally (corresponding to 15,808 PTBs) in 2010 could be attributed to PM2.5 (PM2.5 > 8.8 μg/m3). Attributable PTBs cost were estimated at $5.09 billion [sensitivity analysis (SA): $2.43–9.66 B], of which $760 million were spent for medical care (SA: $362 M–1.44 B). The estimated PM2.5 attributable fraction (AF) of PTB was highest in urban counties, with highest AFs in the Ohio Valley and the southern United States. Conclusions: PM2.5 may contribute substantially to burden and costs of PTB in the United States, and considerable health and economic benefits could be achieved through environmental regulatory interventions that reduce PM2.5 exposure in

  17. The Cambridge Face Tracker: Accurate, Low Cost Measurement of Head Posture Using Computer Vision and Face Recognition Software

    PubMed Central

    Thomas, Peter B. M.; Baltrušaitis, Tadas; Robinson, Peter; Vivian, Anthony J.

    2016-01-01

    Purpose We validate a video-based method of head posture measurement. Methods The Cambridge Face Tracker uses neural networks (constrained local neural fields) to recognize facial features in video. The relative position of these facial features is used to calculate head posture. First, we assess the accuracy of this approach against videos in three research databases where each frame is tagged with a precisely measured head posture. Second, we compare our method to a commercially available mechanical device, the Cervical Range of Motion device: four subjects each adopted 43 distinct head postures that were measured using both methods. Results The Cambridge Face Tracker achieved confident facial recognition in 92% of the approximately 38,000 frames of video from the three databases. The respective mean error in absolute head posture was 3.34°, 3.86°, and 2.81°, with a median error of 1.97°, 2.16°, and 1.96°. The accuracy decreased with more extreme head posture. Comparing The Cambridge Face Tracker to the Cervical Range of Motion Device gave correlation coefficients of 0.99 (P < 0.0001), 0.96 (P < 0.0001), and 0.99 (P < 0.0001) for yaw, pitch, and roll, respectively. Conclusions The Cambridge Face Tracker performs well under real-world conditions and within the range of normally-encountered head posture. It allows useful quantification of head posture in real time or from precaptured video. Its performance is similar to that of a clinically validated mechanical device. It has significant advantages over other approaches in that subjects do not need to wear any apparatus, and it requires only low cost, easy-to-setup consumer electronics. Translational Relevance Noncontact assessment of head posture allows more complete clinical assessment of patients, and could benefit surgical planning in future. PMID:27730008

  18. Accurate molecular dynamics and nuclear quantum effects at low cost by multiple steps in real and imaginary time: Using density functional theory to accelerate wavefunction methods

    NASA Astrophysics Data System (ADS)

    Kapil, V.; VandeVondele, J.; Ceriotti, M.

    2016-02-01

    The development and implementation of increasingly accurate methods for electronic structure calculations mean that, for many atomistic simulation problems, treating light nuclei as classical particles is now one of the most serious approximations. Even though recent developments have significantly reduced the overhead for modeling the quantum nature of the nuclei, the cost is still prohibitive when combined with advanced electronic structure methods. Here we present how multiple time step integrators can be combined with ring-polymer contraction techniques (effectively, multiple time stepping in imaginary time) to reduce virtually to zero the overhead of modelling nuclear quantum effects, while describing inter-atomic forces at high levels of electronic structure theory. This is demonstrated for a combination of MP2 and semi-local DFT applied to the Zundel cation. The approach can be seamlessly combined with other methods to reduce the computational cost of path integral calculations, such as high-order factorizations of the Boltzmann operator or generalized Langevin equation thermostats.

  19. Accurate molecular dynamics and nuclear quantum effects at low cost by multiple steps in real and imaginary time: Using density functional theory to accelerate wavefunction methods

    SciTech Connect

    Kapil, V.; Ceriotti, M.; VandeVondele, J.

    2016-02-07

    The development and implementation of increasingly accurate methods for electronic structure calculations mean that, for many atomistic simulation problems, treating light nuclei as classical particles is now one of the most serious approximations. Even though recent developments have significantly reduced the overhead for modeling the quantum nature of the nuclei, the cost is still prohibitive when combined with advanced electronic structure methods. Here we present how multiple time step integrators can be combined with ring-polymer contraction techniques (effectively, multiple time stepping in imaginary time) to reduce virtually to zero the overhead of modelling nuclear quantum effects, while describing inter-atomic forces at high levels of electronic structure theory. This is demonstrated for a combination of MP2 and semi-local DFT applied to the Zundel cation. The approach can be seamlessly combined with other methods to reduce the computational cost of path integral calculations, such as high-order factorizations of the Boltzmann operator or generalized Langevin equation thermostats.

  20. A model to estimate the cost effectiveness of the indoorenvironment improvements in office work

    SciTech Connect

    Seppanen, Olli; Fisk, William J.

    2004-06-01

    Deteriorated indoor climate is commonly related to increases in sick building syndrome symptoms, respiratory illnesses, sick leave, reduced comfort and losses in productivity. The cost of deteriorated indoor climate for the society is high. Some calculations show that the cost is higher than the heating energy costs of the same buildings. Also building-level calculations have shown that many measures taken to improve indoor air quality and climate are cost-effective when the potential monetary savings resulting from an improved indoor climate are included as benefits gained. As an initial step towards systemizing these building level calculations we have developed a conceptual model to estimate the cost-effectiveness of various measures. The model shows the links between the improvements in the indoor environment and the following potential financial benefits: reduced medical care cost, reduced sick leave, better performance of work, lower turn over of employees, and lower cost of building maintenance due to fewer complaints about indoor air quality and climate. The pathways to these potential benefits from changes in building technology and practices go via several human responses to the indoor environment such as infectious diseases, allergies and asthma, sick building syndrome symptoms, perceived air quality, and thermal environment. The model also includes the annual cost of investments, operation costs, and cost savings of improved indoor climate. The conceptual model illustrates how various factors are linked to each other. SBS symptoms are probably the most commonly assessed health responses in IEQ studies and have been linked to several characteristics of buildings and IEQ. While the available evidence indicates that SBS symptoms can affect these outcomes and suspects that such a linkage exists, at present we can not quantify the relationships sufficiently for cost-benefit modeling. New research and analyses of existing data to quantify the financial