Science.gov

Sample records for accurate cost estimate

  1. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained. PMID:17363231

  2. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  3. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  4. Profitable capitation requires accurate costing.

    PubMed

    West, D A; Hicks, L L; Balas, E A; West, T D

    1996-01-01

    In the name of costing accuracy, nurses are asked to track inventory use on per treatment basis when more significant costs, such as general overhead and nursing salaries, are usually allocated to patients or treatments on an average cost basis. Accurate treatment costing and financial viability require analysis of all resources actually consumed in treatment delivery, including nursing services and inventory. More precise costing information enables more profitable decisions as is demonstrated by comparing the ratio-of-cost-to-treatment method (aggregate costing) with alternative activity-based costing methods (ABC). Nurses must participate in this costing process to assure that capitation bids are based upon accurate costs rather than simple averages. PMID:8788799

  5. Ramjet cost estimating handbook

    NASA Technical Reports Server (NTRS)

    Emmons, H. T.; Norwood, D. L.; Rasmusen, J. E.; Reynolds, H. E.

    1978-01-01

    Research conducted under Air Force Contract F33615-76-C-2043 to generate cost data and to establish a cost methodology that accurately predicts the production costs of ramjet engines is presented. The cost handbook contains a description of over one hundred and twenty-five different components which are defined as baseline components. The cost estimator selects from the handbook the appropriate components to fit his ramjet assembly, computes the cost from cost computation data sheets in the handbook, and totals all of the appropriate cost elements to arrive at the total engine cost. The methodology described in the cost handbook addresses many different ramjet types from simple podded arrangements of the liquid fuel ramjet to the more complex integral rocket/ramjet configurations including solid fuel ramjets and solid ducted rockets. It is applicable to a range of sizes from 6 in diameter to 18 in diameter and to production quantities up to 5000 engines.

  6. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  7. Estimating Airline Operating Costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    The factors affecting commercial aircraft operating and delay costs were used to develop an airline operating cost model which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model permits estimates of aircraft related costs, i.e., aircraft service, landing fees, flight attendants, and control fees. A method for estimating the costs of certain types of airline delay is also described.

  8. Estimating airline operating costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    A review was made of the factors affecting commercial aircraft operating and delay costs. From this work, an airline operating cost model was developed which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model, similar in some respects to the standard Air Transport Association of America (ATA) Direct Operating Cost Model, permits estimates of aircraft-related costs not now included in the standard ATA model (e.g., aircraft service, landing fees, flight attendants, and control fees). A study of the cost of aircraft delay was also made and a method for estimating the cost of certain types of airline delay is described.

  9. Product line cost estimation: a standard cost approach.

    PubMed

    Cooper, J C; Suver, J D

    1988-04-01

    Product line managers often must make decisions based on inaccurate cost information. A method is needed to determine costs more accurately. By using a standard costing model, product line managers can better estimate the cost of intermediate and end products, and hence better estimate the costs of the product line. PMID:10286385

  10. Spacecraft platform cost estimating relationships

    NASA Technical Reports Server (NTRS)

    Gruhl, W. M.

    1972-01-01

    The three main cost areas of unmanned satellite development are discussed. The areas are identified as: (1) the spacecraft platform (SCP), (2) the payload or experiments, and (3) the postlaunch ground equipment and operations. The SCP normally accounts for over half of the total project cost and accurate estimates of SCP costs are required early in project planning as a basis for determining total project budget requirements. The development of single formula SCP cost estimating relationships (CER) from readily available data by statistical linear regression analysis is described. The advantages of single formula CER are presented.

  11. Cost-Estimation Program

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    1995-01-01

    COSTIT computer program estimates cost of electronic design by reading item-list file and file containing cost for each item. Accuracy of cost estimate based on accuracy of cost-list file. Written by use of AWK utility for Sun4-series computers running SunOS 4.x and IBM PC-series and compatible computers running MS-DOS. The Sun version (NPO-19587). PC version (NPO-19157).

  12. Updated Conceptual Cost Estimating

    NASA Technical Reports Server (NTRS)

    Brown, J. A.

    1987-01-01

    16-page report discusses development and use of NASA TR-1508, the Kennedy Space Center Aerospace Construction Price Book for preparing conceptual, budget, funding, cost-estimating, and preliminary cost-engineering reports. Updated annually from 1974 through 1985 with actual bid prices and government estimates. Includes labor and material quantities and prices with contractor and subcontractor markups for buildings, facilities, and systems at Kennedy Space Center. While data pertains to aerospace facilities, format and cost-estimating techniques guide estimation of costs in other construction applications.

  13. The Psychology of Cost Estimating

    NASA Technical Reports Server (NTRS)

    Price, Andy

    2016-01-01

    Cost estimation for large (and even not so large) government programs is a challenge. The number and magnitude of cost overruns associated with large Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) programs highlight the difficulties in developing and promulgating accurate cost estimates. These overruns can be the result of inadequate technology readiness or requirements definition, the whims of politicians or government bureaucrats, or even as failures of the cost estimating profession itself. However, there may be another reason for cost overruns that is right in front of us, but only recently have we begun to grasp it: the fact that cost estimators and their customers are human. The last 70+ years of research into human psychology and behavioral economics have yielded amazing findings into how we humans process and use information to make judgments and decisions. What these scientists have uncovered is surprising: humans are often irrational and illogical beings, making decisions based on factors such as emotion and perception, rather than facts and data. These built-in biases to our thinking directly affect how we develop our cost estimates and how those cost estimates are used. We cost estimators can use this knowledge of biases to improve our cost estimates and also to improve how we communicate and work with our customers. By understanding how our customers think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. By using psychology to our advantage, we can more effectively help the decision maker and our organizations make fact-based decisions.

  14. Capital cost estimate

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The capital cost estimate for the nuclear process heat source (NPHS) plant was made by: (1) using costs from the current commercial HTGR for electricity production as a base for items that are essentially the same and (2) development of new estimates for modified or new equipment that is specifically for the process heat application. Results are given in tabular form and cover the total investment required for each process temperature studied.

  15. A better approach to cost estimation.

    PubMed

    Richmond, Russ

    2013-03-01

    Using ratios of costs to charges (RCCs) to estimate costs can cause hospitals to significantly over- or under-invest in service lines. A focus on improving cost estimation in cost centers where physicians have significant control over operating expenses, such as drugs or implants, can strengthen decision making and strategic planning. Connecting patient file information to purchasing data can lead to more accurate reflections of actual costs and help hospitals gain better visibility across service lines.

  16. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  17. Cost-estimating relationships for space programs

    NASA Technical Reports Server (NTRS)

    Mandell, Humboldt C., Jr.

    1992-01-01

    Cost-estimating relationships (CERs) are defined and discussed as they relate to the estimation of theoretical costs for space programs. The paper primarily addresses CERs based on analogous relationships between physical and performance parameters to estimate future costs. Analytical estimation principles are reviewed examining the sources of errors in cost models, and the use of CERs is shown to be affected by organizational culture. Two paradigms for cost estimation are set forth: (1) the Rand paradigm for single-culture single-system methods; and (2) the Price paradigms that incorporate a set of cultural variables. For space programs that are potentially subject to even small cultural changes, the Price paradigms are argued to be more effective. The derivation and use of accurate CERs is important for developing effective cost models to analyze the potential of a given space program.

  18. Manned Mars mission cost estimate

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph; Smith, Keith

    1986-01-01

    The potential costs of several options of a manned Mars mission are examined. A cost estimating methodology based primarily on existing Marshall Space Flight Center (MSFC) parametric cost models is summarized. These models include the MSFC Space Station Cost Model and the MSFC Launch Vehicle Cost Model as well as other modes and techniques. The ground rules and assumptions of the cost estimating methodology are discussed and cost estimates presented for six potential mission options which were studied. The estimated manned Mars mission costs are compared to the cost of the somewhat analogous Apollo Program cost after normalizing the Apollo cost to the environment and ground rules of the manned Mars missions. It is concluded that a manned Mars mission, as currently defined, could be accomplished for under $30 billion in 1985 dollars excluding launch vehicle development and mission operations.

  19. ADP (Automated Data Processing) cost estimating heuristics

    SciTech Connect

    Sadlowe, A.R.; Arrowood, L.F.; Jones, K.A.; Emrich, M.L.; Watson, B.D.

    1987-09-11

    Artificial Intelligence, in particular expert systems methodologies, is being applied to the US Navy's Automated Data Processing estimating tasks. Skilled Navy project leaders are nearing retirement; replacements may not yet possess the many years of experience required to make accurate decisions regarding time, cost, equipment, and personnel needs. The potential departure of expertise resulted in the development of a system to capture organizational expertise. The prototype allows inexperienced project leaders to interactively generate cost estimates. 5 refs.

  20. Cost Estimating Handbook for Environmental Restoration

    SciTech Connect

    1990-09-01

    Environmental restoration (ER) projects have presented the DOE and cost estimators with a number of properties that are not comparable to the normal estimating climate within DOE. These properties include: An entirely new set of specialized expressions and terminology. A higher than normal exposure to cost and schedule risk, as compared to most other DOE projects, due to changing regulations, public involvement, resource shortages, and scope of work. A higher than normal percentage of indirect costs to the total estimated cost due primarily to record keeping, special training, liability, and indemnification. More than one estimate for a project, particularly in the assessment phase, in order to provide input into the evaluation of alternatives for the cleanup action. While some aspects of existing guidance for cost estimators will be applicable to environmental restoration projects, some components of the present guidelines will have to be modified to reflect the unique elements of these projects. The purpose of this Handbook is to assist cost estimators in the preparation of environmental restoration estimates for Environmental Restoration and Waste Management (EM) projects undertaken by DOE. The DOE has, in recent years, seen a significant increase in the number, size, and frequency of environmental restoration projects that must be costed by the various DOE offices. The coming years will show the EM program to be the largest non-weapons program undertaken by DOE. These projects create new and unique estimating requirements since historical cost and estimating precedents are meager at best. It is anticipated that this Handbook will enhance the quality of cost data within DOE in several ways by providing: The basis for accurate, consistent, and traceable baselines. Sound methodologies, guidelines, and estimating formats. Sources of cost data/databases and estimating tools and techniques available at DOE cost professionals.

  1. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  2. 31 CFR 205.24 - How are accurate estimates maintained?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are accurate estimates maintained... Treasury-State Agreement § 205.24 How are accurate estimates maintained? (a) If a State has knowledge that an estimate does not reasonably correspond to the State's cash needs for a Federal assistance...

  3. Micromagnetometer calibration for accurate orientation estimation.

    PubMed

    Zhang, Zhi-Qiang; Yang, Guang-Zhong

    2015-02-01

    Micromagnetometers, together with inertial sensors, are widely used for attitude estimation for a wide variety of applications. However, appropriate sensor calibration, which is essential to the accuracy of attitude reconstruction, must be performed in advance. Thus far, many different magnetometer calibration methods have been proposed to compensate for errors such as scale, offset, and nonorthogonality. They have also been used for obviate magnetic errors due to soft and hard iron. However, in order to combine the magnetometer with inertial sensor for attitude reconstruction, alignment difference between the magnetometer and the axes of the inertial sensor must be determined as well. This paper proposes a practical means of sensor error correction by simultaneous consideration of sensor errors, magnetic errors, and alignment difference. We take the summation of the offset and hard iron error as the combined bias and then amalgamate the alignment difference and all the other errors as a transformation matrix. A two-step approach is presented to determine the combined bias and transformation matrix separately. In the first step, the combined bias is determined by finding an optimal ellipsoid that can best fit the sensor readings. In the second step, the intrinsic relationships of the raw sensor readings are explored to estimate the transformation matrix as a homogeneous linear least-squares problem. Singular value decomposition is then applied to estimate both the transformation matrix and magnetic vector. The proposed method is then applied to calibrate our sensor node. Although there is no ground truth for the combined bias and transformation matrix for our node, the consistency of calibration results among different trials and less than 3(°) root mean square error for orientation estimation have been achieved, which illustrates the effectiveness of the proposed sensor calibration method for practical applications. PMID:25265625

  4. Parametric cost estimation for space science missions

    NASA Astrophysics Data System (ADS)

    Lillie, Charles F.; Thompson, Bruce E.

    2008-07-01

    Cost estimation for space science missions is critically important in budgeting for successful missions. The process requires consideration of a number of parameters, where many of the values are only known to a limited accuracy. The results of cost estimation are not perfect, but must be calculated and compared with the estimates that the government uses for budgeting purposes. Uncertainties in the input parameters result from evolving requirements for missions that are typically the "first of a kind" with "state-of-the-art" instruments and new spacecraft and payload technologies that make it difficult to base estimates on the cost histories of previous missions. Even the cost of heritage avionics is uncertain due to parts obsolescence and the resulting redesign work. Through experience and use of industry best practices developed in participation with the Aerospace Industries Association (AIA), Northrop Grumman has developed a parametric modeling approach that can provide a reasonably accurate cost range and most probable cost for future space missions. During the initial mission phases, the approach uses mass- and powerbased cost estimating relationships (CER)'s developed with historical data from previous missions. In later mission phases, when the mission requirements are better defined, these estimates are updated with vendor's bids and "bottoms- up", "grass-roots" material and labor cost estimates based on detailed schedules and assigned tasks. In this paper we describe how we develop our CER's for parametric cost estimation and how they can be applied to estimate the costs for future space science missions like those presented to the Astronomy & Astrophysics Decadal Survey Study Committees.

  5. Solar power satellite cost estimate

    NASA Technical Reports Server (NTRS)

    Harron, R. J.; Wadle, R. C.

    1981-01-01

    The solar power configuration costed is the 5 GW silicon solar cell reference system. The subsystems identified by work breakdown structure elements to the lowest level for which cost information was generated. This breakdown divides into five sections: the satellite, construction, transportation, the ground receiving station and maintenance. For each work breakdown structure element, a definition, design description and cost estimate were included. An effort was made to include for each element a reference that more thoroughly describes the element and the method of costing used. All costs are in 1977 dollars.

  6. Progress Toward Automated Cost Estimation

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1992-01-01

    Report discusses efforts to develop standard system of automated cost estimation (ACE) and computer-aided design (CAD). Advantage of system is time saved and accuracy enhanced by automating extraction of quantities from design drawings, consultation of price lists, and application of cost and markup formulas.

  7. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  8. Accurate Orientation Estimation Using AHRS under Conditions of Magnetic Distortion

    PubMed Central

    Yadav, Nagesh; Bleakley, Chris

    2014-01-01

    Low cost, compact attitude heading reference systems (AHRS) are now being used to track human body movements in indoor environments by estimation of the 3D orientation of body segments. In many of these systems, heading estimation is achieved by monitoring the strength of the Earth's magnetic field. However, the Earth's magnetic field can be locally distorted due to the proximity of ferrous and/or magnetic objects. Herein, we propose a novel method for accurate 3D orientation estimation using an AHRS, comprised of an accelerometer, gyroscope and magnetometer, under conditions of magnetic field distortion. The system performs online detection and compensation for magnetic disturbances, due to, for example, the presence of ferrous objects. The magnetic distortions are detected by exploiting variations in magnetic dip angle, relative to the gravity vector, and in magnetic strength. We investigate and show the advantages of using both magnetic strength and magnetic dip angle for detecting the presence of magnetic distortions. The correction method is based on a particle filter, which performs the correction using an adaptive cost function and by adapting the variance during particle resampling, so as to place more emphasis on the results of dead reckoning of the gyroscope measurements and less on the magnetometer readings. The proposed method was tested in an indoor environment in the presence of various magnetic distortions and under various accelerations (up to 3 g). In the experiments, the proposed algorithm achieves <2° static peak-to-peak error and <5° dynamic peak-to-peak error, significantly outperforming previous methods. PMID:25347584

  9. Quantifying Accurate Calorie Estimation Using the "Think Aloud" Method

    ERIC Educational Resources Information Center

    Holmstrup, Michael E.; Stearns-Bruening, Kay; Rozelle, Jeffrey

    2013-01-01

    Objective: Clients often have limited time in a nutrition education setting. An improved understanding of the strategies used to accurately estimate calories may help to identify areas of focused instruction to improve nutrition knowledge. Methods: A "Think Aloud" exercise was recorded during the estimation of calories in a standard dinner meal…

  10. Accurate Parameter Estimation for Unbalanced Three-Phase System

    PubMed Central

    Chen, Yuan

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newton-Raphson scheme, whose global convergence is studied in this paper. Computer simulations show that the mean square error performance of NLS method can attain the Cramér-Rao lower bound. Moreover, our proposal provides more accurate frequency estimation when compared with the complex least mean square (CLMS) and augmented CLMS. PMID:25162056

  11. Accurate parameter estimation for unbalanced three-phase system.

    PubMed

    Chen, Yuan; So, Hing Cheung

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newton-Raphson scheme, whose global convergence is studied in this paper. Computer simulations show that the mean square error performance of NLS method can attain the Cramér-Rao lower bound. Moreover, our proposal provides more accurate frequency estimation when compared with the complex least mean square (CLMS) and augmented CLMS.

  12. Estimating the Costs of Preventive Interventions

    ERIC Educational Resources Information Center

    Foster, E. Michael; Porter, Michele M.; Ayers, Tim S.; Kaplan, Debra L.; Sandler, Irwin

    2007-01-01

    The goal of this article is to improve the practice and reporting of cost estimates of prevention programs. It reviews the steps in estimating the costs of an intervention and the principles that should guide estimation. The authors then review prior efforts to estimate intervention costs using a sample of well-known but diverse studies. Finally,…

  13. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate. 261.142 Section 261.142... Materials § 261.142 Cost estimate. (a) The owner or operator must have a detailed written estimate, in... facility. (1) The estimate must equal the cost of conducting the activities described in paragraph (a)...

  14. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  15. An accurate link correlation estimator for improving wireless protocol performance.

    PubMed

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-02-12

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation.

  16. Software Development Cost Estimation Executive Summary

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Menzies, Tim

    2006-01-01

    Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.

  17. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  18. Accurate photometric redshift probability density estimation - method comparison and application

    NASA Astrophysics Data System (ADS)

    Rau, Markus Michael; Seitz, Stella; Brimioulle, Fabrice; Frank, Eibe; Friedrich, Oliver; Gruen, Daniel; Hoyle, Ben

    2015-10-01

    We introduce an ordinal classification algorithm for photometric redshift estimation, which significantly improves the reconstruction of photometric redshift probability density functions (PDFs) for individual galaxies and galaxy samples. As a use case we apply our method to CFHTLS galaxies. The ordinal classification algorithm treats distinct redshift bins as ordered values, which improves the quality of photometric redshift PDFs, compared with non-ordinal classification architectures. We also propose a new single value point estimate of the galaxy redshift, which can be used to estimate the full redshift PDF of a galaxy sample. This method is competitive in terms of accuracy with contemporary algorithms, which stack the full redshift PDFs of all galaxies in the sample, but requires orders of magnitude less storage space. The methods described in this paper greatly improve the log-likelihood of individual object redshift PDFs, when compared with a popular neural network code (ANNZ). In our use case, this improvement reaches 50 per cent for high-redshift objects (z ≥ 0.75). We show that using these more accurate photometric redshift PDFs will lead to a reduction in the systematic biases by up to a factor of 4, when compared with less accurate PDFs obtained from commonly used methods. The cosmological analyses we examine and find improvement upon are the following: gravitational lensing cluster mass estimates, modelling of angular correlation functions and modelling of cosmic shear correlation functions.

  19. Hydrogen Station Cost Estimates: Comparing Hydrogen Station Cost Calculator Results with other Recent Estimates

    SciTech Connect

    Melaina, M.; Penev, M.

    2013-09-01

    This report compares hydrogen station cost estimates conveyed by expert stakeholders through the Hydrogen Station Cost Calculation (HSCC) to a select number of other cost estimates. These other cost estimates include projections based upon cost models and costs associated with recently funded stations.

  20. Cost Estimates for Federal Student Loans: The Market Cost Debate

    ERIC Educational Resources Information Center

    Delisle, Jason

    2008-01-01

    In an ongoing debate about the relative costs of the federal government's direct and guaranteed student loan programs, some budget experts and private lenders have argued for the use of "market cost" estimates. They assert that official government cost estimates for federal student loans differ from what private entities would likely charge…

  1. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  2. Accurate, low-cost 3D-models of gullies

    NASA Astrophysics Data System (ADS)

    Onnen, Nils; Gronz, Oliver; Ries, Johannes B.; Brings, Christine

    2015-04-01

    are able to produce accurate and low-cost 3D-models of gullies.

  3. Accurate estimators of correlation functions in Fourier space

    NASA Astrophysics Data System (ADS)

    Sefusatti, E.; Crocce, M.; Scoccimarro, R.; Couchman, H. M. P.

    2016-08-01

    Efficient estimators of Fourier-space statistics for large number of objects rely on fast Fourier transforms (FFTs), which are affected by aliasing from unresolved small-scale modes due to the finite FFT grid. Aliasing takes the form of a sum over images, each of them corresponding to the Fourier content displaced by increasing multiples of the sampling frequency of the grid. These spurious contributions limit the accuracy in the estimation of Fourier-space statistics, and are typically ameliorated by simultaneously increasing grid size and discarding high-frequency modes. This results in inefficient estimates for e.g. the power spectrum when desired systematic biases are well under per cent level. We show that using interlaced grids removes odd images, which include the dominant contribution to aliasing. In addition, we discuss the choice of interpolation kernel used to define density perturbations on the FFT grid and demonstrate that using higher order interpolation kernels than the standard Cloud-In-Cell algorithm results in significant reduction of the remaining images. We show that combining fourth-order interpolation with interlacing gives very accurate Fourier amplitudes and phases of density perturbations. This results in power spectrum and bispectrum estimates that have systematic biases below 0.01 per cent all the way to the Nyquist frequency of the grid, thus maximizing the use of unbiased Fourier coefficients for a given grid size and greatly reducing systematics for applications to large cosmological data sets.

  4. Outer planet probe cost estimates: First impressions

    NASA Technical Reports Server (NTRS)

    Niehoff, J.

    1974-01-01

    An examination was made of early estimates of outer planetary atmospheric probe cost by comparing the estimates with past planetary projects. Of particular interest is identification of project elements which are likely cost drivers for future probe missions. Data are divided into two parts: first, the description of a cost model developed by SAI for the Planetary Programs Office of NASA, and second, use of this model and its data base to evaluate estimates of probe costs. Several observations are offered in conclusion regarding the credibility of current estimates and specific areas of the outer planet probe concept most vulnerable to cost escalation.

  5. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.

  6. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  7. Accurate heart rate estimation from camera recording via MUSIC algorithm.

    PubMed

    Fouladi, Seyyed Hamed; Balasingham, Ilangko; Ramstad, Tor Audun; Kansanen, Kimmo

    2015-01-01

    In this paper, we propose an algorithm to extract heart rate frequency from video camera using the Multiple SIgnal Classification (MUSIC) algorithm. This leads to improved accuracy of the estimated heart rate frequency in cases the performance is limited by the number of samples and frame rate. Monitoring vital signs remotely can be exploited for both non-contact physiological and psychological diagnosis. The color variation recorded by ordinary cameras is used for heart rate monitoring. The orthogonality between signal space and noise space is used to find more accurate heart rate frequency in comparison with traditional methods. It is shown via experimental results that the limitation of previous methods can be overcome by using subspace methods. PMID:26738015

  8. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  9. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  10. Parametric Cost Estimates for an International Competitive Edge

    SciTech Connect

    Murphy, L.T.

    2006-07-01

    This paper summarizes the progress to date by CH2M HILL and the UKAEA in development of a parametric modelling capability for estimating the costs of large nuclear decommissioning projects in the United Kingdom (UK) and Europe. The ability to successfully apply parametric cost estimating techniques will be a key factor to commercial success in the UK and European multi-billion dollar waste management, decommissioning and environmental restoration markets. The most useful parametric models will be those that incorporate individual components representing major elements of work: reactor decommissioning, fuel cycle facility decommissioning, waste management facility decommissioning and environmental restoration. Models must be sufficiently robust to estimate indirect costs and overheads, permit pricing analysis and adjustment, and accommodate the intricacies of international monetary exchange, currency fluctuations and contingency. The development of a parametric cost estimating capability is also a key component in building a forward estimating strategy. The forward estimating strategy will enable the preparation of accurate and cost-effective out-year estimates, even when work scope is poorly defined or as yet indeterminate. Preparation of cost estimates for work outside the organizations current sites, for which detailed measurement is not possible and historical cost data does not exist, will also be facilitated. (authors)

  11. A Cost Estimation Tool for Charter Schools

    ERIC Educational Resources Information Center

    Hayes, Cheryl D.; Keller, Eric

    2009-01-01

    To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…

  12. Demystifying the Cost Estimation Process

    ERIC Educational Resources Information Center

    Obi, Samuel C.

    2010-01-01

    In manufacturing today, nothing is more important than giving a customer a clear and straight-forward accounting of what their money has purchased. Many potentially promising return business orders are lost because of unclear, ambiguous, or improper billing. One of the best ways of resolving cost bargaining conflicts is by providing a…

  13. Contractor-style tunnel cost estimating

    SciTech Connect

    Scapuzzi, D. )

    1990-06-01

    Keeping pace with recent advances in construction technology is a challenge for the cost estimating engineer. Using an estimating style that simulates the actual construction process and is similar in style to the contractor's estimate will give a realistic view of underground construction costs. For a contractor-style estimate, a mining method is chosen; labor crews, plant and equipment are selected, and advance rates are calculated for the various phases of work which are used to determine the length of time necessary to complete each phase of work. The durations are multiplied by the cost or labor and equipment per unit of time and, along with the costs for materials and supplies, combine to complete the estimate. Variations in advance rates, ground support, labor crew size, or other areas are more easily analyzed for their overall effect on the cost and schedule of a project. 14 figs.

  14. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  15. A Framework for Automating Cost Estimates in Assembly Processes

    SciTech Connect

    Calton, T.L.; Peters, R.R.

    1998-12-09

    When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process may iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.

  16. An approach to software cost estimation

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.

    1984-01-01

    A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.

  17. Estimating the costs of human space exploration

    NASA Technical Reports Server (NTRS)

    Mandell, Humboldt C., Jr.

    1994-01-01

    The plan for NASA's new exploration initiative has the following strategic themes: (1) incremental, logical evolutionary development; (2) economic viability; and (3) excellence in management. The cost estimation process is involved with all of these themes and they are completely dependent upon the engineering cost estimator for success. The purpose is to articulate the issues associated with beginning this major new government initiative, to show how NASA intends to resolve them, and finally to demonstrate the vital importance of a leadership role by the cost estimation community.

  18. Process Equipment Cost Estimation, Final Report

    SciTech Connect

    H.P. Loh; Jennifer Lyons; Charles W. White, III

    2002-01-01

    This report presents generic cost curves for several equipment types generated using ICARUS Process Evaluator. The curves give Purchased Equipment Cost as a function of a capacity variable. This work was performed to assist NETL engineers and scientists in performing rapid, order of magnitude level cost estimates or as an aid in evaluating the reasonableness of cost estimates submitted with proposed systems studies or proposals for new processes. The specific equipment types contained in this report were selected to represent a relatively comprehensive set of conventional chemical process equipment types.

  19. Statistical methods of estimating mining costs

    USGS Publications Warehouse

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  20. Estimating archiving costs for engineering records

    SciTech Connect

    Stutz, R.A.; Lamartine, B.C.

    1997-02-01

    Information technology has completely changed the concept of record keeping for engineering projects -- the advent of digital records was a momentous discovery, as significant as the invention of the printing press. Digital records allowed huge amounts of information to be stored in a very small space and to be examined quickly. However, digital documents are much more vulnerable to the passage of time than printed documents because the media on which they are stored are easily affected by physical phenomena, such as magnetic fields, oxidation, material decay, and by various environmental factors that may erase the information. Even more important, digital information becomes obsolete because, even if future generations may be able to read it, they may not necessarily be able to interpret it. Engineering projects of all sizes are becoming more dependent on digital records. These records are created on computers used in design, estimating, construction management, and construction. The necessity for the accurate and accessible storage of these documents, generated by computer software systems, is increasing for a number of reasons including legal and environment issues. This paper will discuss media life considerations and life cycle costs associated with several methods of storing engineering records.

  1. The free energy cost of accurate biochemical oscillations

    PubMed Central

    Cao, Yuansheng; Wang, Hongli; Ouyang, Qi; Tu, Yuhai

    2015-01-01

    Oscillation is an important cellular process that regulates timing of different vital life cycles. However, in the noisy cellular environment, oscillations can be highly inaccurate due to phase fluctuations. It remains poorly understood how biochemical circuits suppress phase fluctuations and what is the incurred thermodynamic cost. Here, we study three different types of biochemical oscillations representing three basic oscillation motifs shared by all known oscillatory systems. In all the systems studied, we find that the phase diffusion constant depends on the free energy dissipation per period following the same inverse relation parameterized by system specific constants. This relationship and its range of validity are shown analytically in a model of noisy oscillation. Microscopically, we find that the oscillation is driven by multiple irreversible cycles that hydrolyze the fuel molecules such as ATP; the number of phase coherent periods is proportional to the free energy consumed per period. Experimental evidence in support of this general relationship and testable predictions are also presented. PMID:26566392

  2. The free-energy cost of accurate biochemical oscillations

    NASA Astrophysics Data System (ADS)

    Cao, Yuansheng; Wang, Hongli; Ouyang, Qi; Tu, Yuhai

    2015-09-01

    Oscillations within the cell regulate the timing of many important life cycles. However, in this noisy environment, oscillations can be highly inaccurate owing to phase fluctuations. It remains poorly understood how biochemical circuits suppress these phase fluctuations and what is the incurred thermodynamic cost. Here, we study three different types of biochemical oscillation, representing three basic oscillation motifs shared by all known oscillatory systems. In all the systems studied, we find that the phase diffusion constant depends on the free-energy dissipation per period, following the same inverse relation parameterized by system-specific constants. This relationship and its range of validity are shown analytically in a model of noisy oscillation. Microscopically, we find that the oscillation is driven by multiple irreversible cycles that hydrolyse fuel molecules such as ATP; the number of phase coherent periods is proportional to the free energy consumed per period. Experimental evidence in support of this general relationship and testable predictions are also presented.

  3. COST ESTIMATING EQUATIONS FOR BEST MANAGEMENT PRACTICES

    EPA Science Inventory

    This paper describes the development of an interactive internet-based cost-estimating tool for commonly used urban storm runoff best management practices (BMP), including: retention and detention ponds, grassed swales, and constructed wetlands. The paper presents the cost data, c...

  4. Fast and Accurate Learning When Making Discrete Numerical Estimates.

    PubMed

    Sanborn, Adam N; Beierholm, Ulrik R

    2016-04-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  5. Fast and Accurate Learning When Making Discrete Numerical Estimates.

    PubMed

    Sanborn, Adam N; Beierholm, Ulrik R

    2016-04-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates.

  6. Fast and Accurate Learning When Making Discrete Numerical Estimates

    PubMed Central

    Sanborn, Adam N.; Beierholm, Ulrik R.

    2016-01-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  7. Cost estimation model for advanced planetary programs, fourth edition

    NASA Technical Reports Server (NTRS)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  8. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb, we incorporated Pb-contaminated soils or Pb acetate into diets for Japanese quail (Coturnix japonica), fed the quail for 15 days, and ...

  9. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE BIOAVAILABILITY OF LEAD TO QUAIL

    EPA Science Inventory

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contami...

  10. Cost Estimation in Engineer-to-Order Manufacturing

    NASA Astrophysics Data System (ADS)

    Hooshmand, Yousef; Köhler, Peter; Korff-Krumm, Andrea

    2016-02-01

    In Engineer-to-Order (ETO) manufacturing the price of products must be defined during early stages of product design and during the bidding process, thus an overestimation of product development (PD) costs may lead to the loss of orders and an underestimation causes a profit loss. What many ETO systems have in common is that the products have to be developed based on different customer requirements so that each order usually results in a new variant. Furthermore, many customer requirement change-requests may arise in different phases of the PD, which is to be considered properly. Thus it is utmost important for ETO systems to have an accurate cost estimation in first stages of the product design and to be able to determine the cost of customer requirement changes in different phases of PD. This paper aims to present a cost estimation methodology as well as a cost estimation model, which estimate the cost of products by relative comparison of the attributes of new product variants with the attributes of standard product variants. In addition, as a necessity in ETO manufacturing, the cost calculation of customer requirement changes in different phases of PD is integrated in the presented method.

  11. Some insight on censored cost estimators.

    PubMed

    Zhao, H; Cheng, Y; Bang, H

    2011-08-30

    Censored survival data analysis has been studied for many years. Yet, the analysis of censored mark variables, such as medical cost, quality-adjusted lifetime, and repeated events, faces a unique challenge that makes standard survival analysis techniques invalid. Because of the 'informative' censorship imbedded in censored mark variables, the use of the Kaplan-Meier (Journal of the American Statistical Association 1958; 53:457-481) estimator, as an example, will produce biased estimates. Innovative estimators have been developed in the past decade in order to handle this issue. Even though consistent estimators have been proposed, the formulations and interpretations of some estimators are less intuitive to practitioners. On the other hand, more intuitive estimators have been proposed, but their mathematical properties have not been established. In this paper, we prove the analytic identity between some estimators (a statistically motivated estimator and an intuitive estimator) for censored cost data. Efron (1967) made similar investigation for censored survival data (between the Kaplan-Meier estimator and the redistribute-to-the-right algorithm). Therefore, we view our study as an extension of Efron's work to informatively censored data so that our findings could be applied to other marked variables. PMID:21748774

  12. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities

    PubMed Central

    Helb, Danica A.; Tetteh, Kevin K. A.; Felgner, Philip L.; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R.; Beeson, James G.; Tappero, Jordan; Smith, David L.; Crompton, Peter D.; Rosenthal, Philip J.; Dorsey, Grant; Drakeley, Christopher J.; Greenhouse, Bryan

    2015-01-01

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual’s recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86–0.93), whereas responses to six antigens accurately estimated an individual’s malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs. PMID:26216993

  13. 48 CFR 1552.216-76 - Estimated cost and cost-sharing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and cost... 1552.216-76 Estimated cost and cost-sharing. As prescribed in 1516.307(c), insert the following clause: Estimated Cost and Cost-Sharing (APR 1996) (a) The total estimated cost of performing the work under...

  14. 48 CFR 1852.216-73 - Estimated cost and cost sharing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and cost... and Clauses 1852.216-73 Estimated cost and cost sharing. As prescribed in 1816.307-70(a), insert the following clause: Estimated Cost and Cost Sharing (DEC 1991) (a) It is estimated that the total cost...

  15. Cost estimate of electricity produced by TPV

    NASA Astrophysics Data System (ADS)

    Palfinger, Günther; Bitnar, Bernd; Durisch, Wilhelm; Mayor, Jean-Claude; Grützmacher, Detlev; Gobrecht, Jens

    2003-05-01

    A crucial parameter for the market penetration of TPV is its electricity production cost. In this work a detailed cost estimate is performed for a Si photocell based TPV system, which was developed for electrically self-powered operation of a domestic heating system. The results are compared to a rough estimate of cost of electricity for a projected GaSb based system. For the calculation of the price of electricity, a lifetime of 20 years, an interest rate of 4.25% per year and maintenance costs of 1% of the investment are presumed. To determine the production cost of TPV systems with a power of 12-20 kW, the costs of the TPV components and 100 EUR kW-1el,peak for assembly and miscellaneous were estimated. Alternatively, the system cost for the GaSb system was derived from the cost of the photocells and from the assumption that they account for 35% of the total system cost. The calculation was done for four different TPV scenarios which include a Si based prototype system with existing technology (etasys = 1.0%), leading to 3000 EUR kW-1el,peak, an optimized Si based system using conventional, available technology (etasys = 1.5%), leading to 900 EUR kW-1el,peak, a further improved system with future technology (etasys = 5%), leading to 340 EUR kW-1el,peak and a GaSb based system (etasys = 12.3% with recuperator), leading to 1900 EUR kW-1el,peak. Thus, prices of electricity from 6 to 25 EURcents kWh-1el (including gas of about 3.5 EURcents kWh-1) were calculated and compared with those of fuel cells (31 EURcents kWh-1) and gas engines (23 EURcents kWh-1).

  16. Support to LANL: Cost estimation. Final report

    SciTech Connect

    Not Available

    1993-10-04

    This report summarizes the activities and progress by ICF Kaiser Engineers conducted on behalf of Los Alamos National Laboratories (LANL) for the US Department of Energy, Office of Waste Management (EM-33) in the area of improving methods for Cost Estimation. This work was conducted between October 1, 1992 and September 30, 1993. ICF Kaiser Engineers supported LANL in providing the Office of Waste Management with planning and document preparation services for a Cost and Schedule Estimating Guide (Guide). The intent of the Guide was to use Activity-Based Cost (ABC) estimation as a basic method in preparing cost estimates for DOE planning and budgeting documents, including Activity Data Sheets (ADSs), which form the basis for the Five Year Plan document. Prior to the initiation of the present contract with LANL, ICF Kaiser Engineers was tasked to initiate planning efforts directed toward a Guide. This work, accomplished from June to September, 1992, included visits to eight DOE field offices and consultation with DOE Headquarters staff to determine the need for a Guide, the desired contents of a Guide, and the types of ABC estimation methods and documentation requirements that would be compatible with current or potential practices and expertise in existence at DOE field offices and their contractors.

  17. Does more accurate exposure prediction necessarily improve health effect estimates?

    PubMed

    Szpiro, Adam A; Paciorek, Christopher J; Sheppard, Lianne

    2011-09-01

    A unique challenge in air pollution cohort studies and similar applications in environmental epidemiology is that exposure is not measured directly at subjects' locations. Instead, pollution data from monitoring stations at some distance from the study subjects are used to predict exposures, and these predicted exposures are used to estimate the health effect parameter of interest. It is usually assumed that minimizing the error in predicting the true exposure will improve health effect estimation. We show in a simulation study that this is not always the case. We interpret our results in light of recently developed statistical theory for measurement error, and we discuss implications for the design and analysis of epidemiologic research.

  18. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  19. PACE 2: Pricing and Cost Estimating Handbook

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.; Shepherd, T.

    1977-01-01

    An automatic data processing system to be used for the preparation of industrial engineering type manhour and material cost estimates has been established. This computer system has evolved into a highly versatile and highly flexible tool which significantly reduces computation time, eliminates computational errors, and reduces typing and reproduction time for estimators and pricers since all mathematical and clerical functions are automatic once basic inputs are derived.

  20. Accurate feature detection and estimation using nonlinear and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Rudin, Leonid; Osher, Stanley

    1994-11-01

    A program for feature detection and estimation using nonlinear and multiscale analysis was completed. The state-of-the-art edge detection was combined with multiscale restoration (as suggested by the first author) and robust results in the presence of noise were obtained. Successful applications to numerous images of interest to DOD were made. Also, a new market in the criminal justice field was developed, based in part, on this work.

  1. Simulation model accurately estimates total dietary iodine intake.

    PubMed

    Verkaik-Kloosterman, Janneke; van 't Veer, Pieter; Ocké, Marga C

    2009-07-01

    One problem with estimating iodine intake is the lack of detailed data about the discretionary use of iodized kitchen salt and iodization of industrially processed foods. To be able to take into account these uncertainties in estimating iodine intake, a simulation model combining deterministic and probabilistic techniques was developed. Data from the Dutch National Food Consumption Survey (1997-1998) and an update of the Food Composition database were used to simulate 3 different scenarios: Dutch iodine legislation until July 2008, Dutch iodine legislation after July 2008, and a potential future situation. Results from studies measuring iodine excretion during the former legislation are comparable with the iodine intakes estimated with our model. For both former and current legislation, iodine intake was adequate for a large part of the Dutch population, but some young children (<5%) were at risk of intakes that were too low. In the scenario of a potential future situation using lower salt iodine levels, the percentage of the Dutch population with intakes that were too low increased (almost 10% of young children). To keep iodine intakes adequate, salt iodine levels should not be decreased, unless many more foods will contain iodized salt. Our model should be useful in predicting the effects of food reformulation or fortification on habitual nutrient intakes.

  2. Cost estimate of initial SSC experimental equipment

    SciTech Connect

    1986-06-01

    The cost of the initial detector complement at recently constructed colliding beam facilities (or at those under construction) has been a significant fraction of the cost of the accelerator complex. Because of the complexity of large modern-day detectors, the time-scale for their design and construction is comparable to the time-scale needed for accelerator design and construction. For these reasons it is appropriate to estimate the cost of the anticipated detector complement in parallel with the cost estimates of the collider itself. The fundamental difficulty with this procedure is that, whereas a firm conceptual design of the collider does exist, comparable information is unavailable for the detectors. Traditionally, these have been built by the high energy physics user community according to their perception of the key scientific problems that need to be addressed. The role of the accelerator laboratory in that process has involved technical and managerial coordination and the allocation of running time and local facilities among the proposed experiments. It seems proper that the basic spirit of experimentation reflecting the scientific judgment of the community should be preserved at the SSC. Furthermore, the formal process of initiation of detector proposals can only start once the SSC has been approved as a construction project and a formal laboratory administration put in place. Thus an ad hoc mechanism had to be created to estimate the range of potential detector needs, potential detector costs, and associated computing equipment.

  3. Unmanned Aerial Vehicles unique cost estimating requirements

    NASA Astrophysics Data System (ADS)

    Malone, P.; Apgar, H.; Stukes, S.; Sterk, S.

    Unmanned Aerial Vehicles (UAVs), also referred to as drones, are aerial platforms that fly without a human pilot onboard. UAVs are controlled autonomously by a computer in the vehicle or under the remote control of a pilot stationed at a fixed ground location. There are a wide variety of drone shapes, sizes, configurations, complexities, and characteristics. Use of these devices by the Department of Defense (DoD), NASA, civil and commercial organizations continues to grow. UAVs are commonly used for intelligence, surveillance, reconnaissance (ISR). They are also use for combat operations, and civil applications, such as firefighting, non-military security work, surveillance of infrastructure (e.g. pipelines, power lines and country borders). UAVs are often preferred for missions that require sustained persistence (over 4 hours in duration), or are “ too dangerous, dull or dirty” for manned aircraft. Moreover, they can offer significant acquisition and operations cost savings over traditional manned aircraft. Because of these unique characteristics and missions, UAV estimates require some unique estimating methods. This paper describes a framework for estimating UAV systems total ownership cost including hardware components, software design, and operations. The challenge of collecting data, testing the sensitivities of cost drivers, and creating cost estimating relationships (CERs) for each key work breakdown structure (WBS) element is discussed. The autonomous operation of UAVs is especially challenging from a software perspective.

  4. Estimating Teacher Turnover Costs: A Case Study

    ERIC Educational Resources Information Center

    Levy, Abigail Jurist; Joy, Lois; Ellis, Pamela; Jablonski, Erica; Karelitz, Tzur M.

    2012-01-01

    High teacher turnover in large U.S. cities is a critical issue for schools and districts, and the students they serve; but surprisingly little work has been done to develop methodologies and standards that districts and schools can use to make reliable estimates of turnover costs. Even less is known about how to detect variations in turnover costs…

  5. PROCEDURE FOR ESTIMATING PERMANENT TOTAL ENCLOSURE COSTS

    EPA Science Inventory

    The paper discusses a procedure for estimating permanent total enclosure (PTE) costs. (NOTE: Industries that use add-on control devices must adequately capture emissions before delivering them to the control device. One way to capture emissions is to use PTEs, enclosures that mee...

  6. Hydrogen from coal cost estimation guidebook

    NASA Technical Reports Server (NTRS)

    Billings, R. E.

    1981-01-01

    In an effort to establish baseline information whereby specific projects can be evaluated, a current set of parameters which are typical of coal gasification applications was developed. Using these parameters a computer model allows researchers to interrelate cost components in a sensitivity analysis. The results make possible an approximate estimation of hydrogen energy economics from coal, under a variety of circumstances.

  7. Software Estimates Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Smith, C. L.

    2003-01-01

    Simulation-Based Cost Model (SiCM), a discrete event simulation developed in Extend , simulates pertinent aspects of the testing of rocket propulsion test articles for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  8. 48 CFR 1352.216-70 - Estimated and allowable costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Estimated and allowable costs. As prescribed in 48 CFR 1316.307(a), insert the following clause: Estimated and Allowable Costs (APR 2010) (a) Estimated Costs. The estimated cost of this contract is... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Estimated and...

  9. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    USGS Publications Warehouse

    Beyer, W. Nelson; Basta, Nicholas T; Chaney, Rufus L.; Henry, Paula F.; Mosby, David; Rattner, Barnett A.; Scheckel, Kirk G.; Sprague, Dan; Weber, John

    2016-01-01

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with phosphorus significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite and tertiary Pb phosphate), and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb.

  10. Bioaccessibility tests accurately estimate bioavailability of lead to quail.

    PubMed

    Beyer, W Nelson; Basta, Nicholas T; Chaney, Rufus L; Henry, Paula F P; Mosby, David E; Rattner, Barnett A; Scheckel, Kirk G; Sprague, Daniel T; Weber, John S

    2016-09-01

    Hazards of soil-borne lead (Pb) to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, the authors measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from 5 Pb-contaminated Superfund sites had relative bioavailabilities from 33% to 63%, with a mean of approximately 50%. Treatment of 2 of the soils with phosphorus (P) significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in 6 in vitro tests and regressed on bioavailability: the relative bioavailability leaching procedure at pH 1.5, the same test conducted at pH 2.5, the Ohio State University in vitro gastrointestinal method, the urban soil bioaccessible lead test, the modified physiologically based extraction test, and the waterfowl physiologically based extraction test. All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the relative bioavailability leaching procedure at pH 2.5 and Ohio State University in vitro gastrointestinal tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite, and tertiary Pb phosphate) and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb, and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb. Environ Toxicol Chem 2016;35:2311-2319. Published 2016 Wiley Periodicals Inc. on behalf of

  11. Software sizing, cost estimation and scheduling

    NASA Technical Reports Server (NTRS)

    Cheadle, William G.

    1988-01-01

    The Technology Implementation and Support Section at Martin Marietta Astronautics Group Denver is tasked with software development analysis, data collection, software productivity improvement and developing and applying various computerized software tools and models. The computerized tools are parametric models that reflect actuals taken from the large data base of completed software development projects. Martin Marietta's data base consists of over 300 completed projects and hundreds of cost estimating relationships (CERs) that are used in sizing, costing, scheduling and productivity improvement equations, studies, models and computerized tools.

  12. Probabilistic cost estimates for climate change mitigation.

    PubMed

    Rogelj, Joeri; McCollum, David L; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-01-01

    For more than a decade, the target of keeping global warming below 2 °C has been a key focus of the international climate debate. In response, the scientific community has published a number of scenario studies that estimate the costs of achieving such a target. Producing these estimates remains a challenge, particularly because of relatively well known, but poorly quantified, uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on the one hand, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other hand, has spent years improving its understanding of the geophysical response of the Earth system to emissions of greenhouse gases. This geophysical response remains a key uncertainty in the cost of mitigation scenarios but has been integrated with assessments of other uncertainties in only a rudimentary manner, that is, for equilibrium conditions. Here we bridge this gap between the two research communities by generating distributions of the costs associated with limiting transient global temperature increase to below specific values, taking into account uncertainties in four factors: geophysical, technological, social and political. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical uncertainties, social factors influencing future energy demand and, lastly, technological uncertainties surrounding the availability of greenhouse gas mitigation options. Our information on temperature risk and mitigation costs provides crucial information for policy-making, because it clarifies the relative importance of mitigation costs, energy demand and the timing of global action in reducing the risk of exceeding a global temperature increase of 2 °C, or other limits such as 3 °C or 1.5

  13. Estimating the social costs of nitrogen pollution

    NASA Astrophysics Data System (ADS)

    Gourevitch, J.; Keeler, B.; Polasky, S.

    2014-12-01

    Agricultural expansion can degrade water quality and related ecosystem services through increased export of nutrients. Such damages to water quality can negatively affect recreation, property values, and human health. While the relationship between agricultural production and nitrogen export is well-studied, the economic costs of nitrogen loss are less well understood. We present a comprehensive assessment of the full costs associated with nitrate pollution from agricultural sources in Minnesota. We found that the most significant economic costs are likely from groundwater contamination of nitrate in public and private wells. For example, we estimated that loss of grassland to corn cultivation in Minnesota between 2007 and 2012 is expected to increase the future number of domestic wells exceeding nitrate concentrations of 10 ppm by 31%. This increase in contamination is estimated to cost well owners $1.4 to 19 million (present values over a 20 year horizon) through remediation, avoidance, and replacement. Our findings demonstrate linkages between changes in land use, water quality, and human well-being.

  14. 48 CFR 1852.216-81 - Estimated cost.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost. 1852.216-81... Estimated cost. As prescribed in 1816.307-70(d), insert the following clause: Estimated cost (DEC 1988) The total estimated cost for complete performance of this contract is $ . See FAR clause 52.216-11,...

  15. Compile-time estimation of communication costs in multicomputers

    NASA Technical Reports Server (NTRS)

    Gupta, Manish; Banerjee, Prithviraj

    1991-01-01

    An important problem facing numerous research projects on parallelizing compilers for distributed memory machines is that of automatically determining a suitable data partitioning scheme for a program. Any strategy for automatic data partitioning needs a mechanism for estimating the performance of a program under a given partitioning scheme, the most crucial part of which involves determining the communication costs incurred by the program. A methodology is described for estimating the communication costs at compile-time as functions of the numbers of processors over which various arrays are distributed. A strategy is described along with its theoretical basis, for making program transformations that expose opportunities for combining of messages, leading to considerable savings in the communication costs. For certain loops with regular dependences, the compiler can detect the possibility of pipelining, and thus estimate communication costs more accurately than it could otherwise. These results are of great significance to any parallelization system supporting numeric applications on multicomputers. In particular, they lay down a framework for effective synthesis of communication on multicomputers from sequential program references.

  16. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Cost estimate submission. 100.16 Section..., COMMUNICATIONS ASSISTANCE FOR LAW ENFORCEMENT ACT OF 1994 § 100.16 Cost estimate submission. (a) The carrier... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from...

  17. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  18. 28 CFR 19.4 - Cost and percentage estimates.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Cost and percentage estimates. 19.4... RECOVERY OF MISSING CHILDREN § 19.4 Cost and percentage estimates. It is estimated that this program will cost DOJ $78,000 during the initial year. This figure is based on estimates of printing, inserting,...

  19. Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information

    NASA Technical Reports Server (NTRS)

    Butts, Glenn

    2007-01-01

    Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.

  20. IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.

    PubMed

    Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio

    2016-03-01

    The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result.

  1. IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.

    PubMed

    Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio

    2016-03-01

    The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result. PMID:24752546

  2. ICPP tank farm closure study. Volume 3: Cost estimates, planning schedules, yearly cost flowcharts, and life-cycle cost estimates

    SciTech Connect

    1998-02-01

    This volume contains information on cost estimates, planning schedules, yearly cost flowcharts, and life-cycle costs for the six options described in Volume 1, Section 2: Option 1 -- Total removal clean closure; No subsequent use; Option 2 -- Risk-based clean closure; LLW fill; Option 3 -- Risk-based clean closure; CERCLA fill; Option 4 -- Close to RCRA landfill standards; LLW fill; Option 5 -- Close to RCRA landfill standards; CERCLA fill; and Option 6 -- Close to RCRA landfill standards; Clean fill. This volume is divided into two portions. The first portion contains the cost and planning schedule estimates while the second portion contains life-cycle costs and yearly cash flow information for each option.

  3. Estimation of immunization providers' activities cost, medication cost, and immunization dose errors cost in Iraq.

    PubMed

    Al-lela, Omer Qutaiba B; Bahari, Mohd Baidi; Al-abbassi, Mustafa G; Salih, Muhannad R M; Basher, Amena Y

    2012-06-01

    The immunization status of children is improved by interventions that increase community demand for compulsory and non-compulsory vaccines, one of the most important interventions related to immunization providers. The aim of this study is to evaluate the activities of immunization providers in terms of activities time and cost, to calculate the immunization doses cost, and to determine the immunization dose errors cost. Time-motion and cost analysis study design was used. Five public health clinics in Mosul-Iraq participated in the study. Fifty (50) vaccine doses were required to estimate activities time and cost. Micro-costing method was used; time and cost data were collected for each immunization-related activity performed by the clinic staff. A stopwatch was used to measure the duration of activity interactions between the parents and clinic staff. The immunization service cost was calculated by multiplying the average salary/min by activity time per minute. 528 immunization cards of Iraqi children were scanned to determine the number and the cost of immunization doses errors (extraimmunization doses and invalid doses). The average time for child registration was 6.7 min per each immunization dose, and the physician spent more than 10 min per dose. Nurses needed more than 5 min to complete child vaccination. The total cost of immunization activities was 1.67 US$ per each immunization dose. Measles vaccine (fifth dose) has a lower price (0.42 US$) than all other immunization doses. The cost of a total of 288 invalid doses was 744.55 US$ and the cost of a total of 195 extra immunization doses was 503.85 US$. The time spent on physicians' activities was longer than that spent on registrars' and nurses' activities. Physician total cost was higher than registrar cost and nurse cost. The total immunization cost will increase by about 13.3% owing to dose errors.

  4. Cost estimates supporting West Valley DEIS

    SciTech Connect

    Pirro, J.

    1981-01-01

    An Environmental Impact Statement (EIS) is being prepared which considers alternate means for solidifying the high level liquid wastes (HLLW) at the Western New York Nuclear Service Center (WNYNSC). For this purpose three basic scenarios were considered. In the first scenario, the HLLW is converted into terminal waste form of borosilicate glass. Before vitrification, the non-radioactive chemical salts are separated from the radioactive and transuranic (TRU) constituents in the HLLW. In the second scenario, the HLLW is converted into an intermediate form-fused salt. The stored HLLW is dewatered and melted and the solids are transported to a Department of Energy (DOE) site. The fused salt will be processed of the DOE site at a later date where it will be converted to a vitrified form in a facility that will be constructed to treat HLLW stored at that site. The vitrified salt will be eventually removed for permanent disposal at a Federal repository. In the third scenario, the HLLW is solidified in the existing HLLW storage tanks with cement and returned for on-site disposal in the existing tanks or additional tanks as needed to accommodate the volume. To support the EIS, the costs to accomplish each of the alternatives is provided. The purpose of this cost estimate is to provide a common basis to evaluate the expenditures required to immobilize the HLLW presently stored at the WNYNSC. (DMC)

  5. 40 CFR 267.142 - Cost estimate for closure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Cost estimate for closure. 267.142... PERMIT Financial Requirements § 267.142 Cost estimate for closure. (a) The owner or operator must have at the facility a detailed written estimate, in current dollars, of the cost of closing the facility...

  6. 40 CFR 264.142 - Cost estimate for closure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for closure. 264.142... Financial Requirements § 264.142 Cost estimate for closure. (a) The owner or operator must have a detailed written estimate, in current dollars, of the cost of closing the facility in accordance with...

  7. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Cost estimating system... of Provisions And Clauses 252.215-7002 Cost estimating system requirements. As prescribed in 215.408(2), use the following clause: Cost Estimating System Requirements (DEC 2006) (a)...

  8. 40 CFR 265.142 - Cost estimate for closure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for closure. 265.142... DISPOSAL FACILITIES Financial Requirements § 265.142 Cost estimate for closure. (a) The owner or operator must have a detailed written estimate, in current dollars, of the cost of closing the facility...

  9. 15 CFR 23.4 - Cost and percentage estimates.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Cost and percentage estimates. 23.4... LOCATION AND RECOVERY OF MISSING CHILDREN § 23.4 Cost and percentage estimates. It is estimated that this... estimate that 9% of its penalty mail will transmit missing children photographs and information when...

  10. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth

  11. A fast and accurate frequency estimation algorithm for sinusoidal signal with harmonic components

    NASA Astrophysics Data System (ADS)

    Hu, Jinghua; Pan, Mengchun; Zeng, Zhidun; Hu, Jiafei; Chen, Dixiang; Tian, Wugang; Zhao, Jianqiang; Du, Qingfa

    2016-10-01

    Frequency estimation is a fundamental problem in many applications, such as traditional vibration measurement, power system supervision, and microelectromechanical system sensors control. In this paper, a fast and accurate frequency estimation algorithm is proposed to deal with low efficiency problem in traditional methods. The proposed algorithm consists of coarse and fine frequency estimation steps, and we demonstrate that it is more efficient than conventional searching methods to achieve coarse frequency estimation (location peak of FFT amplitude) by applying modified zero-crossing technique. Thus, the proposed estimation algorithm requires less hardware and software sources and can achieve even higher efficiency when the experimental data increase. Experimental results with modulated magnetic signal show that the root mean square error of frequency estimation is below 0.032 Hz with the proposed algorithm, which has lower computational complexity and better global performance than conventional frequency estimation methods.

  12. AN OVERVIEW OF TOOL FOR RESPONSE ACTION COST ESTIMATING (TRACE)

    SciTech Connect

    FERRIES SR; KLINK KL; OSTAPKOWICZ B

    2012-01-30

    Tools and techniques that provide improved performance and reduced costs are important to government programs, particularly in current times. An opportunity for improvement was identified for preparation of cost estimates used to support the evaluation of response action alternatives. As a result, CH2M HILL Plateau Remediation Company has developed Tool for Response Action Cost Estimating (TRACE). TRACE is a multi-page Microsoft Excel{reg_sign} workbook developed to introduce efficiencies into the timely and consistent production of cost estimates for response action alternatives. This tool combines costs derived from extensive site-specific runs of commercially available remediation cost models with site-specific and estimator-researched and derived costs, providing the best estimating sources available. TRACE also provides for common quantity and key parameter links across multiple alternatives, maximizing ease of updating estimates and performing sensitivity analyses, and ensuring consistency.

  13. Development of Classification and Story Building Data for Accurate Earthquake Damage Estimation

    NASA Astrophysics Data System (ADS)

    Sakai, Yuki; Fukukawa, Noriko; Arai, Kensuke

    We investigated the method of developing classification and story building data from census population database in order to estimate earthquake damage more accurately especially in the urban area presuming that there are correlation between numbers of non-wooden or high-rise buildings and the population. We formulated equations of estimating numbers of wooden houses, low-to-mid-rise(1-9 story) and high-rise(over 10 story) non-wooden buildings in the 1km mesh from night and daytime population database based on the building data we investigated and collected in the selected 20 meshs in Kanto area. We could accurately estimate the numbers of three classified buildings by the formulated equations, but in some special cases, such as the apartment block mesh, the estimated values are quite different from actual values.

  14. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  15. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  16. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  17. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  18. Assuring Software Cost Estimates: Is it an Oxymoron?

    NASA Technical Reports Server (NTRS)

    Hihn, Jarius; Tregre, Grant

    2013-01-01

    The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.

  19. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are

  20. 48 CFR 36.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Government estimate of... Contracting for Construction 36.203 Government estimate of construction costs. (a) An independent Government estimate of construction costs shall be prepared and furnished to the contracting officer at the...

  1. Space tug economic analysis study. Volume 3: Cost estimates

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Cost estimates for the space tug operation are presented. The subjects discussed are: (1) research and development costs, (2) investment costs, (3) operations costs, and (4) funding requirements. The emphasis is placed on the single stage tug configuration using various types of liquid propellants.

  2. Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?

    PubMed

    Hey, Spencer Phillips; Kimmelman, Jonathan

    2016-10-01

    Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach. PMID:27197044

  3. Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?

    PubMed

    Hey, Spencer Phillips; Kimmelman, Jonathan

    2016-10-01

    Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach.

  4. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... current dollars, of the cost of disposing of any hazardous secondary material as listed or characteristic... requirements if he can demonstrate that on-site disposal capacity will exist at all times over the life of the... value. (b) During the active life of the facility, the owner or operator must adjust the cost...

  5. Estimating demolition cost of plutonium buildings for dummies

    SciTech Connect

    Tower, S.E.

    2000-07-01

    The primary purpose of the Rocky Flats Field Office of the US Department of Energy is to decommission the entire plant. In an effort to improve the basis and the accuracy of the future decommissioning cost, Rocky Flats has developed a powerful but easy-to-use tool to determine budget cost estimates to characterize, decontaminate, and demolish all its buildings. The parametric cost-estimating tool is called the Facilities Disposition Cost Model (FDCM).

  6. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Contractor shall— (i) Comply with its disclosed estimating system; and (ii) Disclose significant changes to... detection and timely correction of errors. (viii) Protect against cost duplication and omissions....

  7. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  8. Pros, Cons, and Alternatives to Weight Based Cost Estimating

    NASA Technical Reports Server (NTRS)

    Joyner, Claude R.; Lauriem, Jonathan R.; Levack, Daniel H.; Zapata, Edgar

    2011-01-01

    Many cost estimating tools use weight as a major parameter in projecting the cost. This is often combined with modifying factors such as complexity, technical maturity of design, environment of operation, etc. to increase the fidelity of the estimate. For a set of conceptual designs, all meeting the same requirements, increased weight can be a major driver in increased cost. However, once a design is fixed, increased weight generally decreases cost, while decreased weight generally increases cost - and the relationship is not linear. Alternative approaches to estimating cost without using weight (except perhaps for materials costs) have been attempted to try to produce a tool usable throughout the design process - from concept studies through development. This paper will address the pros and cons of using weight based models for cost estimating, using liquid rocket engines as the example. It will then examine approaches that minimize the impct of weight based cost estimating. The Rocket Engine- Cost Model (RECM) is an attribute based model developed internally by Pratt & Whitney Rocketdyne for NASA. RECM will be presented primarily to show a successful method to use design and programmatic parameters instead of weight to estimate both design and development costs and production costs. An operations model developed by KSC, the Launch and Landing Effects Ground Operations model (LLEGO), will also be discussed.

  9. Accurate Estimation of the Entropy of Rotation-Translation Probability Distributions.

    PubMed

    Fogolari, Federico; Dongmo Foumthuim, Cedrix Jurgal; Fortuna, Sara; Soler, Miguel Angel; Corazza, Alessandra; Esposito, Gennaro

    2016-01-12

    The estimation of rotational and translational entropies in the context of ligand binding has been the subject of long-time investigations. The high dimensionality (six) of the problem and the limited amount of sampling often prevent the required resolution to provide accurate estimates by the histogram method. Recently, the nearest-neighbor distance method has been applied to the problem, but the solutions provided either address rotation and translation separately, therefore lacking correlations, or use a heuristic approach. Here we address rotational-translational entropy estimation in the context of nearest-neighbor-based entropy estimation, solve the problem numerically, and provide an exact and an approximate method to estimate the full rotational-translational entropy.

  10. Preliminary estimates of operating costs for lighter than air transports

    NASA Technical Reports Server (NTRS)

    Smith, C. L.; Ardema, M. D.

    1975-01-01

    A preliminary set of operating cost relationships are presented for airship transports. The starting point for the development of the relationships is the direct operating cost formulae and the indirect operating cost categories commonly used for estimating costs of heavier than air commercial transports. Modifications are made to the relationships to account for the unique features of airships. To illustrate the cost estimating method, the operating costs of selected airship cargo transports are computed. Conventional fully buoyant and hybrid semi-buoyant systems are investigated for a variety of speeds, payloads, ranges, and altitudes. Comparisons are made with aircraft transports for a range of cargo densities.

  11. Preliminary estimates of operating costs for lighter than air transports

    NASA Technical Reports Server (NTRS)

    Smith, C. L.; Ardema, M. D.

    1975-01-01

    Presented is a preliminary set of operating cost relationships for airship transports. The starting point for the development of the relationships is the direct operating cost formulae and the indirect operating cost categories commonly used for estimating costs of heavier than air commercial transports. Modifications are made to the relationships to account for the unique features of airships. To illustrate the cost estimating method, the operating costs of selected airship cargo transports are computed. Conventional fully buoyant and hybrid semi-buoyant systems are investigated for a variety of speeds, payloads, ranges, and altitudes. Comparisons are made with aircraft transports for a range of cargo densities.

  12. Handbook for cost estimating. A method for developing estimates of costs for generic actions for nuclear power plants

    SciTech Connect

    Ball, J.R.; Cohen, S.; Ziegler, E.Z.

    1984-10-01

    This document provides overall guidance to assist the NRC in preparing the types of cost estimates required by the Regulatory Analysis Guidelines and to assist in the assignment of priorities in resolving generic safety issues. The Handbook presents an overall cost model that allows the cost analyst to develop a chronological series of activities needed to implement a specific regulatory requirement throughout all applicable commercial LWR power plants and to identify the significant cost elements for each activity. References to available cost data are provided along with rules of thumb and cost factors to assist in evaluating each cost element. A suitable code-of-accounts data base is presented to assist in organizing and aggregating costs. Rudimentary cost analysis methods are described to allow the analyst to produce a constant-dollar, lifetime cost for the requirement. A step-by-step example cost estimate is included to demonstrate the overall use of the Handbook.

  13. Polynomial fitting of DT-MRI fiber tracts allows accurate estimation of muscle architectural parameters.

    PubMed

    Damon, Bruce M; Heemskerk, Anneriet M; Ding, Zhaohua

    2012-06-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor magnetic resonance imaging fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image data sets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8 and 15.3 m(-1)), signal-to-noise ratio (50, 75, 100 and 150) and voxel geometry (13.8- and 27.0-mm(3) voxel volume with isotropic resolution; 13.5-mm(3) volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to second-order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m(-1)), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation.

  14. Fuel Cost Estimation for Sumatra Grid System

    NASA Astrophysics Data System (ADS)

    Liun, Edwaren

    2010-06-01

    Sumatra has a high growth rate electricity energy demand from the first decade in this century. At the medium of this decade the growth is 11% per annum. On the other side capability of Government of Indonesia cq. PLN authority is limited, while many and most old existing power plants will be retired. The electricity demand growth of Sumatra is increasing the fuel consumption for several next decades. Based on several cases by vary growth scenarios and economic parameters, it shown that some kinds of fossil fuel keep to be required until next several decades. Although Sumatra has abundant coal resource, however, the other fuel types such as fuel oil, diesel, gas and nuclear are needed. On the Base Scenario and discount rate of 10%, the Sumatra System will require 11.6 million tones of coal until 2030 producing 866 TWh with cost of US10558 million. Nuclear plants produce about 501 TWh or 32% by cost of US3.1 billion. On the High Scenario and discount rate 10%, the coal consumption becomes 486.6 million tones by fuel cost of US12.7 billion producing 1033 TWh electricity energy. Nuclear fuel cost required in this scenario is US7.06 billion. The other fuel in large amount consumed is natural gas for combined cycle plants by cost of US1.38 billion producing 11.7 TWh of electricity energy on the Base Scenario and discount rate of 10%. In the High Scenario and discount rate 10% coal plants take role in power generation in Sumatra producing about 866 TWh or 54% of electricity energy. Coal consumption will be the highest on the Base Scenario with discount rate of 12% producing 756 TWh and required cost of US17.1 billion. Nuclear plants will not applicable in this scenario due to its un-competitiveness. The fuel cost will depend on nuclear power role in Sumatra system. Fuel cost will increase correspond to the increasing of coal consumption on the case where nuclear power plants not appear.

  15. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  16. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere.

  17. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  18. 48 CFR 236.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... construction costs. 236.203 Section 236.203 Federal Acquisition Regulations System DEFENSE ACQUISITION...-ENGINEER CONTRACTS Special Aspects of Contracting for Construction 236.203 Government estimate of construction costs. Follow the procedures at PGI 236.203 for handling the Government estimate of...

  19. 48 CFR 236.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... construction costs. 236.203 Section 236.203 Federal Acquisition Regulations System DEFENSE ACQUISITION...-ENGINEER CONTRACTS Special Aspects of Contracting for Construction 236.203 Government estimate of construction costs. Follow the procedures at PGI 236.203 for handling the Government estimate of...

  20. 48 CFR 1336.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Government estimate of construction costs. 1336.203 Section 1336.203 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE... Contracting for Construction 1336.203 Government estimate of construction costs. After award, the...

  1. 48 CFR 836.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Government estimate of construction costs. 836.203 Section 836.203 Federal Acquisition Regulations System DEPARTMENT OF VETERANS... Contracting for Construction 836.203 Government estimate of construction costs. The overall amount of...

  2. 40 CFR 267.142 - Cost estimate for closure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... active life of the facility, the owner or operator must adjust the closure cost estimate for inflation... closure cost estimate must be updated for inflation within 30 days after the close of the firm's fiscal... dollars, or by using an inflation factor derived from the most recent Implicit Price Deflator for...

  3. Mental health disorders among individuals with mental retardation: challenges to accurate prevalence estimates.

    PubMed Central

    Kerker, Bonnie D.; Owens, Pamela L.; Zigler, Edward; Horwitz, Sarah M.

    2004-01-01

    OBJECTIVES: The objectives of this literature review were to assess current challenges to estimating the prevalence of mental health disorders among individuals with mental retardation (MR) and to develop recommendations to improve such estimates for this population. METHODS: The authors identified 200 peer-reviewed articles, book chapters, government documents, or reports from national and international organizations on the mental health status of people with MR. Based on the study's inclusion criteria, 52 articles were included in the review. RESULTS: Available data reveal inconsistent estimates of the prevalence of mental health disorders among those with MR, but suggest that some mental health conditions are more common among these individuals than in the general population. Two main challenges to identifying accurate prevalence estimates were found: (1) health care providers have difficulty diagnosing mental health conditions among individuals with MR; and (2) methodological limitations of previous research inhibit confidence in study results. CONCLUSIONS: Accurate prevalence estimates are necessary to ensure the availability of appropriate treatment services. To this end, health care providers should receive more training regarding the mental health treatment of individuals with MR. Further, government officials should discuss mechanisms of collecting nationally representative data, and the research community should utilize consistent methods with representative samples when studying mental health conditions in this population. PMID:15219798

  4. Accurate estimation of forest carbon stocks by 3-D remote sensing of individual trees.

    PubMed

    Omasa, Kenji; Qiu, Guo Yu; Watanuki, Kenichi; Yoshimi, Kenji; Akiyama, Yukihide

    2003-03-15

    Forests are one of the most important carbon sinks on Earth. However, owing to the complex structure, variable geography, and large area of forests, accurate estimation of forest carbon stocks is still a challenge for both site surveying and remote sensing. For these reasons, the Kyoto Protocol requires the establishment of methodologies for estimating the carbon stocks of forests (Kyoto Protocol, Article 5). A possible solution to this challenge is to remotely measure the carbon stocks of every tree in an entire forest. Here, we present a methodology for estimating carbon stocks of a Japanese cedar forest by using a high-resolution, helicopter-borne 3-dimensional (3-D) scanning lidar system that measures the 3-D canopy structure of every tree in a forest. Results show that a digital image (10-cm mesh) of woody canopy can be acquired. The treetop can be detected automatically with a reasonable accuracy. The absolute error ranges for tree height measurements are within 42 cm. Allometric relationships of height to carbon stocks then permit estimation of total carbon storage by measurement of carbon stocks of every tree. Thus, we suggest that our methodology can be used to accurately estimate the carbon stocks of Japanese cedar forests at a stand scale. Periodic measurements will reveal changes in forest carbon stocks.

  5. ABC estimation of unit costs for emergency department services.

    PubMed

    Holmes, R L; Schroeder, R E

    1996-04-01

    Rapid evolution of the health care industry forces managers to make cost-effective decisions. Typical hospital cost accounting systems do not provide emergency department managers with the information needed, but emergency department settings are so complex and dynamic as to make the more accurate activity-based costing (ABC) system prohibitively expensive. Through judicious use of the available traditional cost accounting information and simple computer spreadsheets. managers may approximate the decision-guiding information that would result from the much more costly and time-consuming implementation of ABC. PMID:10156656

  6. Review of storage battery system cost estimates

    SciTech Connect

    Brown, D.R.; Russell, J.A.

    1986-04-01

    Cost analyses for zinc bromine, sodium sulfur, and lead acid batteries were reviewed. Zinc bromine and sodium sulfur batteries were selected because of their advanced design nature and the high level of interest in these two technologies. Lead acid batteries were included to establish a baseline representative of a more mature technology.

  7. A Method to Accurately Estimate the Muscular Torques of Human Wearing Exoskeletons by Torque Sensors

    PubMed Central

    Hwang, Beomsoo; Jeon, Doyoung

    2015-01-01

    In exoskeletal robots, the quantification of the user’s muscular effort is important to recognize the user’s motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users’ muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user’s limb accurately from the measured torque. The user’s limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user’s muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions. PMID:25860074

  8. A method to accurately estimate the muscular torques of human wearing exoskeletons by torque sensors.

    PubMed

    Hwang, Beomsoo; Jeon, Doyoung

    2015-04-09

    In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions.

  9. Chromatography paper as a low-cost medium for accurate spectrophotometric assessment of blood hemoglobin concentration.

    PubMed

    Bond, Meaghan; Elguea, Carlos; Yan, Jasper S; Pawlowski, Michal; Williams, Jessica; Wahed, Amer; Oden, Maria; Tkaczyk, Tomasz S; Richards-Kortum, Rebecca

    2013-06-21

    Anemia affects a quarter of the world's population, and a lack of appropriate diagnostic tools often prevents treatment in low-resource settings. Though the HemoCue 201+ is an appropriate device for diagnosing anemia in low-resource settings, the high cost of disposables ($0.99 per test in Malawi) limits its availability. We investigated using spectrophotometric measurement of blood spotted on chromatography paper as a low-cost (<$0.01 per test) alternative to HemoCue cuvettes. For this evaluation, donor blood was diluted with plasma to simulate anemia, a micropipette spotted blood on paper, and a bench-top spectrophotometer validated the approach before the development of a low-cost reader. We optimized impregnating paper with chemicals to lyse red blood cells, paper type, drying time, wavelengths measured, and sensitivity to variations in volume of blood, and we validated our approach using patient samples. Lysing the blood cells with sodium deoxycholate dried in Whatman Chr4 chromatography paper gave repeatable results, and the absorbance difference between 528 nm and 656 nm was stable over time in measurements taken up to 10 min after sample preparation. The method was insensitive to the amount of blood spotted on the paper over the range of 5 μL to 25 μL. We created a low-cost, handheld reader to measure the transmission of paper cuvettes at these optimal wavelengths. Training and validating our method with patient samples on both the spectrometer and the handheld reader showed that both devices are accurate to within 2 g dL(-1) of the HemoCue device for 98% and 95% of samples, respectively.

  10. AX Tank Farm waste retrieval alternatives cost estimates

    SciTech Connect

    Krieg, S.A.

    1998-07-21

    This report presents the estimated costs associated with retrieval of the wastes from the four tanks in AX Tank Farm. The engineering cost estimates developed for this report are based on previous cost data prepared for Project W-320 and the HTI 241-C-106 Heel Retrieval System. The costs presented in this report address only the retrieval of the wastes from the four AX Farm tanks. This includes costs for equipment procurement, fabrication, installation, and operation to retrieve the wastes. The costs to modify the existing plant equipment and systems to support the retrieval equipment are also included. The estimates do not include operational costs associated with pumping the waste out of the waste receiver tank (241-AY-102) between AX Farm retrieval campaigns or transportation, processing, and disposal of the retrieved waste.

  11. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  12. Measuring nonlinear oscillations using a very accurate and low-cost linear optical position transducer

    NASA Astrophysics Data System (ADS)

    Donoso, Guillermo; Ladera, Celso L.

    2016-09-01

    An accurate linear optical displacement transducer of about 0.2 mm resolution over a range of ∼40 mm is presented. This device consists of a stack of thin cellulose acetate strips, each strip longitudinally slid ∼0.5 mm over the precedent one so that one end of the stack becomes a stepped wedge of constant step. A narrowed light beam from a white LED orthogonally incident crosses the wedge at a known point, the transmitted intensity being detected with a phototransistor whose emitter is connected to a diode. We present the interesting analytical proof that the voltage across the diode is linearly dependent upon the ordinate of the point where the light beam falls on the wedge, as well as the experimental validation of such a theoretical proof. Applications to nonlinear oscillations are then presented—including the interesting case of a body moving under dry friction, and the more advanced case of an oscillator in a quartic energy potential—whose time-varying positions were accurately measured with our transducer. Our sensing device can resolve the dynamics of an object attached to it with great accuracy and precision at a cost considerably less than that of a linear neutral density wedge. The technique used to assemble the wedge of acetate strips is described.

  13. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  14. The application of artificial neural networks in indirect cost estimation

    NASA Astrophysics Data System (ADS)

    Leśniak, Agnieszka

    2013-10-01

    Estimating of the costs of construction project is one of the most important task in the management of the project. The total costs can be divided into direct costs that are related to executing the works, and indirect costs that accompany delivery. A precise costs estimation is usually a highly labour and time-intensive task especially when using manual calculation methods. This paper presents Artificial Neural Network (ANN) approach to predicting index of indirect cost of construction projects in Poland. A quantitative study was undertaken on the factors conditioning indirect costs of polish construction projects and a determination was made of the actual costs incurred by enterprises during project implementation. As a result of these studies, a data set was assembled covering 72 real-life cases of building projects constructed in Poland.

  15. Fuzzy case based reasoning in sports facilities unit cost estimating

    NASA Astrophysics Data System (ADS)

    Zima, Krzysztof

    2016-06-01

    This article presents an example of estimating costs in the early phase of the project using fuzzy case-based reasoning. The fragment of database containing descriptions and unit cost of sports facilities was shown. The formulas used in Case Based Reasoning method were presented, too. The article presents similarity measurement using a few formulas, including fuzzy similarity. The outcome of cost calculations based on CBR method was presented as a fuzzy number of unit cost of construction work.

  16. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images.

    PubMed

    Lavoie, Benjamin R; Okoniewski, Michal; Fear, Elise C

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range.

  17. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images

    PubMed Central

    Lavoie, Benjamin R.; Okoniewski, Michal; Fear, Elise C.

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  18. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images.

    PubMed

    Lavoie, Benjamin R; Okoniewski, Michal; Fear, Elise C

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  19. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  20. Effective Echo Detection and Accurate Orbit Estimation Algorithms for Space Debris Radar

    NASA Astrophysics Data System (ADS)

    Isoda, Kentaro; Sakamoto, Takuya; Sato, Toru

    Orbit estimation of space debris, objects of no inherent value orbiting the earth, is a task that is important for avoiding collisions with spacecraft. The Kamisaibara Spaceguard Center radar system was built in 2004 as the first radar facility in Japan devoted to the observation of space debris. In order to detect the smaller debris, coherent integration is effective in improving SNR (Signal-to-Noise Ratio). However, it is difficult to apply coherent integration to real data because the motions of the targets are unknown. An effective algorithm is proposed for echo detection and orbit estimation of the faint echoes from space debris. The characteristics of the evaluation function are utilized by the algorithm. Experiments show the proposed algorithm improves SNR by 8.32dB and enables estimation of orbital parameters accurately to allow for re-tracking with a single radar.

  1. Estimates of costs by DRG in Sydney teaching hospitals: an application of the Yale cost model.

    PubMed

    Palmer, G; Aisbett, C; Fetter, R; Winchester, L; Reid, B; Rigby, E

    1991-01-01

    The results are reported of a first round of costing by DRG in seven major teaching hospital sites in Sydney using the Yale cost model. These results, when compared between the hospitals and with values of relative costs by DRG from the United States, indicate that the cost modelling procedure has produced credible and potentially useful estimates of casemix costs. The rationale and underlying theory of cost modelling is explained, and the need for further work to improve the method of allocating costs to DRGs, and to improve the cost centre definitions currently used by the hospitals, is emphasised. PMID:10117339

  2. 48 CFR 836.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government estimate of... Contracting for Construction 836.203 Government estimate of construction costs. The overall amount of the Government estimate must not be disclosed until after award of the contract. After award, the...

  3. 48 CFR 1336.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government estimate of... Contracting for Construction 1336.203 Government estimate of construction costs. After award, the independent Government estimated price can be released, upon request, to those firms or individuals who...

  4. Estimating the Life Cycle Cost of Space Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    A space system's Life Cycle Cost (LCC) includes design and development, launch and emplacement, and operations and maintenance. Each of these cost factors is usually estimated separately. NASA uses three different parametric models for the design and development cost of crewed space systems; the commercial PRICE-H space hardware cost model, the NASA-Air Force Cost Model (NAFCOM), and the Advanced Missions Cost Model (AMCM). System mass is an important parameter in all three models. System mass also determines the launch and emplacement cost, which directly depends on the cost per kilogram to launch mass to Low Earth Orbit (LEO). The launch and emplacement cost is the cost to launch to LEO the system itself and also the rockets, propellant, and lander needed to emplace it. The ratio of the total launch mass to payload mass depends on the mission scenario and destination. The operations and maintenance costs include any material and spares provided, the ground control crew, and sustaining engineering. The Mission Operations Cost Model (MOCM) estimates these costs as a percentage of the system development cost per year.

  5. Parameter Estimation of Ion Current Formulations Requires Hybrid Optimization Approach to Be Both Accurate and Reliable

    PubMed Central

    Loewe, Axel; Wilhelms, Mathias; Schmid, Jochen; Krause, Mathias J.; Fischer, Fathima; Thomas, Dierk; Scholz, Eberhard P.; Dössel, Olaf; Seemann, Gunnar

    2016-01-01

    Computational models of cardiac electrophysiology provided insights into arrhythmogenesis and paved the way toward tailored therapies in the last years. To fully leverage in silico models in future research, these models need to be adapted to reflect pathologies, genetic alterations, or pharmacological effects, however. A common approach is to leave the structure of established models unaltered and estimate the values of a set of parameters. Today’s high-throughput patch clamp data acquisition methods require robust, unsupervised algorithms that estimate parameters both accurately and reliably. In this work, two classes of optimization approaches are evaluated: gradient-based trust-region-reflective and derivative-free particle swarm algorithms. Using synthetic input data and different ion current formulations from the Courtemanche et al. electrophysiological model of human atrial myocytes, we show that neither of the two schemes alone succeeds to meet all requirements. Sequential combination of the two algorithms did improve the performance to some extent but not satisfactorily. Thus, we propose a novel hybrid approach coupling the two algorithms in each iteration. This hybrid approach yielded very accurate estimates with minimal dependency on the initial guess using synthetic input data for which a ground truth parameter set exists. When applied to measured data, the hybrid approach yielded the best fit, again with minimal variation. Using the proposed algorithm, a single run is sufficient to estimate the parameters. The degree of superiority over the other investigated algorithms in terms of accuracy and robustness depended on the type of current. In contrast to the non-hybrid approaches, the proposed method proved to be optimal for data of arbitrary signal to noise ratio. The hybrid algorithm proposed in this work provides an important tool to integrate experimental data into computational models both accurately and robustly allowing to assess the often non

  6. Cost estimate guidelines for advanced nuclear power technologies

    SciTech Connect

    Delene, J.G.; Hudson, C.R. II

    1993-05-01

    Several advanced power plant concepts are currently under development. These include the Modular High Temperature Gas Cooled Reactors, the Advanced Liquid Metal Reactor and the Advanced Light Water Reactors. One measure of the attractiveness of a new concept is its cost. Invariably, the cost of a new type of power plant will be compared with other alternative forms of electrical generation. This report provides a common starting point, whereby the cost estimates for the various power plants to be considered are developed with common assumptions and ground rules. Comparisons can then be made on a consistent basis. This is the second update of these cost estimate guidelines. Changes have been made to make the guidelines more current (January 1, 1992) and in response to suggestions made as a result of the use of the previous report. The principal changes are that the reference site has been changed from a generic Northeast (Middletown) site to a more central site (EPRI`s East/West Central site) and that reference bulk commodity prices and labor productivity rates have been added. This report is designed to provide a framework for the preparation and reporting of costs. The cost estimates will consist of the overnight construction cost, the total plant capital cost, the operation and maintenance costs, the fuel costs, decommissioning costs and the power production or busbar generation cost.

  7. The unit cost factors and calculation methods for decommissioning - Cost estimation of nuclear research facilities

    SciTech Connect

    Kwan-Seong Jeong; Dong-Gyu Lee; Chong-Hun Jung; Kune-Woo Lee

    2007-07-01

    Available in abstract form only. Full text of publication follows: The uncertainties of decommissioning costs increase high due to several conditions. Decommissioning cost estimation depends on the complexity of nuclear installations, its site-specific physical and radiological inventories. Therefore, the decommissioning costs of nuclear research facilities must be estimated in accordance with the detailed sub-tasks and resources by the tasks of decommissioning activities. By selecting the classified activities and resources, costs are calculated by the items and then the total costs of all decommissioning activities are reshuffled to match with its usage and objectives. And the decommissioning cost of nuclear research facilities is calculated by applying a unit cost factor method on which classification of decommissioning works fitted with the features and specifications of decommissioning objects and establishment of composition factors are based. Decommissioning costs of nuclear research facilities are composed of labor cost, equipment and materials cost. Of these three categorical costs, the calculation of labor costs are very important because decommissioning activities mainly depend on labor force. Labor costs in decommissioning activities are calculated on the basis of working time consumed in decommissioning objects and works. The working times are figured out of unit cost factors and work difficulty factors. Finally, labor costs are figured out by using these factors as parameters of calculation. The accuracy of decommissioning cost estimation results is much higher compared to the real decommissioning works. (authors)

  8. Estimating the cost of large superconducting thin solenoid magnets

    SciTech Connect

    Green, M.A.; St. Lorant, S.J.

    1993-07-01

    The cost of thin superconducting solenoid magnets can be estimated if one knows the magnet stored energy, the magnetic field volume product or the overall mass of the superconducting coil and its cryostat. This report shows cost data collected since 1979 for large superconducting solenoid magnets used in high energy physics. These magnets are characterized in most cases by the use of indirect two phase helium cooling and a superconductor stabilizer of very pure aluminum. This correlation can be used for making a preliminary cost estimate of proposed one of a kind superconducting magnets. The magnet costs quoted include the power supply and quench protection system but the cost of the helium refrigerator and helium distribution system is not included in the estimated cost.

  9. Intraocular lens power estimation by accurate ray tracing for eyes underwent previous refractive surgeries

    NASA Astrophysics Data System (ADS)

    Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong

    2015-08-01

    For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.

  10. A 1% treadmill grade most accurately reflects the energetic cost of outdoor running.

    PubMed

    Jones, A M; Doust, J H

    1996-08-01

    When running indoors on a treadmill, the lack of air resistance results in a lower energy cost compared with running outdoors at the same velocity. A slight incline of the treadmill gradient can be used to increase the energy cost in compensation. The aim of this study was to determine the treadmill gradient that most accurately reflects the energy cost of outdoor running. Nine trained male runners, thoroughly habituated to treadmill running, ran for 6 min at six different velocities (2.92, 3.33, 3.75, 4.17, 4.58 and 5.0 m s-1) with 6 min recovery between runs. This routine was repeated six times, five times on a treadmill set at different grades (0%, 0%, 1%, 2%, 3%) and once outdoors along a level road. Duplicate collections of expired air were taken during the final 2 min of each run to determine oxygen consumption. The repeatability of the methodology was confirmed by high correlations (r = 0.99) and non-significant differences between the duplicate expired air collections and between the repeated runs at 0% grade. The relationship between oxygen uptake (VO2) and velocity for each grade was highly linear (r > 0.99). At the two lowest velocities, VO2 during road running was not significantly different from treadmill running at 0% or 1% grade, but was significantly less than 2% and 3% grade. For 3.75 m s-1, the VO2 during road running was significantly different from treadmill running at 0%, 2% and 3% grades but not from 1% grade. For 4.17 and 4.58 m s-1, the VO2 during road running was not significantly different from that at 1% or 2% grade but was significantly greater than 0% grade and significantly less than 3% grade. At 5.0 m s-1, the VO2 for road running fell between the VO2 value for 1% and 2% grade treadmill running but was not significantly different from any of the treadmill grade conditions. This study demonstrates equality of the energetic cost of treadmill and outdoor running with the use of a 1% treadmill grade over a duration of approximately 5 min

  11. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  12. Improving The Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning many program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility. This paper presents the plans for the newly established role. Described is how the Independent Program Assessment Office, working with all NASA Centers, NASA Headquarters, other Government agencies, and industry, is focused on creating cost estimation and analysis as a professional discipline that will be recognized equally with the technical disciplines needed to design new space and aeronautics activities. Investments in selected, new analysis tools, creating advanced training opportunities for analysts, and developing career paths for future analysts engaged in the discipline are all elements of the plan. Plans also include increasing the human resources available to conduct independent cost analysis of Agency programs during their formulation, to improve near-term capability to conduct economic cost-benefit assessments, to support NASA management's decision process, and to provide cost analysis results emphasizing "full-cost" and "full-life cycle" considerations. The Agency cost analysis improvement plan has been approved for implementation starting this calendar year. Adequate financial

  13. Space Station Furnace Facility. Volume 3: Program cost estimate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The approach used to estimate costs for the Space Station Furnace Facility (SSFF) is based on a computer program developed internally at Teledyne Brown Engineering (TBE). The program produces time-phased estimates of cost elements for each hardware component, based on experience with similar components. Engineering estimates of the degree of similarity or difference between the current project and the historical data is then used to adjust the computer-produced cost estimate and to fit it to the current project Work Breakdown Structure (WBS). The SSFF Concept as presented at the Requirements Definition Review (RDR) was used as the base configuration for the cost estimate. This program incorporates data on costs of previous projects and the allocation of those costs to the components of one of three, time-phased, generic WBS's. Input consists of a list of similar components for which cost data exist, number of interfaces with their type and complexity, identification of the extent to which previous designs are applicable, and programmatic data concerning schedules and miscellaneous data (travel, off-site assignments). Output is program cost in labor hours and material dollars, for each component, broken down by generic WBS task and program schedule phase.

  14. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  15. Commercial Crew Cost Estimating - A Look at Estimating Processes, Challenges and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Battle, Rick; Cole, Lance

    2015-01-01

    To support annual PPBE budgets and NASA HQ requests for cost information for commercial crew transportation to the International Space Station (ISS), the NASA ISS ACES team developed system development and per flight cost estimates for the potential providers for each annual PPBE submit from 2009-2014. This paper describes the cost estimating processes used, challenges and lessons learned to develop estimates for this key NASA project that diverted from the traditional procurement approach and used a new way of doing business

  16. READSCAN: a fast and scalable pathogen discovery program with accurate genome relative abundance estimation

    PubMed Central

    Rashid, Mamoon; Pain, Arnab

    2013-01-01

    Summary: READSCAN is a highly scalable parallel program to identify non-host sequences (of potential pathogen origin) and estimate their genome relative abundance in high-throughput sequence datasets. READSCAN accurately classified human and viral sequences on a 20.1 million reads simulated dataset in <27 min using a small Beowulf compute cluster with 16 nodes (Supplementary Material). Availability: http://cbrc.kaust.edu.sa/readscan Contact: arnab.pain@kaust.edu.sa or raeece.naeem@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23193222

  17. Estimating instantaneous energetic cost during non-steady-state gait.

    PubMed

    Selinger, Jessica C; Donelan, J Maxwell

    2014-12-01

    Respiratory measures of oxygen and carbon dioxide are routinely used to estimate the body's steady-state metabolic energy use. However, slow mitochondrial dynamics, long transit times, complex respiratory control mechanisms, and high breath-by-breath variability obscure the relationship between the body's instantaneous energy demands (instantaneous energetic cost) and that measured from respiratory gases (measured energetic cost). The purpose of this study was to expand on traditional methods of assessing metabolic cost by estimating instantaneous energetic cost during non-steady-state conditions. To accomplish this goal, we first imposed known changes in energy use (input), while measuring the breath-by-breath response (output). We used these input/output relationships to model the body as a dynamic system that maps instantaneous to measured energetic cost. We found that a first-order linear differential equation well approximates transient energetic cost responses during gait. Across all subjects, model fits were parameterized by an average time constant (τ) of 42 ± 12 s with an average R(2) of 0.94 ± 0.05 (mean ± SD). Armed with this input/output model, we next tested whether we could use it to reliably estimate instantaneous energetic cost from breath-by-breath measures under conditions that simulated dynamically changing gait. A comparison of the imposed energetic cost profiles and our estimated instantaneous cost demonstrated a close correspondence, supporting the use of our methodology to study the role of energetics during locomotor adaptation and learning.

  18. Econometric estimation of country-specific hospital costs.

    PubMed

    Adam, Taghreed; Evans, David B; Murray, Christopher JL

    2003-02-26

    Information on the unit cost of inpatient and outpatient care is an essential element for costing, budgeting and economic-evaluation exercises. Many countries lack reliable estimates, however. WHO has recently undertaken an extensive effort to collect and collate data on the unit cost of hospitals and health centres from as many countries as possible; so far, data have been assembled from 49 countries, for various years during the period 1973-2000. The database covers a total of 2173 country-years of observations. Large gaps remain, however, particularly for developing countries. Although the long-term solution is that all countries perform their own costing studies, the question arises whether it is possible to predict unit costs for different countries in a standardized way for short-term use. The purpose of the work described in this paper, a modelling exercise, was to use the data collected across countries to predict unit costs in countries for which data are not yet available, with the appropriate uncertainty intervals.The model presented here forms part of a series of models used to estimate unit costs for the WHO-CHOICE project. The methods and the results of the model, however, may be used to predict a number of different types of country-specific unit costs, depending on the purpose of the exercise. They may be used, for instance, to estimate the costs per bed-day at different capacity levels; the "hotel" component of cost per bed-day; or unit costs net of particular components such as drugs.In addition to reporting estimates for selected countries, the paper shows that unit costs of hospitals vary within countries, sometimes by an order of magnitude. Basing cost-effectiveness studies or budgeting exercises on the results of a study of a single facility, or even a small group of facilities, is likely to be misleading. PMID:12773218

  19. Econometric estimation of country-specific hospital costs

    PubMed Central

    Adam, Taghreed; Evans, David B; Murray, Christopher JL

    2003-01-01

    Information on the unit cost of inpatient and outpatient care is an essential element for costing, budgeting and economic-evaluation exercises. Many countries lack reliable estimates, however. WHO has recently undertaken an extensive effort to collect and collate data on the unit cost of hospitals and health centres from as many countries as possible; so far, data have been assembled from 49 countries, for various years during the period 1973–2000. The database covers a total of 2173 country-years of observations. Large gaps remain, however, particularly for developing countries. Although the long-term solution is that all countries perform their own costing studies, the question arises whether it is possible to predict unit costs for different countries in a standardized way for short-term use. The purpose of the work described in this paper, a modelling exercise, was to use the data collected across countries to predict unit costs in countries for which data are not yet available, with the appropriate uncertainty intervals. The model presented here forms part of a series of models used to estimate unit costs for the WHO-CHOICE project. The methods and the results of the model, however, may be used to predict a number of different types of country-specific unit costs, depending on the purpose of the exercise. They may be used, for instance, to estimate the costs per bed-day at different capacity levels; the "hotel" component of cost per bed-day; or unit costs net of particular components such as drugs. In addition to reporting estimates for selected countries, the paper shows that unit costs of hospitals vary within countries, sometimes by an order of magnitude. Basing cost-effectiveness studies or budgeting exercises on the results of a study of a single facility, or even a small group of facilities, is likely to be misleading. PMID:12773218

  20. Construction cost estimation of municipal incinerators by fuzzy linear regression

    SciTech Connect

    Chang, N.B.; Chen, Y.L.; Yang, H.H.

    1996-12-31

    Regression analysis has been widely used in engineering cost estimation. It is recognized that the fuzzy structure in cost estimation is a different type of uncertainty compared to the measurement error in the least-squares regression modeling. Hence, the uncertainties encountered in many events of construction and operating costs estimation and prediction cannot be fully depicted by conventional least-squares regression models. This paper presents a construction cost analysis of municipal incinerators by the techniques of fuzzy linear regression. A thorough investigation of construction costs in the Taiwan Resource Recovery Project was conducted based on design parameters such as design capacity, type of grate system, and the selected air pollution control process. The focus has been placed upon the methodology for dealing with the heterogeneity phenomenon of a set of observations for which regression is evaluated.

  1. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2003-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  2. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2004-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  3. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merion M.

    2002-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  4. Software cost estimation using class point metrics (CPM)

    NASA Astrophysics Data System (ADS)

    Ghode, Aditi; Periyasamy, Kasilingam

    2011-12-01

    Estimating cost for the software project is one of the most important and crucial task to maintain the software reliability. Many cost estimation models have been reported till now, but most of them have significant drawbacks due to rapid changes in the technology. For example, Source Line Of Code (SLOC) can only be counted when the software construction is complete. Function Point (FP) metric is deficient in handling Object Oriented Technology, as it was designed for procedural languages such as COBOL. Since Object-Oriented Programming became a popular development practice, most of the software companies started applying the Unified Modeling Language (UML). The objective of this research is to develop a new cost estimation model with the application of class diagram for the software cost estimation.

  5. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-01

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems. PMID:26651397

  6. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-01

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems.

  7. Accurate Estimation of Carotid Luminal Surface Roughness Using Ultrasonic Radio-Frequency Echo

    NASA Astrophysics Data System (ADS)

    Kitamura, Kosuke; Hasegawa, Hideyuki; Kanai, Hiroshi

    2012-07-01

    It would be useful to measure the minute surface roughness of the carotid arterial wall to detect the early stage of atherosclerosis. In conventional ultrasonography, the axial resolution of a B-mode image depends on the ultrasonic wavelength of 150 µm at 10 MHz because a B-mode image is constructed using the amplitude of the radio-frequency (RF) echo. Therefore, the surface roughness caused by atherosclerosis in an early stage cannot be measured using a conventional B-mode image obtained by ultrasonography because the roughness is 10-20 µm. We have realized accurate transcutaneous estimation of such a minute surface profile using the lateral motion of the carotid arterial wall, which is estimated by block matching of received ultrasonic signals. However, the width of the region where the surface profile is estimated depends on the magnitude of the lateral displacement of the carotid arterial wall (i.e., if the lateral displacement of the arterial wall is 1 mm, the surface profile is estimated in a region of 1 mm in width). In this study, the width was increased by combining surface profiles estimated using several ultrasonic beams. In the present study, we first measured a fine wire, whose diameter was 13 µm, using ultrasonic equipment to obtain an ultrasonic beam profile for determination of the optimal kernel size for block matching based on the correlation between RF echoes. Second, we estimated the lateral displacement and surface profile of a phantom, which had a saw tooth profile on its surface, and compared the surface profile measured by ultrasound with that measured by a laser profilometer. Finally, we estimated the lateral displacement and surface roughness of the carotid arterial wall of three healthy subjects (24-, 23-, and 23-year-old males) using the proposed method.

  8. Estimating the Costs of Educating Handicapped Children: A Resource-Cost Model Approach-Summary Report.

    ERIC Educational Resources Information Center

    Hartman, William T.

    1981-01-01

    The resource cost model approach makes the programmatic aspects of special education explicit and links these with associated costs. It facilitates planning for education of handicapped children. An effective cost estimation plan was necessary because of recent political and legal mandates which established the educational rights of the…

  9. Estimating the Cost of a Bachelor's Degree: An Institutional Cost Analysis.

    ERIC Educational Resources Information Center

    To, Duc-Le

    The cost of a bachelor's degree was estimated and compared for different types of institutions. The objective was to develop a single index to show how much each type of institution spends on producing a bachelor's degree graduate, and to use trend data to show how these costs will change over time. The basic concept associated with the cost of a…

  10. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization

    PubMed Central

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 μs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  11. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization.

    PubMed

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 µs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  12. Improving the Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning man), program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility.

  13. COST ESTIMATING EQUATIONS FOR BEST MANAGEMENT PRACTICES (BMP)

    EPA Science Inventory

    This paper describes the development of an interactive internet-based cost-estimating tool for commonly used urban storm runoff best management practices (BMP), including: retention and detention ponds, grassed swales, and constructed wetlands. The paper presents the cost data, c...

  14. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    SciTech Connect

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-18

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1–2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S{sub 0} and A{sub 0}, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A{sub 0} to thickness variations was shown to be superior to S{sub 0}, however, the attenuation from A{sub 0} when a liquid loading was present was much higher than S{sub 0}. A{sub 0} was less sensitive to the presence of coatings on the surface of than S{sub 0}.

  15. Estimating design costs for first-of-a-kind projects

    SciTech Connect

    Banerjee, Bakul; /Fermilab

    2006-03-01

    Modern scientific facilities are often outcomes of projects that are first-of-a-kind, that is, minimal historical data are available for project costs and schedules. However, at Fermilab, there was an opportunity to execute two similar projects consecutively. In this paper, a comparative study of the design costs for these two projects is presented using earned value methodology. This study provides some insights into how to estimate the cost of a replicated project.

  16. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    NASA Astrophysics Data System (ADS)

    Granata, Daniele; Carnevale, Vincenzo

    2016-08-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset.

  17. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    PubMed Central

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  18. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets.

    PubMed

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant "collective" variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  19. Removing the thermal component from heart rate provides an accurate VO2 estimation in forest work.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Lebel, Luc; Kolus, Ahmet

    2016-05-01

    Heart rate (HR) was monitored continuously in 41 forest workers performing brushcutting or tree planting work. 10-min seated rest periods were imposed during the workday to estimate the HR thermal component (ΔHRT) per Vogt et al. (1970, 1973). VO2 was measured using a portable gas analyzer during a morning submaximal step-test conducted at the work site, during a work bout over the course of the day (range: 9-74 min), and during an ensuing 10-min rest pause taken at the worksite. The VO2 estimated, from measured HR and from corrected HR (thermal component removed), were compared to VO2 measured during work and rest. Varied levels of HR thermal component (ΔHRTavg range: 0-38 bpm) originating from a wide range of ambient thermal conditions, thermal clothing insulation worn, and physical load exerted during work were observed. Using raw HR significantly overestimated measured work VO2 by 30% on average (range: 1%-64%). 74% of VO2 prediction error variance was explained by the HR thermal component. VO2 estimated from corrected HR, was not statistically different from measured VO2. Work VO2 can be estimated accurately in the presence of thermal stress using Vogt et al.'s method, which can be implemented easily by the practitioner with inexpensive instruments.

  20. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    NASA Astrophysics Data System (ADS)

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj-xi)/(tj-ti) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  1. Methods for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, M.R.; Bland, R.

    2000-01-01

    Accurate estimates of net residual discharge in tidally affected rivers and estuaries are possible because of recently developed ultrasonic discharge measurement techniques. Previous discharge estimates using conventional mechanical current meters and methods based on stage/discharge relations or water slope measurements often yielded errors that were as great as or greater than the computed residual discharge. Ultrasonic measurement methods consist of: 1) the use of ultrasonic instruments for the measurement of a representative 'index' velocity used for in situ estimation of mean water velocity and 2) the use of the acoustic Doppler current discharge measurement system to calibrate the index velocity measurement data. Methods used to calibrate (rate) the index velocity to the channel velocity measured using the Acoustic Doppler Current Profiler are the most critical factors affecting the accuracy of net discharge estimation. The index velocity first must be related to mean channel velocity and then used to calculate instantaneous channel discharge. Finally, discharge is low-pass filtered to remove the effects of the tides. An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin Rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. Two sets of data were collected during a spring tide (monthly maximum tidal current) and one of data collected during a neap tide (monthly minimum tidal current). The relative magnitude of instrumental errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was found to be the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three

  2. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    PubMed Central

    Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-01-01

    Abstract Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil‐Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj–xi)/(tj–ti) computed between all data pairs i > j. For normally distributed data, Theil‐Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil‐Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one‐sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root‐mean‐square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences. PMID:27668140

  3. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    PubMed Central

    Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-01-01

    Abstract Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil‐Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj–xi)/(tj–ti) computed between all data pairs i > j. For normally distributed data, Theil‐Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil‐Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one‐sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root‐mean‐square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  4. Accurate Relative Location Estimates for the North Korean Nuclear Tests Using Empirical Slowness Corrections

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna', T.; Mykkeltveit, S.

    2016-10-01

    modified velocity gradients reduce the residuals, the relative location uncertainties, and the sensitivity to the combination of stations used. The traveltime gradients appear to be overestimated for the regional phases, and teleseismic relative location estimates are likely to be more accurate despite an apparent lower precision. Calibrations for regional phases are essential given that smaller magnitude events are likely not to be recorded teleseismically. We discuss the implications for the absolute event locations. Placing the 2006 event under a local maximum of overburden at 41.293°N, 129.105°E would imply a location of 41.299°N, 129.075°E for the January 2016 event, providing almost optimal overburden for the later four events.

  5. 40 CFR 264.142 - Cost estimate for closure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... estimate for inflation within 60 days prior to the anniversary date of the establishment of the financial... guarantee, the closure cost estimate must be updated for inflation within 30 days after the close of the... current dollars, or by using an inflation factor derived from the most recent Implicit Price Deflator...

  6. Hydrogen Production Cost Estimate Using Biomass Gasification: Independent Review

    SciTech Connect

    none,

    2011-10-01

    This independent review is the conclusion arrived at from data collection, document reviews, interviews and deliberation from December 2010 through April 2011 and the technical potential of Hydrogen Production Cost Estimate Using Biomass Gasification. The Panel reviewed the current H2A case (Version 2.12, Case 01D) for hydrogen production via biomass gasification and identified four principal components of hydrogen levelized cost: CapEx; feedstock costs; project financing structure; efficiency/hydrogen yield. The panel reexamined the assumptions around these components and arrived at new estimates and approaches that better reflect the current technology and business environments.

  7. Fuel Cell System for Transportation -- 2005 Cost Estimate

    SciTech Connect

    Wheeler, D.

    2006-10-01

    Independent review report of the methodology used by TIAX to estimate the cost of producing PEM fuel cells using 2005 cell stack technology. The U.S. Department of Energy (DOE) Hydrogen, Fuel Cells and Infrastructure Technologies Program Manager asked the National Renewable Energy Laboratory (NREL) to commission an independent review of the 2005 TIAX cost analysis for fuel cell production. The NREL Systems Integrator is responsible for conducting independent reviews of progress toward meeting the DOE Hydrogen Program (the Program) technical targets. An important technical target of the Program is the proton exchange membrane (PEM) fuel cell cost in terms of dollars per kilowatt ($/kW). The Program's Multi-Year Program Research, Development, and Demonstration Plan established $125/kW as the 2005 technical target. Over the last several years, the Program has contracted with TIAX, LLC (TIAX) to produce estimates of the high volume cost of PEM fuel cell production for transportation use. Since no manufacturer is yet producing PEM fuel cells in the quantities needed for an initial hydrogen-based transportation economy, these estimates are necessary for DOE to gauge progress toward meeting its targets. For a PEM fuel cell system configuration developed by Argonne National Laboratory, TIAX estimated the total cost to be $108/kW, based on assumptions of 500,000 units per year produced with 2005 cell stack technology, vertical integration of cell stack manufacturing, and balance-of-plant (BOP) components purchased from a supplier network. Furthermore, TIAX conducted a Monte Carlo analysis by varying ten key parameters over a wide range of values and estimated with 98% certainty that the mean PEM fuel cell system cost would be below DOE's 2005 target of $125/kW. NREL commissioned DJW TECHNOLOGY, LLC to form an Independent Review Team (the Team) of industry fuel cell experts and to evaluate the cost estimation process and the results reported by TIAX. The results of this

  8. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method. PMID:23893759

  9. Accurate estimation of motion blur parameters in noisy remote sensing image

    NASA Astrophysics Data System (ADS)

    Shi, Xueyan; Wang, Lin; Shao, Xiaopeng; Wang, Huilin; Tao, Zhong

    2015-05-01

    The relative motion between remote sensing satellite sensor and objects is one of the most common reasons for remote sensing image degradation. It seriously weakens image data interpretation and information extraction. In practice, point spread function (PSF) should be estimated firstly for image restoration. Identifying motion blur direction and length accurately is very crucial for PSF and restoring image with precision. In general, the regular light-and-dark stripes in the spectrum can be employed to obtain the parameters by using Radon transform. However, serious noise existing in actual remote sensing images often causes the stripes unobvious. The parameters would be difficult to calculate and the error of the result relatively big. In this paper, an improved motion blur parameter identification method to noisy remote sensing image is proposed to solve this problem. The spectrum characteristic of noisy remote sensing image is analyzed firstly. An interactive image segmentation method based on graph theory called GrabCut is adopted to effectively extract the edge of the light center in the spectrum. Motion blur direction is estimated by applying Radon transform on the segmentation result. In order to reduce random error, a method based on whole column statistics is used during calculating blur length. Finally, Lucy-Richardson algorithm is applied to restore the remote sensing images of the moon after estimating blur parameters. The experimental results verify the effectiveness and robustness of our algorithm.

  10. Efficient and accurate estimation of relative order tensors from λ- maps

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Rishi; Miao, Xijiang; Shealy, Paul; Valafar, Homayoun

    2009-06-01

    The rapid increase in the availability of RDC data from multiple alignment media in recent years has necessitated the development of more sophisticated analyses that extract the RDC data's full information content. This article presents an analysis of the distribution of RDCs from two media (2D-RDC data), using the information obtained from a λ-map. This article also introduces an efficient algorithm, which leverages these findings to extract the order tensors for each alignment medium using unassigned RDC data in the absence of any structural information. The results of applying this 2D-RDC analysis method to synthetic and experimental data are reported in this article. The relative order tensor estimates obtained from the 2D-RDC analysis are compared to order tensors obtained from the program REDCAT after using assignment and structural information. The final comparisons indicate that the relative order tensors estimated from the unassigned 2D-RDC method very closely match the results from methods that require assignment and structural information. The presented method is successful even in cases with small datasets. The results of analyzing experimental RDC data for the protein 1P7E are presented to demonstrate the potential of the presented work in accurately estimating the principal order parameters from RDC data that incompletely sample the RDC space. In addition to the new algorithm, a discussion of the uniqueness of the solutions is presented; no more than two clusters of distinct solutions have been shown to satisfy each λ-map.

  11. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method.

  12. Accurate estimation of the RMS emittance from single current amplifier data

    SciTech Connect

    Stockli, Martin P.; Welton, R.F.; Keller, R.; Letchford, A.P.; Thomae, R.W.; Thomason, J.W.G.

    2002-05-31

    This paper presents the SCUBEEx rms emittance analysis, a self-consistent, unbiased elliptical exclusion method, which combines traditional data-reduction methods with statistical methods to obtain accurate estimates for the rms emittance. Rather than considering individual data, the method tracks the average current density outside a well-selected, variable boundary to separate the measured beam halo from the background. The average outside current density is assumed to be part of a uniform background and not part of the particle beam. Therefore the average outside current is subtracted from the data before evaluating the rms emittance within the boundary. As the boundary area is increased, the average outside current and the inside rms emittance form plateaus when all data containing part of the particle beam are inside the boundary. These plateaus mark the smallest acceptable exclusion boundary and provide unbiased estimates for the average background and the rms emittance. Small, trendless variations within the plateaus allow for determining the uncertainties of the estimates caused by variations of the measured background outside the smallest acceptable exclusion boundary. The robustness of the method is established with complementary variations of the exclusion boundary. This paper presents a detailed comparison between traditional data reduction methods and SCUBEEx by analyzing two complementary sets of emittance data obtained with a Lawrence Berkeley National Laboratory and an ISIS H{sup -} ion source.

  13. Estimating the costs of landslide damage in the United States

    USGS Publications Warehouse

    Fleming, Robert W.; Taylor, Fred A.

    1980-01-01

    Landslide damages are one of the most costly natural disasters in the United States. A recent estimate of the total annual cost of landslide damage is in excess of $1 billion {Schuster, 1978}. The damages can be significantly reduced, however, through the combined action of technical experts, government, and the public. Before they can be expected to take action, local governments need to have an appreciation of costs of damage in their areas of responsibility and of the reductions in losses that can be achieved. Where studies of cost of landslide damages have been conducted, it is apparent that {1} costs to the public and private sectors of our economy due to landslide damage are much larger than anticipated; {2} taxpayers and public officials generally are unaware of the magnitude of the cost, owing perhaps to the lack of any centralization of data; and {3} incomplete records and unavailability of records result in lower reported costs than actually were incurred. The U.S. Geological Survey has developed a method to estimate the cost of landslide damages in regional and local areas and has applied the method in three urban areas and one rural area. Costs are for different periods and are unadjusted for inflation; therefore, strict comparisons of data from different years should be avoided. Estimates of the average annual cost of landslide damage for the urban areas studied are $5,900,000 in the San Francisco Bay area; $4,000,000 in Allegheny County, Pa.; and $5,170,000 in Hamilton County, Ohio. Adjusting these figures for the population of each area, the annual cost of damages per capita are $1.30 in the nine-county San Francisco Bay region; $2.50 in Allegheny County, Pa.; and $5.80 in Hamilton County, Ohio. On the basis of data from other sources, the estimated annual damages on a per capita basis for the City of Los Angeles, Calif., are about $1.60. If the costs were available for the damages from landslides in Los Angeles in 1977-78 and 1979-80, the annual per

  14. Quick and accurate estimation of the elastic constants using the minimum image method

    NASA Astrophysics Data System (ADS)

    Tretiakov, Konstantin V.; Wojciechowski, Krzysztof W.

    2015-04-01

    A method for determining the elastic properties using the minimum image method (MIM) is proposed and tested on a model system of particles interacting by the Lennard-Jones (LJ) potential. The elastic constants of the LJ system are determined in the thermodynamic limit, N → ∞, using the Monte Carlo (MC) method in the NVT and NPT ensembles. The simulation results show that when determining the elastic constants, the contribution of long-range interactions cannot be ignored, because that would lead to erroneous results. In addition, the simulations have revealed that the inclusion of further interactions of each particle with all its minimum image neighbors even in case of small systems leads to results which are very close to the values of elastic constants in the thermodynamic limit. This enables one for a quick and accurate estimation of the elastic constants using very small samples.

  15. Pitfalls in accurate estimation of overdiagnosis: implications for screening policy and compliance.

    PubMed

    Feig, Stephen A

    2013-01-01

    Stories in the public media that 30 to 50% of screen-detected breast cancers are overdiagnosed dissuade women from being screened because overdiagnosed cancers would never result in death if undetected yet do result in unnecessary treatment. However, such concerns are unwarranted because the frequency of overdiagnosis, when properly calculated, is only 0 to 5%. In the previous issue of Breast Cancer Research, Duffy and Parmar report that accurate estimation of the rate of overdiagnosis recognizes the effect of lead time on detection rates and the consequent requirement for an adequate number of years of follow-up. These indispensable elements were absent from highly publicized studies that overestimated the frequency of overdiagnosis.

  16. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms.

    PubMed

    Saccà, Alessandro

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes' principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of 'unellipticity' introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  17. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms

    PubMed Central

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes’ principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of ‘unellipticity’ introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  18. Cost estimation when time and resources are limited: the Brief DATCAP.

    PubMed

    French, Michael T; Roebuck, M Christopher; McLellan, A Thomas

    2004-10-01

    The Drug Abuse Treatment Cost Analysis Program (DATCAP) was designed in the early 1990s as a research guide to collect and analyze financial data from addiction treatment programs. The addiction research community could clearly benefit from a version of the DATCAP that reduced the time and effort required for its administration without compromising the integrity of its cost estimates. This paper introduces the Brief DATCAP and presents some preliminary findings. Initial feedback from respondents suggests that the Brief DATCAP is understandable, and easier and quicker to complete than the DATCAP. More importantly, preliminary results indicate that cost estimates from the Brief DATCAP differ from those of the longer DATCAP by less than 2%. These results have important research and policy implications because a shorter yet reasonably accurate cost instrument will enhance the feasibility and precision of future economic evaluations of addiction interventions.

  19. Accurate biopsy-needle depth estimation in limited-angle tomography using multi-view geometry

    NASA Astrophysics Data System (ADS)

    van der Sommen, Fons; Zinger, Sveta; de With, Peter H. N.

    2016-03-01

    Recently, compressed-sensing based algorithms have enabled volume reconstruction from projection images acquired over a relatively small angle (θ < 20°). These methods enable accurate depth estimation of surgical tools with respect to anatomical structures. However, they are computationally expensive and time consuming, rendering them unattractive for image-guided interventions. We propose an alternative approach for depth estimation of biopsy needles during image-guided interventions, in which we split the problem into two parts and solve them independently: needle-depth estimation and volume reconstruction. The complete proposed system consists of the previous two steps, preceded by needle extraction. First, we detect the biopsy needle in the projection images and remove it by interpolation. Next, we exploit epipolar geometry to find point-to-point correspondences in the projection images to triangulate the 3D position of the needle in the volume. Finally, we use the interpolated projection images to reconstruct the local anatomical structures and indicate the position of the needle within this volume. For validation of the algorithm, we have recorded a full CT scan of a phantom with an inserted biopsy needle. The performance of our approach ranges from a median error of 2.94 mm for an distributed viewing angle of 1° down to an error of 0.30 mm for an angle larger than 10°. Based on the results of this initial phantom study, we conclude that multi-view geometry offers an attractive alternative to time-consuming iterative methods for the depth estimation of surgical tools during C-arm-based image-guided interventions.

  20. Accurate Estimation of the Fine Layering Effect on the Wave Propagation in the Carbonate Rocks

    NASA Astrophysics Data System (ADS)

    Bouchaala, F.; Ali, M. Y.

    2014-12-01

    The attenuation caused to the seismic wave during its propagation can be mainly divided into two parts, the scattering and the intrinsic attenuation. The scattering is an elastic redistribution of the energy due to the medium heterogeneities. However the intrinsic attenuation is an inelastic phenomenon, mainly due to the fluid-grain friction during the wave passage. The intrinsic attenuation is directly related to the physical characteristics of the medium, so this parameter is very can be used for media characterization and fluid detection, which is beneficial for the oil and gas industry. The intrinsic attenuation is estimated by subtracting the scattering from the total attenuation, therefore the accuracy of the intrinsic attenuation is directly dependent on the accuracy of the total attenuation and the scattering. The total attenuation can be estimated from the recorded waves, by using in-situ methods as the spectral ratio and frequency shift methods. The scattering is estimated by assuming the heterogeneities as a succession of stacked layers, each layer is characterized by a single density and velocity. The accuracy of the scattering is strongly dependent on the layer thicknesses, especially in the case of the media composed of carbonate rocks, such media are known for their strong heterogeneity. Previous studies gave some assumptions for the choice of the layer thickness, but they showed some limitations especially in the case of carbonate rocks. In this study we established a relationship between the layer thicknesses and the frequency of the propagation, after certain mathematical development of the Generalized O'Doherty-Anstey formula. We validated this relationship through some synthetic tests and real data provided from a VSP carried out over an onshore oilfield in the emirate of Abu Dhabi in the United Arab Emirates, primarily composed of carbonate rocks. The results showed the utility of our relationship for an accurate estimation of the scattering

  1. Cost estimates for membrane filtration and conventional treatment

    SciTech Connect

    Wiesner, M.R.; Hackney, J.; Sethi, S. ); Jacangelo, J.G. ); Laine, J.M. . Lyonnaise des Eaux)

    1994-12-01

    Costs of several ultrafiltration and nanofiltration processes are compared with the cost of conventional liquid-solid separation with and without GAC adsorption for small water treatment facilities. Data on raw-water quality, permeate flux, recovery, frequency of backflushing, and chemical dosage obtained from a pilot study were used with a previously developed model for membrane costs to calculate anticipated capital and operating costs for each instance. Data from the US Environmental Protection Agency were used to estimate conventional treatment costs. All of the membrane process calculations showed comparable or lower total costs per unit volume treated compared with conventional treatment for small facilities (< 200,000 m[sup 3]/d or about 5 mgd). Membrane processes may offer small facilities a less expensive alternative for the removal of particles and organic materials from drinking water.

  2. US-based Drug Cost Parameter Estimation for Economic Evaluations

    PubMed Central

    Levy, Joseph F; Meek, Patrick D; Rosenberg, Marjorie A

    2014-01-01

    Introduction In the US, more than 10% of national health expenditures are for prescription drugs. Assessing drug costs in US economic evaluation studies is not consistent, as the true acquisition cost of a drug is not known by decision modelers. Current US practice focuses on identifying one reasonable drug cost and imposing some distributional assumption to assess uncertainty. Methods We propose a set of Rules based on current pharmacy practice that account for the heterogeneity of drug product costs. The set of products derived from our Rules, and their associated costs, form an empirical distribution that can be used for more realistic sensitivity analyses, and create transparency in drug cost parameter computation. The Rules specify an algorithmic process to select clinically equivalent drug products that reduce pill burden, use an appropriate package size, and assume uniform weighting of substitutable products. Three diverse examples show derived empirical distributions and are compared with previously reported cost estimates. Results The shapes of the empirical distributions among the three drugs differ dramatically, including multiple modes and different variation. Previously published estimates differed from the means of the empirical distributions. Published ranges for sensitivity analyses did not cover the ranges of the empirical distributions. In one example using lisinopril, the empirical mean cost of substitutable products was $444 (range $23–$953) as compared to a published estimate of $305 (range $51–$523). Conclusions Our Rules create a simple and transparent approach to create cost estimates of drug products and assess their variability. The approach is easily modified to include a subset of, or different weighting for, substitutable products. The derived empirical distribution is easily incorporated into one-way or probabilistic sensitivity analyses. PMID:25532826

  3. Oil and gas pipeline construction cost analysis and developing regression models for cost estimation

    NASA Astrophysics Data System (ADS)

    Thaduri, Ravi Kiran

    In this study, cost data for 180 pipelines and 136 compressor stations have been analyzed. On the basis of the distribution analysis, regression models have been developed. Material, Labor, ROW and miscellaneous costs make up the total cost of a pipeline construction. The pipelines are analyzed based on different pipeline lengths, diameter, location, pipeline volume and year of completion. In a pipeline construction, labor costs dominate the total costs with a share of about 40%. Multiple non-linear regression models are developed to estimate the component costs of pipelines for various cross-sectional areas, lengths and locations. The Compressor stations are analyzed based on the capacity, year of completion and location. Unlike the pipeline costs, material costs dominate the total costs in the construction of compressor station, with an average share of about 50.6%. Land costs have very little influence on the total costs. Similar regression models are developed to estimate the component costs of compressor station for various capacities and locations.

  4. Detailed cost estimate of reference residential photovoltaic designs

    SciTech Connect

    Palmer, R.S.; Penasa, D.A.; Thomas, M.G.

    1983-04-01

    This report presents estimated installation costs for four reference residential photovoltaic designs. Installation cost estimates ranged from $1.28 to $2.12/W/sub p/ for arrays installed by union labor (4.1 to 6.07 kW/sub p/-systems), and from $1.22 to $1.83 W/sub p/ for non-union installations. Standoff mounting was found to increase costs from $1.63/W/sub p/ to $2.12/W/sub p/ for a representative case, whereas 25 kWh of battery storage capacity increased installation costs from $1.44/W/sub p/ to $2.08/W/sub p/. Overall system costs (union-based were $6000 to $7000 for a 4.1 kW array in the northeast, to approx. $9000 for a 6.07 kW/sub p/ array in the southwest. This range of installation costs, approx. $1 to $2/W/sub p/ (in 1980 dollars), is representative of current installation costs for residential PV systems. Any future cost reductions are likely to be small and can be accomplished only by optimization of mounting techniques, module efficiencies, and module reliability in toto.

  5. Can student health professionals accurately estimate alcohol content in commonly occurring drinks?

    PubMed Central

    Sinclair, Julia; Searle, Emma

    2016-01-01

    Objectives: Correct identification of alcohol as a contributor to, or comorbidity of, many psychiatric diseases requires health professionals to be competent and confident to take an accurate alcohol history. Being able to estimate (or calculate) the alcohol content in commonly consumed drinks is a prerequisite for quantifying levels of alcohol consumption. The aim of this study was to assess this ability in medical and nursing students. Methods: A cross-sectional survey of 891 medical and nursing students across different years of training was conducted. Students were asked the alcohol content of 10 different alcoholic drinks by seeing a slide of the drink (with picture, volume and percentage of alcohol by volume) for 30 s. Results: Overall, the mean number of correctly estimated drinks (out of the 10 tested) was 2.4, increasing to just over 3 if a 10% margin of error was used. Wine and premium strength beers were underestimated by over 50% of students. Those who drank alcohol themselves, or who were further on in their clinical training, did better on the task, but overall the levels remained low. Conclusions: Knowledge of, or the ability to work out, the alcohol content of commonly consumed drinks is poor, and further research is needed to understand the reasons for this and the impact this may have on the likelihood to undertake screening or initiate treatment. PMID:27536344

  6. Ocean Lidar Measurements of Beam Attenuation and a Roadmap to Accurate Phytoplankton Biomass Estimates

    NASA Astrophysics Data System (ADS)

    Hu, Yongxiang; Behrenfeld, Mike; Hostetler, Chris; Pelon, Jacques; Trepte, Charles; Hair, John; Slade, Wayne; Cetinic, Ivona; Vaughan, Mark; Lu, Xiaomei; Zhai, Pengwang; Weimer, Carl; Winker, David; Verhappen, Carolus C.; Butler, Carolyn; Liu, Zhaoyan; Hunt, Bill; Omar, Ali; Rodier, Sharon; Lifermann, Anne; Josset, Damien; Hou, Weilin; MacDonnell, David; Rhew, Ray

    2016-06-01

    Beam attenuation coefficient, c, provides an important optical index of plankton standing stocks, such as phytoplankton biomass and total particulate carbon concentration. Unfortunately, c has proven difficult to quantify through remote sensing. Here, we introduce an innovative approach for estimating c using lidar depolarization measurements and diffuse attenuation coefficients from ocean color products or lidar measurements of Brillouin scattering. The new approach is based on a theoretical formula established from Monte Carlo simulations that links the depolarization ratio of sea water to the ratio of diffuse attenuation Kd and beam attenuation C (i.e., a multiple scattering factor). On July 17, 2014, the CALIPSO satellite was tilted 30° off-nadir for one nighttime orbit in order to minimize ocean surface backscatter and demonstrate the lidar ocean subsurface measurement concept from space. Depolarization ratios of ocean subsurface backscatter are measured accurately. Beam attenuation coefficients computed from the depolarization ratio measurements compare well with empirical estimates from ocean color measurements. We further verify the beam attenuation coefficient retrievals using aircraft-based high spectral resolution lidar (HSRL) data that are collocated with in-water optical measurements.

  7. mBEEF: An accurate semi-local Bayesian error estimation density functional

    NASA Astrophysics Data System (ADS)

    Wellendorff, Jess; Lundgaard, Keld T.; Jacobsen, Karsten W.; Bligaard, Thomas

    2014-04-01

    We present a general-purpose meta-generalized gradient approximation (MGGA) exchange-correlation functional generated within the Bayesian error estimation functional framework [J. Wellendorff, K. T. Lundgaard, A. Møgelhøj, V. Petzold, D. D. Landis, J. K. Nørskov, T. Bligaard, and K. W. Jacobsen, Phys. Rev. B 85, 235149 (2012)]. The functional is designed to give reasonably accurate density functional theory (DFT) predictions of a broad range of properties in materials physics and chemistry, while exhibiting a high degree of transferability. Particularly, it improves upon solid cohesive energies and lattice constants over the BEEF-vdW functional without compromising high performance on adsorption and reaction energies. We thus expect it to be particularly well-suited for studies in surface science and catalysis. An ensemble of functionals for error estimation in DFT is an intrinsic feature of exchange-correlation models designed this way, and we show how the Bayesian ensemble may provide a systematic analysis of the reliability of DFT based simulations.

  8. Greater contrast in Martian hydrological history from more accurate estimates of paleodischarge

    NASA Astrophysics Data System (ADS)

    Jacobsen, R. E.; Burr, D. M.

    2016-09-01

    Correlative width-discharge relationships from the Missouri River Basin are commonly used to estimate fluvial paleodischarge on Mars. However, hydraulic geometry provides alternative, and causal, width-discharge relationships derived from broader samples of channels, including those in reduced-gravity (submarine) environments. Comparison of these relationships implies that causal relationships from hydraulic geometry should yield more accurate and more precise discharge estimates. Our remote analysis of a Martian-terrestrial analog channel, combined with in situ discharge data, substantiates this implication. Applied to Martian features, these results imply that paleodischarges of interior channels of Noachian-Hesperian (~3.7 Ga) valley networks have been underestimated by a factor of several, whereas paleodischarges for smaller fluvial deposits of the Late Hesperian-Early Amazonian (~3.0 Ga) have been overestimated. Thus, these new paleodischarges significantly magnify the contrast between early and late Martian hydrologic activity. Width-discharge relationships from hydraulic geometry represent validated tools for quantifying fluvial input near candidate landing sites of upcoming missions.

  9. Estimation of traffic accident costs: a prompted model.

    PubMed

    Hejazi, Rokhshad; Shamsudin, Mad Nasir; Radam, Alias; Rahim, Khalid Abdul; Ibrahim, Zelina Zaitun; Yazdani, Saeed

    2013-01-01

    Traffic accidents are the reason for 25% of unnatural deaths in Iran. The main objective of this study is to find a simple model for the estimation of economic costs especially in Islamic countries (like Iran) in a straightforward manner. The model can show the magnitude of traffic accident costs with monetary equivalent. Data were collected from different sources that included traffic police records, insurance companies and hospitals. The conceptual framework, in our study, was based on the method of Ayati. He used this method for the estimation of economic costs in Iran. We promoted his method via minimum variables. Our final model has only three available variables which can be taken from insurance companies and police records. The running model showed that the traffic accident costs were US$2.2 million in 2007 for our case study route.

  10. Estimating the Costs of Educating Handicapped Children: A Resource-Cost Model Approach. Final Report.

    ERIC Educational Resources Information Center

    Hartman, William T.

    The research described in this report attempts to estimate the costs of providing an appropriate education to all school-aged handicapped children by 1980-81. The study begins by addressing the aspects of special education that will help to predict future costs--patterns of growth to the present, legal and political mandates, the nature of various…

  11. 48 CFR 1552.216-76 - Estimated cost and cost-sharing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Estimated cost and cost-sharing. 1552.216-76 Section 1552.216-76 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and...

  12. 48 CFR 1552.216-76 - Estimated cost and cost-sharing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Estimated cost and cost-sharing. 1552.216-76 Section 1552.216-76 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and...

  13. Estimating the Costs of Educating Handicapped Children: A Resource-Cost Model Approach--Summary Report.

    ERIC Educational Resources Information Center

    Hartman, William T.

    The purpose of this study was to develop an appropriate methodology and use it to estimate the costs of providing appropriate special education programs and services for all school-aged handicapped children in the U.S. in 1980-81. A resource-cost model approach was selected, based on a mathematical formulation of the relationships among students,…

  14. Biodiversity on Swedish pastures: estimating biodiversity production costs.

    PubMed

    Nilsson, Fredrik Olof Laurentius

    2009-01-01

    This paper estimates the costs of producing biological diversity on Swedish permanent grasslands. A simple model is introduced where biodiversity on pastures is produced using grazing animals. On the pastures, the grazing animals create a sufficient grazing pressure to lead to an environment that suits many rare and red-listed species. Two types of pastures are investigated: semi-natural and cultivated. Biological diversity produced on a pasture is estimated by combining a biodiversity indicator, which measures the quality of the land, with the size of the pasture. Biodiversity is, in this context, a quantitative measure where a given quantity can be produced either by small area with high quality or a larger area with lower quality. Two areas in different parts of Sweden are investigated. Box-Cox transformations, which provide flexible functional forms, are used in the empirical analysis and the results indicate that the biodiversity production costs differ between the regions. The major contribution of this paper is that it develops and tests a method of estimating biodiversity production costs on permanent pastures when biodiversity quality differs between pastures. If the method were to be used with cost data, that were more thoroughly collected and covered additional production areas, biodiversity cost functions could be estimated and used in applied policy work. PMID:18079049

  15. Utilizing Expert Knowledge in Estimating Future STS Costs

    NASA Technical Reports Server (NTRS)

    Fortner, David B.; Ruiz-Torres, Alex J.

    2004-01-01

    A method of estimating the costs of future space transportation systems (STSs) involves classical activity-based cost (ABC) modeling combined with systematic utilization of the knowledge and opinions of experts to extend the process-flow knowledge of existing systems to systems that involve new materials and/or new architectures. The expert knowledge is particularly helpful in filling gaps that arise in computational models of processes because of inconsistencies in historical cost data. Heretofore, the costs of planned STSs have been estimated following a "top-down" approach that tends to force the architectures of new systems to incorporate process flows like those of the space shuttles. In this ABC-based method, one makes assumptions about the processes, but otherwise follows a "bottoms up" approach that does not force the new system architecture to incorporate a space-shuttle-like process flow. Prototype software has been developed to implement this method. Through further development of software, it should be possible to extend the method beyond the space program to almost any setting in which there is a need to estimate the costs of a new system and to extend the applicable knowledge base in order to make the estimate.

  16. Biodiversity on Swedish pastures: estimating biodiversity production costs.

    PubMed

    Nilsson, Fredrik Olof Laurentius

    2009-01-01

    This paper estimates the costs of producing biological diversity on Swedish permanent grasslands. A simple model is introduced where biodiversity on pastures is produced using grazing animals. On the pastures, the grazing animals create a sufficient grazing pressure to lead to an environment that suits many rare and red-listed species. Two types of pastures are investigated: semi-natural and cultivated. Biological diversity produced on a pasture is estimated by combining a biodiversity indicator, which measures the quality of the land, with the size of the pasture. Biodiversity is, in this context, a quantitative measure where a given quantity can be produced either by small area with high quality or a larger area with lower quality. Two areas in different parts of Sweden are investigated. Box-Cox transformations, which provide flexible functional forms, are used in the empirical analysis and the results indicate that the biodiversity production costs differ between the regions. The major contribution of this paper is that it develops and tests a method of estimating biodiversity production costs on permanent pastures when biodiversity quality differs between pastures. If the method were to be used with cost data, that were more thoroughly collected and covered additional production areas, biodiversity cost functions could be estimated and used in applied policy work.

  17. 48 CFR 1615.406-2 - Certificates of accurate cost or pricing data for community rated carriers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION... knowledge and belief: (1) The cost or pricing data submitted (or, if not submitted, maintained and... with the requirements of 48 CFR Chapter 16 and the FEHB Program contract and are accurate,...

  18. Estimating and reducing company wide property abandonment costs

    SciTech Connect

    Johnstone, J.E.; Neuser, J.M.; Fritts, J.M.

    1995-12-01

    Many domestic oil and gas properties and production facilities are at the end, or rapidly approaching the end, of their useful economic lives. Corporate management is becoming increasingly concerned with the costs associated with the abandonment of these properties and their potential impact in long range planning scenarios and property evaluations. Abandonment and environmental remediation costs are generally expected to rise in the future due to increased regulation and more rigorous clean-up standards. Identifying these costs and raising management`s awareness of these costs in today`s operations will help management plan for this approaching problem and generate ways to reduce those costs. The paper describes the steps necessary to organize rd manage a successful abandonment and remediation cost study of company wide magnitude. Areas discussed include organization of the review teams, field data gathering, determination and application of historical cost data, and actual cost estimation. The paper presents typical results of the study for various types of production fields facilities as well as ideas for actions today to reduce future abandonment and remediation costs.

  19. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  20. A process model to estimate biodiesel production costs.

    PubMed

    Haas, Michael J; McAloon, Andrew J; Yee, Winnie C; Foglia, Thomas A

    2006-03-01

    'Biodiesel' is the name given to a renewable diesel fuel that is produced from fats and oils. It consists of the simple alkyl esters of fatty acids, most typically the methyl esters. We have developed a computer model to estimate the capital and operating costs of a moderately-sized industrial biodiesel production facility. The major process operations in the plant were continuous-process vegetable oil transesterification, and ester and glycerol recovery. The model was designed using contemporary process simulation software, and current reagent, equipment and supply costs, following current production practices. Crude, degummed soybean oil was specified as the feedstock. Annual production capacity of the plant was set at 37,854,118 l (10 x 10(6)gal). Facility construction costs were calculated to be US dollar 11.3 million. The largest contributors to the equipment cost, accounting for nearly one third of expenditures, were storage tanks to contain a 25 day capacity of feedstock and product. At a value of US dollar 0.52/kg (dollar 0.236/lb) for feedstock soybean oil, a biodiesel production cost of US dollar 0.53/l (dollar 2.00/gal) was predicted. The single greatest contributor to this value was the cost of the oil feedstock, which accounted for 88% of total estimated production costs. An analysis of the dependence of production costs on the cost of the feedstock indicated a direct linear relationship between the two, with a change of US dollar 0.020/l (dollar 0.075/gal) in product cost per US dollar 0.022/kg (dollar 0.01/lb) change in oil cost. Process economics included the recovery of coproduct glycerol generated during biodiesel production, and its sale into the commercial glycerol market as an 80% w/w aqueous solution, which reduced production costs by approximately 6%. The production cost of biodiesel was found to vary inversely and linearly with variations in the market value of glycerol, increasing by US dollar 0.0022/l (dollar 0.0085/gal) for every US

  1. A novel cost function to estimate parameters of oscillatory biochemical systems

    PubMed Central

    2012-01-01

    Oscillatory pathways are among the most important classes of biochemical systems with examples ranging from circadian rhythms and cell cycle maintenance. Mathematical modeling of these highly interconnected biochemical networks is needed to meet numerous objectives such as investigating, predicting and controlling the dynamics of these systems. Identifying the kinetic rate parameters is essential for fully modeling these and other biological processes. These kinetic parameters, however, are not usually available from measurements and most of them have to be estimated by parameter fitting techniques. One of the issues with estimating kinetic parameters in oscillatory systems is the irregularities in the least square (LS) cost function surface used to estimate these parameters, which is caused by the periodicity of the measurements. These irregularities result in numerous local minima, which limit the performance of even some of the most robust global optimization algorithms. We proposed a parameter estimation framework to address these issues that integrates temporal information with periodic information embedded in the measurements used to estimate these parameters. This periodic information is used to build a proposed cost function with better surface properties leading to fewer local minima and better performance of global optimization algorithms. We verified for three oscillatory biochemical systems that our proposed cost function results in an increased ability to estimate accurate kinetic parameters as compared to the traditional LS cost function. We combine this cost function with an improved noise removal approach that leverages periodic characteristics embedded in the measurements to effectively reduce noise. The results provide strong evidence on the efficacy of this noise removal approach over the previous commonly used wavelet hard-thresholding noise removal methods. This proposed optimization framework results in more accurate kinetic parameters that

  2. Accurate Visual Heading Estimation at High Rotation Rate Without Oculomotor or Static-Depth Cues

    NASA Technical Reports Server (NTRS)

    Stone, Leland S.; Perrone, John A.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    It has been claimed that either oculomotor or static depth cues provide the signals about self-rotation necessary approx.-1 deg/s. We tested this hypothesis by simulating self-motion along a curved path with the eyes fixed in the head (plus or minus 16 deg/s of rotation). Curvilinear motion offers two advantages: 1) heading remains constant in retinotopic coordinates, and 2) there is no visual-oculomotor conflict (both actual and simulated eye position remain stationary). We simulated 400 ms of rotation combined with 16 m/s of translation at fixed angles with respect to gaze towards two vertical planes of random dots initially 12 and 24 m away, with a field of view of 45 degrees. Four subjects were asked to fixate a central cross and to respond whether they were translating to the left or right of straight-ahead gaze. From the psychometric curves, heading bias (mean) and precision (semi-interquartile) were derived. The mean bias over 2-5 runs was 3.0, 4.0, -2.0, -0.4 deg for the first author and three naive subjects, respectively (positive indicating towards the rotation direction). The mean precision was 2.0, 1.9, 3.1, 1.6 deg. respectively. The ability of observers to make relatively accurate and precise heading judgments, despite the large rotational flow component, refutes the view that extra-flow-field information is necessary for human visual heading estimation at high rotation rates. Our results support models that process combined translational/rotational flow to estimate heading, but should not be construed to suggest that other cues do not play an important role when they are available to the observer.

  3. An Instructional Cost Estimation Model for the XYZ Community College.

    ERIC Educational Resources Information Center

    Edmonson, William F.

    An enrollment-driven model for estimating instructional costs is presented in this paper as developed by the Western Interstate Commission for Higher Education (WICHE). After stating the principles of the WICHE planning system (i.e., various categories of data are gathered, segmented, and then cross-tabulated against one another to yield certain…

  4. Stochastic Estimation of Cost Frontier: Evidence from Bangladesh

    ERIC Educational Resources Information Center

    Mamun, Shamsul Arifeen Khan

    2012-01-01

    In the literature of higher education cost function study, enough knowledge is created in the area of economy scale in the context of developed countries but the knowledge of input demand is lacking. On the other hand, empirical knowledge in the context of developing countries is very meagre. The paper fills up the knowledge gap, estimating a…

  5. Common Day Care Safety Renovations: Descriptions, Explanations and Cost Estimates.

    ERIC Educational Resources Information Center

    Spack, Stan

    This booklet explains some of the day care safety features specified by the new Massachusetts State Building Code (January 1, 1975) which must be met before a new day care center can be licensed. The safety features described are those which most often require renovation to meet the building code standards. Best estimates of the costs involved in…

  6. Estimating the extra cost of living with disability in Vietnam.

    PubMed

    Minh, Hoang Van; Giang, Kim Bao; Liem, Nguyen Thanh; Palmer, Michael; Thao, Nguyen Phuong; Duong, Le Bach

    2015-01-01

    Disability is shown to be both a cause and a consequence of poverty. However, relatively little research has investigated the economic cost of living with a disability. This study reports the results of a study on the extra cost of living with disability in Vietnam in 2011. The study was carried out in eight cities/provinces in Vietnam, including Hanoi and Ho Chi Minh cities (two major metropolitan in Vietnam) and six provinces from each of the six socio-economic regions in Vietnam. Costs are estimated using the standard of living approach whereby the difference in incomes between people with disability and those without disability for a given standard of living serves as a proxy for the cost of living with disability. The extra cost of living with disability in Vietnam accounted for about 8.8-9.5% of annual household income, or valued about US$200-218. Communication difficulty was shown to result in highest additional cost of living with disability and self-care difficulty was shown to lead to the lowest levels of extra of living cost. The extra cost of living with disability increased as people had more severe impairment. Interventions to promote the economic security of livelihood for people with disabilities are needed.

  7. Optimization of Correlation Kernel Size for Accurate Estimation of Myocardial Contraction and Relaxation

    NASA Astrophysics Data System (ADS)

    Honjo, Yasunori; Hasegawa, Hideyuki; Kanai, Hiroshi

    2012-07-01

    rates estimated using different kernel sizes were examined using the normalized mean-squared error of the estimated strain rate from the actual one obtained by the 1D phase-sensitive method. Compared with conventional kernel sizes, this result shows the possibility of the proposed correlation kernel to enable more accurate measurement of the strain rate. In in vivo measurement, the regional instantaneous velocities and strain rates in the radial direction of the heart wall were analyzed in detail at an extremely high temporal resolution (frame rate of 860 Hz). In this study, transition in contraction and relaxation was able to be detected by 2D tracking. These results indicate the potential of this method in the high-accuracy estimation of the strain rates and detailed analyses of the physiological function of the myocardium.

  8. Verify by Genability - Providing Solar Customers with Accurate Reports of Utility Bill Cost Savings

    SciTech Connect

    2015-12-01

    The National Renewable Energy Laboratory (NREL), partnering with Genability and supported by the U.S. Department of Energy's SunShot Incubator program, independently verified the accuracy of Genability's monthly cost savings.

  9. Estimating resource use and cost of prophylactic management of neutropenia with filgrastim.

    PubMed

    Annemans, L; Van Overbeke, N; Standaert, B; Van Belle, S

    2005-05-01

    The study objective is to develop a methodology for the measurement of time, resource use and cost of the prophylactic management of neutropenia with filgrastim in different settings where the drug is routinely used: in-hospital care, outpatient care and home care. The activity-based costing method is used to analyse the cost of managing prophylactically neutropenia and comprises four steps. First, department heads in each of the chosen settings were selected and interviewed to obtain key elements in the workflow that involves the management of neutropenia, followed by the second step involving in-depth, structured interviews of key personnel. The third step was the measurement of the time required for frequently occurring activities in monitoring neutropenia and the administration of filgrastim by a study nurse. Finally, information on resource unit costs and personnel salaries were collected from the administration units to calculate an average cost. Sensitivity analyses were undertaken on estimated variables in the study. A list of eight to 14 consecutive activities linked to the prophylactic management of neutropenia was observed. The number and type of activities do not differ between an in-hospital oncology ward and an outpatient setting except for blood samplings. The difference is more pronounced between hospital and home care settings, as in the latter the patient performs many of the activities him/herself. The cost estimate per setting for prophylactic drug use is 6.30 Euros for in-hospital care, 3.67 Euros for outpatient care and 5.49 Euros for home care. Taking the two most frequently occurring scenarios per chemotherapy cycle (i.e. with or without febrile neutropenia), the following cost estimates are obtained: 60.41 Euros for a patient with febrile neutropenia and 56.77 Euros for a patient without febrile neutropenia, excluding drug costs. With the activity-based costing method it is possible to accurately demonstrate cost savings in the management

  10. Estimating the Cost of NASA's Space Launch Initiative: How SLI Cost Stack Up Against the Shuttle

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph H.; Roth, Axel (Technical Monitor)

    2002-01-01

    NASA is planning to replace the Space Shuttle with a new completely reusable Second Generation Launch System by approximately 2012. Numerous contracted and NASA in-house Space Transportation Architecture Studies and various technology maturation activities are proceeding and have resulted in scores of competing architecture configurations being proposed. Life cycle cost is a key discriminator between all these various concepts. However, the one obvious analogy for costing purposes remains the current Shuttle system. Are there credible reasons to believe that a second generation reusable launch system can be accomplished at less cost than the Shuttle? The need for a credible answer to this question is critical. This paper reviews the cost estimating approaches being used by the contractors and the government estimators to address this issue and explores the rationale behind the numbers.

  11. Building of an Experimental Cline With Arabidopsis thaliana to Estimate Herbicide Fitness Cost

    PubMed Central

    Roux, Fabrice; Giancola, Sandra; Durand, Stéphanie; Reboud, Xavier

    2006-01-01

    Various management strategies aim at maintaining pesticide resistance frequency under a threshold value by taking advantage of the benefit of the fitness penalty (the cost) expressed by the resistance allele outside the treated area or during the pesticide selection “off years.” One method to estimate a fitness cost is to analyze the resistance allele frequency along transects across treated and untreated areas. On the basis of the shape of the cline, this method gives the relative contributions of both gene flow and the fitness difference between genotypes in the treated and untreated areas. Taking advantage of the properties of such migration–selection balance, an artificial cline was built up to optimize the conditions where the fitness cost of two herbicide-resistant mutants (acetolactate synthase and auxin-induced target genes) in the model species Arabidopsis thaliana could be more accurately measured. The analysis of the microevolutionary dynamics in these experimental populations indicated mean fitness costs of ∼15 and 92% for the csr1-1 and axr2-1 resistances, respectively. In addition, negative frequency dependence for the fitness cost was also detected for the axr2-1 resistance. The advantages and disadvantages of the cline approach are discussed in regard to other methods of cost estimation. This comparison highlights the powerful ability of an experimental cline to measure low fitness costs and detect sensibility to frequency-dependent variations. PMID:16582450

  12. Estimating age-based antiretroviral therapy costs for HIV-infected children in resource-limited settings based on World Health Organization weight-based dosing recommendations

    PubMed Central

    2014-01-01

    Background Pediatric antiretroviral therapy (ART) has been shown to substantially reduce morbidity and mortality in HIV-infected infants and children. To accurately project program costs, analysts need accurate estimations of antiretroviral drug (ARV) costs for children. However, the costing of pediatric antiretroviral therapy is complicated by weight-based dosing recommendations which change as children grow. Methods We developed a step-by-step methodology for estimating the cost of pediatric ARV regimens for children ages 0–13 years old. The costing approach incorporates weight-based dosing recommendations to provide estimated ARV doses throughout childhood development. Published unit drug costs are then used to calculate average monthly drug costs. We compared our derived monthly ARV costs to published estimates to assess the accuracy of our methodology. Results The estimates of monthly ARV costs are provided for six commonly used first-line pediatric ARV regimens, considering three possible care scenarios. The costs derived in our analysis for children were fairly comparable to or slightly higher than available published ARV drug or regimen estimates. Conclusions The methodology described here can be used to provide an accurate estimation of pediatric ARV regimen costs for cost-effectiveness analysts to project the optimum packages of care for HIV-infected children, as well as for program administrators and budget analysts who wish to assess the feasibility of increasing pediatric ART availability in constrained budget environments. PMID:24885453

  13. Skin Temperature Over the Carotid Artery, an Accurate Non-invasive Estimation of Near Core Temperature

    PubMed Central

    Imani, Farsad; Karimi Rouzbahani, Hamid Reza; Goudarzi, Mehrdad; Tarrahi, Mohammad Javad; Ebrahim Soltani, Alireza

    2016-01-01

    Background: During anesthesia, continuous body temperature monitoring is essential, especially in children. Anesthesia can increase the risk of loss of body temperature by three to four times. Hypothermia in children results in increased morbidity and mortality. Since the measurement points of the core body temperature are not easily accessible, near core sites, like rectum, are used. Objectives: The purpose of this study was to measure skin temperature over the carotid artery and compare it with the rectum temperature, in order to propose a model for accurate estimation of near core body temperature. Patients and Methods: Totally, 124 patients within the age range of 2 - 6 years, undergoing elective surgery, were selected. Temperature of rectum and skin over the carotid artery was measured. Then, the patients were randomly divided into two groups (each including 62 subjects), namely modeling (MG) and validation groups (VG). First, in the modeling group, the average temperature of the rectum and skin over the carotid artery were measured separately. The appropriate model was determined, according to the significance of the model’s coefficients. The obtained model was used to predict the rectum temperature in the second group (VG group). Correlation of the predicted values with the real values (the measured rectum temperature) in the second group was investigated. Also, the difference in the average values of these two groups was examined in terms of significance. Results: In the modeling group, the average rectum and carotid temperatures were 36.47 ± 0.54°C and 35.45 ± 0.62°C, respectively. The final model was obtained, as follows: Carotid temperature × 0.561 + 16.583 = Rectum temperature. The predicted value was calculated based on the regression model and then compared with the measured rectum value, which showed no significant difference (P = 0.361). Conclusions: The present study was the first research, in which rectum temperature was compared with that

  14. Cost estimate for muddy water palladium production facility at Mound

    SciTech Connect

    McAdams, R.K.

    1988-11-30

    An economic feasibility study was performed on the ''Muddy Water'' low-chlorine content palladium powder production process developed by Mound. The total capital investment and total operating costs (dollars per gram) were determined for production batch sizes of 1--10 kg in 1-kg increments. The report includes a brief description of the Muddy Water process, the process flow diagram, and material balances for the various production batch sizes. Two types of facilities were evaluated--one for production of new, ''virgin'' palladium powder, and one for recycling existing material. The total capital investment for virgin facilities ranged from $600,000 --$1.3 million for production batch sizes of 1--10 kg, respectively. The range for recycle facilities was $1--$2.3 million. The total operating cost for 100% acceptable powder production in the virgin facilities ranged from $23 per gram for a 1-kg production batch size to $8 per gram for a 10-kg batch size. Similarly for recycle facilities, the total operating cost ranged from $34 per gram to $5 per gram. The total operating cost versus product acceptability (ranging from 50%--100% acceptability) was also evaluated for both virgin and recycle facilities. Because production sizes studied vary widely and because scale-up factors are unknown for batch sizes greater than 1 kg, all costs are ''order-of-magnitude'' estimates. All costs reported are in 1987 dollars.

  15. Cost Estimation of Laser Additive Manufacturing of Stainless Steel

    NASA Astrophysics Data System (ADS)

    Piili, Heidi; Happonen, Ari; Väistö, Tapio; Venkataramanan, Vijaikrishnan; Partanen, Jouni; Salminen, Antti

    Laser additive manufacturing (LAM) is a layer wise fabrication method in which a laser beam melts metallic powder to form solid objects. Although 3D printing has been invented 30 years ago, the industrial use is quite limited whereas the introduction of cheap consumer 3D printers, in recent years, has familiarized the 3D printing. Interest is focused more and more in manufacturing of functional parts. Aim of this study is to define and discuss the current economic opportunities and restrictions of LAM process. Manufacturing costs were studied with different build scenarios each with estimated cost structure by calculated build time and calculating the costs of the machine, material and energy with optimized machine utilization. All manufacturing and time simulations in this study were carried out with a research machine equal to commercial EOS M series equipment. The study shows that the main expense in LAM is the investment cost of the LAM machine, compared to which the relative proportions of the energy and material costs are very low. The manufacturing time per part is the key factor to optimize costs of LAM.

  16. Man power/cost estimation model: Automated planetary projects

    NASA Technical Reports Server (NTRS)

    Kitchen, L. D.

    1975-01-01

    A manpower/cost estimation model is developed which is based on a detailed level of financial analysis of over 30 million raw data points which are then compacted by more than three orders of magnitude to the level at which the model is applicable. The major parameter of expenditure is manpower (specifically direct labor hours) for all spacecraft subsystem and technical support categories. The resultant model is able to provide a mean absolute error of less than fifteen percent for the eight programs comprising the model data base. The model includes cost saving inheritance factors, broken down in four levels, for estimating follow-on type programs where hardware and design inheritance are evident or expected.

  17. Estimating pressurized water reactor decommissioning costs: A user`s manual for the PWR Cost Estimating Computer Program (CECP) software. Draft report for comment

    SciTech Connect

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  18. Estimating boiling water reactor decommissioning costs. A user`s manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    SciTech Connect

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  19. Estimating boiling water reactor decommissioning costs: A user`s manual for the BWR Cost Estimating Computer Program (CECP) software. Final report

    SciTech Connect

    Bierschbach, M.C.

    1996-06-01

    Nuclear power plant licensees are required to submit to the US Nuclear Regulatory Commission (NRC) for review their decommissioning cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning boiling water reactor (BWR) power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  20. Decommissioning Cost Estimating Factors And Earned Value Integration

    SciTech Connect

    Sanford, P.C.; Cimmarron, E.

    2008-07-01

    The Rocky Flats 771 Project progressed from the planning stage of decommissioning a plutonium facility, through the strip-out of highly-contaminated equipment, removal of utilities and structural decontamination, and building demolition. Actual cost data was collected from the strip-out activities and compared to original estimates, allowing the development of cost by equipment groupings and types and over time. Separate data was developed from the project control earned value reporting and compared with the equipment data. The paper discusses the analysis to develop the detailed factors for the different equipment types, and the items that need to be considered during characterization of a similar facility when preparing an estimate. The factors are presented based on direct labor requirements by equipment type. The paper also includes actual support costs, and examples of fixed or one-time start-up costs. The integration of the estimate and the earned value system used for the 771 Project is also discussed. The paper covers the development of the earned value system as well as its application to a facility to be decommissioned and an existing work breakdown structure. Lessons learned are provided, including integration with scheduling and craft supervision, measurement approaches, and verification of scope completion. In summary: The work of decommissioning the Rocky Flats 771 Project process equipment was completed in 2003. Early in the planning process, we had difficulty in identifying credible data and implementing processes for estimating and controlling this work. As the project progressed, we were able to collect actual data on the costs of removing plutonium contaminated equipment from various areas over the life of this work and associate those costs with individual pieces of equipment. We also were able to develop and test out a system for measuring the earned value of a decommissioning project based on an evolving estimate. These were elements that

  1. A Quantitative Method for Estimating Probable Public Costs of Hurricanes.

    PubMed

    BOSWELL; DEYLE; SMITH; BAKER

    1999-04-01

    / A method is presented for estimating probable public costs resulting from damage caused by hurricanes, measured as local government expenditures approved for reimbursement under the Stafford Act Section 406 Public Assistance Program. The method employs a multivariate model developed through multiple regression analysis of an array of independent variables that measure meteorological, socioeconomic, and physical conditions related to the landfall of hurricanes within a local government jurisdiction. From the regression analysis we chose a log-log (base 10) model that explains 74% of the variance in the expenditure data using population and wind speed as predictors. We illustrate application of the method for a local jurisdiction-Lee County, Florida, USA. The results show that potential public costs range from $4.7 million for a category 1 hurricane with winds of 137 kilometers per hour (85 miles per hour) to $130 million for a category 5 hurricane with winds of 265 kilometers per hour (165 miles per hour). Based on these figures, we estimate expected annual public costs of $2.3 million. These cost estimates: (1) provide useful guidance for anticipating the magnitude of the federal, state, and local expenditures that would be required for the array of possible hurricanes that could affect that jurisdiction; (2) allow policy makers to assess the implications of alternative federal and state policies for providing public assistance to jurisdictions that experience hurricane damage; and (3) provide information needed to develop a contingency fund or other financial mechanism for assuring that the community has sufficient funds available to meet its obligations. KEY WORDS: Hurricane; Public costs; Local government; Disaster recovery; Disaster response; Florida; Stafford Act

  2. Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Douglas, Freddie; Bourgeois, Edit Kaminsky

    2005-01-01

    The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).

  3. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  4. Cost estimate for a proposed GDF Suez LNG testing program

    SciTech Connect

    Blanchat, Thomas K.; Brady, Patrick Dennis; Jernigan, Dann A.; Luketa, Anay Josephine; Nissen, Mark R.; Lopez, Carlos; Vermillion, Nancy; Hightower, Marion Michael

    2014-02-01

    At the request of GDF Suez, a Rough Order of Magnitude (ROM) cost estimate was prepared for the design, construction, testing, and data analysis for an experimental series of large-scale (Liquefied Natural Gas) LNG spills on land and water that would result in the largest pool fires and vapor dispersion events ever conducted. Due to the expected cost of this large, multi-year program, the authors utilized Sandia's structured cost estimating methodology. This methodology insures that the efforts identified can be performed for the cost proposed at a plus or minus 30 percent confidence. The scale of the LNG spill, fire, and vapor dispersion tests proposed by GDF could produce hazard distances and testing safety issues that need to be fully explored. Based on our evaluations, Sandia can utilize much of our existing fire testing infrastructure for the large fire tests and some small dispersion tests (with some modifications) in Albuquerque, but we propose to develop a new dispersion testing site at our remote test area in Nevada because of the large hazard distances. While this might impact some testing logistics, the safety aspects warrant this approach. In addition, we have included a proposal to study cryogenic liquid spills on water and subsequent vaporization in the presence of waves. Sandia is working with DOE on applications that provide infrastructure pertinent to wave production. We present an approach to conduct repeatable wave/spill interaction testing that could utilize such infrastructure.

  5. 48 CFR 2452.216-80 - Estimated cost and fixed-fee.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Estimated cost and fixed... Clauses 2452.216-80 Estimated cost and fixed-fee. As prescribed in 2416.307(b), insert the following clause: Estimated Cost And Fixed-Fee (DEC 2012) (a) It is estimated that the total cost to the...

  6. 48 CFR 1852.216-74 - Estimated cost and fixed fee.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Estimated cost and fixed... and Clauses 1852.216-74 Estimated cost and fixed fee. As prescribed in 1816.307-70(b), insert the following clause: Estimated Cost and Fixed Fee (DEC 1991) The estimated cost of this contract...

  7. 48 CFR 1852.216-74 - Estimated cost and fixed fee.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Estimated cost and fixed... and Clauses 1852.216-74 Estimated cost and fixed fee. As prescribed in 1816.307-70(b), insert the following clause: Estimated Cost and Fixed Fee (DEC 1991) The estimated cost of this contract...

  8. 48 CFR 1852.216-74 - Estimated cost and fixed fee.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Estimated cost and fixed... and Clauses 1852.216-74 Estimated cost and fixed fee. As prescribed in 1816.307-70(b), insert the following clause: Estimated Cost and Fixed Fee (DEC 1991) The estimated cost of this contract...

  9. 48 CFR 1852.216-85 - Estimated cost and award fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and award... and Clauses 1852.216-85 Estimated cost and award fee. As prescribed in 1816.406-70(e), insert the following clause: Estimated Cost and Award Fee (SEP 1993) The estimated cost of this contract is $___....

  10. 48 CFR 1852.216-74 - Estimated cost and fixed fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and fixed... and Clauses 1852.216-74 Estimated cost and fixed fee. As prescribed in 1816.307-70(b), insert the following clause: Estimated Cost and Fixed Fee (DEC 1991) The estimated cost of this contract...

  11. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture

  12. Using average cost methods to estimate encounter-level costs for medical-surgical stays in the VA.

    PubMed

    Wagner, Todd H; Chen, Shuo; Barnett, Paul G

    2003-09-01

    The U.S. Department of Veterans Affairs (VA) maintains discharge abstracts, but these do not include cost information. This article describes the methods the authors used to estimate the costs of VA medical-surgical hospitalizations in fiscal years 1998 to 2000. They estimated a cost regression with 1996 Medicare data restricted to veterans receiving VA care in an earlier year. The regression accounted for approximately 74 percent of the variance in cost-adjusted charges, and it proved to be robust to outliers and the year of input data. The beta coefficients from the cost regression were used to impute costs of VA medical-surgical hospital discharges. The estimated aggregate costs were reconciled with VA budget allocations. In addition to the direct medical costs, their cost estimates include indirect costs and physician services; both of these were allocated in proportion to direct costs. They discuss the method's limitations and application in other health care systems. PMID:15095543

  13. Cost estimation of timber bridges using neural networks

    SciTech Connect

    Creese, R.C.; Li. L.

    1995-05-01

    Neural network models, or more simply {open_quotes}neural nets,{close_quotes} have great potential application in speech and image recognition. They also have great potential for cost estimating. Neural networks are particularly effective for complex estimation where the relationship between the output and the input cannot be expressed by simple mathematic relationships. A neural network method was applied to the cost estimation of timber bridges to illustrate the technique. The results of the neural network method were evaluated by the coefficient of determination, The R square value for the key input variables. A comparison of the neural network results and the standard linear regression results was performed upon the timber bridge data. A step-by-step validation is presented to make it easy to understand the application of neural networks to this estimation process. The input is propagated from the input through each layer until an output is generated. The output is compared with the desired output and the error is distributed for each node in the outer layer. The error is transmitted backward (thus the phase {open_quotes}back propagation{close_quotes}) from the output layer to the intermediate layers and then to the input layer. Based upon the errors, the weights are adjusted and the procedure is repeated. The number of training cycles is 15,000 to 50,000 for simple networks, but this usually takes only a few minutes on a personal computer. 7 refs., 4 figs., 11 tabs.

  14. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  15. Determining Sample Size for Accurate Estimation of the Squared Multiple Correlation Coefficient.

    ERIC Educational Resources Information Center

    Algina, James; Olejnik, Stephen

    2000-01-01

    Discusses determining sample size for estimation of the squared multiple correlation coefficient and presents regression equations that permit determination of the sample size for estimating this parameter for up to 20 predictor variables. (SLD)

  16. Cost-effective accurate coarse-grid method for highly convective multidimensional unsteady flows

    NASA Technical Reports Server (NTRS)

    Leonard, B. P.; Niknafs, H. S.

    1991-01-01

    A fundamentally multidimensional convection scheme is described based on vector transient interpolation modeling rewritten in conservative control-volume form. Vector third-order upwinding is used as the basis of the algorithm; this automatically introduces important cross-difference terms that are absent from schemes using component-wise one-dimensional formulas. Third-order phase accuracy is good; this is important for coarse-grid large-eddy or full simulation. Potential overshoots or undershoots are avoided by using a recently developed universal limiter. Higher order accuracy is obtained locally, where needed, by the cost-effective strategy of adaptive stencil expansion in a direction normal to each control-volume face; this is controlled by monitoring the absolute normal gradient and curvature across the face. Higher (than third) order cross-terms do not appear to be needed. Since the wider stencil is used only in isolated narrow regions (near discontinuities), extremely high (in this case, seventh) order accuracy can be achieved for little more than the cost of a globally third-order scheme.

  17. Estimating cost ratio distribution between fatal and non-fatal road accidents in Malaysia

    NASA Astrophysics Data System (ADS)

    Hamdan, Nurhidayah; Daud, Noorizam

    2014-07-01

    Road traffic crashes are a global major problem, and should be treated as a shared responsibility. In Malaysia, road accident tragedies kill 6,917 people and injure or disable 17,522 people in year 2012, and government spent about RM9.3 billion in 2009 which cost the nation approximately 1 to 2 percent loss of gross domestic product (GDP) reported annually. The current cost ratio for fatal and non-fatal accident used by Ministry of Works Malaysia simply based on arbitrary value of 6:4 or equivalent 1.5:1 depends on the fact that there are six factors involved in the calculation accident cost for fatal accident while four factors for non-fatal accident. The simple indication used by the authority to calculate the cost ratio is doubted since there is lack of mathematical and conceptual evidence to explain how this ratio is determined. The main aim of this study is to determine the new accident cost ratio for fatal and non-fatal accident in Malaysia based on quantitative statistical approach. The cost ratio distributions will be estimated based on Weibull distribution. Due to the unavailability of official accident cost data, insurance claim data both for fatal and non-fatal accident have been used as proxy information for the actual accident cost. There are two types of parameter estimates used in this study, which are maximum likelihood (MLE) and robust estimation. The findings of this study reveal that accident cost ratio for fatal and non-fatal claim when using MLE is 1.33, while, for robust estimates, the cost ratio is slightly higher which is 1.51. This study will help the authority to determine a more accurate cost ratio between fatal and non-fatal accident as compared to the official ratio set by the government, since cost ratio is an important element to be used as a weightage in modeling road accident related data. Therefore, this study provides some guidance tips to revise the insurance claim set by the Malaysia road authority, hence the appropriate method

  18. A robust and accurate center-frequency estimation (RACE) algorithm for improving motion estimation performance of SinMod on tagged cardiac MR images without known tagging parameters.

    PubMed

    Liu, Hong; Wang, Jie; Xu, Xiangyang; Song, Enmin; Wang, Qian; Jin, Renchao; Hung, Chih-Cheng; Fei, Baowei

    2014-11-01

    A robust and accurate center-frequency (CF) estimation (RACE) algorithm for improving the performance of the local sine-wave modeling (SinMod) method, which is a good motion estimation method for tagged cardiac magnetic resonance (MR) images, is proposed in this study. The RACE algorithm can automatically, effectively and efficiently produce a very appropriate CF estimate for the SinMod method, under the circumstance that the specified tagging parameters are unknown, on account of the following two key techniques: (1) the well-known mean-shift algorithm, which can provide accurate and rapid CF estimation; and (2) an original two-direction-combination strategy, which can further enhance the accuracy and robustness of CF estimation. Some other available CF estimation algorithms are brought out for comparison. Several validation approaches that can work on the real data without ground truths are specially designed. Experimental results on human body in vivo cardiac data demonstrate the significance of accurate CF estimation for SinMod, and validate the effectiveness of RACE in facilitating the motion estimation performance of SinMod.

  19. Accurate low-cost methods for performance evaluation of cache memory systems

    NASA Technical Reports Server (NTRS)

    Laha, Subhasis; Patel, Janak H.; Iyer, Ravishankar K.

    1988-01-01

    Methods of simulation based on statistical techniques are proposed to decrease the need for large trace measurements and for predicting true program behavior. Sampling techniques are applied while the address trace is collected from a workload. This drastically reduces the space and time needed to collect the trace. Simulation techniques are developed to use the sampled data not only to predict the mean miss rate of the cache, but also to provide an empirical estimate of its actual distribution. Finally, a concept of primed cache is introduced to simulate large caches by the sampling-based method.

  20. Evaluation of a low-cost and accurate ocean temperature logger on subsurface mooring systems

    SciTech Connect

    Tian, Chuan; Deng, Zhiqun; Lu, Jun; Xu, Xiaoyang; Zhao, Wei; Xu, Ming

    2014-06-23

    Monitoring seawater temperature is important to understanding evolving ocean processes. To monitor internal waves or ocean mixing, a large number of temperature loggers are typically mounted on subsurface mooring systems to obtain high-resolution temperature data at different water depths. In this study, we redesigned and evaluated a compact, low-cost, self-contained, high-resolution and high-accuracy ocean temperature logger, TC-1121. The newly designed TC-1121 loggers are smaller, more robust, and their sampling intervals can be automatically changed by indicated events. They have been widely used in many mooring systems to study internal wave and ocean mixing. The logger’s fundamental design, noise analysis, calibration, drift test, and a long-term sea trial are discussed in this paper.

  1. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, Boy-Santhos; Hut, Rolf; van de Giesen, Nick

    2013-04-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the 150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  2. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, B.; Hut, R.; Van De Giesen, N.

    2012-12-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the $150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  3. Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.

    PubMed

    Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S

    2009-02-01

    A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment.

  4. Treatment Cost Analysis Tool (TCAT) for Estimating Costs of Outpatient Treatment Services

    PubMed Central

    Flynn, Patrick M.; Broome, Kirk M.; Beaston-Blaakman, Aaron; Knight, Danica K.; Horgan, Constance M.; Shepard, Donald S.

    2009-01-01

    A Microsoft® Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment types were $882, $1,310, and $1,381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment. PMID:19004576

  5. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  6. Improved Recharge Estimation from Portable, Low-Cost Weather Stations.

    PubMed

    Holländer, Hartmut M; Wang, Zijian; Assefa, Kibreab A; Woodbury, Allan D

    2016-03-01

    Groundwater recharge estimation is a critical quantity for sustainable groundwater management. The feasibility and robustness of recharge estimation was evaluated using physical-based modeling procedures, and data from a low-cost weather station with remote sensor techniques in Southern Abbotsford, British Columbia, Canada. Recharge was determined using the Richards-based vadose zone hydrological model, HYDRUS-1D. The required meteorological data were recorded with a HOBO(TM) weather station for a short observation period (about 1 year) and an existing weather station (Abbotsford A) for long-term study purpose (27 years). Undisturbed soil cores were taken at two locations in the vicinity of the HOBO(TM) weather station. The derived soil hydraulic parameters were used to characterize the soil in the numerical model. Model performance was evaluated using observed soil moisture and soil temperature data obtained from subsurface remote sensors. A rigorous sensitivity analysis was used to test the robustness of the model. Recharge during the short observation period was estimated at 863 and 816 mm. The mean annual recharge was estimated at 848 and 859 mm/year based on a time series of 27 years. The relative ratio of annual recharge-precipitation varied from 43% to 69%. From a monthly recharge perspective, the majority (80%) of recharge due to precipitation occurred during the hydrologic winter period. The comparison of the recharge estimates with other studies indicates a good agreement. Furthermore, this method is able to predict transient recharge estimates, and can provide a reasonable tool for estimates on nutrient leaching that is often controlled by strong precipitation events and rapid infiltration of water and nitrate into the soil. PMID:26011672

  7. Improved Recharge Estimation from Portable, Low-Cost Weather Stations.

    PubMed

    Holländer, Hartmut M; Wang, Zijian; Assefa, Kibreab A; Woodbury, Allan D

    2016-03-01

    Groundwater recharge estimation is a critical quantity for sustainable groundwater management. The feasibility and robustness of recharge estimation was evaluated using physical-based modeling procedures, and data from a low-cost weather station with remote sensor techniques in Southern Abbotsford, British Columbia, Canada. Recharge was determined using the Richards-based vadose zone hydrological model, HYDRUS-1D. The required meteorological data were recorded with a HOBO(TM) weather station for a short observation period (about 1 year) and an existing weather station (Abbotsford A) for long-term study purpose (27 years). Undisturbed soil cores were taken at two locations in the vicinity of the HOBO(TM) weather station. The derived soil hydraulic parameters were used to characterize the soil in the numerical model. Model performance was evaluated using observed soil moisture and soil temperature data obtained from subsurface remote sensors. A rigorous sensitivity analysis was used to test the robustness of the model. Recharge during the short observation period was estimated at 863 and 816 mm. The mean annual recharge was estimated at 848 and 859 mm/year based on a time series of 27 years. The relative ratio of annual recharge-precipitation varied from 43% to 69%. From a monthly recharge perspective, the majority (80%) of recharge due to precipitation occurred during the hydrologic winter period. The comparison of the recharge estimates with other studies indicates a good agreement. Furthermore, this method is able to predict transient recharge estimates, and can provide a reasonable tool for estimates on nutrient leaching that is often controlled by strong precipitation events and rapid infiltration of water and nitrate into the soil.

  8. Architects and Design-Phase Cost Estimates: Design Professionals Should Reconsider the Value of Third-Party Estimates

    ERIC Educational Resources Information Center

    Coakley, John

    2010-01-01

    Professional cost estimators are widely used by architects during the design phases of a project to provide preliminary cost estimates. These estimates may begin at the conceptual design phase and are prepared at regular intervals through the construction document phase. Estimating professionals are frequently tasked with "selling" the importance…

  9. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Estimated Breakdown of Dwelling Costs for Estimating Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  10. Estimates of outage costs of electricity in Pakistan

    SciTech Connect

    Ashraf, J.; Sabih, F.

    1993-12-31

    This article estimates outage costs of electricity for each of the four provinces in Pakistan (Punjab, North-West Frontier Province, Baluchistan, and Sind). The term {open_quotes}power outage{close_quotes} refers to all problems associated with electricity supply, such as voltage drops (brownouts), power failures (blackouts), and load shedding. The most significant of these in Pakistan is load shedding when power supply to different consumers is shut off during different times of the day, especially during peak hours when the pressure on the system is the highest. Power shortages mainly arise during the low-water months when the effective capacity of hydropower plants drops significantly. This decline in power supplied by hydro plants cannot be made up by operating thermal power plants because of the limited availability of gas and the high cost of alternative fuels required for the operation of gas turbines.

  11. A Cost-Benefit and Accurate Method for Assessing Microalbuminuria: Single versus Frequent Urine Analysis

    PubMed Central

    Hemmati, Roholla; Gharipour, Mojgan; Khosravi, Alireza; Jozan, Mahnaz

    2013-01-01

    Background. The purpose of this study was to answer the question whether a single testing for microalbuminuria results in a reliable conclusion leading costs saving. Methods. This current cross-sectional study included a total of 126 consecutive persons. Microalbuminuria was assessed by collection of two fasting random urine specimens on arrival to the clinic as well as one week later in the morning. Results. In overall, 17 out of 126 participants suffered from microalbuminuria that, among them, 12 subjects were also diagnosed as microalbuminuria once assessing this factor with a sensitivity of 70.6%, a specificity of 100%, a PPV of 100%, a NPV of 95.6%, and an accuracy of 96.0%. The measured sensitivity, specificity, PVV, NPV, and accuracy in hypertensive patients were 73.3%, 100%, 100%, 94.8%, and 95.5%, respectively. Also, these rates in nonhypertensive groups were 50.0%, 100%, 100%, 97.3%, and 97.4%, respectively. According to the ROC curve analysis, a single measurement of UACR had a high value for discriminating defected from normal renal function state (c = 0.989). Urinary albumin concentration in a single measurement had also high discriminative value for diagnosis of damaged kidney (c = 0.995). Conclusion. The single testing of both UACR and urine albumin level rather frequent testing leads to high diagnostic sensitivity, specificity, and accuracy as well as high predictive values in total population and also in hypertensive subgroups. PMID:24455207

  12. Estimation method of point spread function based on Kalman filter for accurately evaluating real optical properties of photonic crystal fibers.

    PubMed

    Shen, Yan; Lou, Shuqin; Wang, Xin

    2014-03-20

    The evaluation accuracy of real optical properties of photonic crystal fibers (PCFs) is determined by the accurate extraction of air hole edges from microscope images of cross sections of practical PCFs. A novel estimation method of point spread function (PSF) based on Kalman filter is presented to rebuild the micrograph image of the PCF cross-section and thus evaluate real optical properties for practical PCFs. Through tests on both artificially degraded images and microscope images of cross sections of practical PCFs, we prove that the proposed method can achieve more accurate PSF estimation and lower PSF variance than the traditional Bayesian estimation method, and thus also reduce the defocus effect. With this method, we rebuild the microscope images of two kinds of commercial PCFs produced by Crystal Fiber and analyze the real optical properties of these PCFs. Numerical results are in accord with the product parameters.

  13. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping.

    PubMed

    Lee, Han B; Schwab, Tanya L; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L; Cervera, Roberto Lopez; McNulty, Melissa S; Bostwick, Hannah S; Clark, Karl J

    2016-06-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus. PMID:26986823

  14. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping.

    PubMed

    Lee, Han B; Schwab, Tanya L; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L; Cervera, Roberto Lopez; McNulty, Melissa S; Bostwick, Hannah S; Clark, Karl J

    2016-06-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus.

  15. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping

    PubMed Central

    Lee, Han B.; Schwab, Tanya L.; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L.; Cervera, Roberto Lopez; McNulty, Melissa S.; Bostwick, Hannah S.; Clark, Karl J.

    2016-01-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus. PMID:26986823

  16. Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification.

    PubMed

    Kumar, Ajay; Kwong, Cyril

    2015-03-01

    Human identification using fingerprint impressions has been widely studied and employed for more than 2000 years. Despite new advancements in the 3D imaging technologies, widely accepted representation of 3D fingerprint features and matching methodology is yet to emerge. This paper investigates 3D representation of widely employed 2D minutiae features by recovering and incorporating (i) minutiae height z and (ii) its 3D orientation φ information and illustrates an effective matching strategy for matching popular minutiae features extended in 3D space. One of the obstacles of the emerging 3D fingerprint identification systems to replace the conventional 2D fingerprint system lies in their bulk and high cost, which is mainly contributed from the usage of structured lighting system or multiple cameras. This paper attempts to addresses such key limitations of the current 3D fingerprint technologies bydeveloping the single camera-based 3D fingerprint identification system. We develop a generalized 3D minutiae matching model and recover extended 3D fingerprint features from the reconstructed 3D fingerprints. 2D fingerprint images acquired for the 3D fingerprint reconstruction can themselves be employed for the performance improvement and have been illustrated in the work detailed in this paper. This paper also attempts to answer one of the most fundamental questions on the availability of inherent discriminable information from 3D fingerprints. The experimental results are presented on a database of 240 clients 3D fingerprints, which is made publicly available to further research efforts in this area, and illustrate the discriminant power of 3D minutiae representation and matching to achieve performance improvement.

  17. An evolutionary morphological approach for software development cost estimation.

    PubMed

    Araújo, Ricardo de A; Oliveira, Adriano L I; Soares, Sergio; Meira, Silvio

    2012-08-01

    In this work we present an evolutionary morphological approach to solve the software development cost estimation (SDCE) problem. The proposed approach consists of a hybrid artificial neuron based on framework of mathematical morphology (MM) with algebraic foundations in the complete lattice theory (CLT), referred to as dilation-erosion perceptron (DEP). Also, we present an evolutionary learning process, called DEP(MGA), using a modified genetic algorithm (MGA) to design the DEP model, because a drawback arises from the gradient estimation of morphological operators in the classical learning process of the DEP, since they are not differentiable in the usual way. Furthermore, an experimental analysis is conducted with the proposed model using five complex SDCE problems and three well-known performance metrics, demonstrating good performance of the DEP model to solve SDCE problems.

  18. 48 CFR 2452.216-70 - Estimated cost, base fee and award fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost, base fee... Provisions and Clauses 2452.216-70 Estimated cost, base fee and award fee. As prescribed in 2416.406(e)(1), insert the following clause in all cost-plus-award-fee contracts: Estimated Cost, Base Fee and Award...

  19. 48 CFR 2452.216-70 - Estimated cost, base fee and award fee.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Estimated cost, base fee... Provisions and Clauses 2452.216-70 Estimated cost, base fee and award fee. As prescribed in 2416.406(e)(1), insert the following clause in all cost-plus-award-fee contracts: Estimated Cost, Base Fee and Award...

  20. 48 CFR 1852.216-84 - Estimated cost and incentive fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and... Provisions and Clauses 1852.216-84 Estimated cost and incentive fee. As prescribed in 1816.406-70(d), insert the following clause: Estimated Cost and Incentive Fee (OCT 1996) The target cost of this contract...

  1. Why Don't They Just Give Us Money? Project Cost Estimating and Cost Reporting

    NASA Technical Reports Server (NTRS)

    Comstock, Douglas A.; Van Wychen, Kristin; Zimmerman, Mary Beth

    2015-01-01

    Successful projects require an integrated approach to managing cost, schedule, and risk. This is especially true for complex, multi-year projects involving multiple organizations. To explore solutions and leverage valuable lessons learned, NASA's Virtual Project Management Challenge will kick off a three-part series examining some of the challenges faced by project and program managers when it comes to managing these important elements. In this first session of the series, we will look at cost management, with an emphasis on the critical roles of cost estimating and cost reporting. By taking a proactive approach to both of these activities, project managers can better control life cycle costs, maintain stakeholder confidence, and protect other current and future projects in the organization's portfolio. Speakers will be Doug Comstock, Director of NASA's Cost Analysis Division, Kristin Van Wychen, Senior Analyst in the GAO Acquisition and Sourcing Management Team, and Mary Beth Zimmerman, Branch Chief for NASA's Portfolio Analysis Branch, Strategic Investments Division. Moderator Ramien Pierre is from NASA's Academy for Program/Project and Engineering Leadership (APPEL).

  2. IUS/TUG orbital operations and mission support study. Volume 5: Cost estimates

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The costing approach, methodology, and rationale utilized for generating cost data for composite IUS and space tug orbital operations are discussed. Summary cost estimates are given along with cost data initially derived for the IUS program and space tug program individually, and cost estimates for each work breakdown structure element.

  3. Bayesian parameter estimation of a k-ε model for accurate jet-in-crossflow simulations

    DOE PAGES

    Ray, Jaideep; Lefantzi, Sophia; Arunajatesan, Srinivasan; Dechant, Lawrence

    2016-05-31

    Reynolds-averaged Navier–Stokes models are not very accurate for high-Reynolds-number compressible jet-in-crossflow interactions. The inaccuracy arises from the use of inappropriate model parameters and model-form errors in the Reynolds-averaged Navier–Stokes model. In this study, the hypothesis is pursued that Reynolds-averaged Navier–Stokes predictions can be significantly improved by using parameters inferred from experimental measurements of a supersonic jet interacting with a transonic crossflow.

  4. Accurate state estimation for a hydraulic actuator via a SDRE nonlinear filter

    NASA Astrophysics Data System (ADS)

    Strano, Salvatore; Terzo, Mario

    2016-06-01

    The state estimation in hydraulic actuators is a fundamental tool for the detection of faults or a valid alternative to the installation of sensors. Due to the hard nonlinearities that characterize the hydraulic actuators, the performances of the linear/linearization based techniques for the state estimation are strongly limited. In order to overcome these limits, this paper focuses on an alternative nonlinear estimation method based on the State-Dependent-Riccati-Equation (SDRE). The technique is able to fully take into account the system nonlinearities and the measurement noise. A fifth order nonlinear model is derived and employed for the synthesis of the estimator. Simulations and experimental tests have been conducted and comparisons with the largely used Extended Kalman Filter (EKF) are illustrated. The results show the effectiveness of the SDRE based technique for applications characterized by not negligible nonlinearities such as dead zone and frictions.

  5. The GFR and GFR decline cannot be accurately estimated in type 2 diabetics.

    PubMed

    Gaspari, Flavio; Ruggenenti, Piero; Porrini, Esteban; Motterlini, Nicola; Cannata, Antonio; Carrara, Fabiola; Jiménez Sosa, Alejandro; Cella, Claudia; Ferrari, Silvia; Stucchi, Nadia; Parvanova, Aneliya; Iliev, Ilian; Trevisan, Roberto; Bossi, Antonio; Zaletel, Jelka; Remuzzi, Giuseppe

    2013-07-01

    There are no adequate studies that have formally tested the performance of different estimating formulas in patients with type 2 diabetes both with and without overt nephropathy. Here we evaluated the agreement between baseline GFRs, GFR changes at month 6, and long-term GFR decline measured by iohexol plasma clearance or estimated by 15 creatinine-based formulas in 600 type 2 diabetics followed for a median of 4.0 years. Ninety patients were hyperfiltering. The number of those identified by estimation formulas ranged from 0 to 24:58 were not identified by any formula. Baseline GFR was significantly underestimated and a 6-month GFR reduction was missed in hyperfiltering patients. Long-term GFR decline was also underestimated by all formulas in the whole study group and in hyper-, normo-, and hypofiltering patients considered separately. Five formulas generated positive slopes in hyperfiltering patients. Baseline concordance correlation coefficients and total deviation indexes ranged from 32.1% to 92.6% and from 0.21 to 0.53, respectively. Concordance correlation coefficients between estimated and measured long-term GFR decline ranged from -0.21 to 0.35. The agreement between estimated and measured values was also poor within each subgroup considered separately. Thus, our study questions the use of any estimation formula to identify hyperfiltering patients and monitor renal disease progression and response to treatment in type 2 diabetics without overt nephropathy.

  6. FAST TRACK COMMUNICATION Accurate estimate of α variation and isotope shift parameters in Na and Mg+

    NASA Astrophysics Data System (ADS)

    Sahoo, B. K.

    2010-12-01

    We present accurate calculations of fine-structure constant variation coefficients and isotope shifts in Na and Mg+ using the relativistic coupled-cluster method. In our approach, we are able to discover the roles of various correlation effects explicitly to all orders in these calculations. Most of the results, especially for the excited states, are reported for the first time. It is possible to ascertain suitable anchor and probe lines for the studies of possible variation in the fine-structure constant by using the above results in the considered systems.

  7. Precision Pointing Control to and Accurate Target Estimation of a Non-Cooperative Vehicle

    NASA Technical Reports Server (NTRS)

    VanEepoel, John; Thienel, Julie; Sanner, Robert M.

    2006-01-01

    In 2004, NASA began investigating a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates in order to achieve capture by the proposed Hubble Robotic Vehicle (HRV), but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST. To generalize the situation, HST is the target vehicle and HRV is the chaser. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a control scheme. Non-cooperative in this context relates to the target vehicle no longer having the ability to maintain attitude control or transmit attitude knowledge.

  8. Some recommendations for an accurate estimation of Lanice conchilega density based on tube counts

    NASA Astrophysics Data System (ADS)

    van Hoey, Gert; Vincx, Magda; Degraer, Steven

    2006-12-01

    The tube building polychaete Lanice conchilega is a common and ecologically important species in intertidal and shallow subtidal sands. It builds a characteristic tube with ragged fringes and can retract rapidly into its tube to depths of more than 20 cm. Therefore, it is very difficult to sample L. conchilega individuals, especially with a Van Veen grab. Consequently, many studies have used tube counts as estimates of real densities. This study reports on some aspects to be considered when using tube counts as a density estimate of L. conchilega, based on intertidal and subtidal samples. Due to its accuracy and independence of sampling depth, the tube method is considered the prime method to estimate the density of L. conchilega. However, caution is needed when analyzing samples with fragile young individuals and samples from areas where temporary physical disturbance is likely to occur.

  9. Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Sanner, Robert M.

    2006-01-01

    Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.

  10. A microbial clock provides an accurate estimate of the postmortem interval in a mouse model system

    PubMed Central

    Metcalf, Jessica L; Wegener Parfrey, Laura; Gonzalez, Antonio; Lauber, Christian L; Knights, Dan; Ackermann, Gail; Humphrey, Gregory C; Gebert, Matthew J; Van Treuren, Will; Berg-Lyons, Donna; Keepers, Kyle; Guo, Yan; Bullard, James; Fierer, Noah; Carter, David O; Knight, Rob

    2013-01-01

    Establishing the time since death is critical in every death investigation, yet existing techniques are susceptible to a range of errors and biases. For example, forensic entomology is widely used to assess the postmortem interval (PMI), but errors can range from days to months. Microbes may provide a novel method for estimating PMI that avoids many of these limitations. Here we show that postmortem microbial community changes are dramatic, measurable, and repeatable in a mouse model system, allowing PMI to be estimated within approximately 3 days over 48 days. Our results provide a detailed understanding of bacterial and microbial eukaryotic ecology within a decomposing corpse system and suggest that microbial community data can be developed into a forensic tool for estimating PMI. DOI: http://dx.doi.org/10.7554/eLife.01104.001 PMID:24137541

  11. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  12. Spectral estimation from laser scanner data for accurate color rendering of objects

    NASA Astrophysics Data System (ADS)

    Baribeau, Rejean

    2002-06-01

    Estimation methods are studied for the recovery of the spectral reflectance across the visible range from the sensing at just three discrete laser wavelengths. Methods based on principal component analysis and on spline interpolation are judged based on the CIE94 color differences for some reference data sets. These include the Macbeth color checker, the OSA-UCS color charts, some artist pigments, and a collection of miscellaneous surface colors. The optimal three sampling wavelengths are also investigated. It is found that color can be estimated with average accuracy ΔE94 = 2.3 when optimal wavelengths 455 nm, 540 n, and 610 nm are used.

  13. Accurate radiocarbon age estimation using "early" measurements: a new approach to reconstructing the Paleolithic absolute chronology

    NASA Astrophysics Data System (ADS)

    Omori, Takayuki; Sano, Katsuhiro; Yoneda, Minoru

    2014-05-01

    This paper presents new correction approaches for "early" radiocarbon ages to reconstruct the Paleolithic absolute chronology. In order to discuss time-space distribution about the replacement of archaic humans, including Neanderthals in Europe, by the modern humans, a massive data, which covers a wide-area, would be needed. Today, some radiocarbon databases focused on the Paleolithic have been published and used for chronological studies. From a viewpoint of current analytical technology, however, the any database have unreliable results that make interpretation of radiocarbon dates difficult. Most of these unreliable ages had been published in the early days of radiocarbon analysis. In recent years, new analytical methods to determine highly-accurate dates have been developed. Ultrafiltration and ABOx-SC methods, as new sample pretreatments for bone and charcoal respectively, have attracted attention because they could remove imperceptible contaminates and derive reliable accurately ages. In order to evaluate the reliability of "early" data, we investigated the differences and variabilities of radiocarbon ages on different pretreatments, and attempted to develop correction functions for the assessment of the reliability. It can be expected that reliability of the corrected age is increased and the age applied to chronological research together with recent ages. Here, we introduce the methodological frameworks and archaeological applications.

  14. How Accurate and Robust Are the Phylogenetic Estimates of Austronesian Language Relationships?

    PubMed Central

    Greenhill, Simon J.; Drummond, Alexei J.; Gray, Russell D.

    2010-01-01

    We recently used computational phylogenetic methods on lexical data to test between two scenarios for the peopling of the Pacific. Our analyses of lexical data supported a pulse-pause scenario of Pacific settlement in which the Austronesian speakers originated in Taiwan around 5,200 years ago and rapidly spread through the Pacific in a series of expansion pulses and settlement pauses. We claimed that there was high congruence between traditional language subgroups and those observed in the language phylogenies, and that the estimated age of the Austronesian expansion at 5,200 years ago was consistent with the archaeological evidence. However, the congruence between the language phylogenies and the evidence from historical linguistics was not quantitatively assessed using tree comparison metrics. The robustness of the divergence time estimates to different calibration points was also not investigated exhaustively. Here we address these limitations by using a systematic tree comparison metric to calculate the similarity between the Bayesian phylogenetic trees and the subgroups proposed by historical linguistics, and by re-estimating the age of the Austronesian expansion using only the most robust calibrations. The results show that the Austronesian language phylogenies are highly congruent with the traditional subgroupings, and the date estimates are robust even when calculated using a restricted set of historical calibrations. PMID:20224774

  15. Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates

    NASA Technical Reports Server (NTRS)

    Peffley, Al F.

    1991-01-01

    The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.

  16. Accurate, fast and cost-effective diagnostic test for monosomy 1p36 using real-time quantitative PCR.

    PubMed

    Cunha, Pricila da Silva; Pena, Heloisa B; D'Angelo, Carla Sustek; Koiffmann, Celia P; Rosenfeld, Jill A; Shaffer, Lisa G; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5-0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  17. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  18. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions.

  19. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  20. Accurate estimation of influenza epidemics using Google search data via ARGO

    PubMed Central

    Yang, Shihao; Santillana, Mauricio; Kou, S. C.

    2015-01-01

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  1. Estimating the additional cost of disability: beyond budget standards.

    PubMed

    Wilkinson-Meyers, Laura; Brown, Paul; McNeill, Robert; Patston, Philip; Dylan, Sacha; Baker, Ronelle

    2010-11-01

    Disabled people have long advocated for sufficient resources to live a life with the same rights and responsibilities as non-disabled people. Identifying the unique resource needs of disabled people relative to the population as a whole and understanding the source of these needs is critical for determining adequate levels of income support and for prioritising service provision. Previous attempts to identify the resources and costs associated with disability have tended to rely on surveys of current resource use. These approaches have been criticised as being inadequate for identifying the resources that would be required to achieve a similar standard of living to non-disabled people and for not using methods that are acceptable to and appropriate for the disabled community. The challenge is therefore to develop a methodology that accurately identifies these unique resource needs, uses an approach that is acceptable to the disabled community, enables all disabled people to participate, and distinguishes 'needs' from 'wants.' This paper describes and presents the rationale for a mixed methodology for identifying and prioritising the resource needs of disabled people. The project is a partnership effort between disabled researchers, a disability support organisation and academic researchers in New Zealand. The method integrates a social model of disability framework and an economic cost model using a budget standards approach to identify additional support, equipment, travel and time required to live an 'ordinary life' in the community. A survey is then used to validate the findings and identify information gaps and resource priorities of the community. Both the theoretical basis of the approach and the practical challenges of designing and implementing a methodology that is acceptable to the disabled community, service providers and funding agencies are discussed. PMID:20933315

  2. Raman spectroscopy for highly accurate estimation of the age of refrigerated porcine muscle

    NASA Astrophysics Data System (ADS)

    Timinis, Constantinos; Pitris, Costas

    2016-03-01

    The high water content of meat, combined with all the nutrients it contains, make it vulnerable to spoilage at all stages of production and storage even when refrigerated at 5 °C. A non-destructive and in situ tool for meat sample testing, which could provide an accurate indication of the storage time of meat, would be very useful for the control of meat quality as well as for consumer safety. The proposed solution is based on Raman spectroscopy which is non-invasive and can be applied in situ. For the purposes of this project, 42 meat samples from 14 animals were obtained and three Raman spectra per sample were collected every two days for two weeks. The spectra were subsequently processed and the sample age was calculated using a set of linear differential equations. In addition, the samples were classified in categories corresponding to the age in 2-day steps (i.e., 0, 2, 4, 6, 8, 10, 12 or 14 days old), using linear discriminant analysis and cross-validation. Contrary to other studies, where the samples were simply grouped into two categories (higher or lower quality, suitable or unsuitable for human consumption, etc.), in this study, the age was predicted with a mean error of ~ 1 day (20%) or classified, in 2-day steps, with 100% accuracy. Although Raman spectroscopy has been used in the past for the analysis of meat samples, the proposed methodology has resulted in a prediction of the sample age far more accurately than any report in the literature.

  3. FASTSim: A Model to Estimate Vehicle Efficiency, Cost and Performance

    SciTech Connect

    Brooker, A.; Gonder, J.; Wang, L.; Wood, E.; Lopp, S.; Ramroth, L.

    2015-05-04

    The Future Automotive Systems Technology Simulator (FASTSim) is a high-level advanced vehicle powertrain systems analysis tool supported by the U.S. Department of Energy’s Vehicle Technologies Office. FASTSim provides a quick and simple approach to compare powertrains and estimate the impact of technology improvements on light- and heavy-duty vehicle efficiency, performance, cost, and battery batches of real-world drive cycles. FASTSim’s calculation framework and balance among detail, accuracy, and speed enable it to simulate thousands of driven miles in minutes. The key components and vehicle outputs have been validated by comparing the model outputs to test data for many different vehicles to provide confidence in the results. A graphical user interface makes FASTSim easy and efficient to use. FASTSim is freely available for download from the National Renewable Energy Laboratory’s website (see www.nrel.gov/fastsim).

  4. Are satellite based rainfall estimates accurate enough for crop modelling under Sahelian climate?

    NASA Astrophysics Data System (ADS)

    Ramarohetra, J.; Sultan, B.

    2012-04-01

    Agriculture is considered as the most climate dependant human activity. In West Africa and especially in the sudano-sahelian zone, rain-fed agriculture - that represents 93% of cultivated areas and is the means of support of 70% of the active population - is highly vulnerable to precipitation variability. To better understand and anticipate climate impacts on agriculture, crop models - that estimate crop yield from climate information (e.g rainfall, temperature, insolation, humidity) - have been developed. These crop models are useful (i) in ex ante analysis to quantify the impact of different strategies implementation - crop management (e.g. choice of varieties, sowing date), crop insurance or medium-range weather forecast - on yields, (ii) for early warning systems and to (iii) assess future food security. Yet, the successful application of these models depends on the accuracy of their climatic drivers. In the sudano-sahelian zone , the quality of precipitation estimations is then a key factor to understand and anticipate climate impacts on agriculture via crop modelling and yield estimations. Different kinds of precipitation estimations can be used. Ground measurements have long-time series but an insufficient network density, a large proportion of missing values, delay in reporting time, and they have limited availability. An answer to these shortcomings may lie in the field of remote sensing that provides satellite-based precipitation estimations. However, satellite-based rainfall estimates (SRFE) are not a direct measurement but rather an estimation of precipitation. Used as an input for crop models, it determines the performance of the simulated yield, hence SRFE require validation. The SARRAH crop model is used to model three different varieties of pearl millet (HKP, MTDO, Souna3) in a square degree centred on 13.5°N and 2.5°E, in Niger. Eight satellite-based rainfall daily products (PERSIANN, CMORPH, TRMM 3b42-RT, GSMAP MKV+, GPCP, TRMM 3b42v6, RFEv2 and

  5. Techniques for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, Michael R.; Bland, Roger

    1999-01-01

    An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. The relative magnitude of equipment errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three sets of calibration data differed by less than an average of 4 cubic meters per second. Typical maximum flow rates during the data-collection period averaged 750 cubic meters per second.

  6. Plant DNA Barcodes Can Accurately Estimate Species Richness in Poorly Known Floras

    PubMed Central

    Costion, Craig; Ford, Andrew; Cross, Hugh; Crayn, Darren; Harrington, Mark; Lowe, Andrew

    2011-01-01

    Background Widespread uptake of DNA barcoding technology for vascular plants has been slow due to the relatively poor resolution of species discrimination (∼70%) and low sequencing and amplification success of one of the two official barcoding loci, matK. Studies to date have mostly focused on finding a solution to these intrinsic limitations of the markers, rather than posing questions that can maximize the utility of DNA barcodes for plants with the current technology. Methodology/Principal Findings Here we test the ability of plant DNA barcodes using the two official barcoding loci, rbcLa and matK, plus an alternative barcoding locus, trnH-psbA, to estimate the species diversity of trees in a tropical rainforest plot. Species discrimination accuracy was similar to findings from previous studies but species richness estimation accuracy proved higher, up to 89%. All combinations which included the trnH-psbA locus performed better at both species discrimination and richness estimation than matK, which showed little enhanced species discriminatory power when concatenated with rbcLa. The utility of the trnH-psbA locus is limited however, by the occurrence of intraspecific variation observed in some angiosperm families to occur as an inversion that obscures the monophyly of species. Conclusions/Significance We demonstrate for the first time, using a case study, the potential of plant DNA barcodes for the rapid estimation of species richness in taxonomically poorly known areas or cryptic populations revealing a powerful new tool for rapid biodiversity assessment. The combination of the rbcLa and trnH-psbA loci performed better for this purpose than any two-locus combination that included matK. We show that although DNA barcodes fail to discriminate all species of plants, new perspectives and methods on biodiversity value and quantification may overshadow some of these shortcomings by applying barcode data in new ways. PMID:22096501

  7. Improved rapid magnitude estimation for a community-based, low-cost MEMS accelerometer network

    USGS Publications Warehouse

    Chung, Angela I.; Cochran, Elizabeth S.; Kaiser, Anna E.; Christensen, Carl M.; Yildirim, Battalgazi; Lawrence, Jesse F.

    2015-01-01

    Immediately following the Mw 7.2 Darfield, New Zealand, earthquake, over 180 Quake‐Catcher Network (QCN) low‐cost micro‐electro‐mechanical systems accelerometers were deployed in the Canterbury region. Using data recorded by this dense network from 2010 to 2013, we significantly improved the QCN rapid magnitude estimation relationship. The previous scaling relationship (Lawrence et al., 2014) did not accurately estimate the magnitudes of nearby (<35  km) events. The new scaling relationship estimates earthquake magnitudes within 1 magnitude unit of the GNS Science GeoNet earthquake catalog magnitudes for 99% of the events tested, within 0.5 magnitude units for 90% of the events, and within 0.25 magnitude units for 57% of the events. These magnitudes are reliably estimated within 3 s of the initial trigger recorded on at least seven stations. In this report, we present the methods used to calculate a new scaling relationship and demonstrate the accuracy of the revised magnitude estimates using a program that is able to retrospectively estimate event magnitudes using archived data.

  8. Accurate distortion estimation and optimal bandwidth allocation for scalable H.264 video transmission over MIMO systems.

    PubMed

    Jubran, Mohammad K; Bansal, Manu; Kondi, Lisimachos P; Grover, Rohan

    2009-01-01

    In this paper, we propose an optimal strategy for the transmission of scalable video over packet-based multiple-input multiple-output (MIMO) systems. The scalable extension of H.264/AVC that provides a combined temporal, quality and spatial scalability is used. For given channel conditions, we develop a method for the estimation of the distortion of the received video and propose different error concealment schemes. We show the accuracy of our distortion estimation algorithm in comparison with simulated wireless video transmission with packet errors. In the proposed MIMO system, we employ orthogonal space-time block codes (O-STBC) that guarantee independent transmission of different symbols within the block code. In the proposed constrained bandwidth allocation framework, we use the estimated end-to-end decoder distortion to optimally select the application layer parameters, i.e., quantization parameter (QP) and group of pictures (GOP) size, and physical layer parameters, i.e., rate-compatible turbo (RCPT) code rate and symbol constellation. Results show the substantial performance gain by using different symbol constellations across the scalable layers as compared to a fixed constellation.

  9. Compact and accurate linear and nonlinear autoregressive moving average model parameter estimation using laguerre functions.

    PubMed

    Chon, K H; Cohen, R J; Holstein-Rathlou, N H

    1997-01-01

    A linear and nonlinear autoregressive moving average (ARMA) identification algorithm is developed for modeling time series data. The algorithm uses Laguerre expansion of kernals (LEK) to estimate Volterra-Wiener kernals. However, instead of estimating linear and nonlinear system dynamics via moving average models, as is the case for the Volterra-Wiener analysis, we propose an ARMA model-based approach. The proposed algorithm is essentially the same as LEK, but this algorithm is extended to include past values of the output as well. Thus, all of the advantages associated with using the Laguerre function remain with our algorithm; but, by extending the algorithm to the linear and nonlinear ARMA model, a significant reduction in the number of Laguerre functions can be made, compared with the Volterra-Wiener approach. This translates into a more compact system representation and makes the physiological interpretation of higher order kernels easier. Furthermore, simulation results show better performance of the proposed approach in estimating the system dynamics than LEK in certain cases, and it remains effective in the presence of significant additive measurement noise. PMID:9236985

  10. Evaluation of the sample needed to accurately estimate outcome-based measurements of dairy welfare on farm.

    PubMed

    Endres, M I; Lobeck-Luchterhand, K M; Espejo, L A; Tucker, C B

    2014-01-01

    Dairy welfare assessment programs are becoming more common on US farms. Outcome-based measurements, such as locomotion, hock lesion, hygiene, and body condition scores (BCS), are included in these assessments. The objective of the current study was to investigate the proportion of cows in the pen or subsamples of pens on a farm needed to provide an accurate estimate of the previously mentioned measurements. In experiment 1, we evaluated cows in 52 high pens (50 farms) for lameness using a 1- to 5-scale locomotion scoring system (1 = normal and 5 = severely lame; 24.4 and 6% of animals were scored ≥ 3 or ≥ 4, respectively). Cows were also given a BCS using a 1- to 5-scale, where 1 = emaciated and 5 = obese; cows were rarely thin (BCS ≤ 2; 0.10% of cows) or fat (BCS ≥ 4; 0.11% of cows). Hygiene scores were assessed on a 1- to 5-scale with 1 = clean and 5 = severely dirty; 54.9% of cows had a hygiene score ≥ 3. Hock injuries were classified as 1 = no lesion, 2 = mild lesion, and 3 = severe lesion; 10.6% of cows had a score of 3. Subsets of data were created with 10 replicates of random sampling that represented 100, 90, 80, 70, 60, 50, 40, 30, 20, 15, 10, 5, and 3% of the cows measured/pen. In experiment 2, we scored the same outcome measures on all cows in lactating pens from 12 farms and evaluated using pen subsamples: high; high and fresh; high, fresh, and hospital; and high, low, and hospital. For both experiments, the association between the estimates derived from all subsamples and entire pen (experiment 1) or herd (experiment 2) prevalence was evaluated using linear regression. To be considered a good estimate, 3 criteria must be met: R(2)>0.9, slope = 1, and intercept = 0. In experiment 1, on average, recording 15% of the pen represented the percentage of clinically lame cows (score ≥ 3), whereas 30% needed to be measured to estimate severe lameness (score ≥ 4). Only 15% of the pen was needed to estimate the percentage of the herd with a hygiene

  11. Global cost estimates of reducing carbon emissions through avoided deforestation

    PubMed Central

    Kindermann, Georg; Obersteiner, Michael; Sohngen, Brent; Sathaye, Jayant; Andrasko, Kenneth; Rametsteiner, Ewald; Schlamadinger, Bernhard; Wunder, Sven; Beach, Robert

    2008-01-01

    Tropical deforestation is estimated to cause about one-quarter of anthropogenic carbon emissions, loss of biodiversity, and other environmental services. United Nations Framework Convention for Climate Change talks are now considering mechanisms for avoiding deforestation (AD), but the economic potential of AD has yet to be addressed. We use three economic models of global land use and management to analyze the potential contribution of AD activities to reduced greenhouse gas emissions. AD activities are found to be a competitive, low-cost abatement option. A program providing a 10% reduction in deforestation from 2005 to 2030 could provide 0.3–0.6 Gt (1 Gt = 1 × 105 g) CO2·yr−1 in emission reductions and would require $0.4 billion to $1.7 billion·yr−1 for 30 years. A 50% reduction in deforestation from 2005 to 2030 could provide 1.5–2.7 Gt CO2·yr−1 in emission reductions and would require $17.2 billion to $28.0 billion·yr−1. Finally, some caveats to the analysis that could increase costs of AD programs are described. PMID:18650377

  12. Global cost estimates of reducing carbon emissions through avoided deforestation.

    PubMed

    Kindermann, Georg; Obersteiner, Michael; Sohngen, Brent; Sathaye, Jayant; Andrasko, Kenneth; Rametsteiner, Ewald; Schlamadinger, Bernhard; Wunder, Sven; Beach, Robert

    2008-07-29

    Tropical deforestation is estimated to cause about one-quarter of anthropogenic carbon emissions, loss of biodiversity, and other environmental services. United Nations Framework Convention for Climate Change talks are now considering mechanisms for avoiding deforestation (AD), but the economic potential of AD has yet to be addressed. We use three economic models of global land use and management to analyze the potential contribution of AD activities to reduced greenhouse gas emissions. AD activities are found to be a competitive, low-cost abatement option. A program providing a 10% reduction in deforestation from 2005 to 2030 could provide 0.3-0.6 Gt (1 Gt = 1 x 10(5) g) CO(2).yr(-1) in emission reductions and would require $0.4 billion to $1.7 billion.yr(-1) for 30 years. A 50% reduction in deforestation from 2005 to 2030 could provide 1.5-2.7 Gt CO(2).yr(-1) in emission reductions and would require $17.2 billion to $28.0 billion.yr(-1). Finally, some caveats to the analysis that could increase costs of AD programs are described.

  13. Cost Estimates of Electricity from a TPV Residential Heating System

    NASA Astrophysics Data System (ADS)

    Palfinger, Günther; Bitnar, Bernd; Durisch, Wilhelm; Mayor, Jean-Claude; Grützmacher, Detlev; Gobrecht, Jens

    2003-01-01

    A thermophotovoltaic (TPV) system was built using a 12 to 20 kWth methane burner which should be integrated into a conventional residential heating system. The TPV system is cylindrical in shape and consists of a selective Yb2O3 emitter, a quartz glass tube to prevent the exhaust gases from heating the cells and a 0.2 m2 monocrystalline silicon solar cell module which is water cooled. The maximum system efficiency of 1.0 % was obtained at a thermal input power of 12 kWth. The electrical power suffices to run a residential heating system in the full power range (12 to 20 kWth) independently of the grid. The end user costs of the TPV components - emitter, glass tube, photocells and cell cooling circuit - were estimated considering 4 different TPV scenarios. The existing technique was compared with an improved system currently under development, which consists of a flexible photocell module that can be glued into the boiler housing and with systems with improved system efficiency (1.5 to 5 %) and geometry. Prices of the electricity from 2.5 to 22 EURcents/kWhel (excl. gas of about 3.5 EURcents/kWh), which corresponds to system costs of 340 to 3000 EUR/kWel,peak, were calculated. The price of electricity by TPV was compared with that of fuel cells and gas engines. While fuel cells are still expensive, gas engines have the disadvantage of maintenance, noise and bulkiness. TPV, in contrast, is a cost efficient alternative to produce heat and electricity, particularly in small peripheral units.

  14. 48 CFR 1852.216-84 - Estimated cost and incentive fee.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the following clause: Estimated Cost and Incentive Fee (OCT 1996) The target cost of this contract is $___. The target fee of this contract is $___. The total target cost and target fee as contemplated by...

  15. 48 CFR 1852.216-84 - Estimated cost and incentive fee.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the following clause: Estimated Cost and Incentive Fee (OCT 1996) The target cost of this contract is $___. The target fee of this contract is $___. The total target cost and target fee as contemplated by...

  16. 48 CFR 1852.216-84 - Estimated cost and incentive fee.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the following clause: Estimated Cost and Incentive Fee (OCT 1996) The target cost of this contract is $___. The target fee of this contract is $___. The total target cost and target fee as contemplated by...

  17. 48 CFR 1852.216-84 - Estimated cost and incentive fee.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... the following clause: Estimated Cost and Incentive Fee (OCT 1996) The target cost of this contract is $___. The target fee of this contract is $___. The total target cost and target fee as contemplated by...

  18. Taking the Evolutionary Road to Developing an In-House Cost Estimate

    NASA Technical Reports Server (NTRS)

    Jacintho, David; Esker, Lind; Herman, Frank; Lavaque, Rodolfo; Regardie, Myma

    2011-01-01

    This slide presentation reviews the process and some of the problems and challenges of developing an In-House Cost Estimate (IHCE). Using as an example the Space Network Ground Segment Sustainment (SGSS) project, the presentation reviews the phases for developing a Cost estimate within the project to estimate government and contractor project costs to support a budget request.

  19. 40 CFR 144.62 - Cost estimate for plugging and abandonment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Cost estimate for plugging and... Waste Injection Wells § 144.62 Cost estimate for plugging and abandonment. (a) The owner or operator must prepare a written estimate, in current dollars, of the cost of plugging the injection well...

  20. Accurate Estimation of Airborne Ultrasonic Time-of-Flight for Overlapping Echoes

    PubMed Central

    Sarabia, Esther G.; Llata, Jose R.; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P.

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  1. Accurate estimation of airborne ultrasonic time-of-flight for overlapping echoes.

    PubMed

    Sarabia, Esther G; Llata, Jose R; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  2. Handbook for quick cost estimates. A method for developing quick approximate estimates of costs for generic actions for nuclear power plants

    SciTech Connect

    Ball, J.R.

    1986-04-01

    This document is a supplement to a ''Handbook for Cost Estimating'' (NUREG/CR-3971) and provides specific guidance for developing ''quick'' approximate estimates of the cost of implementing generic regulatory requirements for nuclear power plants. A method is presented for relating the known construction costs for new nuclear power plants (as contained in the Energy Economic Data Base) to the cost of performing similar work, on a back-fit basis, at existing plants. Cost factors are presented to account for variations in such important cost areas as construction labor productivity, engineering and quality assurance, replacement energy, reworking of existing features, and regional variations in the cost of materials and labor. Other cost categories addressed in this handbook include those for changes in plant operating personnel and plant documents, licensee costs, NRC costs, and costs for other government agencies. Data sheets, worksheets, and appropriate cost algorithms are included to guide the user through preparation of rough estimates. A sample estimate is prepared using the method and the estimating tools provided.

  3. Innovation in the pharmaceutical industry: New estimates of R&D costs.

    PubMed

    DiMasi, Joseph A; Grabowski, Henry G; Hansen, Ronald W

    2016-05-01

    The research and development costs of 106 randomly selected new drugs were obtained from a survey of 10 pharmaceutical firms. These data were used to estimate the average pre-tax cost of new drug and biologics development. The costs of compounds abandoned during testing were linked to the costs of compounds that obtained marketing approval. The estimated average out-of-pocket cost per approved new compound is $1395 million (2013 dollars). Capitalizing out-of-pocket costs to the point of marketing approval at a real discount rate of 10.5% yields a total pre-approval cost estimate of $2558 million (2013 dollars). When compared to the results of the previous study in this series, total capitalized costs were shown to have increased at an annual rate of 8.5% above general price inflation. Adding an estimate of post-approval R&D costs increases the cost estimate to $2870 million (2013 dollars). PMID:26928437

  4. Innovation in the pharmaceutical industry: New estimates of R&D costs.

    PubMed

    DiMasi, Joseph A; Grabowski, Henry G; Hansen, Ronald W

    2016-05-01

    The research and development costs of 106 randomly selected new drugs were obtained from a survey of 10 pharmaceutical firms. These data were used to estimate the average pre-tax cost of new drug and biologics development. The costs of compounds abandoned during testing were linked to the costs of compounds that obtained marketing approval. The estimated average out-of-pocket cost per approved new compound is $1395 million (2013 dollars). Capitalizing out-of-pocket costs to the point of marketing approval at a real discount rate of 10.5% yields a total pre-approval cost estimate of $2558 million (2013 dollars). When compared to the results of the previous study in this series, total capitalized costs were shown to have increased at an annual rate of 8.5% above general price inflation. Adding an estimate of post-approval R&D costs increases the cost estimate to $2870 million (2013 dollars).

  5. Voxel-based registration of simulated and real patient CBCT data for accurate dental implant pose estimation

    NASA Astrophysics Data System (ADS)

    Moreira, António H. J.; Queirós, Sandro; Morais, Pedro; Rodrigues, Nuno F.; Correia, André Ricardo; Fernandes, Valter; Pinho, A. C. M.; Fonseca, Jaime C.; Vilaça, João. L.

    2015-03-01

    The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant's pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant's pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant's main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant's pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67+/-34μm and 108μm, and angular misfits of 0.15+/-0.08° and 1.4°, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants' pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

  6. IDC reengineering Phase 2 & 3 US industry standard cost estimate summary

    SciTech Connect

    Harris, James M.; Huelskamp, Robert M.

    2015-01-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, using a commercial software cost estimation tool calibrated to US industry performance parameters. This is not a cost estimate for Sandia to perform the project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  7. Cost estimation: An expert-opinion approach. [cost analysis of research projects using the Delphi method (forecasting)

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Fogleman, S.; Gielecki, M.

    1976-01-01

    A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.

  8. An Energy-Efficient Strategy for Accurate Distance Estimation in Wireless Sensor Networks

    PubMed Central

    Tarrío, Paula; Bernardos, Ana M.; Casar, José R.

    2012-01-01

    In line with recent research efforts made to conceive energy saving protocols and algorithms and power sensitive network architectures, in this paper we propose a transmission strategy to minimize the energy consumption in a sensor network when using a localization technique based on the measurement of the strength (RSS) or the time of arrival (TOA) of the received signal. In particular, we find the transmission power and the packet transmission rate that jointly minimize the total consumed energy, while ensuring at the same time a desired accuracy in the RSS or TOA measurements. We also propose some corrections to these theoretical results to take into account the effects of shadowing and packet loss in the propagation channel. The proposed strategy is shown to be effective in realistic scenarios providing energy savings with respect to other transmission strategies, and also guaranteeing a given accuracy in the distance estimations, which will serve to guarantee a desired accuracy in the localization result. PMID:23202218

  9. Accurate automatic estimation of total intracranial volume: a nuisance variable with less nuisance.

    PubMed

    Malone, Ian B; Leung, Kelvin K; Clegg, Shona; Barnes, Josephine; Whitwell, Jennifer L; Ashburner, John; Fox, Nick C; Ridgway, Gerard R

    2015-01-01

    Total intracranial volume (TIV/ICV) is an important covariate for volumetric analyses of the brain and brain regions, especially in the study of neurodegenerative diseases, where it can provide a proxy of maximum pre-morbid brain volume. The gold-standard method is manual delineation of brain scans, but this requires careful work by trained operators. We evaluated Statistical Parametric Mapping 12 (SPM12) automated segmentation for TIV measurement in place of manual segmentation and also compared it with SPM8 and FreeSurfer 5.3.0. For T1-weighted MRI acquired from 288 participants in a multi-centre clinical trial in Alzheimer's disease we find a high correlation between SPM12 TIV and manual TIV (R(2)=0.940, 95% Confidence Interval (0.924, 0.953)), with a small mean difference (SPM12 40.4±35.4ml lower than manual, amounting to 2.8% of the overall mean TIV in the study). The correlation with manual measurements (the key aspect when using TIV as a covariate) for SPM12 was significantly higher (p<0.001) than for either SPM8 (R(2)=0.577 CI (0.500, 0.644)) or FreeSurfer (R(2)=0.801 CI (0.744, 0.843)). These results suggest that SPM12 TIV estimates are an acceptable substitute for labour-intensive manual estimates even in the challenging context of multiple centres and the presence of neurodegenerative pathology. We also briefly discuss some aspects of the statistical modelling approaches to adjust for TIV. PMID:25255942

  10. Modelling the Constraints of Spatial Environment in Fauna Movement Simulations: Comparison of a Boundaries Accurate Function and a Cost Function

    NASA Astrophysics Data System (ADS)

    Jolivet, L.; Cohen, M.; Ruas, A.

    2015-08-01

    Landscape influences fauna movement at different levels, from habitat selection to choices of movements' direction. Our goal is to provide a development frame in order to test simulation functions for animal's movement. We describe our approach for such simulations and we compare two types of functions to calculate trajectories. To do so, we first modelled the role of landscape elements to differentiate between elements that facilitate movements and the ones being hindrances. Different influences are identified depending on landscape elements and on animal species. Knowledge were gathered from ecologists, literature and observation datasets. Second, we analysed the description of animal movement recorded with GPS at fine scale, corresponding to high temporal frequency and good location accuracy. Analysing this type of data provides information on the relation between landscape features and movements. We implemented an agent-based simulation approach to calculate potential trajectories constrained by the spatial environment and individual's behaviour. We tested two functions that consider space differently: one function takes into account the geometry and the types of landscape elements and one cost function sums up the spatial surroundings of an individual. Results highlight the fact that the cost function exaggerates the distances travelled by an individual and simplifies movement patterns. The geometry accurate function represents a good bottom-up approach for discovering interesting areas or obstacles for movements.

  11. [Research on maize multispectral image accurate segmentation and chlorophyll index estimation].

    PubMed

    Wu, Qian; Sun, Hong; Li, Min-zan; Song, Yuan-yuan; Zhang, Yan-e

    2015-01-01

    In order to rapidly acquire maize growing information in the field, a non-destructive method of maize chlorophyll content index measurement was conducted based on multi-spectral imaging technique and imaging processing technology. The experiment was conducted at Yangling in Shaanxi province of China and the crop was Zheng-dan 958 planted in about 1 000 m X 600 m experiment field. Firstly, a 2-CCD multi-spectral image monitoring system was available to acquire the canopy images. The system was based on a dichroic prism, allowing precise separation of the visible (Blue (B), Green (G), Red (R): 400-700 nm) and near-infrared (NIR, 760-1 000 nm) band. The multispectral images were output as RGB and NIR images via the system vertically fixed to the ground with vertical distance of 2 m and angular field of 50°. SPAD index of each sample was'measured synchronously to show the chlorophyll content index. Secondly, after the image smoothing using adaptive smooth filtering algorithm, the NIR maize image was selected to segment the maize leaves from background, because there was a big difference showed in gray histogram between plant and soil background. The NIR image segmentation algorithm was conducted following steps of preliminary and accuracy segmentation: (1) The results of OTSU image segmentation method and the variable threshold algorithm were discussed. It was revealed that the latter was better one in corn plant and weed segmentation. As a result, the variable threshold algorithm based on local statistics was selected for the preliminary image segmentation. The expansion and corrosion were used to optimize the segmented image. (2) The region labeling algorithm was used to segment corn plants from soil and weed background with an accuracy of 95. 59 %. And then, the multi-spectral image of maize canopy was accurately segmented in R, G and B band separately. Thirdly, the image parameters were abstracted based on the segmented visible and NIR images. The average gray

  12. Use of Multiple Data Sources to Estimate the Economic Cost of Dengue Illness in Malaysia

    PubMed Central

    Shepard, Donald S.; Undurraga, Eduardo A.; Lees, Rosemary Susan; Halasa, Yara; Lum, Lucy Chai See; Ng, Chiu Wan

    2012-01-01

    Dengue represents a substantial burden in many tropical and sub-tropical regions of the world. We estimated the economic burden of dengue illness in Malaysia. Information about economic burden is needed for setting health policy priorities, but accurate estimation is difficult because of incomplete data. We overcame this limitation by merging multiple data sources to refine our estimates, including an extensive literature review, discussion with experts, review of data from health and surveillance systems, and implementation of a Delphi process. Because Malaysia has a passive surveillance system, the number of dengue cases is under-reported. Using an adjusted estimate of total dengue cases, we estimated an economic burden of dengue illness of US$56 million (Malaysian Ringgit MYR196 million) per year, which is approximately US$2.03 (Malaysian Ringgit 7.14) per capita. The overall economic burden of dengue would be even higher if we included costs associated with dengue prevention and control, dengue surveillance, and long-term sequelae of dengue. PMID:23033404

  13. Accurate Estimation of Protein Folding and Unfolding Times: Beyond Markov State Models.

    PubMed

    Suárez, Ernesto; Adelman, Joshua L; Zuckerman, Daniel M

    2016-08-01

    Because standard molecular dynamics (MD) simulations are unable to access time scales of interest in complex biomolecular systems, it is common to "stitch together" information from multiple shorter trajectories using approximate Markov state model (MSM) analysis. However, MSMs may require significant tuning and can yield biased results. Here, by analyzing some of the longest protein MD data sets available (>100 μs per protein), we show that estimators constructed based on exact non-Markovian (NM) principles can yield significantly improved mean first-passage times (MFPTs) for protein folding and unfolding. In some cases, MSM bias of more than an order of magnitude can be corrected when identical trajectory data are reanalyzed by non-Markovian approaches. The NM analysis includes "history" information, higher order time correlations compared to MSMs, that is available in every MD trajectory. The NM strategy is insensitive to fine details of the states used and works well when a fine time-discretization (i.e., small "lag time") is used. PMID:27340835

  14. Accurate estimation of normal incidence absorption coefficients with confidence intervals using a scanning laser Doppler vibrometer

    NASA Astrophysics Data System (ADS)

    Vuye, Cedric; Vanlanduit, Steve; Guillaume, Patrick

    2009-06-01

    When using optical measurements of the sound fields inside a glass tube, near the material under test, to estimate the reflection and absorption coefficients, not only these acoustical parameters but also confidence intervals can be determined. The sound fields are visualized using a scanning laser Doppler vibrometer (SLDV). In this paper the influence of different test signals on the quality of the results, obtained with this technique, is examined. The amount of data gathered during one measurement scan makes a thorough statistical analysis possible leading to the knowledge of confidence intervals. The use of a multi-sine, constructed on the resonance frequencies of the test tube, shows to be a very good alternative for the traditional periodic chirp. This signal offers the ability to obtain data for multiple frequencies in one measurement, without the danger of a low signal-to-noise ratio. The variability analysis in this paper clearly shows the advantages of the proposed multi-sine compared to the periodic chirp. The measurement procedure and the statistical analysis are validated by measuring the reflection ratio at a closed end and comparing the results with the theoretical value. Results of the testing of two building materials (an acoustic ceiling tile and linoleum) are presented and compared to supplier data.

  15. Energetic costs of mange in wolves estimated from infrared thermography

    USGS Publications Warehouse

    Cross, Paul C.; Almberg, Emily S.; Haase, Catherine G; Hudson, Peter J.; Maloney, Shane K; Metz, Matthew C; Munn, Adam J; Nugent, Paul; Putzeys, Olivier; Stahler, Daniel R.; Stewart, Anya C; Smith, Doug W.

    2016-01-01

    Parasites, by definition, extract energy from their hosts and thus affect trophic and food web dynamics even when the parasite may have limited effects on host population size. We studied the energetic costs of mange (Sarcoptes scabiei) in wolves (Canis lupus) using thermal cameras to estimate heat losses associated with compromised insulation during the winter. We combined the field data of known, naturally infected wolves with data set on captive wolves with shaved patches of fur as a positive control to simulate mange-induced hair loss. We predict that during the winter in Montana, more severe mange infection increases heat loss by around 5.2 to 12 MJ per night (1240 to 2850 kcal, or a 65% to 78% increase) for small and large wolves, respectively accounting for wind effects. To maintain body temperature would require a significant proportion of a healthy wolf's total daily energy demands (18-22 MJ/day). We also predict how these thermal costs may increase in colder climates by comparing our predictions in Bozeman, Montana to those from a place with lower ambient temperatures (Fairbanks, Alaska). Contrary to our expectations, the 14°C differential between these regions was not as important as the potential differences in wind speed. These large increases in energetic demands can be mitigated by either increasing consumption rates or decreasing other energy demands. Data from GPS-collared wolves indicated that healthy wolves move, on average, 17 km per day, which was reduced by 1.5, 1.8 and 6.5 km for light, medium, and severe hair loss. In addition, the wolf with the most hair loss was less active at night and more active during the day, which is the converse of the movement patterns of healthy wolves. At the individual level mange infections create significant energy demands and altered behavioral patterns, this may have cascading effects on prey consumption rates, food web dynamics, predator-prey interactions, and scavenger communities.

  16. Developing a Cost Model and Methodology to Estimate Capital Costs for Thermal Energy Storage

    SciTech Connect

    Glatzmaier, G.

    2011-12-01

    This report provides an update on the previous cost model for thermal energy storage (TES) systems. The update allows NREL to estimate the costs of such systems that are compatible with the higher operating temperatures associated with advanced power cycles. The goal of the Department of Energy (DOE) Solar Energy Technology Program is to develop solar technologies that can make a significant contribution to the United States domestic energy supply. The recent DOE SunShot Initiative sets a very aggressive cost goal to reach a Levelized Cost of Energy (LCOE) of 6 cents/kWh by 2020 with no incentives or credits for all solar-to-electricity technologies.1 As this goal is reached, the share of utility power generation that is provided by renewable energy sources is expected to increase dramatically. Because Concentrating Solar Power (CSP) is currently the only renewable technology that is capable of integrating cost-effective energy storage, it is positioned to play a key role in providing renewable, dispatchable power to utilities as the share of power generation from renewable sources increases. Because of this role, future CSP plants will likely have as much as 15 hours of Thermal Energy Storage (TES) included in their design and operation. As such, the cost and performance of the TES system is critical to meeting the SunShot goal for solar technologies. The cost of electricity from a CSP plant depends strongly on its overall efficiency, which is a product of two components - the collection and conversion efficiencies. The collection efficiency determines the portion of incident solar energy that is captured as high-temperature thermal energy. The conversion efficiency determines the portion of thermal energy that is converted to electricity. The operating temperature at which the overall efficiency reaches its maximum depends on many factors, including material properties of the CSP plant components. Increasing the operating temperature of the power generation

  17. Breckinridge Project, initial effort. Report IX. Operating cost estimate

    SciTech Connect

    1982-01-01

    Operating costs are normally broken into three major categories: variable costs including raw materials, annual catalyst and chemicals, and utilities; semi-variable costs including labor and labor related cost; and fixed or capital related charges. The raw materials and utilities costs are proportional to production; however, a small component of utilities cost is independent of production. The catalyst and chemicals costs are also normally proportional to production. Semi-variable costs include direct labor, maintenance labor, labor supervision, contract maintenance, maintenance materials, payroll overheads, operation supplies, and general overhead and administration. Fixed costs include local taxes, insurance and the time value of the capital investment. The latter charge often includes the investor's anticipated return on investment. In determining operating costs for financial analysis, return on investment (ROI) and depreciation are not treated as cash operating costs. These costs are developed in the financial analysis; the annual operating cost determined here omits ROI and depreciation. Project Annual Operating Costs are summarized in Table 1. Detailed supporting information for the cost elements listed below is included in the following sections: Electrical, catalyst and chemicals, and salaries and wages.

  18. Wind effect on PV module temperature: Analysis of different techniques for an accurate estimation.

    NASA Astrophysics Data System (ADS)

    Schwingshackl, Clemens; Petitta, Marcello; Ernst Wagner, Jochen; Belluardo, Giorgio; Moser, David; Castelli, Mariapina; Zebisch, Marc; Tetzlaff, Anke

    2013-04-01

    temperature estimation using meteorological parameters. References: [1] Skoplaki, E. et al., 2008: A simple correlation for the operating temperature of photovoltaic modules of arbitrary mounting, Solar Energy Materials & Solar Cells 92, 1393-1402 [2] Skoplaki, E. et al., 2008: Operating temperature of photovoltaic modules: A survey of pertinent correlations, Renewable Energy 34, 23-29 [3] Koehl, M. et al., 2011: Modeling of the nominal operating cell temperature based on outdoor weathering, Solar Energy Materials & Solar Cells 95, 1638-1646 [4] Mattei, M. et al., 2005: Calculation of the polycrystalline PV module temperature using a simple method of energy balance, Renewable Energy 31, 553-567 [5] Kurtz, S. et al.: Evaluation of high-temperature exposure of rack-mounted photovoltaic modules

  19. Zero-Cost Estimation of Zero-Point Energies.

    PubMed

    Császár, Attila G; Furtenbacher, Tibor

    2015-10-01

    An additive, linear, atom-type-based (ATB) scheme is developed allowing no-cost estimation of zero-point vibrational energies (ZPVE) of neutral, closed-shell molecules in their ground electronic states. The atom types employed correspond to those defined within the MM2 molecular mechanics force field approach. The reference training set of 156 molecules cover chained and branched alkanes, alkenes, cycloalkanes and cycloalkenes, alkynes, alcohols, aldehydes, carboxylic acids, amines, amides, ethers, esters, ketones, benzene derivatives, heterocycles, nucleobases, all the natural amino acids, some dipeptides and sugars, as well as further simple molecules and ones containing several structural units, including several vitamins. A weighted linear least-squares fit of atom-type-based ZPVE increments results in recommended values for the following atoms, with the number of atom types defined in parentheses: H(8), D(1), B(1), C(6), N(7), O(3), F(1), Si(1), P(2), S(3), and Cl(1). The average accuracy of the ATB ZPVEs is considerably better than 1 kcal mol(-1), that is, better than chemical accuracy. The proposed ATB scheme could be extended to many more atoms and atom types, following a careful validation procedure; deviation from the MM2 atom types seems to be necessary, especially for third-row elements. PMID:26398318

  20. Accurate Intermolecular Interactions at Dramatically Reduced Cost and a Many-Body Energy Decomposition Scheme for XPol+SAPT

    NASA Astrophysics Data System (ADS)

    Lao, Ka Un; Herbert, John M.

    2013-06-01

    An efficient, monomer-based electronic structure method is introduced for computing non-covalent interactions in molecular and ionic clusters. It builds upon our ``explicit polarization" (XPol) with pairwise-additive symmetry-adapted perturbation theory (SAPT) using the Kohn-Sham (KS) version of SAPT, but replaces the problematic and expensive sum-over-states dispersion terms with empirical potentials. This modification reduces the scaling from {O}(N^5) to {O}(N^3) and also facilitates the use of Kohn-Sham density functional theory (KS-DFT) as a low-cost means to capture intramolecular electron correlation. Accurate binding energies are obtained for benchmark databases of dimer binding energies, and potential energy curves are also captured accurately, for a variety of challenging systems. As compared to traditional DFT-SAPT or SAPT(DFT) methods, it removes the limitation to dimers and extends SAPT-based methodology to many-body systems. For many-body systems such as water clusters and halide-water cluster anions, the new method is superior to established density-functional methods for non-covalent interactions. We suggest that using different asymptotic corrections for different monomers is necessary to get good binding energies in general, as DFT-SAPT or SAPT(DFT), especially for hydrogen-bonded complexes. We also introduce a decomposition scheme for the interaction energy that extends traditional SAPT energy decomposition analysis to systems containing more than two monomers, and we find that the various energy components (electrostatic, exchange, induction, and dispersion) are in very good agreement with high-level SAPT benchmarks for dimers. For (H_2O)_6, the many-body contribution to the interaction energy agrees well with that obtained from traditional Kitaura-Morokuma energy decomposition analysis.

  1. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  2. Quaternion-Based Unscented Kalman Filter for Accurate Indoor Heading Estimation Using Wearable Multi-Sensor System

    PubMed Central

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  3. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-05-07

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path.

  4. A new set of atomic radii for accurate estimation of solvation free energy by Poisson-Boltzmann solvent model.

    PubMed

    Yamagishi, Junya; Okimoto, Noriaki; Morimoto, Gentaro; Taiji, Makoto

    2014-11-01

    The Poisson-Boltzmann implicit solvent (PB) is widely used to estimate the solvation free energies of biomolecules in molecular simulations. An optimized set of atomic radii (PB radii) is an important parameter for PB calculations, which determines the distribution of dielectric constants around the solute. We here present new PB radii for the AMBER protein force field to accurately reproduce the solvation free energies obtained from explicit solvent simulations. The presented PB radii were optimized using results from explicit solvent simulations of the large systems. In addition, we discriminated PB radii for N- and C-terminal residues from those for nonterminal residues. The performances using our PB radii showed high accuracy for the estimation of solvation free energies at the level of the molecular fragment. The obtained PB radii are effective for the detailed analysis of the solvation effects of biomolecules.

  5. Different approaches to estimating transition costs in the electric- utility industry

    SciTech Connect

    Baxter, L.W.

    1995-10-01

    The term ``transition costs`` describes the potential revenue shortfall (or welfare loss) a utility (or other actor) may experience through government-initiated deregulation of electricity generation. The potential for transition costs arises whenever a regulated industry is subject to competitive market forces as a result of explicit government action. Federal and state proposals to deregulate electricity generation sparked a national debate on transition costs in the electric-utility industry. Industry-wide transition cost estimates range from about $20 billion to $500 billion. Such disparate estimates raise important questions on estimation methods for decision makers. This report examines different approaches to estimating transition costs. The study has three objectives. First, we discuss the concept of transition cost. Second, we identify the major cost categories included in transition cost estimates and summarize the current debate on which specific costs are appropriately included in these estimates. Finally, we identify general and specific estimation approaches and assess their strengths and weaknesses. We relied primarily on the evidentiary records established at the Federal Energy Regulatory Commission and the California Public Utilities Commission to identify major cost categories and specific estimation approaches. We also contacted regulatory commission staffs in ten states to ascertain estimation activities in each of these states. We refined a classification framework to describe and assess general estimation options. We subsequently developed and applied criteria to describe and assess specific estimation approaches proposed by federal regulators, state regulators, utilities, independent power companies, and consultants.

  6. Disaster warning system study summary. [cost estimates using NOAA satellites

    NASA Technical Reports Server (NTRS)

    Leroy, B. F.; Maloy, J. E.; Braley, R. C.; Provencher, C. E.; Schumaker, H. A.; Valgora, M. E.

    1977-01-01

    A conceptual satellite system to replace or complement NOAA's data collection, internal communications, and public information dissemination systems for the mid-1980's was defined. Program cost and cost sensitivity to variations in communications functions are analyzed.

  7. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture

  8. Estimating Costs of Services and Resource Allocations Using Self Reports of Time Proportions.

    ERIC Educational Resources Information Center

    Newman, Warren B.; Jones, Robert A.

    A method using estimates of individual time allocations was developed to produce a profile of departmental functioning as a whole. Previous research suggested that staff estimates were a reasonably accurate and economical method for obtaining estimates of time. Each staff member estimated the percentage of his or her time spent on different tasks…

  9. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401 Cost... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Cost accounting...

  10. What Would It Cost to Coach Every New Principal? An Estimate Using Statewide Personnel Data

    ERIC Educational Resources Information Center

    Lochmiller, Chad R.

    2014-01-01

    In this paper, I use Levin and McEwan's (2001) cost feasibility approach and personnel data obtained from the Superintendent of Public Instruction to estimate the cost of providing coaching support to every newly hired principal in Washington State. Based on this descriptive analysis, I estimate that the cost to provide leadership coaching to…

  11. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  12. Accurate path integral molecular dynamics simulation of ab-initio water at near-zero added cost

    NASA Astrophysics Data System (ADS)

    Elton, Daniel; Fritz, Michelle; Soler, José; Fernandez-Serra, Marivi

    It is now established that nuclear quantum motion plays an important role in determining water's structure and dynamics. These effects are important to consider when evaluating DFT functionals and attempting to develop better ones for water. The standard way of treating nuclear quantum effects, path integral molecular dynamics (PIMD), multiplies the number of energy/force calculations by the number of beads, which is typically 32. Here we introduce a method whereby PIMD can be incorporated into a DFT molecular dynamics simulation at virtually zero cost. The method is based on the cluster (many body) expansion of the energy. We first subtract the DFT monomer energies, using a custom DFT-based monomer potential energy surface. The evolution of the PIMD beads is then performed using only the more-accurate Partridge-Schwenke monomer energy surface. The DFT calculations are done using the centroid positions. Various bead thermostats can be employed to speed up the sampling of the quantum ensemble. The method bears some resemblance to multiple timestep algorithms and other schemes used to speed up PIMD with classical force fields. We show that our method correctly captures some of key effects of nuclear quantum motion on both the structure and dynamics of water. We acknowledge support from DOE Award No. DE-FG02-09ER16052 (D.E.) and DOE Early Career Award No. DE-SC0003871 (M.V.F.S.).

  13. Estimating the cost related to surveillance of colorectal cancer in a French population

    PubMed Central

    Lejeune, Catherine; Binquet, Christine; Bonnetain, Franck; Mahboubi, Amel; Abrahamowicz, Michal; Moreau, Thierry; Raikou, Maria; Bedenne, Laurent; Quantin, Catherine; Bonithon-Kopp, Claire

    2009-01-01

    Little is known about costs related to the surveillance of patients that have undergone curative resection of colorectal cancer. The aim of this study was to calculate the observed surveillance costs for 385 patients followed-up over a 3-year period, to estimate surveillance costs if French guidelines are respected, and to identify the determinants related to surveillance costs to derive a global estimation for France, using a linear mixed model. The observed mean surveillance cost was € 713. If French recommendations were strictly applied, the estimated mean cost would vary between € 680 and € 1 069 according to the frequency of abdominal ultrasound. The predicted determinants of the cost were: age, recurrence, duration of surveillance since diagnosis, and adjuvant treatments. For France, the surveillance cost represented 4.4% of the cost of colorectal cancer management. The cost of surveillance should now be balanced with its effectiveness and compared with surveillance alternatives. PMID:19259712

  14. 40 CFR 265.144 - Cost estimate for post-closure care.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for post-closure care..., STORAGE, AND DISPOSAL FACILITIES Financial Requirements § 265.144 Cost estimate for post-closure care. (a) The owner or operator of a hazardous waste disposal unit must have a detailed written estimate,...

  15. 48 CFR 1336.605 - Government cost estimate for architect-engineer work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government cost estimate... Architect-Engineer Services 1336.605 Government cost estimate for architect-engineer work. After award, the independent Government estimated price can be released, upon request, to those firms or individuals...

  16. 40 CFR 264.144 - Cost estimate for post-closure care.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for post-closure care... FACILITIES Financial Requirements § 264.144 Cost estimate for post-closure care. (a) The owner or operator of... contingent closure and post-closure plan, must have a detailed written estimate, in current dollars, of...

  17. Estimating dietary costs of low-income women in California: A comparison of two approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: Compare two approaches for estimating individual daily diet costs in a population of low-income women in California. Design: Cost estimates based on time-intensive Method 1 (three 24-h recalls and associated food prices on receipts) were compared with estimates using a lesser intensive M...

  18. Cost estimation for solid waste management in industrialising regions - Precedents, problems and prospects

    SciTech Connect

    Parthan, Shantha R.; Milke, Mark W.; Wilson, David C.; Cocks, John H.

    2012-03-15

    Highlights: Black-Right-Pointing-Pointer We review cost estimation approaches for solid waste management. Black-Right-Pointing-Pointer Unit cost method and benchmarking techniques used in industrialising regions (IR). Black-Right-Pointing-Pointer Variety in scope, quality and stakeholders makes cost estimation challenging in IR. Black-Right-Pointing-Pointer Integrate waste flow and cost models using cost functions to improve cost planning. - Abstract: The importance of cost planning for solid waste management (SWM) in industrialising regions (IR) is not well recognised. The approaches used to estimate costs of SWM can broadly be classified into three categories - the unit cost method, benchmarking techniques and developing cost models using sub-approaches such as cost and production function analysis. These methods have been developed into computer programmes with varying functionality and utility. IR mostly use the unit cost and benchmarking approach to estimate their SWM costs. The models for cost estimation, on the other hand, are used at times in industrialised countries, but not in IR. Taken together, these approaches could be viewed as precedents that can be modified appropriately to suit waste management systems in IR. The main challenges (or problems) one might face while attempting to do so are a lack of cost data, and a lack of quality for what data do exist. There are practical benefits to planners in IR where solid waste problems are critical and budgets are limited.

  19. Laboratory demonstration of aircraft estimation using low-cost sensors

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.

    1978-01-01

    Four nonlinear state estimators were devised which provide techniques for obtaining the angular orientation (attitude) of the aircraft. An extensive FORTRAN computer program was developed to demonstrate and evaluate the estimators by using recorded flight test data. This program simulates the estimator operation, and it compares the state estimates with actual state measurements. The program was used to evaluate the state estimators with data recorded on the NASA Ames CV-990 and CESSNA 402B aircraft. A preliminary assessment was made of the memory, word length, and timing requirements for implementing the selected state estimator on a typical microcomputer.

  20. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  1. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    SciTech Connect

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.; Huelskamp, Robert M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  2. Los Alamos Waste Management Cost Estimation Model; Final report: Documentation of waste management process, development of Cost Estimation Model, and model reference manual

    SciTech Connect

    Matysiak, L.M.; Burns, M.L.

    1994-03-01

    This final report completes the Los Alamos Waste Management Cost Estimation Project, and includes the documentation of the waste management processes at Los Alamos National Laboratory (LANL) for hazardous, mixed, low-level radioactive solid and transuranic waste, development of the cost estimation model and a user reference manual. The ultimate goal of this effort was to develop an estimate of the life cycle costs for the aforementioned waste types. The Cost Estimation Model is a tool that can be used to calculate the costs of waste management at LANL for the aforementioned waste types, under several different scenarios. Each waste category at LANL is managed in a separate fashion, according to Department of Energy requirements and state and federal regulations. The cost of the waste management process for each waste category has not previously been well documented. In particular, the costs associated with the handling, treatment and storage of the waste have not been well understood. It is anticipated that greater knowledge of these costs will encourage waste generators at the Laboratory to apply waste minimization techniques to current operations. Expected benefits of waste minimization are a reduction in waste volume, decrease in liability and lower waste management costs.

  3. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Contractor's policies, procedures, and practices for budgeting and planning controls, and generating...) Flow of work, coordination, and communication; and (5) Budgeting, planning, estimating methods... personnel have sufficient training, experience, and guidance to perform estimating and budgeting tasks...

  4. Estimating the Cost of Standardized Student Testing in the United States.

    ERIC Educational Resources Information Center

    Phelps, Richard P.

    2000-01-01

    Describes and contrasts different methods of estimating costs of standardized testing. Using a cost-accounting approach, compares gross and marginal costs and considers testing objects (test materials and services, personnel and student time, and administrative/building overhead). Social marginal costs of replacing existing tests with a national…

  5. Lunar base scenario cost estimates: Lunar base systems study task 6.1

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The projected development and production costs of each of the Lunar Base's systems are described and unit costs are estimated for transporting the systems to the lunar surface and for setting up the system.

  6. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Cost accounting standard... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401...

  7. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Cost accounting standard... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401...

  8. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Cost accounting standard... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401...

  9. COST ESTIMATION MODELS FOR DRINKING WATER TREATMENT UNIT PROCESSES

    EPA Science Inventory

    Cost models for unit processes typically utilized in a conventional water treatment plant and in package treatment plant technology are compiled in this paper. The cost curves are represented as a function of specified design parameters and are categorized into four major catego...

  10. Improving Space Project Cost Estimating with Engineering Management Variables

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph W.; Roth, Axel (Technical Monitor)

    2001-01-01

    Current space project cost models attempt to predict space flight project cost via regression equations, which relate the cost of projects to technical performance metrics (e.g. weight, thrust, power, pointing accuracy, etc.). This paper examines the introduction of engineering management parameters to the set of explanatory variables. A number of specific engineering management variables are considered and exploratory regression analysis is performed to determine if there is statistical evidence for cost effects apart from technical aspects of the projects. It is concluded that there are other non-technical effects at work and that further research is warranted to determine if it can be shown that these cost effects are definitely related to engineering management.

  11. Species Distribution 2.0: An Accurate Time- and Cost-Effective Method of Prospection Using Street View Imagery

    PubMed Central

    Schwoertzig, Eugénie; Millon, Alexandre

    2016-01-01

    Species occurrence data provide crucial information for biodiversity studies in the current context of global environmental changes. Such studies often rely on a limited number of occurrence data collected in the field and on pseudo-absences arbitrarily chosen within the study area, which reduces the value of these studies. To overcome this issue, we propose an alternative method of prospection using geo-located street view imagery (SVI). Following a standardised protocol of virtual prospection using both vertical (aerial photographs) and horizontal (SVI) perceptions, we have surveyed 1097 randomly selected cells across Spain (0.1x0.1 degree, i.e. 20% of Spain) for the presence of Arundo donax L. (Poaceae). In total we have detected A. donax in 345 cells, thus substantially expanding beyond the now two-centuries-old field-derived record, which described A. donax only 216 cells. Among the field occurrence cells, 81.1% were confirmed by SVI prospection to be consistent with species presence. In addition, we recorded, by SVI prospection, 752 absences, i.e. cells where A. donax was considered absent. We have also compared the outcomes of climatic niche modeling based on SVI data against those based on field data. Using generalized linear models fitted with bioclimatic predictors, we have found SVI data to provide far more compelling results in terms of niche modeling than does field data as classically used in SDM. This original, cost- and time-effective method provides the means to accurately locate highly visible taxa, reinforce absence data, and predict species distribution without long and expensive in situ prospection. At this time, the majority of available SVI data is restricted to human-disturbed environments that have road networks. However, SVI is becoming increasingly available in natural areas, which means the technique has considerable potential to become an important factor in future biodiversity studies. PMID:26751565

  12. User's manual for the INDCEPT code for estimating industrial steam boiler plant capital investment costs

    SciTech Connect

    Bowers, H I; Fuller, L C; Hudson, II, C R

    1982-09-01

    The INDCEPT computer code package was developed to provide conceptual capital investment cost estimates for single- and multiple-unit industrial steam boiler plants. Cost estimates can be made as a function of boiler type, size, location, and date of initial operation. The output includes a detailed breakdown of the estimate into direct and indirect costs. Boiler plant cost models are provided to reflect various types and sources of coal and alternate means of sulfur and particulate removal. Cost models are also included for low-Btu and medium-Btu gas produced in coal gasification plants.

  13. Stochastic Frontier Estimation of a CES Cost Function: The Case of Higher Education in Britain.

    ERIC Educational Resources Information Center

    Izadi, Hooshang; Johnes, Geraint; Oskrochi, Reza; Crouchley, Robert

    2002-01-01

    Examines the use of stochastic frontier estimation of constant elasticity of substitution (CES) cost function to measure differences in efficiency among British universities. (Contains 28 references.) (PKP)

  14. A systematic approach for the accurate non-invasive estimation of blood glucose utilizing a novel light-tissue interaction adaptive modelling scheme

    NASA Astrophysics Data System (ADS)

    Rybynok, V. O.; Kyriacou, P. A.

    2007-10-01

    Diabetes is one of the biggest health challenges of the 21st century. The obesity epidemic, sedentary lifestyles and an ageing population mean prevalence of the condition is currently doubling every generation. Diabetes is associated with serious chronic ill health, disability and premature mortality. Long-term complications including heart disease, stroke, blindness, kidney disease and amputations, make the greatest contribution to the costs of diabetes care. Many of these long-term effects could be avoided with earlier, more effective monitoring and treatment. Currently, blood glucose can only be monitored through the use of invasive techniques. To date there is no widely accepted and readily available non-invasive monitoring technique to measure blood glucose despite the many attempts. This paper challenges one of the most difficult non-invasive monitoring techniques, that of blood glucose, and proposes a new novel approach that will enable the accurate, and calibration free estimation of glucose concentration in blood. This approach is based on spectroscopic techniques and a new adaptive modelling scheme. The theoretical implementation and the effectiveness of the adaptive modelling scheme for this application has been described and a detailed mathematical evaluation has been employed to prove that such a scheme has the capability of extracting accurately the concentration of glucose from a complex biological media.

  15. Estimating costs of sea lice control strategy in Norway.

    PubMed

    Liu, Yajie; Bjelland, Hans Vanhauwaer

    2014-12-01

    This paper explores the costs of sea lice control strategies associated with salmon aquaculture at a farm level in Norway. Diseases can cause reduction in growth, low feed efficiency and market prices, increasing mortality rates, and expenditures on prevention and treatment measures. Aquaculture farms suffer the most direct and immediate economic losses from diseases. The goal of a control strategy is to minimize the total disease costs, including biological losses, and treatment costs while to maximize overall profit. Prevention and control strategies are required to eliminate or minimize the disease, while cost-effective disease control strategies at the fish farm level are designed to reduce the losses, and to enhance productivity and profitability. Thus, the goal can be achieved by integrating models of fish growth, sea lice dynamics and economic factors. A production function is first constructed to incorporate the effects of sea lice on production at a farm level, followed by a detailed cost analysis of several prevention and treatment strategies associated with sea lice in Norway. The results reveal that treatments are costly and treatment costs are very sensitive to treatment types used and timing of the treatment conducted. Applying treatment at an early growth stage is more economical than at a later stage.

  16. Mass screening for neuroblastoma and estimation of costs.

    PubMed

    Nishi, M; Miyake, H; Takeda, T; Takasugi, N; Hanai, J; Kawai, T

    1991-01-01

    On the basis of epidemiological data and medical costs for patients with neuroblastoma, we have calculated the cost of mass screening for neuroblastoma with high performance liquid chromatography (HPLC) compared to the cost when it is not performed. If the sensitivity of the mass screening is 80% and 22,000 infants are screened annually the cost will be 27,809,000 yen ($191,800). If mass is not performed, the cost will be 28,446,000 yen ($196,200). The difference in cost (637,000 yen or $4,400) is fairly small. If the sensitivity is 75% and 16,500 infants are screened, the difference is also small (174,000 yen or $1,200). Therefore, mass screening with the HPLC method will not be an undue financial burden. But re-screening at an older age will be done with less financially favorable results, considering that the sensitivity may not be as high as that of the first screening and that mothers are somewhat reluctant about re-screening. The balance of the cost of mass screening by qualitative methods may also be less favorable, since the detection rate is low. PMID:1957600

  17. Reservoir evaluation of thin-bedded turbidites and hydrocarbon pore thickness estimation for an accurate quantification of resource

    NASA Astrophysics Data System (ADS)

    Omoniyi, Bayonle; Stow, Dorrik

    2016-04-01

    One of the major challenges in the assessment of and production from turbidite reservoirs is to take full account of thin and medium-bedded turbidites (<10cm and <30cm respectively). Although such thinner, low-pay sands may comprise a significant proportion of the reservoir succession, they can go unnoticed by conventional analysis and so negatively impact on reserve estimation, particularly in fields producing from prolific thick-bedded turbidite reservoirs. Field development plans often take little note of such thin beds, which are therefore bypassed by mainstream production. In fact, the trapped and bypassed fluids can be vital where maximising field value and optimising production are key business drivers. We have studied in detail, a succession of thin-bedded turbidites associated with thicker-bedded reservoir facies in the North Brae Field, UKCS, using a combination of conventional logs and cores to assess the significance of thin-bedded turbidites in computing hydrocarbon pore thickness (HPT). This quantity, being an indirect measure of thickness, is critical for an accurate estimation of original-oil-in-place (OOIP). By using a combination of conventional and unconventional logging analysis techniques, we obtain three different results for the reservoir intervals studied. These results include estimated net sand thickness, average sand thickness, and their distribution trend within a 3D structural grid. The net sand thickness varies from 205 to 380 ft, and HPT ranges from 21.53 to 39.90 ft. We observe that an integrated approach (neutron-density cross plots conditioned to cores) to HPT quantification reduces the associated uncertainties significantly, resulting in estimation of 96% of actual HPT. Further work will focus on assessing the 3D dynamic connectivity of the low-pay sands with the surrounding thick-bedded turbidite facies.

  18. Performance evaluation of ocean color satellite models for deriving accurate chlorophyll estimates in the Gulf of Saint Lawrence

    NASA Astrophysics Data System (ADS)

    Montes-Hugo, M.; Bouakba, H.; Arnone, R.

    2014-06-01

    The understanding of phytoplankton dynamics in the Gulf of the Saint Lawrence (GSL) is critical for managing major fisheries off the Canadian East coast. In this study, the accuracy of two atmospheric correction techniques (NASA standard algorithm, SA, and Kuchinke's spectral optimization, KU) and three ocean color inversion models (Carder's empirical for SeaWiFS (Sea-viewing Wide Field-of-View Sensor), EC, Lee's quasi-analytical, QAA, and Garver- Siegel-Maritorena semi-empirical, GSM) for estimating the phytoplankton absorption coefficient at 443 nm (aph(443)) and the chlorophyll concentration (chl) in the GSL is examined. Each model was validated based on SeaWiFS images and shipboard measurements obtained during May of 2000 and April 2001. In general, aph(443) estimates derived from coupling KU and QAA models presented the smallest differences with respect to in situ determinations as measured by High Pressure liquid Chromatography measurements (median absolute bias per cruise up to 0.005, RMSE up to 0.013). A change on the inversion approach used for estimating aph(443) values produced up to 43.4% increase on prediction error as inferred from the median relative bias per cruise. Likewise, the impact of applying different atmospheric correction schemes was secondary and represented an additive error of up to 24.3%. By using SeaDAS (SeaWiFS Data Analysis System) default values for the optical cross section of phytoplankton (i.e., aph(443) = aph(443)/chl = 0.056 m2mg-1), the median relative bias of our chl estimates as derived from the most accurate spaceborne aph(443) retrievals and with respect to in situ determinations increased up to 29%.

  19. The health and visibility cost of air pollution: a comparison of estimation methods.

    PubMed

    Delucchi, Mark A; Murphy, James J; McCubbin, Donald R

    2002-02-01

    Air pollution from motor vehicles, electricity-generating plants, industry, and other sources can harm human health, injure crops and forests, damage building materials, and impair visibility. Economists sometimes analyze the social cost of these impacts, in order to illuminate tradeoffs, compare alternatives, and promote efficient use of scarce resource. In this paper, we compare estimates of the health and visibility costs of air pollution derived from a meta-hedonic price analysis, with an estimate of health costs derived from a damage-function analysis and an estimate of the visibility cost derived from contingent valuation. We find that the meta-hedonic price analysis produces an estimate of the health cost that lies at the low end of the range of damage-function estimates. This is consistent with hypotheses that on the one hand, hedonic price analysis does not capture all of the health costs of air pollution (because individuals may not be fully informed about all of the health effects), and that on the other hand, the value of mortality used in the high-end damage function estimates is too high. The analysis of the visibility cost of air pollution derived from a meta-hedonic price analysis produces an estimate that is essentially identical to an independent estimate based on contingent valuation. This close agreement lends some credence to the estimates. We then apply the meta hedonic-price model to estimate the visibility cost per kilogram of motor vehicle emissions.

  20. But what will it Cost? The history of NASA cost estimating

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph W.

    1994-01-01

    Within two years of being chartered in 1958 as an independent agency to conduct civilian pursuits in aeronautics and space, NASA absorbed either wholly or partially the people, facilities, and equipment of several existing organizations. These included the laboratories of the National Advisory Committee of Aeronautics (NACA) at Langley Research Center in Virginia, Ames Research Center in California, and Lewis Research Center in Ohio; the Army Ballistic Missile Agency (ABMA) at Redstone Arsenal Alabama, for which the team of Wernher von Braun worked; and the Department of Defense Advanced Research Projects Agency (ARPA) and their ongoing work on big boosters. These were especially valuable resources to jump start the new agency in light of the shocking success of the Soviet space probe Sputnik in the autumn of the previous year and the corresponding pressure from an impatient American public to produce some response. Along with these inheritances, there came some existing systems engineering and management practices, including project cost estimating methodologies. This paper will briefly trace the origins of those methods and how they evolved within the agency over the past three decades.

  1. Estimation of marginal costs at existing waste treatment facilities.

    PubMed

    Martinez-Sanchez, Veronica; Hulgaard, Tore; Hindsgaul, Claus; Riber, Christian; Kamuk, Bettina; Astrup, Thomas F

    2016-04-01

    This investigation aims at providing an improved basis for assessing economic consequences of alternative Solid Waste Management (SWM) strategies for existing waste facilities. A bottom-up methodology was developed to determine marginal costs in existing facilities due to changes in the SWM system, based on the determination of average costs in such waste facilities as function of key facility and waste compositional parameters. The applicability of the method was demonstrated through a case study including two existing Waste-to-Energy (WtE) facilities, one with co-generation of heat and power (CHP) and another with only power generation (Power), affected by diversion strategies of five waste fractions (fibres, plastic, metals, organics and glass), named "target fractions". The study assumed three possible responses to waste diversion in the WtE facilities: (i) biomass was added to maintain a constant thermal load, (ii) Refused-Derived-Fuel (RDF) was included to maintain a constant thermal load, or (iii) no reaction occurred resulting in a reduced waste throughput without full utilization of the facility capacity. Results demonstrated that marginal costs of diversion from WtE were up to eleven times larger than average costs and dependent on the response in the WtE plant. Marginal cost of diversion were between 39 and 287 € Mg(-1) target fraction when biomass was added in a CHP (from 34 to 303 € Mg(-1) target fraction in the only Power case), between -2 and 300 € Mg(-1) target fraction when RDF was added in a CHP (from -2 to 294 € Mg(-1) target fraction in the only Power case) and between 40 and 303 € Mg(-1) target fraction when no reaction happened in a CHP (from 35 to 296 € Mg(-1) target fraction in the only Power case). Although average costs at WtE facilities were highly influenced by energy selling prices, marginal costs were not (provided a response was initiated at the WtE to keep constant the utilized thermal capacity). Failing to systematically

  2. Estimation of marginal costs at existing waste treatment facilities.

    PubMed

    Martinez-Sanchez, Veronica; Hulgaard, Tore; Hindsgaul, Claus; Riber, Christian; Kamuk, Bettina; Astrup, Thomas F

    2016-04-01

    This investigation aims at providing an improved basis for assessing economic consequences of alternative Solid Waste Management (SWM) strategies for existing waste facilities. A bottom-up methodology was developed to determine marginal costs in existing facilities due to changes in the SWM system, based on the determination of average costs in such waste facilities as function of key facility and waste compositional parameters. The applicability of the method was demonstrated through a case study including two existing Waste-to-Energy (WtE) facilities, one with co-generation of heat and power (CHP) and another with only power generation (Power), affected by diversion strategies of five waste fractions (fibres, plastic, metals, organics and glass), named "target fractions". The study assumed three possible responses to waste diversion in the WtE facilities: (i) biomass was added to maintain a constant thermal load, (ii) Refused-Derived-Fuel (RDF) was included to maintain a constant thermal load, or (iii) no reaction occurred resulting in a reduced waste throughput without full utilization of the facility capacity. Results demonstrated that marginal costs of diversion from WtE were up to eleven times larger than average costs and dependent on the response in the WtE plant. Marginal cost of diversion were between 39 and 287 € Mg(-1) target fraction when biomass was added in a CHP (from 34 to 303 € Mg(-1) target fraction in the only Power case), between -2 and 300 € Mg(-1) target fraction when RDF was added in a CHP (from -2 to 294 € Mg(-1) target fraction in the only Power case) and between 40 and 303 € Mg(-1) target fraction when no reaction happened in a CHP (from 35 to 296 € Mg(-1) target fraction in the only Power case). Although average costs at WtE facilities were highly influenced by energy selling prices, marginal costs were not (provided a response was initiated at the WtE to keep constant the utilized thermal capacity). Failing to systematically

  3. Estimates and implications of the costs of compliance with biosafety regulations in developing countries.

    PubMed

    Falck-Zepeda, Jose; Yorobe, Jose; Husin, Bahagiawati Amir; Manalo, Abraham; Lokollo, Erna; Ramon, Godfrey; Zambrano, Patricia; Sutrisno

    2012-01-01

    Estimating the cost of compliance with biosafety regulations is important as it helps developers focus their investments in producer development. We provide estimates for the cost of compliance for a set of technologies in Indonesia, the Philippines and other countries. These costs vary from US $100,000 to 1.7 million. These are estimates of regulatory costs and do not include product development or deployment costs. Cost estimates need to be compared with potential gains when the technology is introduced in these countries and the gains in knowledge accumulate during the biosafety assessment process. Although the cost of compliance is important, time delays and uncertainty are even more important and may have an adverse impact on innovations reaching farmers.

  4. Accurate recovery of 4D left ventricular deformations using volumetric B-splines incorporating phase based displacement estimates

    NASA Astrophysics Data System (ADS)

    Chen, Jian; Tustison, Nicholas J.; Amini, Amir A.

    2006-03-01

    In this paper, an improved framework for estimation of 3-D left-ventricular deformations from tagged MRI is presented. Contiguous short- and long-axis tagged MR images are collected and are used within a 4-D B-Spline based deformable model to determine 4-D displacements and strains. An initial 4-D B-spline model fitted to sparse tag line data is first constructed by minimizing a 4-D Chamfer distance potential-based energy function for aligning isoparametric planes of the model with tag line locations; subsequently, dense virtual tag lines based on 2-D phase-based displacement estimates and the initial model are created. A final 4-D B-spline model with increased knots is fitted to the virtual tag lines. From the final model, we can extract accurate 3-D myocardial deformation fields and corresponding strain maps which are local measures of non-rigid deformation. Lagrangian strains in simulated data are derived which show improvement over our previous work. The method is also applied to 3-D tagged MRI data collected in a canine.

  5. Can endocranial volume be estimated accurately from external skull measurements in great-tailed grackles (Quiscalus mexicanus)?

    PubMed Central

    Palmstrom, Christin R.

    2015-01-01

    There is an increasing need to validate and collect data approximating brain size on individuals in the field to understand what evolutionary factors drive brain size variation within and across species. We investigated whether we could accurately estimate endocranial volume (a proxy for brain size), as measured by computerized tomography (CT) scans, using external skull measurements and/or by filling skulls with beads and pouring them out into a graduated cylinder for male and female great-tailed grackles. We found that while females had higher correlations than males, estimations of endocranial volume from external skull measurements or beads did not tightly correlate with CT volumes. We found no accuracy in the ability of external skull measures to predict CT volumes because the prediction intervals for most data points overlapped extensively. We conclude that we are unable to detect individual differences in endocranial volume using external skull measurements. These results emphasize the importance of validating and explicitly quantifying the predictive accuracy of brain size proxies for each species and each sex. PMID:26082858

  6. Probabilistic estimation of numbers and costs of future landslides in the San Francisco Bay region

    USGS Publications Warehouse

    Crovelli, R.A.; Coe, J.A.

    2009-01-01

    We used historical records of damaging landslides triggered by rainstorms and a newly developed Probabilistic Landslide Assessment Cost Estimation System (PLACES) to estimate the numbers and direct costs of future landslides in the 10-county San Francisco Bay region. Historical records of damaging landslides in the region are incomplete. Therefore, our estimates of numbers and costs of future landslides are minimal estimates. The estimated mean annual number of future damaging landslides for the entire 10-county region is about 65. Santa Cruz County has the highest estimated mean annual number of damaging future landslides (about 18), whereas Napa, San Francisco, and Solano Counties have the lowest estimated mean numbers of damaging landslides (about 1 each). The estimated mean annual cost of future landslides in the entire region is about US $14.80 million (year 2000 $). The estimated mean annual cost is highest for San Mateo County ($3.24 million) and lowest for Solano County ($0.18 million). The annual per capita cost for the entire region will be about $2.10. Santa Cruz County will have the highest annual per capita cost at $8.45, whereas San Francisco County will have the lowest per capita cost at $0.31. Normalising costs by dividing by the percentage of land area with slopes equal to or greater than 17% indicates that San Francisco County will have the highest cost per square km ($7,101), whereas Santa Clara County will have the lowest cost per square km ($229). These results indicate that the San Francisco Bay region has one of the highest levels of landslide risk in the United States. Compared with landslide cost estimates from the rest of the world, the risk level in the Bay region seems high, but not exceptionally high.

  7. Bounds Estimation Via Regression with Asymmetric Cost Functions

    NASA Technical Reports Server (NTRS)

    DeCoste, D.

    1997-01-01

    This paper addresses a significant but mostly-neglected class of problems that we call bounds estimation. This includes learning empirical best-case and worst-case algorithmic complexity bounds and red-line bounds on sensor data.

  8. USDA Estimates of the Cost of Raising a Child: A Guide to Their Use and Interpretation.

    ERIC Educational Resources Information Center

    Edwards, Carolyn S.

    This guide describes estimates of the cost of raising a child made by the Family Economics Research Group of the United States Department of Agriculture (USDA). The guide starts with a description of what estimates are available, giving short profiles of the cost of raising urban, rural nonfarm, and rural farm children. The next section defines…

  9. A Markov model to estimate Salmonella morbidity, mortality, illness duration, and cost.

    PubMed

    Herrick, Robert L; Buchberger, Steven G; Clark, Robert M; Kupferle, Margaret; Murray, Regan; Succop, Paul

    2012-10-01

    Approximately 690000-1790000 Salmonella cases, 20000 hospitalizations, and 400 deaths occur in the USA annually, costing approximately $2.6bn. Existing models estimate morbidity, mortality, and cost solely from incidence. They do not estimate illness duration or use time as an independent cost predictor. Existing models may underestimate physician visits, hospitalizations, deaths, and associated costs. We developed a Markov chain Monte Carlo model to estimate illness duration, physician/emergency room visits, inpatient hospitalizations, mortality, and resultant costs for a given Salmonella incidence. Interested parties include society, third-party payers, health providers, federal, state and local governments, businesses, and individual patients and their families. The marginal approach estimates individual disease behavior for every patient, explicitly estimates disease duration and calculates separate time-dependent costs. The aggregate approach is a Markov equivalent of the existing models; it assumes average disease behavior and cost for a given morbidity/mortality. Transition probabilities were drawn from a meta-analysis of 53 Salmonella studies. Both approaches were tested using the 1993 Salmonella typhimurium outbreak in Gideon, Missouri. This protocol can be applied to estimate morbidity, mortality and cost of specific outbreaks, provide better national Salmonella burden estimates, and estimate the benefits of reducing Salmonella risk.

  10. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 1 2014-07-01 2014-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  11. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance: Treasury 1 2011-07-01 2011-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  12. 48 CFR 736.605 - Government cost estimate for architect-engineer work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government cost estimate for architect-engineer work. 736.605 Section 736.605 Federal Acquisition Regulations System AGENCY FOR... Architect-Engineer Services 736.605 Government cost estimate for architect-engineer work. See 736.602-3(c)(5)....

  13. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  14. 40 CFR 264.144 - Cost estimate for post-closure care.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... facility, the owner or operator must adjust the post-closure cost estimate for inflation within 60 days... cost estimate must be updated for inflation within 30 days after the close of the firm's fiscal year... dollars or by using an inflation factor derived from the most recent Implicit Price Deflator for...

  15. 40 CFR 265.144 - Cost estimate for post-closure care.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of the facility, the owner or operator must adjust the post-closure cost estimate for inflation...-closure care cost estimate must be updated for inflation no later than 30 days after the close of the firm... current dollars or by using an inflation factor derived from the most recent Implicit Price Deflator...

  16. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  17. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  18. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  19. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  20. Estimating Development Cost of an Interactive Website Based Cancer Screening Promotion Program

    PubMed Central

    Lairson, David R.; Chung, Tong Han; Smith, Lisa G.; Springston, Jeffrey K.; Champion, Victoria L.

    2015-01-01

    Objectives The aim of this study was to estimate the initial development costs for an innovative talk show format tailored intervention delivered via the interactive web, for increasing cancer screening in women 50 to 75 who were non-adherent to screening guidelines for colorectal cancer and/or breast cancer. Methods The cost of the intervention development was estimated from a societal perspective. Micro costing methods plus vendor contract costs were used to estimate cost. Staff logs were used to track personnel time. Non-personnel costs include all additional resources used to produce the intervention. Results Development cost of the interactive web based intervention was $.39 million, of which 77% was direct cost. About 98% of the cost was incurred in personnel time cost, contract cost and overhead cost. Conclusions The new web-based disease prevention medium required substantial investment in health promotion and media specialist time. The development cost was primarily driven by the high level of human capital required. The cost of intervention development is important information for assessing and planning future public and private investments in web-based health promotion interventions. PMID:25749548

  1. Estimating the Costs of Torture: Challenges and Opportunities.

    PubMed

    Mpinga, Emmanuel Kabengele; Kandala, Ngianga-Bakwin; Hasselgård-Rowe, Jennifer; Tshimungu Kandolo, Félicien; Verloo, Henk; Bukonda, Ngoyi K Zacharie; Chastonay, Philippe

    2015-12-01

    Due to its nature, extent and consequences, torture is considered a major public health problem and a serious violation of human rights. Our study aims to set the foundation for a theoretical framework of the costs related to torture. It examines existing challenges and proposes some solutions. Our proposed framework targets policy makers, human rights activists, professionals working in programmes, centres and rehabilitation projects, judges and lawyers, survivors of torture and their families and anyone involved in the prevention and fight against this practice and its consequences. We adopted a methodology previously used in studies investigating the challenges in measuring and valuing productivity costs in health disorders. We identify and discuss conceptual, methodological, political and ethical challenges that studies on the economic and social costs of torture pose and propose alternatives in terms of possible solutions to these challenges. The economic dimension of torture is rarely debated and integrated in research, policies and programmes. Several challenges such as epistemological, methodological, ethical or political ones have often been presented as obstacles to cost studies of torture and as an excuse for not investigating this dimension. In identifying, analysing and proposing solutions to these challenges, we intend to stimulate the integration of the economic dimension in research and prevention of torture strategies. PMID:26385586

  2. Estimating the Costs of Torture: Challenges and Opportunities.

    PubMed

    Mpinga, Emmanuel Kabengele; Kandala, Ngianga-Bakwin; Hasselgård-Rowe, Jennifer; Tshimungu Kandolo, Félicien; Verloo, Henk; Bukonda, Ngoyi K Zacharie; Chastonay, Philippe

    2015-12-01

    Due to its nature, extent and consequences, torture is considered a major public health problem and a serious violation of human rights. Our study aims to set the foundation for a theoretical framework of the costs related to torture. It examines existing challenges and proposes some solutions. Our proposed framework targets policy makers, human rights activists, professionals working in programmes, centres and rehabilitation projects, judges and lawyers, survivors of torture and their families and anyone involved in the prevention and fight against this practice and its consequences. We adopted a methodology previously used in studies investigating the challenges in measuring and valuing productivity costs in health disorders. We identify and discuss conceptual, methodological, political and ethical challenges that studies on the economic and social costs of torture pose and propose alternatives in terms of possible solutions to these challenges. The economic dimension of torture is rarely debated and integrated in research, policies and programmes. Several challenges such as epistemological, methodological, ethical or political ones have often been presented as obstacles to cost studies of torture and as an excuse for not investigating this dimension. In identifying, analysing and proposing solutions to these challenges, we intend to stimulate the integration of the economic dimension in research and prevention of torture strategies.

  3. ESTIMATING INNOVATIVE TECHNOLOGY COSTS FOR THE SITE PROGRAM

    EPA Science Inventory

    Among the objectives of the EPA`s Superfund Innovative Technology Evaluation (SITE) Program are two which pertain to the issue of economics: 1) That the program will provide a projected cost for each treatment technology demonstrated. 2) That the program will attempt to identify ...

  4. The Pilot Training Study: A Cost-Estimating Model for Advanced Pilot Training (APT).

    ERIC Educational Resources Information Center

    Knollmeyer, L. E.

    The Advanced Pilot Training Cost Model is a statement of relationships that may be used, given the necessary inputs, for estimating the resources required and the costs to train pilots in the Air Force formal flying training schools. Resources and costs are computed by weapon system on an annual basis for use in long-range planning or sensitivity…

  5. Estimating Resource Costs of Levy Campaigns in Five Ohio School Districts

    ERIC Educational Resources Information Center

    Ingle, W. Kyle; Petroff, Ruth Ann; Johnson, Paul A.

    2011-01-01

    Using Levin and McEwan's (2001) "ingredients method," this study identified the major activities and associated costs of school levy campaigns in five districts. The ingredients were divided into one of five cost categories--human resources, facilities, fees, marketing, and supplies. As to overall costs of the campaigns, estimates ranged from a…

  6. Estimating the Degree Cost Functions of the Philippines Public and Private Higher Educational Institutions

    ERIC Educational Resources Information Center

    Rufino, Cesar C.

    2006-01-01

    A flexible one-output and two-input cost function is estimated for the degree program offerings of public and private higher educational institutions (HEIs) of the Philippines, employing the data from a nationally representative sample of 29 HEIs. This model, called Flexible Fixed Cost Quadratic cost function includes as output--full time…

  7. Estimating the costs of intensity-modulated and 3-dimensional conformal radiotherapy in Ontario

    PubMed Central

    Yong, J.H.E.; McGowan, T.; Redmond-Misner, R.; Beca, J.; Warde, P.; Gutierrez, E.; Hoch, J.S.

    2016-01-01

    Background Radiotherapy is a common treatment for many cancers, but up-to-date estimates of the costs of radiotherapy are lacking. In the present study, we estimated the unit costs of intensity-modulated radiotherapy (imrt) and 3-dimensional conformal radiotherapy (3D-crt) in Ontario. Methods An activity-based costing model was developed to estimate the costs of imrt and 3D-crt in prostate cancer. It included the costs of equipment, staff, and supporting infrastructure. The framework was subsequently adapted to estimate the costs of radiotherapy in breast cancer and head-and-neck cancer. We also tested various scenarios by varying the program maturity and the use of volumetric modulated arc therapy (vmat) alongside imrt. Results From the perspective of the health care system, treating prostate cancer with imrt and 3D-crt respectively cost $12,834 and $12,453 per patient. The cost of radiotherapy ranged from $5,270 to $14,155 and was sensitive to analytic perspective, radiation technique, and disease site. Cases of head-and-neck cancer were the most costly, being driven by treatment complexity and fractions per treatment. Although imrt was more costly than 3D-crt, its cost will likely decline over time as programs mature and vmat is incorporated. Conclusions Our costing model can be modified to estimate the costs of 3D-crt and imrt for various disease sites and settings. The results demonstrate the important role of capital costs in studies of radiotherapy cost from a health system perspective, which our model can accommodate. In addition, our study established the need for future analyses of imrt cost to consider how vmat affects time consumption. PMID:27330359

  8. Estimating the gas transfer velocity: a prerequisite for more accurate and higher resolution GHG fluxes (lower Aare River, Switzerland)

    NASA Astrophysics Data System (ADS)

    Sollberger, S.; Perez, K.; Schubert, C. J.; Eugster, W.; Wehrli, B.; Del Sontro, T.

    2013-12-01

    Currently, carbon dioxide (CO2) and methane (CH4) emissions from lakes, reservoirs and rivers are readily investigated due to the global warming potential of those gases and the role these inland waters play in the carbon cycle. However, there is a lack of high spatiotemporally-resolved emission estimates, and how to accurately assess the gas transfer velocity (K) remains controversial. In anthropogenically-impacted systems where run-of-river reservoirs disrupt the flow of sediments by increasing the erosion and load accumulation patterns, the resulting production of carbonic greenhouse gases (GH-C) is likely to be enhanced. The GH-C flux is thus counteracting the terrestrial carbon sink in these environments that act as net carbon emitters. The aim of this project was to determine the GH-C emissions from a medium-sized river heavily impacted by several impoundments and channelization through a densely-populated region of Switzerland. Estimating gas emission from rivers is not trivial and recently several models have been put forth to do so; therefore a second goal of this project was to compare the river emission models available with direct measurements. Finally, we further validated the modeled fluxes by using a combined approach with water sampling, chamber measurements, and highly temporal GH-C monitoring using an equilibrator. We conducted monthly surveys along the 120 km of the lower Aare River where we sampled for dissolved CH4 (';manual' sampling) at a 5-km sampling resolution, and measured gas emissions directly with chambers over a 35 km section. We calculated fluxes (F) via the boundary layer equation (F=K×(Cw-Ceq)) that uses the water-air GH-C concentration (C) gradient (Cw-Ceq) and K, which is the most sensitive parameter. K was estimated using 11 different models found in the literature with varying dependencies on: river hydrology (n=7), wind (2), heat exchange (1), and river width (1). We found that chamber fluxes were always higher than boundary

  9. Estimating Criminal Justice System Costs and Cost-Savings Benefits of Day Reporting Centers

    ERIC Educational Resources Information Center

    Craddock, Amy

    2004-01-01

    This paper reports on the net cost-savings benefits (loss) to the criminal justice system of one rural and one urban day reporting center, both of which serve high risk/high need probationers. It also discusses issues of conducting criminal justice system cost studies of community corrections programs. The average DRC participant in the rural…

  10. Solid Waste Operations Complex W-113: Project cost estimate. Preliminary design report. Volume IV

    SciTech Connect

    1995-01-01

    This document contains Volume IV of the Preliminary Design Report for the Solid Waste Operations Complex W-113 which is the Project Cost Estimate and construction schedule. The estimate was developed based upon Title 1 material take-offs, budgetary equipment quotes and Raytheon historical in-house data. The W-113 project cost estimate and project construction schedule were integrated together to provide a resource loaded project network.

  11. Development of a new, robust and accurate, spectroscopic metric for scatterer size estimation in optical coherence tomography (OCT) images

    NASA Astrophysics Data System (ADS)

    Kassinopoulos, Michalis; Pitris, Costas

    2016-03-01

    The modulations appearing on the backscattering spectrum originating from a scatterer are related to its diameter as described by Mie theory for spherical particles. Many metrics for Spectroscopic Optical Coherence Tomography (SOCT) take advantage of this observation in order to enhance the contrast of Optical Coherence Tomography (OCT) images. However, none of these metrics has achieved high accuracy when calculating the scatterer size. In this work, Mie theory was used to further investigate the relationship between the degree of modulation in the spectrum and the scatterer size. From this study, a new spectroscopic metric, the bandwidth of the Correlation of the Derivative (COD) was developed which is more robust and accurate, compared to previously reported techniques, in the estimation of scatterer size. The self-normalizing nature of the derivative and the robustness of the first minimum of the correlation as a measure of its width, offer significant advantages over other spectral analysis approaches especially for scatterer sizes above 3 μm. The feasibility of this technique was demonstrated using phantom samples containing 6, 10 and 16 μm diameter microspheres as well as images of normal and cancerous human colon. The results are very promising, suggesting that the proposed metric could be implemented in OCT spectral analysis for measuring nuclear size distribution in biological tissues. A technique providing such information would be of great clinical significance since it would allow the detection of nuclear enlargement at the earliest stages of precancerous development.

  12. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation

    NASA Astrophysics Data System (ADS)

    Subramanian, Swetha; Mast, T. Douglas

    2015-09-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature.

  13. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation.

    PubMed

    Subramanian, Swetha; Mast, T Douglas

    2015-10-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature. PMID:26352462

  14. Estimating Power Outage Cost based on a Survey for Industrial Customers

    NASA Astrophysics Data System (ADS)

    Yoshida, Yoshikuni; Matsuhashi, Ryuji

    A survey was conducted on power outage cost for industrial customers. 5139 factories, which are designated energy management factories in Japan, answered their power consumption and the loss of production value due to the power outage in an hour in summer weekday. The median of unit cost of power outage of whole sectors is estimated as 672 yen/kWh. The sector of services for amusement and hobbies and the sector of manufacture of information and communication electronics equipment relatively have higher unit cost of power outage. Direct damage cost from power outage in whole sectors reaches 77 billion yen. Then utilizing input-output analysis, we estimated indirect damage cost that is caused by the repercussion of production halt. Indirect damage cost in whole sectors reaches 91 billion yen. The sector of wholesale and retail trade has the largest direct damage cost. The sector of manufacture of transportation equipment has the largest indirect damage cost.

  15. Medical costs of smoking in the United States: estimates, their validity, and their implications

    PubMed Central

    Warner, K.; Hodgson, T.; Carroll, C.

    1999-01-01

    OBJECTIVE—To compare estimates of the medical costs of smoking in the United States and to consider their relevance to assessing the costs of smoking in developing countries and the net economic burden of smoking.
DATA SOURCES—A Medline search through early 1999 using keywords "smoking" and "cost", with review of article reference lists.
STUDY SELECTION—Peer-reviewed papers examining medical costs in a single year, covering the non-institutionalised American population.
DATA EXTRACTION—Methods underlying study estimates were identified, described, and compared with attributable expenditure methodology in the literature dealing with costs of illness. Differences in methods were associated with implied differences in findings.
DATA SYNTHESIS—With one exception, the studies find the annual medical costs of smoking to constitute approximately 6-8% of American personal health expenditures. The exception, a recent study, found much larger attributable expenditures. The lower estimates may reflect the limitation of analysis to costs associated with the principal smoking-related diseases. The higher estimate derives from analysis of smoking-attributable differences in all medical costs. However, the finding from the most recent study, also considering all medical costs, fell in the 6-8% range.
CONCLUSIONS—The medical costs of smoking in the United States equal, and may well exceed, the commonly referenced figure of 6-8%. This literature has direct methodological relevance to developing countries interested in assessing the magnitude of their current cost-of-smoking burden and their future burdens, with differences in tobacco use histories and the availability of chronic disease treatment affecting country-specific estimates. The debate over the use of gross or net medical cost estimates is likely to intensify with the proliferation of lawsuits against the tobacco industry to recover expenditures on tobacco-produced disease.


Keywords: medical

  16. Users' Guide to USDA Estimates of the Cost of Raising a Child.

    ERIC Educational Resources Information Center

    Edwards, Carolyn S.

    In this article, estimates of the cost of raising a child, that are available from the U.S. Department of Agriculture, are described; the most widely requested estimates updated to current price levels are provided; and the most frequently asked questions about the use and interpretation of these estimates are answered. Information on additional…

  17. Hyperketonemia in early lactation dairy cattle: a deterministic estimate of component and total cost per case.

    PubMed

    McArt, J A A; Nydam, D V; Overton, M W

    2015-03-01

    The purpose of this study was to develop a deterministic economic model to estimate the costs associated with (1) the component cost per case of hyperketonemia (HYK) and (2) the total cost per case of HYK when accounting for costs related to HYK-attributed diseases. Data from current literature was used to model the incidence and risks of HYK (defined as a blood β-hydroxybutyrate concentration≥1.2 mmol/L), displaced abomasa (DA), metritis, disease associations, milk production, culling, and reproductive outcomes. The component cost of HYK was estimated based on 1,000 calvings per year; the incidence of HYK in primiparous and multiparous animals; the percent of animals receiving clinical treatment; the direct costs of diagnostics, therapeutics, labor, and death loss; and the indirect costs of future milk production losses, future culling losses, and reproduction losses. Costs attributable to DA and metritis were estimated based on the incidence of each disease in the first 30 DIM; the number of cases of each disease attributable to HYK; the direct costs of diagnostics, therapeutics, discarded milk during treatment and the withdrawal period, veterinary service (DA only), and death loss; and the indirect costs of future milk production losses, future culling losses, and reproduction losses. The component cost per case of HYK was estimated at $134 and $111 for primiparous and multiparous animals, respectively; the average component cost per case of HYK was estimated to be $117. Thirty-four percent of the component cost of HYK was due to future reproductive losses, 26% to death loss, 26% to future milk production losses, 8% to future culling losses, 3% to therapeutics, 2% to labor, and 1% to diagnostics. The total cost per case of HYK was estimated at $375 and $256 for primiparous and multiparous animals, respectively; the average total cost per case of HYK was $289. Forty-one percent of the total cost of HYK was due to the component cost of HYK, 33% to costs

  18. The Hospitalization Costs of Diabetes and Hypertension Complications in Zimbabwe: Estimations and Correlations.

    PubMed

    Mutowo, Mutsa P; Lorgelly, Paula K; Laxy, Michael; Renzaho, Andre M N; Mangwiro, John C; Owen, Alice J

    2016-01-01

    Objective. Treating complications associated with diabetes and hypertension imposes significant costs on health care systems. This study estimated the hospitalization costs for inpatients in a public hospital in Zimbabwe. Methods. The study was retrospective and utilized secondary data from medical records. Total hospitalization costs were estimated using generalized linear models. Results. The median cost and interquartile range (IQR) for patients with diabetes, $994 (385-1553) mean $1319 (95% CI: 981-1657), was higher than patients with hypertension, $759 (494-1147) mean $914 (95% CI: 825-1003). Female patients aged below 65 years with diabetes had the highest estimated mean costs ($1467 (95% CI: 1177-1828)). Wound care had the highest estimated mean cost of all procedures, $2884 (95% CI: 2004-4149) for patients with diabetes and $2239 (95% CI: 1589-3156) for patients with hypertension. Age below 65 years, medical procedures (amputation, wound care, dialysis, and physiotherapy), the presence of two or more comorbidities, and being prescribed two or more drugs were associated with significantly higher hospitalization costs. Conclusion. Our estimated costs could be used to evaluate and improve current inpatient treatment and management of patients with diabetes and hypertension and determine the most cost-effective interventions to prevent complications and comorbidities. PMID:27403444

  19. The Hospitalization Costs of Diabetes and Hypertension Complications in Zimbabwe: Estimations and Correlations

    PubMed Central

    Mutowo, Mutsa P.; Lorgelly, Paula K.; Laxy, Michael; Mangwiro, John C.; Owen, Alice J.

    2016-01-01

    Objective. Treating complications associated with diabetes and hypertension imposes significant costs on health care systems. This study estimated the hospitalization costs for inpatients in a public hospital in Zimbabwe. Methods. The study was retrospective and utilized secondary data from medical records. Total hospitalization costs were estimated using generalized linear models. Results. The median cost and interquartile range (IQR) for patients with diabetes, $994 (385–1553) mean $1319 (95% CI: 981–1657), was higher than patients with hypertension, $759 (494–1147) mean $914 (95% CI: 825–1003). Female patients aged below 65 years with diabetes had the highest estimated mean costs ($1467 (95% CI: 1177–1828)). Wound care had the highest estimated mean cost of all procedures, $2884 (95% CI: 2004–4149) for patients with diabetes and $2239 (95% CI: 1589–3156) for patients with hypertension. Age below 65 years, medical procedures (amputation, wound care, dialysis, and physiotherapy), the presence of two or more comorbidities, and being prescribed two or more drugs were associated with significantly higher hospitalization costs. Conclusion. Our estimated costs could be used to evaluate and improve current inpatient treatment and management of patients with diabetes and hypertension and determine the most cost-effective interventions to prevent complications and comorbidities. PMID:27403444

  20. Survey of State-Level Cost and Benefit Estimates of Renewable Portfolio Standards

    SciTech Connect

    Heeter, J.; Barbose, G.; Bird, L.; Weaver, S.; Flores-Espino, F.; Kuskova-Burns, K.; Wiser, R.

    2014-05-01

    Most renewable portfolio standards (RPS) have five or more years of implementation experience, enabling an assessment of their costs and benefits. Understanding RPS costs and benefits is essential for policymakers evaluating existing RPS policies, assessing the need for modifications, and considering new policies. This study provides an overview of methods used to estimate RPS compliance costs and benefits, based on available data and estimates issued by utilities and regulators. Over the 2010-2012 period, average incremental RPS compliance costs in the United States were equivalent to 0.8% of retail electricity rates, although substantial variation exists around this average, both from year-to-year and across states. The methods used by utilities and regulators to estimate incremental compliance costs vary considerably from state to state and a number of states are currently engaged in processes to refine and standardize their approaches to RPS cost calculation. The report finds that state assessments of RPS benefits have most commonly attempted to quantitatively assess avoided emissions and human health benefits, economic development impacts, and wholesale electricity price savings. Compared to the summary of RPS costs, the summary of RPS benefits is more limited, as relatively few states have undertaken detailed benefits estimates, and then only for a few types of potential policy impacts. In some cases, the same impacts may be captured in the assessment of incremental costs. For these reasons, and because methodologies and level of rigor vary widely, direct comparisons between the estimates of benefits and costs are challenging.

  1. Cost estimate of hospital stays for premature newborns of adolescent mothers in a Brazilian public hospital

    PubMed Central

    Mwamakamba, Lutufyo Witson; Zucchi, Paola

    2014-01-01

    ABSTRACT Objective: To estimate the direct costs of hospital stay for premature newborns of adolescent mothers, in a public hospital. Methods: A cost estimate study conducted between 2009 and 2011, in which direct hospital costs were estimated for premature newborns of adolescent mothers, with 22 to 36 6/7 gestational weeks, and treated at the neonatal unit of the hospital. Results: In 2006, there were 5,180 deliveries at this hospital, and 17.8% (922) were newborns of adolescent mothers, of which 19.63% (181) were admitted to the neonatal unit. Out of the 181 neonates, 58% (105) were premature and 80% (84) of them were included in this study. These 84 neonates had a total of 1,633 days in-patient hospital care at a total cost of US$195,609.00. Approximately 72% of this total cost (US$141,323.00) accounted for hospital services. The mean daily costs ranged from US$97.00 to US$157.00. Conclusion: This study demonstrated that the average cost of premature newborns from adolescent mothers was US$2,328.00 and varied according to birth weight. For those weighing <1,000g at birth, the mean direct cost was US$8,930.00 per stay as opposed to a cost of US$642.00 for those with birth weight >2,000g. The overall estimated direct cost for the 84 neonates in the study totaled US$195,609.00. PMID:25003930

  2. Comparing NASA and ESA Cost Estimating Methods for Human Missions to Mars

    NASA Technical Reports Server (NTRS)

    Hunt, Charles D.; vanPelt, Michel O.

    2004-01-01

    To compare working methodologies between the cost engineering functions in NASA Marshall Space Flight Center (MSFC) and ESA European Space Research and Technology Centre (ESTEC), as well as to set-up cost engineering capabilities for future manned Mars projects and other studies which involve similar subsystem technologies in MSFC and ESTEC, a demonstration cost estimate exercise was organized. This exercise was a direct way of enhancing not only cooperation between agencies but also both agencies commitment to credible cost analyses. Cost engineers in MSFC and ESTEC independently prepared life-cycle cost estimates for a reference human Mars project and subsequently compared the results and estimate methods in detail. As a non-sensitive, public domain reference case for human Mars projects, the Mars Direct concept was chosen. In this paper the results of the exercise are shown; the differences and similarities in estimate methodologies, philosophies, and databases between MSFC and ESTEC, as well as the estimate results for the Mars Direct concept. The most significant differences are explained and possible estimate improvements identified. In addition, the Mars Direct plan and the extensive cost breakdown structure jointly set-up by MSFC and ESTEC for this concept are presented. It was found that NASA applied estimate models mainly based on historic Apollo and Space Shuttle cost data, taking into account the changes in technology since then. ESA used models mostly based on European satellite and launcher cost data, taking into account the higher equipment and testing standards for human space flight. Most of NASA's and ESA s estimates for the Mars Direct case are comparable, but there are some important, consistent differences in the estimates for: 1) Large Structures and Thermal Control subsystems; 2) System Level Management, Engineering, Product Assurance and Assembly, Integration and Test/Verification activities; 3) Mission Control; 4) Space Agency Program Level

  3. Evaluation of economic effects of population ageing--methodology of estimating indirect costs.

    PubMed

    Schubert, Agata; Czech, Marcin; Gębska-Kuczerowska, Anita

    2015-01-01

    Process of demographic ageing, especially in recent decades, is steadily growing in dynamics and importance due to increasing health-related needs and expectations with regard to a guarantee of social services. Elaboration of the most effective model of care, tailored to Polish conditions, requires an estimation of actual costs of this care, including indirect costs which are greatly related to informal care. The fact that the costs of informal care are omitted, results from a determined approach to analyses. It is discussed only from a perspective of budget for health and does not cover societal aspects. In such situation, however, the costs borne by a receiver of services are neglected. As a consequence, the costs of informal care are underestimated or often excluded from calculations, even if they include indirect costs. Comprehensive methodological approach for estimating the costs of informal care seems to be important for a properly conducted economic evaluation in health care sector.

  4. Cost Estimates for the Decontamination and Decommissioning of Eight ORNL Buildings

    SciTech Connect

    Hogan, M.

    2006-07-01

    The U.S. Department of Energy's Oak Ridge National Laboratory (ORNL) contains a number of buildings that are antiquated and no longer used. These buildings historically were used for the production of atomic weapons and often remain contaminated with radioactive materials. Certain costs and risks are associated with the long-term stewardship of the buildings. One way to reduce these liabilities is to eliminate the buildings that are no longer in use and are not expected to be used in the future. Some of these buildings at ORNL are located in an area known as 'Isotope Circle'. From this area, eight buildings that are expected to be decontaminated and decommissioned (D and D) in the next five to ten years were chosen to have cost estimates completed. The specific facilities are Buildings 3030, 3031, 3118, 3032, 3033, 3033 Annex, 3034, and 3093. There are many challenges for estimating the costs to D and D buildings potentially contaminated with radionuclides. Each building is unique, has various types and levels of contamination, and (as in this case) often lacks up-to-date information. Because of these limitations, order-of- magnitude cost estimates for each of the eight ORNL buildings were completed using parametric cost modeling software known as RACER{sup TM} (Remedial Action Cost Engineering and Requirements System). This type of cost estimate is useful for screening technical concepts and is used for budgetary planning. For the eight buildings evaluated in this study, the total cost to D and D was estimated to be nearly $6 M. This value includes the direct cost of approximately $3.5 M to complete D and D and $2.5 M in cost markups. Also, assuming the actual project does not begin until the year 2010, this total cost is escalated to almost $6.7 M, which accounts for expected inflation. Although the cost estimates in this study were expected to have a wide range in accuracy, there are various factors that could impact these estimates in a negative or positive

  5. Bread Basket: a gaming model for estimating home-energy costs

    SciTech Connect

    Not Available

    1982-01-01

    An instructional manual for answering the twenty variables on COLORADO ENERGY's computerized program estimating home energy costs. The program will generate home-energy cost estimates based on individual household data, such as total square footage, number of windows and doors, number and variety of appliances, heating system design, etc., and will print out detailed costs, showing the percentages of the total household budget that energy costs will amount to over a twenty-year span. Using the program, homeowners and policymakers alike can predict the effects of rising energy prices on total spending by Colorado households.

  6. Cost estimate of hospital stays for premature newborns in a public tertiary hospital in Brazil

    PubMed Central

    Desgualdo, Claudia Maria; Riera, Rachel; Zucchi, Paola

    2011-01-01

    OBJECTIVES: To estimate the direct costs of hospital stays for premature newborns in the Interlagos Hospital and Maternity Center in São Paulo, Brazil and to assess the difference between the amount reimbursed to the hospital by the Unified Health System and the real cost of care for each premature newborn. METHODS: A cost-estimate study in which hospital and professional costs were estimated for premature infants born at 22 to 36 weeks gestation during the calendar year of 2004 and surviving beyond one hour of age. Direct costs included hospital services, professional care, diagnoses and therapy, orthotics, prosthetics, special materials, and blood products. Costs were estimated using tables published by the Unified Health System and the Brasíndice as well as the list of medical procedures provided by the Brazilian Classification of Medical Procedures. RESULTS: The average direct cost of care for initial hospitalization of a premature newborn in 2004 was $2,386 USD. Total hospital expenses and professional services for all premature infants in this hospital were $227,000 and $69,500 USD, respectively. The costs for diagnostic testing and blood products for all premature infants totaled $22,440 and $1,833 USD. The daily average cost of a premature newborn weighing less than 1,000 g was $115 USD, and the daily average cost of a premature newborn weighing more than 2,500 g was $89 USD. Amounts reimbursed to the hospital by the Unified Health System corresponded to only 27.42% of the real cost of care. CONCLUSIONS: The cost of hospital stays for premature newborns was much greater than the amount reimbursed to the hospital by the Unified Health System. The highest costs corresponded to newborns with lower birth weight. Hospital costs progressively and discretely decreased as the newborns' weight increased. PMID:22012050

  7. Motion estimation by integrated low cost system (vision and MEMS) for positioning of a scooter "Vespa"

    NASA Astrophysics Data System (ADS)

    Guarnieri, A.; Milan, N.; Pirotti, F.; Vettore, A.

    2011-12-01

    In the automotive sector, especially in these last decade, a growing number of investigations have taken into account electronic systems to check and correct the behavior of drivers, increasing road safety. The possibility to identify with high accuracy the vehicle position in a mapping reference frame for driving directions and best-route analysis is also another topic which attracts lot of interest from the research and development sector. To reach the objective of accurate vehicle positioning and integrate response events, it is necessary to estimate time by time the position, orientation and velocity of the system. To this aim low cost GPS and MEMS (sensors can be used. In comparison to a four wheel vehicle, the dynamics of a two wheel vehicle (e.g. a scooter) feature a higher level of complexity. Indeed more degrees of freedom must be taken into account to describe the motion of the latter. For example a scooter can twist sideways, thus generating a roll angle. A slight pitch angle has to be considered as well, since wheel suspensions have a higher degree of motion with respect to four wheel vehicles. In this paper we present a method for the accurate reconstruction of the trajectory of a motorcycle ("Vespa" scooter), which can be used as alternative to the "classical" approach based on the integration of GPS and INS sensors. Position and orientation of the scooter are derived from MEMS data and images acquired by on-board digital camera. A Bayesian filter provides the means for integrating the data from MEMS-based orientation sensor and the GPS receiver.

  8. Estimating the financial cost of chronic kidney disease to the NHS in England

    PubMed Central

    Kerr, Marion; Bray, Benjamin; Medcalf, James; O'Donoghue, Donal J.; Matthews, Beverley

    2012-01-01

    Background Chronic kidney disease (CKD) is a major challenge for health care systems around the world, and the prevalence rates appear to be increasing. We estimate the costs of CKD in a universal health care system. Methods Economic modelling was used to estimate the annual cost of Stages 3–5 CKD to the National Health Service (NHS) in England, including CKD-related prescribing and care, renal replacement therapy (RRT), and excess strokes, myocardial infarctions (MIs) and Methicillin-Resistant Staphylococcus Aureus (MRSA) infections in people with CKD. Results The cost of CKD to the English NHS in 2009–10 is estimated at £1.44 to £1.45 billion, which is ∼1.3% of all NHS spending in that year. More than half this sum was spent on RRT, which was provided for 2% of the CKD population. The economic model estimates that ∼7000 excess strokes and 12 000 excess MIs occurred in the CKD population in 2009–10, relative to an age- and gender-matched population without CKD. The cost of excess strokes and MIs is estimated at £174–£178 million. Conclusions The financial impact of CKD is large, with particularly high costs relating to RRT and cardiovascular complications. It is hoped that these detailed cost estimates will be useful in analysing the cost-effectiveness of treatments for CKD. PMID:22815543

  9. SPS susceptible-system cost factors investment summary and mitigation-cost-increment estimates

    SciTech Connect

    Morrison, E L

    1980-05-01

    The Electromagnetic Compatibility (EMC) evaluation program supporting the SPS Concept Development Evaluation Phase has included examinations of the degradation in capability of all susceptible communications and electronic systems that could be exposed to SPS emissions, the development and testing of mitigation techniques to allow operation in the SPS environment, and the development of total investment and mitigation cost data. Mitigation costs relate only to the modification or reconfiguration of susceptible systems; redeployment being a possible consideration for rectenna siting exercises during the SPS Engineering Development Phase. An extensive survey is summarized regarding the current and planned facilities using the equipment categories listed: microwave communications; radar systems; sensors; computers; medical equipment; and research support. Current investment, future plans, and mitigation costs are presented, with geographic distribution in six CONUS areas.

  10. Technology Cost and Schedule Estimation (TCASE) Final Report

    NASA Technical Reports Server (NTRS)

    Wallace, Jon; Schaffer, Mark

    2015-01-01

    During the 2014-2015 project year, the focus of the TCASE project has shifted from collection of historical data from many sources to securing a data pipeline between TCASE and NASA's widely used TechPort system. TCASE v1.0 implements a data import solution that was achievable within the project scope, while still providing the basis for a long-term ability to keep TCASE in sync with TechPort. Conclusion: TCASE data quantity is adequate and the established data pipeline will enable future growth. Data quality is now highly dependent the quality of data in TechPort. Recommendation: Technology development organizations within NASA should continue to work closely with project/program data tracking and archiving efforts (e.g. TechPort) to ensure that the right data is being captured at the appropriate quality level. TCASE would greatly benefit, for example, if project cost/budget information was included in TechPort in the future.

  11. The social cost of rheumatoid arthritis in Italy: the results of an estimation exercise.

    PubMed

    Turchetti, G; Bellelli, S; Mosca, M

    2014-03-14

    The objective of this study is to estimate the mean annual social cost per adult person and the total social cost of rheumatoid arthritis (RA) in Italy. A literature review was performed by searching primary economic studies on adults in order to collect cost data of RA in Italy in the last decade. The review results were merged with data of institutional sources for estimating - following the methodological steps of the cost of illness analysis - the social cost of RA in Italy. The mean annual social cost of RA was € 13,595 per adult patient in Italy. Affecting 259,795 persons, RA determines a social cost of € 3.5 billions in Italy. Non-medical direct cost and indirect cost represent the main cost items (48% and 31%) of the total social cost of RA in Italy. Based on these results, it appears evident that the assessment of the economic burden of RA solely based on direct medical costs evaluation gives a limited view of the phenomenon.

  12. Statistical estimation of service cracks and maintenance cost for aircraft structures

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1975-01-01

    A method is developed for the statistical estimation of the number of cracks to be repaired in service as well as the repair and the maintenance costs. The present approach accounts for the statistical distribution of the initial crack size, the statistical nature of the NDI technique used for detecting the crack, and the renewal process for the crack propagation of repaired cracks. The mean and the standard deviation of the cumulative number of cracks to be repaired are computed as a function of service time. The statistics of the costs of repair and maintenance, expressed in terms of the percentage of the cost of replacement, are estimated as a function of service time. The results of the present study provide relevant information for the decision of fleet management, the estimation of life cycle cost, and procurement specifications. The present study is essential to the design and cost optimization of aircraft structures.

  13. Twice-weighted multiple interval estimation of a marginal structural model to analyze cost-effectiveness.

    PubMed

    Goldfeld, K S

    2014-03-30

    Cost-effectiveness analysis is an important tool that can be applied to the evaluation of a health treatment or policy. When the observed costs and outcomes result from a nonrandomized treatment, making causal inference about the effects of the treatment requires special care. The challenges are compounded when the observation period is truncated for some of the study subjects. This paper presents a method of unbiased estimation of cost-effectiveness using observational study data that is not fully observed. The method-twice-weighted multiple interval estimation of a marginal structural model-was developed in order to analyze the cost-effectiveness of treatment protocols for advanced dementia residents living nursing homes when they become acutely ill. A key feature of this estimation approach is that it facilitates a sensitivity analysis that identifies the potential effects of unmeasured confounding on the conclusions concerning cost-effectiveness.

  14. 48 CFR 1615.406-2 - Certificate of accurate cost or pricing data for community-rated carriers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION... Community-Rated Carriers This is to certify that, to the best of my knowledge and belief: (1) The cost or... the ____* FEHB Program rates were developed in accordance with the requirements of 48 CFR Chapter...

  15. 48 CFR 1615.406-2 - Certificates of accurate cost or pricing data for community rated carriers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION... certify that, to the best of my knowledge and belief: (1) The cost or pricing data submitted (or, if not... were developed in accordance with the requirements of 48 CFR Chapter 16 and the FEHB Program...

  16. Bureau of mines cost estimating system handbook (in two parts). 1. Surface and underground mining

    SciTech Connect

    Not Available

    1987-01-01

    The handbook provides a convenient costing procedure based on the summation of the costs for unit processes required in any particular mining or mineral processing operation. The costing handbook consists of a series of costing sections, each corresponding to a specific mining unit process. Contained within each section is the methodology to estimate either the capital or operating cost for that unit process. The unit process sections may be used to generate, in January 1984 dollars, costs through the use of either costing curves or formulae representing the prevailing technology. Coverage for surface mining includes dredging, quarrying, strip mining, and open pit mining. The underground mining includes individual development sections for drifting, raising, shaft sinking, stope development, various mining methods, underground mine haulage, general plant, and underground mine administrative cost.

  17. Estimating the human recovery costs of seriously injured road crash casualties.

    PubMed

    Bambach, M R; Mitchell, R J

    2015-12-01

    Road crashes result in substantial trauma and costs to societies around the world. Robust costing methods are an important tool to estimate costs associated with road trauma, and are key inputs into policy development and cost-benefit analysis for road safety programmes and infrastructure projects. With an expanding focus on seriously injured road crash casualties, in addition to the long standing focus on fatalities, methods for costing seriously injured casualties are becoming increasingly important. Some road safety agencies are defining a seriously injured casualty as an individual that was admitted to hospital following a road crash, and as a result, hospital separation data provide substantial potential for estimating the costs associated with seriously injured road crash casualties. The aim of this study is to establish techniques for estimating the human recovery costs of (non-fatal) seriously injured road crash casualties directly from hospital separation data. An individuals' road crash-related hospitalisation record and their personal injury insurance claim were linked for road crashes that occurred in New South Wales, Australia. These records provided the means for estimating all of the costs to the casualty directly related to their recovery from their injuries. A total of 10,897 seriously injured road crash casualties were identified and four methods for estimating their recovery costs were examined, using either unit record or aggregated hospital separation data. The methods are shown to provide robust techniques for estimating the human recovery costs of seriously injured road crash casualties, that may prove useful for identifying, implementing and evaluating safety programmes intended to reduce the incidence of road crash-related serious injuries.

  18. Costs of cervical cancer treatment: population-based estimates from Ontario

    PubMed Central

    Pendrith, C.; Thind, A.; Zaric, G.S.; Sarma, S.

    2016-01-01

    Objectives The objectives of the present study were to estimate the overall and specific medical care costs associated with cervical cancer in the first 5 years after diagnosis in Ontario. Methods Incident cases of invasive cervical cancer during 2007–2010 were identified from the Ontario Cancer Registry and linked to administrative databases held at the Institute for Clinical Evaluative Sciences. Mean costs in 2010 Canadian dollars were estimated using the arithmetic mean and estimators that adjust for censored data. Results Mean age of the patients in the study cohort (779 cases) was 49.3 years. The mean overall medical care cost was $39,187 [standard error (se): $1,327] in the 1st year after diagnosis. Costs in year 1 ranged from $34,648 (se: $1,275) for those who survived at least 1 year to $69,142 (se: $4,818) for those who died from cervical cancer within 1 year. At 5 years after diagnosis, the mean overall unadjusted cost was $63,131 (se: $3,131), and the cost adjusted for censoring was $68,745 (se: $2,963). Inpatient hospitalizations and cancer-related care were the two largest components of cancer treatment costs. Conclusions We found that the estimated mean costs that did not account for censoring were consistently undervalued, highlighting the importance of estimates based on censoring-adjusted costs in cervical cancer. Our results are reliable for estimating the economic burden of cervical cancer and the cost-effectiveness of cervical cancer prevention strategies. PMID:27122978

  19. Double robust estimator of average causal treatment effect for censored medical cost data.

    PubMed

    Wang, Xuan; Beste, Lauren A; Maier, Marissa M; Zhou, Xiao-Hua

    2016-08-15

    In observational studies, estimation of average causal treatment effect on a patient's response should adjust for confounders that are associated with both treatment exposure and response. In addition, the response, such as medical cost, may have incomplete follow-up. In this article, a double robust estimator is proposed for average causal treatment effect for right censored medical cost data. The estimator is double robust in the sense that it remains consistent when either the model for the treatment assignment or the regression model for the response is correctly specified. Double robust estimators increase the likelihood the results will represent a valid inference. Asymptotic normality is obtained for the proposed estimator, and an estimator for the asymptotic variance is also derived. Simulation studies show good finite sample performance of the proposed estimator and a real data analysis using the proposed method is provided as illustration. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Double robust estimator of average causal treatment effect for censored medical cost data.

    PubMed

    Wang, Xuan; Beste, Lauren A; Maier, Marissa M; Zhou, Xiao-Hua

    2016-08-15

    In observational studies, estimation of average causal treatment effect on a patient's response should adjust for confounders that are associated with both treatment exposure and response. In addition, the response, such as medical cost, may have incomplete follow-up. In this article, a double robust estimator is proposed for average causal treatment effect for right censored medical cost data. The estimator is double robust in the sense that it remains consistent when either the model for the treatment assignment or the regression model for the response is correctly specified. Double robust estimators increase the likelihood the results will represent a valid inference. Asymptotic normality is obtained for the proposed estimator, and an estimator for the asymptotic variance is also derived. Simulation studies show good finite sample performance of the proposed estimator and a real data analysis using the proposed method is provided as illustration. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26818601

  1. Estimating the Environmental Costs of Africa's Massive "Development Corridors".

    PubMed

    Laurance, William F; Sloan, Sean; Weng, Lingfei; Sayer, Jeffrey A

    2015-12-21

    In sub-Saharan Africa, dozens of major "development corridors" have been proposed or are being created to increase agricultural production [1-4], mineral exports [5-7], and economic integration. The corridors involve large-scale expansion of infrastructure such as roads, railroads, pipelines, and port facilities and will open up extensive areas of land to new environmental pressures [1, 4, 8]. We assessed the potential environmental impacts of 33 planned or existing corridors that, if completed, would total over 53,000 km in length and crisscross much of the African continent. We mapped each corridor and estimated human occupancy (using the distribution of persistent night-lights) and environmental values (endangered and endemic vertebrates, plant diversity, critical habitats, carbon storage, and climate-regulation services) inside a 50-km-wide band overlaid onto each corridor. We also assessed the potential for each corridor to facilitate increases in agricultural production. The corridors varied considerably in their environmental values, and many were only sparsely populated. Because of marginal soils or climates, some corridors appear to have only modest agricultural potential. Collectively, the corridors would bisect over 400 existing protected areas and could degrade a further ~1,800 by promoting habitat disruption near or inside the reserves. We conclude that many of the development corridors will promote serious and largely irreversible environmental changes and should proceed only if rigorous mitigation and protection measures can be employed. Some planned corridors with high environmental values and limited agricultural benefits should possibly be cancelled altogether. VIDEO ABSTRACT.

  2. Estimating the Environmental Costs of Africa's Massive "Development Corridors".

    PubMed

    Laurance, William F; Sloan, Sean; Weng, Lingfei; Sayer, Jeffrey A

    2015-12-21

    In sub-Saharan Africa, dozens of major "development corridors" have been proposed or are being created to increase agricultural production [1-4], mineral exports [5-7], and economic integration. The corridors involve large-scale expansion of infrastructure such as roads, railroads, pipelines, and port facilities and will open up extensive areas of land to new environmental pressures [1, 4, 8]. We assessed the potential environmental impacts of 33 planned or existing corridors that, if completed, would total over 53,000 km in length and crisscross much of the African continent. We mapped each corridor and estimated human occupancy (using the distribution of persistent night-lights) and environmental values (endangered and endemic vertebrates, plant diversity, critical habitats, carbon storage, and climate-regulation services) inside a 50-km-wide band overlaid onto each corridor. We also assessed the potential for each corridor to facilitate increases in agricultural production. The corridors varied considerably in their environmental values, and many were only sparsely populated. Because of marginal soils or climates, some corridors appear to have only modest agricultural potential. Collectively, the corridors would bisect over 400 existing protected areas and could degrade a further ~1,800 by promoting habitat disruption near or inside the reserves. We conclude that many of the development corridors will promote serious and largely irreversible environmental changes and should proceed only if rigorous mitigation and protection measures can be employed. Some planned corridors with high environmental values and limited agricultural benefits should possibly be cancelled altogether. VIDEO ABSTRACT. PMID:26628009

  3. Price Estimation Guidelines

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.; Aster, R. W.; Firnett, P. J.; Miller, M. A.

    1985-01-01

    Improved Price Estimation Guidelines, IPEG4, program provides comparatively simple, yet relatively accurate estimate of price of manufactured product. IPEG4 processes user supplied input data to determine estimate of price per unit of production. Input data include equipment cost, space required, labor cost, materials and supplies cost, utility expenses, and production volume on industry wide or process wide basis.

  4. Cost and price estimate of Brayton and Stirling engines in selected production volumes

    NASA Technical Reports Server (NTRS)

    Fortgang, H. R.; Mayers, H. F.

    1980-01-01

    The methods used to determine the production costs and required selling price of Brayton and Stirling engines modified for use in solar power conversion units are presented. Each engine part, component and assembly was examined and evaluated to determine the costs of its material and the method of manufacture based on specific annual production volumes. Cost estimates are presented for both the Stirling and Brayton engines in annual production volumes of 1,000, 25,000, 100,000 and 400,000. At annual production volumes above 50,000 units, the costs of both engines are similar, although the Stirling engine costs are somewhat lower. It is concluded that modifications to both the Brayton and Stirling engine designs could reduce the estimated costs.

  5. The Cost of Crime to Society: New Crime-Specific Estimates for Policy and Program Evaluation

    PubMed Central

    French, Michael T.; Fang, Hai

    2010-01-01

    Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than ten years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost of society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. PMID:20071107

  6. How Much Does Intellectual Disability Really Cost? First Estimates for Australia

    ERIC Educational Resources Information Center

    Doran, Christopher M.; Einfeld, Stewart L.; Madden, Rosamond H.; Otim, Michael; Horstead, Sian K.; Ellis, Louise A.; Emerson, Eric

    2012-01-01

    Background: Given the paucity of relevant data, this study estimates the cost of intellectual disability (ID) to families and the government in Australia. Method: Family costs were collected via the Client Service Receipt Inventory, recording information relating to service use and personal expense as a consequence of ID. Government expenditure on…

  7. A model for estimating the cost impact of schedule perturbations on aerospace research and development programs

    NASA Technical Reports Server (NTRS)

    Bishop, D. F.

    1972-01-01

    The problem of determining the cost impact attributable to perturbations in an aerospace R and D program schedule is discussed in terms of the diminishing availability of funds. The methodology from which a model is presented for updating R and D cost estimates as a function of perturbations in program time is presented.

  8. 78 FR 61227 - Public Assistance Cost Estimating Format for Large Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-03

    ...--Construction Cost Contingencies/Uncertainties (Design and Construction) 5. Part D Factor--General Contractor's... Project Management and Design Costs 10. Summary and Application of the Parts B Through H Factors III. The... Results for Each Large Project Estimated by the CEF 8. The Engineering and Design Services Curves (A and...

  9. The Effect of Infrastructure Sharing in Estimating Operations Cost of Future Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Sundaram, Meenakshi

    2005-01-01

    NASA and the aerospace industry are extremely serious about reducing the cost and improving the performance of launch vehicles both manned or unmanned. In the aerospace industry, sharing infrastructure for manufacturing more than one type spacecraft is becoming a trend to achieve economy of scale. An example is the Boeing Decatur facility where both Delta II and Delta IV launch vehicles are made. The author is not sure how Boeing estimates the costs of each spacecraft made in the same facility. Regardless of how a contractor estimates the cost, NASA in its popular cost estimating tool, NASA Air force Cost Modeling (NAFCOM) has to have a method built in to account for the effect of infrastructure sharing. Since there is no provision in the most recent version of NAFCOM2002 to take care of this, it has been found by the Engineering Cost Community at MSFC that the tool overestimates the manufacturing cost by as much as 30%. Therefore, the objective of this study is to develop a methodology to assess the impact of infrastructure sharing so that better operations cost estimates may be made.

  10. Two Computer Programs for Equipment Cost Estimation and Economic Evaluation of Chemical Processes.

    ERIC Educational Resources Information Center

    Kuri, Carlos J.; Corripio, Armando B.

    1984-01-01

    Describes two computer programs for use in process design courses: an easy-to-use equipment cost estimation program based on latest cost correlations available and an economic evaluation program which calculates two profitability indices. Comparisons between programed and hand-calculated results are included. (JM)

  11. Estimating the Full Cost of Family-Financed Time Inputs to Education.

    ERIC Educational Resources Information Center

    Levine, Victor

    This paper presents a methodology for estimating the full cost of parental time allocated to child-care activities at home. Building upon the human capital hypothesis, a model is developed in which the cost of an hour diverted from labor market activity is seen as consisting of three components: 1) direct wages foregone; 2) investments in…

  12. Preliminary weight and cost estimates for transport aircraft composite structural design concepts

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Preliminary weight and cost estimates have been prepared for design concepts utilized for a transonic long range transport airframe with extensive applications of advanced composite materials. The design concepts, manufacturing approach, and anticipated details of manufacturing cost reflected in the composite airframe are substantially different from those found in conventional metal structure and offer further evidence of the advantages of advanced composite materials.

  13. Observing Volcanic Thermal Anomalies from Space: How Accurate is the Estimation of the Hotspot's Size and Temperature?

    NASA Astrophysics Data System (ADS)

    Zaksek, K.; Pick, L.; Lombardo, V.; Hort, M. K.

    2015-12-01

    Measuring the heat emission from active volcanic features on the basis of infrared satellite images contributes to the volcano's hazard assessment. Because these thermal anomalies only occupy a small fraction (< 1 %) of a typically resolved target pixel (e.g. from Landsat 7, MODIS) the accurate determination of the hotspot's size and temperature is however problematic. Conventionally this is overcome by comparing observations in at least two separate infrared spectral wavebands (Dual-Band method). We investigate the resolution limits of this thermal un-mixing technique by means of a uniquely designed indoor analog experiment. Therein the volcanic feature is simulated by an electrical heating alloy of 0.5 mm diameter installed on a plywood panel of high emissivity. Two thermographic cameras (VarioCam high resolution and ImageIR 8300 by Infratec) record images of the artificial heat source in wavebands comparable to those available from satellite data. These range from the short-wave infrared (1.4-3 µm) over the mid-wave infrared (3-8 µm) to the thermal infrared (8-15 µm). In the conducted experiment the pixel fraction of the hotspot was successively reduced by increasing the camera-to-target distance from 3 m to 35 m. On the basis of an individual target pixel the expected decrease of the hotspot pixel area with distance at a relatively constant wire temperature of around 600 °C was confirmed. The deviation of the hotspot's pixel fraction yielded by the Dual-Band method from the theoretically calculated one was found to be within 20 % up until a target distance of 25 m. This means that a reliable estimation of the hotspot size is only possible if the hotspot is larger than about 3 % of the pixel area, a resolution boundary most remotely sensed volcanic hotspots fall below. Future efforts will focus on the investigation of a resolution limit for the hotspot's temperature by varying the alloy's amperage. Moreover, the un-mixing results for more realistic multi

  14. Cost estimation for solid waste management in industrialising regions--precedents, problems and prospects.

    PubMed

    Parthan, Shantha R; Milke, Mark W; Wilson, David C; Cocks, John H

    2012-03-01

    The importance of cost planning for solid waste management (SWM) in industrialising regions (IR) is not well recognised. The approaches used to estimate costs of SWM can broadly be classified into three categories - the unit cost method, benchmarking techniques and developing cost models using sub-approaches such as cost and production function analysis. These methods have been developed into computer programmes with varying functionality and utility. IR mostly use the unit cost and benchmarking approach to estimate their SWM costs. The models for cost estimation, on the other hand, are used at times in industrialised countries, but not in IR. Taken together, these approaches could be viewed as precedents that can be modified appropriately to suit waste management systems in IR. The main challenges (or problems) one might face while attempting to do so are a lack of cost data, and a lack of quality for what data do exist. There are practical benefits to planners in IR where solid waste problems are critical and budgets are limited.

  15. Estimating development cost for a tailored interactive computer program to enhance colorectal cancer screening compliance.

    PubMed

    Lairson, David R; Chang, Yu-Chia; Bettencourt, Judith L; Vernon, Sally W; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was Dollars 328,866. The development cost was Dollars 52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  16. Estimating the cost-savings associated with bundling maternal and child health interventions: a proposed methodology

    PubMed Central

    2013-01-01

    Background There is a pressing need to include cost data in the Lives Saved Tool (LiST). This paper proposes a method that combines data from both the WHO CHOosing Interventions that are Cost-Effective (CHOICE) database and the OneHealth Tool (OHT) to develop unit costs for delivering child and maternal health services, both alone and bundled. Methods First, a translog cost function is estimated to calculate factor shares of personnel, consumables, other direct (variable or recurrent costs excluding personnel and consumables) and indirect (capital or investment) costs. Primary source facility level data from Kenya, Namibia, South Africa, Uganda, Zambia and Zimbabwe are utilized, with separate analyses for hospitals and health centres. Second, the resulting other-direct and indirect factor shares are applied to country unit costs from the WHO CHOICE unit cost database to calculate those portions of unit cost. Third, the remainder of the costs is calculated using default data from the OHT. Fourth, we calculate the effect of bundling services by assuming that a LiST intervention visit takes an average of 20 minutes when delivered alone but only incremental time in addition to the basic visit when delivered in a bundle. Results Personnel costs account for the greatest share of costs for both hospitals and health centres at 50% and 38%, respectively. The percentages differ between hospitals and health centres for consumables (21% versus 17%), other direct (7.5% versus 6.75%), and indirect (22% versus 23%) costs. Combining the other-direct and indirect factor shares with the WHO CHOICE database and the other costs from OHT provides a comprehensive cost estimate of LiST interventions. Finally, the cost of six recommended antenatal care (ANC) interventions is $69.76 when delivered alone, but $61.18 when delivered as a bundle, a savings of $8.58 (12.2%). Conclusions This paper proposes a method for estimating a comprehensive cost of providing child and maternal health

  17. Estimating costs of traffic crashes and crime: tools for informed decision making.

    PubMed

    Streff, F M; Molnar, L J; Cohen, M A; Miller, T R; Rossman, S B

    1992-01-01

    Traffic crashes and crime both impose significant economic and social burdens through injury and loss of life, as well as property damage and loss. Efforts to reduce crashes and crime often result in competing demands on limited public resources. Comparable and up-to-date cost data on crashes and crime contribute to informed decisions about allocation of these resources in important ways. As a first step, cost data provide information about the magnitude of the problems of crashes and crime by allowing us to estimate associated dollar losses to society. More importantly, cost data on crashes and crime are essential to evaluating costs and benefits of various policy alternatives that compete for resources. This paper presents the first comparable comprehensive cost estimates for crashes and crime and applies them to crash and crime incidence data for Michigan to generate dollar losses for the state. An example illustrates how cost estimates can be used to evaluate costs and benefits of crash-reduction and crime-reduction policies in making resource allocation decisions. Traffic crash and selected index crime incidence data from the calendar year 1988 were obtained from the Michigan State Police. Costs for crashes and index crimes were generated and applied to incidence data to estimate dollar losses from crashes and index crimes for the state of Michigan. In 1988, index crimes in Michigan resulted in $0.8 billion in monetary costs and $2.4 billion in total monetary and nonmonetary quality-of-life costs (using the willingness-to-pay approach). Traffic crashes in Michigan resulted in $2.3 billion in monetary costs and $7.1 billion in total monetary and nonmonetary quality-of-life costs, nearly three times the costs of index crimes. Based on dollar losses to the state, the magnitude of the problem of traffic crashes clearly exceeded that of index crimes in Michigan in 1988. From a policy perspective, summing the total dollar losses from crashes or crime is of less

  18. Using Data Envelopment Analysis to Improve Estimates of Higher Education Institution's Per-Student Education Costs

    ERIC Educational Resources Information Center

    Salerno, Carlo

    2006-01-01

    This paper puts forth a data envelopment analysis (DEA) approach to estimating higher education institutions' per-student education costs (PSCs) in an effort to redress a number of methodological problems endemic to such estimations, particularly the allocation of shared expenditures between education and other institutional activities. An example…

  19. Handbook of estimating data, factors, and procedures. [for manufacturing cost studies

    NASA Technical Reports Server (NTRS)

    Freeman, L. M.

    1977-01-01

    Elements to be considered in estimating production costs are discussed in this manual. Guidelines, objectives, and methods for analyzing requirements and work structure are given. Time standards for specific specfic operations are listed for machining, sheet metal working, electroplating and metal treating; painting; silk screening, etching and encapsulating; coil winding; wire preparation and wiring; soldering; and the fabrication of etched circuits and terminal boards. The relation of the various elements of cost to the total cost as proposed for various programs by various contractors is compared with government estimates.

  20. Solar thermal technology development: Estimated market size and energy cost savings. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Gates, W. R.

    1983-01-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. The fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. STT R&D is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), dependng on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest.

  1. Solar thermal technology development: Estimated market size and energy cost savings. Volume 1: Executive summary

    NASA Astrophysics Data System (ADS)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. The fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. STT R&D is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), dependng on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest.

  2. MBRidge: an accurate and cost-effective method for profiling DNA methylome at single-base resolution

    PubMed Central

    Cai, Wanshi; Mao, Fengbiao; Teng, Huajing; Cai, Tao; Zhao, Fangqing; Wu, Jinyu; Sun, Zhong Sheng

    2015-01-01

    Organisms and cells, in response to environmental influences or during development, undergo considerable changes in DNA methylation on a genome-wide scale, which are linked to a variety of biological processes. Using MethylC-seq to decipher DNA methylome at single-base resolution is prohibitively costly. In this study, we develop a novel approach, named MBRidge, to detect the methylation levels of repertoire CpGs, by innovatively introducing C-hydroxylmethylated adapters and bisulfate treatment into the MeDIP-seq protocol and employing ridge regression in data analysis. A systematic evaluation of DNA methylome in a human ovarian cell line T29 showed that MBRidge achieved high correlation (R > 0.90) with much less cost (∼10%) in comparison with MethylC-seq. We further applied MBRidge to profiling DNA methylome in T29H, an oncogenic counterpart of T29's. By comparing methylomes of T29H and T29, we identified 131790 differential methylation regions (DMRs), which are mainly enriched in carcinogenesis-related pathways. These are substantially different from 7567 DMRs that were obtained by RRBS and related with cell development or differentiation. The integrated analysis of DMRs in the promoter and expression of DMR-corresponding genes revealed that DNA methylation enforced reverse regulation of gene expression, depending on the distance from the proximal DMR to transcription starting sites in both mRNA and lncRNA. Taken together, our results demonstrate that MBRidge is an efficient and cost-effective method that can be widely applied to profiling DNA methylomes. PMID:26078362

  3. Life cycle cost estimation and systems analysis of Waste Management Facilities

    SciTech Connect

    Shropshire, D.; Feizollahi, F.

    1995-10-01

    This paper presents general conclusions from application of a system cost analysis method developed by the United States Department of Energy (DOE), Waste Management Division (WM), Waste Management Facilities Costs Information (WMFCI) program. The WMFCI method has been used to assess the DOE complex-wide management of radioactive, hazardous, and mixed wastes. The Idaho Engineering Laboratory, along with its subcontractor Morrison Knudsen Corporation, has been responsible for developing and applying the WMFCI cost analysis method. The cost analyses are based on system planning level life-cycle costs. The costs for life-cycle waste management activities estimated by WMFCI range from bench-scale testing and developmental work needed to design and construct a facility, facility permitting and startup, operation and maintenance, to the final decontamination, decommissioning, and closure of the facility. For DOE complex-wide assessments, cost estimates have been developed at the treatment, storage, and disposal module level and rolled up for each DOE installation. Discussions include conclusions reached by studies covering complex-wide consolidation of treatment, storage, and disposal facilities, system cost modeling, system costs sensitivity, system cost optimization, and the integration of WM waste with the environmental restoration and decontamination and decommissioning secondary wastes.

  4. Review of cost estimates for reducing CO2 emissions. Final report, Task 9

    SciTech Connect

    Not Available

    1990-10-01

    Since the ground breaking work of William Nordhaus in 1977, cost estimates for reducing CO{sub 2} emissions have been developed by numerous groups. The various studies have reported sometimes widely divergent cost estimates for reducing CO{sub 2} emissions. Some recent analyses have indicated that large reductions in CO{sub 2} emissions could be achieved at zero or negative costs (e.g. Rocky Mountain Institute 1989). In contrast, a recent study by Alan Manne of Stanford and Richard Richels of the Electric Power Research Institute (Manne-Richels 1989) concluded that in the US the total discounted costs of reducing CO{sub 2} emissions by 20 percent below the 1990 level could be as much as 3.6 trillion dollars over the period from 1990 to 2100. Costs of this order of magnitude would represent about 5 percent of US GNP. The purpose of this briefing paper is to summarize the different cost estimates for CO{sub 2} emission reduction and to identify the key issues and assumptions that underlie these cost estimates.

  5. Cost and size estimates for an electrochemical bulk energy storage concept

    NASA Technical Reports Server (NTRS)

    Warshay, M.; Wright, L. O.

    1975-01-01

    Preliminary capital cost and size estimates were made for a titanium trichloride, titanium tetrachloride, ferric chloride, ferrous chloride redox-flow-cell electric power system. On the basis of these preliminary estimates plus other important considerations, this electrochemical system emerged as having great promise as a bulk energy storage system for power load leveling. The size of this system is less than two per cent of that of a comparable pumped hydroelectric plant. The estimated capital cost of a 10 MW, 60- and 85-MWh redox-flow system compared well with that of competing systems.

  6. Regional Cost Estimates for Reclamation Practices on Arid and Semiarid Lands

    SciTech Connect

    W. K. Ostler

    2002-02-01

    The U.S. Army uses the Integrated Training Area Management program for managing training land. One of the major objectives of the Integrated Training Area Management program has been to develop a method for estimating training land carrying capacity in a sustainable manner. The Army Training and Testing Area Carrying Capacity methodology measures training load in terms of Maneuver Impact Miles. One Maneuver Impact Mile is the equivalent impact of an M1A2 tank traveling one mile while participating in an armor battalion field training exercise. The Army Training and Testing Area Carrying Capacity methodology is also designed to predict land maintenance costs in terms of dollars per Maneuver Impact Mile. The overall cost factor is calculated using the historical cost of land maintenance practices and the effectiveness of controlling erosion. Because land maintenance costs and effectiveness are influenced by the characteristics of the land, Army Training and Testing Area Carrying Capacity cost factors must be developed for each ecological region of the country. Costs for land maintenance activities are presented here for the semiarid and arid regions of the United States. Five ecoregions are recognized, and average values for reclamation activities are presented. Because there are many variables that can influence costs, ranges for reclamation activities are also presented. Costs are broken down into six major categories: seedbed preparation, fertilization, seeding, planting, mulching, and supplemental erosion control. Costs for most land reclamation practices and materials varied widely within and between ecological provinces. Although regional cost patterns were evident for some practices, the patterns were not consistent between practices. For the purpose of estimating land reclamation costs for the Army Training and Testing Area Carrying Capacity methodology, it may be desirable to use the ''Combined Average'' of all provinces found in the last row of each table

  7. Estimating the economic impact of environmental investments on retail costs and dealer strategies for offsetting these costs

    SciTech Connect

    Simpson, G.S.

    1995-09-01

    Retail agrichemical dealers have become increasingly familiar with the concept of containment over the past several years. Currently, containment regulations are in place in 13 states, and are being drafted in 7 others. Agrichemical dealers in these states will be required to assess the potential environmental impact of their operating practices on the land under and around the retail production site. A recent survey of TVA model site and individual technology demonstration dealers attempted to gain insight into the impact these investments in containment structures, changing operating practices, and state containment regulations were having on annual production costs. The purpose of this paper is to (1) provide the agrichemical dealer a methodology for quickly estimating the potential impact that environmental investments will have on annual production costs, and (2) to evaluate the effectiveness of alternative management strategies employed to offset some, or in some cases, all of these additional costs.

  8. Cost-effective genome-wide estimation of allele frequencies from pooled DNA in Atlantic salmon (Salmo salar L.)

    PubMed Central

    2013-01-01

    Background New sequencing technologies have tremendously increased the number of known molecular markers (single nucleotide polymorphisms; SNPs) in a variety of species. Concurrently, improvements to genotyping technology have now made it possible to efficiently genotype large numbers of genome-wide distributed SNPs enabling genome wide association studies (GWAS). However, genotyping significant numbers of individuals with large number of SNPs remains prohibitively expensive for many research groups. A possible solution to this problem is to determine allele frequencies from pooled DNA samples, such ‘allelotyping’ has been presented as a cost-effective alternative to individual genotyping and has become popular in human GWAS. In this article we have tested the effectiveness of DNA pooling to obtain accurate allele frequency estimates for Atlantic salmon (Salmo salar L.) populations using an Illumina SNP-chip. Results In total, 56 Atlantic salmon DNA pools from 14 populations were analyzed on an Atlantic salmon SNP-chip containing probes for 5568 SNP markers, 3928 of which were bi-allelic. We developed an efficient quality control filter which enables exclusion of loci showing high error rate and minor allele frequency (MAF) close to zero. After applying multiple quality control filters we obtained allele frequency estimates for 3631 bi-allelic loci. We observed high concordance (r > 0.99) between allele frequency estimates derived from individual genotyping and DNA pools. Our results also indicate that even relatively small DNA pools (35 individuals) can provide accurate allele frequency estimates for a given sample. Conclusions Despite of higher level of variation associated with array replicates compared to pool construction, we suggest that both sources of variation should be taken into account. This study demonstrates that DNA pooling allows fast and high-throughput determination of allele frequencies in Atlantic salmon enabling cost

  9. Estimation of Life-Year Loss and Lifetime Costs for Different Stages of Colon Adenocarcinoma in Taiwan

    PubMed Central

    Chen, Po-Chuan; Lee, Jenq-Chang; Wang, Jung-Der

    2015-01-01

    Backgrounds and aims Life-expectancy of colon cancer patients cannot be accurately answered due to the lack of both large datasets and long-term follow-ups, which impedes accurate estimation of lifetime cost to treat colon cancer patients. In this study, we applied a method to estimate life-expectancy of colon cancer patients in Taiwan and calculate the lifetime costs by different stages and age groups. Methods A total of 17,526 cases with pathologically verified colon adenocarcinoma between 2002 and 2009 were extracted from Taiwan Cancer Registry database for analysis. All patients were followed-up until the end of 2011. Life-expectancy, expected-years-of-life-lost and lifetime costs were estimated, using a semi-parametric survival extrapolation method and borrowing information from life tables of vital statistics. Results Patients with more advanced stages of colon cancer were generally younger and less co-morbid with major chronic diseases than those with stages I and II. The LE of stage I was not significantly different from that of the age- and sex-matched general population, whereas those of stages II, III, and IV colon cancer patients after diagnosis were 16.57±0.07, 13.35±0.07, and 4.05±0.05 years, respectively; the corresponding expected-years-of-life-lost were 1.28±0.07, 5.93±0.07 and 16.42±0.06 years, significantly shorter than the general population after accounting for lead time bias. Besides, the lifetime cost of managing stage II colon cancer patients would be US $8,416±1939, 14,334±1,755, and 21,837±1,698, respectively, indicating a big saving for early diagnosis and treatment after stratification for age and sex. Conclusions Treating colon cancer at younger age and earlier stage saves more life-years and healthcare costs. Future studies are indicated to apply these quantitative results into the cost-effectiveness evaluation of screening program for colon cancers. PMID:26207912

  10. Cost and size estimates for an electrochemical bulk energy storage concept

    NASA Technical Reports Server (NTRS)

    Warshay, M.; Wright, L. O.

    1975-01-01

    Preliminary capital cost and size estimates were made for an electrochemical bulk energy storage concept. The electrochemical system considered was an electrically rechargeable flow cell with a redox couple. On the basis of preliminary capital cost estimates, size estimates, and several other important considerations, the redox-flow-cell system emerges as having great promise as a bulk energy storage system for power load leveling. The size of this system would be less than 2 percent of that of a comparable pumped hydroelectric plant. The capital cost of a 10-megawatt, 60- and 85-megawatt-hour redox system is estimated to be $190 to $330 per kilowatt. The other important features of the redox system contributing to its load leveling application are its low adverse environmental impact, its high efficiency, its apparent absence of electrochemically-related cycle life limitations, and its fast response.

  11. Development of weight and cost estimates for lifting surfaces with active controls

    NASA Technical Reports Server (NTRS)

    Anderson, R. D.; Flora, C. C.; Nelson, R. M.; Raymond, E. T.; Vincent, J. H.

    1976-01-01

    Equations and methodology were developed for estimating the weight and cost incrementals due to active controls added to the wing and horizontal tail of a subsonic transport airplane. The methods are sufficiently generalized to be suitable for preliminary design. Supporting methodology and input specifications for the weight and cost equations are provided. The weight and cost equations are structured to be flexible in terms of the active control technology (ACT) flight control system specification. In order to present a self-contained package, methodology is also presented for generating ACT flight control system characteristics for the weight and cost equations. Use of the methodology is illustrated.

  12. GME: at what cost?

    PubMed

    Young, David W

    2003-11-01

    Current computing methods impede determining the real cost of graduate medical education. However, a more accurate estimate could be obtained if policy makers would allow for the application of basic cost-accounting principles, including consideration of department-level costs, unbundling of joint costs, and other factors.

  13. Constrained low-cost GPS/INS filter with encoder bias estimation for ground vehicles' applications

    NASA Astrophysics Data System (ADS)

    Abdel-Hafez, Mamoun F.; Saadeddin, Kamal; Amin Jarrah, Mohammad

    2015-06-01

    In this paper, a constrained, fault-tolerant, low-cost navigation system is proposed for ground vehicle's applications. The system is designed to provide a vehicle navigation solution at 50 Hz by fusing the measurements of the inertial measurement unit (IMU), the global positioning system (GPS) receiver, and the velocity measurement from wheel encoders. A high-integrity estimation filter is proposed to obtain a high accuracy state estimate. The filter utilizes vehicle velocity constraints measurement to enhance the estimation accuracy. However, if the velocity measurement of the encoder is biased, the accuracy of the estimate is degraded. Therefore, a noise estimation algorithm is proposed to estimate a possible bias in the velocity measurement of the encoder. Experimental tests, with simulated biases on the encoder's readings, are conducted and the obtained results are presented. The experimental results show the enhancement in the estimation accuracy when the simulated bias is estimated using the proposed method.

  14. Estimating the cost of cervical cancer screening in five developing countries

    PubMed Central

    Goldhaber-Fiebert, Jeremy D; Goldie, Sue J

    2006-01-01

    Background Cost-effectiveness analyses (CEAs) can provide useful information to policymakers concerned with the broad allocation of resources as well as to local decision makers choosing between different options for reducing the burden from a single disease. For the latter, it is important to use country-specific data when possible and to represent cost differences between countries that might make one strategy more or less attractive than another strategy locally. As part of a CEA of cervical cancer screening in five developing countries, we supplemented limited primary cost data by developing other estimation techniques for direct medical and non-medical costs associated with alternative screening approaches using one of three initial screening tests: simple visual screening, HPV DNA testing, and cervical cytology. Here, we report estimation methods and results for three cost areas in which data were lacking. Methods To supplement direct medical costs, including staff, supplies, and equipment depreciation using country-specific data, we used alternative techniques to quantify cervical cytology and HPV DNA laboratory sample processing costs. We used a detailed quantity and price approach whose face validity was compared to an adaptation of a US laboratory estimation methodology. This methodology was also used to project annual sample processing capacities for each laboratory type. The cost of sample transport from the clinic to the laboratory was estimated using spatial models. A plausible range of the cost of patient time spent seeking and receiving screening was estimated using only formal sector employment and wages as well as using both formal and informal sector participation and country-specific minimum wages. Data sources included primary data from country-specific studies, international databases, international prices, and expert opinion. Costs were standardized to year 2000 international dollars using inflation adjustment and purchasing power parity

  15. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    PubMed Central

    2012-01-01

    Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs) ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV), and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to provide information on the

  16. The cost of forming more accurate impressions: accuracy-motivated perceivers see the personality of others more distinctively but less normatively than perceivers without an explicit goal.

    PubMed

    Biesanz, Jeremy C; Human, Lauren J

    2010-04-01

    Does the motivation to form accurate impressions actually improve accuracy? The present work extended Kenny's (1991, 1994) weighted-average model (WAM)--a theoretical model of the factors that influence agreement among personality judgments--to examine two components of interpersonal perception: distinctive and normative accuracy. WAM predicts that an accuracy motivation should enhance distinctive accuracy but decrease normative accuracy. In other words, the impressions of a perceiver with an accuracy motivation will correspond more with the target person's unique characteristics and less with the characteristics of the average person. Perceivers randomly assigned to receive the social goal of forming accurate impressions, which was communicated through a single-sentence instruction, achieved higher levels of distinctive self-other agreement but lower levels of normative agreement compared with perceivers not given an explicit impression-formation goal. The results suggest that people motivated to form accurate impressions do indeed become more accurate, but at the cost of seeing others less normatively and, in particular, less positively.

  17. The cost of forming more accurate impressions: accuracy-motivated perceivers see the personality of others more distinctively but less normatively than perceivers without an explicit goal.

    PubMed

    Biesanz, Jeremy C; Human, Lauren J

    2010-04-01

    Does the motivation to form accurate impressions actually improve accuracy? The present work extended Kenny's (1991, 1994) weighted-average model (WAM)--a theoretical model of the factors that influence agreement among personality judgments--to examine two components of interpersonal perception: distinctive and normative accuracy. WAM predicts that an accuracy motivation should enhance distinctive accuracy but decrease normative accuracy. In other words, the impressions of a perceiver with an accuracy motivation will correspond more with the target person's unique characteristics and less with the characteristics of the average person. Perceivers randomly assigned to receive the social goal of forming accurate impressions, which was communicated through a single-sentence instruction, achieved higher levels of distinctive self-other agreement but lower levels of normative agreement compared with perceivers not given an explicit impression-formation goal. The results suggest that people motivated to form accurate impressions do indeed become more accurate, but at the cost of seeing others less normatively and, in particular, less positively. PMID:20424106

  18. Cost estimates for near-term depolyment of advanced traffic management systems. Final report

    SciTech Connect

    Stevens, S.S.; Chin, S.M.

    1993-02-15

    The objective of this study is to provide cost est engineering, design, installation, operation and maintenance of Advanced Traffic Management Systems (ATMS) in the largest 75 metropolitan areas in the United States. This report gives estimates for deployment costs for ATMS in the next five years, subject to the qualifications and caveats set out in following paragraphs. The report considers infrastructure components required to realize fully a functional ATMS over each of two highway networks (as discussed in the Section describing our general assumptions) under each of the four architectures identified in the MITRE Intelligent Vehicle Highway Systems (IVHS) Architecture studies. The architectures are summarized in this report in Table 2. Estimates are given for eight combinations of highway networks and architectures. We estimate that it will cost between $8.5 Billion (minimal network) and $26 Billion (augmented network) to proceed immediately with deployment of ATMS in the largest 75 metropolitan areas. Costs are given in 1992 dollars, and are not adjusted for future inflation. Our estimates are based partially on completed project costs, which have been adjusted to 1992 dollars. We assume that a particular architecture will be chosen; projected costs are broken by architecture.

  19. Application of Boosting Regression Trees to Preliminary Cost Estimation in Building Construction Projects.

    PubMed

    Shin, Yoonseok

    2015-01-01

    Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project. PMID:26339227

  20. Reusable Reentry Satellite (RRS) system design study: System cost estimates document

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Reusable Reentry Satellite (RRS) program was initiated to provide life science investigators relatively inexpensive, frequent access to space for extended periods of time with eventual satellite recovery on earth. The RRS will provide an on-orbit laboratory for research on biological and material processes, be launched from a number of expendable launch vehicles, and operate in Low-Altitude Earth Orbit (LEO) as a free-flying unmanned laboratory. SAIC's design will provide independent atmospheric reentry and soft landing in the continental U.S., orbit for a maximum of 60 days, and will sustain three flights per year for 10 years. The Reusable Reentry Vehicle (RRV) will be 3-axis stabilized with artificial gravity up to 1.5g's, be rugged and easily maintainable, and have a modular design to accommodate a satellite bus and separate modular payloads (e.g., rodent module, general biological module, ESA microgravity botany facility, general botany module). The purpose of this System Cost Estimate Document is to provide a Life Cycle Cost Estimate (LCCE) for a NASA RRS Program using SAIC's RRS design. The estimate includes development, procurement, and 10 years of operations and support (O&S) costs for NASA's RRS program. The estimate does not include costs for other agencies which may track or interface with the RRS program (e.g., Air Force tracking agencies or individual RRS experimenters involved with special payload modules (PM's)). The life cycle cost estimate extends over the 10 year operation and support period FY99-2008.