Science.gov

Sample records for accurate cost estimate

  1. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  2. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  3. ESTIMATING IRRIGATION COSTS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Having accurate estimates of the cost of irrigation is important when making irrigation decisions. Estimates of fixed costs are critical for investment decisions. Operating cost estimates can assist in decisions regarding additional irrigations. This fact sheet examines the costs associated with ...

  4. The cost of doing a cost estimate

    NASA Technical Reports Server (NTRS)

    Remer, Donald S.; Buchanan, Harry R.

    1993-01-01

    A model for estimating the cost required to do a cost estimate for Deep Space Network (DSN) projects that range from $0.1 to $100 million is presented. The cost of the cost estimate in thousands of dollars, C(sub E), is found to be approximately given by C(sub E) = K(/C(sub p)/(sup 0.35)) where C(sub p) is the cost of the project being estimated in millions of dollars and K is a constant depending on the accuracy of the estimate. For an order-of-magnitude estimate, K = 24; for a budget estimate, K = 60; and for a definitive estimate, K = 115. That is, for a specific project, the cost of doing a budget estimate is about 2.5 times as much as that for an order-of-magnitude estimate, and a definitive estimate costs about twice as much as a budget estimate. Use of this model should help provide the level of resources required for doing cost estimates and, as a result, provide insights towards more accurate estimates with less potential for cost overruns.

  5. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  6. Estimating Airline Operating Costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    The factors affecting commercial aircraft operating and delay costs were used to develop an airline operating cost model which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model permits estimates of aircraft related costs, i.e., aircraft service, landing fees, flight attendants, and control fees. A method for estimating the costs of certain types of airline delay is also described.

  7. Estimating airline operating costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    A review was made of the factors affecting commercial aircraft operating and delay costs. From this work, an airline operating cost model was developed which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model, similar in some respects to the standard Air Transport Association of America (ATA) Direct Operating Cost Model, permits estimates of aircraft-related costs not now included in the standard ATA model (e.g., aircraft service, landing fees, flight attendants, and control fees). A study of the cost of aircraft delay was also made and a method for estimating the cost of certain types of airline delay is described.

  8. Conceptual Cost Estimating

    NASA Technical Reports Server (NTRS)

    Brown, J. A.

    1983-01-01

    Kennedy Space Center data aid in efficient construction-cost managment. Report discusses development and use of NASA TR-1508, Kennedy Space Center Aerospace Construction price book for preparing conceptual budget, funding cost estimating, and preliminary cost engineering reports. Report based on actual bid prices and Government estimates.

  9. Spacecraft platform cost estimating relationships

    NASA Technical Reports Server (NTRS)

    Gruhl, W. M.

    1972-01-01

    The three main cost areas of unmanned satellite development are discussed. The areas are identified as: (1) the spacecraft platform (SCP), (2) the payload or experiments, and (3) the postlaunch ground equipment and operations. The SCP normally accounts for over half of the total project cost and accurate estimates of SCP costs are required early in project planning as a basis for determining total project budget requirements. The development of single formula SCP cost estimating relationships (CER) from readily available data by statistical linear regression analysis is described. The advantages of single formula CER are presented.

  10. Cost-Estimation Program

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    1995-01-01

    COSTIT computer program estimates cost of electronic design by reading item-list file and file containing cost for each item. Accuracy of cost estimate based on accuracy of cost-list file. Written by use of AWK utility for Sun4-series computers running SunOS 4.x and IBM PC-series and compatible computers running MS-DOS. The Sun version (NPO-19587). PC version (NPO-19157).

  11. Updated Conceptual Cost Estimating

    NASA Technical Reports Server (NTRS)

    Brown, J. A.

    1987-01-01

    16-page report discusses development and use of NASA TR-1508, the Kennedy Space Center Aerospace Construction Price Book for preparing conceptual, budget, funding, cost-estimating, and preliminary cost-engineering reports. Updated annually from 1974 through 1985 with actual bid prices and government estimates. Includes labor and material quantities and prices with contractor and subcontractor markups for buildings, facilities, and systems at Kennedy Space Center. While data pertains to aerospace facilities, format and cost-estimating techniques guide estimation of costs in other construction applications.

  12. The Psychology of Cost Estimating

    NASA Technical Reports Server (NTRS)

    Price, Andy

    2016-01-01

    Cost estimation for large (and even not so large) government programs is a challenge. The number and magnitude of cost overruns associated with large Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) programs highlight the difficulties in developing and promulgating accurate cost estimates. These overruns can be the result of inadequate technology readiness or requirements definition, the whims of politicians or government bureaucrats, or even as failures of the cost estimating profession itself. However, there may be another reason for cost overruns that is right in front of us, but only recently have we begun to grasp it: the fact that cost estimators and their customers are human. The last 70+ years of research into human psychology and behavioral economics have yielded amazing findings into how we humans process and use information to make judgments and decisions. What these scientists have uncovered is surprising: humans are often irrational and illogical beings, making decisions based on factors such as emotion and perception, rather than facts and data. These built-in biases to our thinking directly affect how we develop our cost estimates and how those cost estimates are used. We cost estimators can use this knowledge of biases to improve our cost estimates and also to improve how we communicate and work with our customers. By understanding how our customers think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. By using psychology to our advantage, we can more effectively help the decision maker and our organizations make fact-based decisions.

  13. Capital cost estimate

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The capital cost estimate for the nuclear process heat source (NPHS) plant was made by: (1) using costs from the current commercial HTGR for electricity production as a base for items that are essentially the same and (2) development of new estimates for modified or new equipment that is specifically for the process heat application. Results are given in tabular form and cover the total investment required for each process temperature studied.

  14. Electric propulsion cost estimation

    NASA Technical Reports Server (NTRS)

    Palaszewski, B. A.

    1985-01-01

    A parametric cost model for mercury ion propulsion modules is presented. A detailed work breakdown structure is included. Cost estimating relationships were developed for the individual subsystems and the nonhardware items (systems engineering, software, etc.). Solar array and power processor unit (PPU) costs are the significant cost drivers. Simplification of both of these subsystems through applications of advanced technology (lightweight solar arrays and high-efficiency, self-radiating PPUs) can reduce costs. Comparison of the performance and cost of several chemical propulsion systems with the Hg ion module are also presented. For outer-planet missions, advanced solar electric propulsion (ASEP) trip times and O2/H2 propulsion trip times are comparable. A three-year trip time savings over the baselined NTO/MMH propulsion system is possible with ASEP.

  15. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  16. A model for the cost of doing a cost estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1992-01-01

    A model for estimating the cost required to do a cost estimate for Deep Space Network (DSN) projects that range from $0.1 to $100 million is presented. The cost of the cost estimate in thousands of dollars, C(sub E), is found to be approximately given by C(sub E) = K((C(sub p))(sup 0.35)) where C(sub p) is the cost of the project being estimated in millions of dollars and K is a constant depending on the accuracy of the estimate. For an order-of-magnitude estimate, K = 24; for a budget estimate, K = 60; and for a definitive estimate, K = 115. That is, for a specific project, the cost of doing a budget estimate is about 2.5 times as much as that for an order-of-magnitude estimate, and a definitive estimate costs about twice as much as a budget estimate. Use of this model should help provide the level of resources required for doing cost estimates and, as a result, provide insights towards more accurate estimates with less potential for cost overruns.

  17. You Can Accurately Predict Land Acquisition Costs.

    ERIC Educational Resources Information Center

    Garrigan, Richard

    1967-01-01

    Land acquisition costs were tested for predictability based upon the 1962 assessed valuations of privately held land acquired for campus expansion by the University of Wisconsin from 1963-1965. By correlating the land acquisition costs of 108 properties acquired during the 3 year period with--(1) the assessed value of the land, (2) the assessed…

  18. Estimating the Cost to do a Cost Estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1998-01-01

    This article provides a model for estimating the cost required to do a cost estimate. Overruns may lead to concellation of a project. In 1991, we completed a study on the cost of doing cost estimates for the class of projects normally encountered in the development and implementation of equipment at the network of tracking stations operated by the Jet Propulsion Laboratory (JPL) for NASA.

  19. The thermodynamic cost of accurate sensory adaptation

    NASA Astrophysics Data System (ADS)

    Tu, Yuhai

    2015-03-01

    Living organisms need to obtain and process environment information accurately in order to make decisions critical for their survival. Much progress have been made in identifying key components responsible for various biological functions, however, major challenges remain to understand system-level behaviors from the molecular-level knowledge of biology and to unravel possible physical principles for the underlying biochemical circuits. In this talk, we will present some recent works in understanding the chemical sensory system of E. coli by combining theoretical approaches with quantitative experiments. We focus on addressing the questions on how cells process chemical information and adapt to varying environment, and what are the thermodynamic limits of key regulatory functions, such as adaptation.

  20. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities. PMID:12747164

  1. Cost-estimating relationships for space programs

    NASA Technical Reports Server (NTRS)

    Mandell, Humboldt C., Jr.

    1992-01-01

    Cost-estimating relationships (CERs) are defined and discussed as they relate to the estimation of theoretical costs for space programs. The paper primarily addresses CERs based on analogous relationships between physical and performance parameters to estimate future costs. Analytical estimation principles are reviewed examining the sources of errors in cost models, and the use of CERs is shown to be affected by organizational culture. Two paradigms for cost estimation are set forth: (1) the Rand paradigm for single-culture single-system methods; and (2) the Price paradigms that incorporate a set of cultural variables. For space programs that are potentially subject to even small cultural changes, the Price paradigms are argued to be more effective. The derivation and use of accurate CERs is important for developing effective cost models to analyze the potential of a given space program.

  2. Manned Mars mission cost estimate

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph; Smith, Keith

    1986-01-01

    The potential costs of several options of a manned Mars mission are examined. A cost estimating methodology based primarily on existing Marshall Space Flight Center (MSFC) parametric cost models is summarized. These models include the MSFC Space Station Cost Model and the MSFC Launch Vehicle Cost Model as well as other modes and techniques. The ground rules and assumptions of the cost estimating methodology are discussed and cost estimates presented for six potential mission options which were studied. The estimated manned Mars mission costs are compared to the cost of the somewhat analogous Apollo Program cost after normalizing the Apollo cost to the environment and ground rules of the manned Mars missions. It is concluded that a manned Mars mission, as currently defined, could be accomplished for under $30 billion in 1985 dollars excluding launch vehicle development and mission operations.

  3. ADP (Automated Data Processing) cost estimating heuristics

    SciTech Connect

    Sadlowe, A.R.; Arrowood, L.F.; Jones, K.A.; Emrich, M.L.; Watson, B.D.

    1987-09-11

    Artificial Intelligence, in particular expert systems methodologies, is being applied to the US Navy's Automated Data Processing estimating tasks. Skilled Navy project leaders are nearing retirement; replacements may not yet possess the many years of experience required to make accurate decisions regarding time, cost, equipment, and personnel needs. The potential departure of expertise resulted in the development of a system to capture organizational expertise. The prototype allows inexperienced project leaders to interactively generate cost estimates. 5 refs.

  4. Estimating the Cost of Doing a Cost Estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1996-01-01

    This article provides a model for estimating the cost required to do a cost estimate...Our earlier work provided data for high technology projects. This article adds data from the construction industry which validates the model over a wider range of technology.

  5. Cost Estimating Handbook for Environmental Restoration

    SciTech Connect

    1990-09-01

    Environmental restoration (ER) projects have presented the DOE and cost estimators with a number of properties that are not comparable to the normal estimating climate within DOE. These properties include: An entirely new set of specialized expressions and terminology. A higher than normal exposure to cost and schedule risk, as compared to most other DOE projects, due to changing regulations, public involvement, resource shortages, and scope of work. A higher than normal percentage of indirect costs to the total estimated cost due primarily to record keeping, special training, liability, and indemnification. More than one estimate for a project, particularly in the assessment phase, in order to provide input into the evaluation of alternatives for the cleanup action. While some aspects of existing guidance for cost estimators will be applicable to environmental restoration projects, some components of the present guidelines will have to be modified to reflect the unique elements of these projects. The purpose of this Handbook is to assist cost estimators in the preparation of environmental restoration estimates for Environmental Restoration and Waste Management (EM) projects undertaken by DOE. The DOE has, in recent years, seen a significant increase in the number, size, and frequency of environmental restoration projects that must be costed by the various DOE offices. The coming years will show the EM program to be the largest non-weapons program undertaken by DOE. These projects create new and unique estimating requirements since historical cost and estimating precedents are meager at best. It is anticipated that this Handbook will enhance the quality of cost data within DOE in several ways by providing: The basis for accurate, consistent, and traceable baselines. Sound methodologies, guidelines, and estimating formats. Sources of cost data/databases and estimating tools and techniques available at DOE cost professionals.

  6. 31 CFR 205.24 - How are accurate estimates maintained?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are accurate estimates maintained... Treasury-State Agreement § 205.24 How are accurate estimates maintained? (a) If a State has knowledge that an estimate does not reasonably correspond to the State's cash needs for a Federal assistance...

  7. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  8. Micromagnetometer calibration for accurate orientation estimation.

    PubMed

    Zhang, Zhi-Qiang; Yang, Guang-Zhong

    2015-02-01

    Micromagnetometers, together with inertial sensors, are widely used for attitude estimation for a wide variety of applications. However, appropriate sensor calibration, which is essential to the accuracy of attitude reconstruction, must be performed in advance. Thus far, many different magnetometer calibration methods have been proposed to compensate for errors such as scale, offset, and nonorthogonality. They have also been used for obviate magnetic errors due to soft and hard iron. However, in order to combine the magnetometer with inertial sensor for attitude reconstruction, alignment difference between the magnetometer and the axes of the inertial sensor must be determined as well. This paper proposes a practical means of sensor error correction by simultaneous consideration of sensor errors, magnetic errors, and alignment difference. We take the summation of the offset and hard iron error as the combined bias and then amalgamate the alignment difference and all the other errors as a transformation matrix. A two-step approach is presented to determine the combined bias and transformation matrix separately. In the first step, the combined bias is determined by finding an optimal ellipsoid that can best fit the sensor readings. In the second step, the intrinsic relationships of the raw sensor readings are explored to estimate the transformation matrix as a homogeneous linear least-squares problem. Singular value decomposition is then applied to estimate both the transformation matrix and magnetic vector. The proposed method is then applied to calibrate our sensor node. Although there is no ground truth for the combined bias and transformation matrix for our node, the consistency of calibration results among different trials and less than 3(°) root mean square error for orientation estimation have been achieved, which illustrates the effectiveness of the proposed sensor calibration method for practical applications. PMID:25265625

  9. Parametric cost estimation for space science missions

    NASA Astrophysics Data System (ADS)

    Lillie, Charles F.; Thompson, Bruce E.

    2008-07-01

    Cost estimation for space science missions is critically important in budgeting for successful missions. The process requires consideration of a number of parameters, where many of the values are only known to a limited accuracy. The results of cost estimation are not perfect, but must be calculated and compared with the estimates that the government uses for budgeting purposes. Uncertainties in the input parameters result from evolving requirements for missions that are typically the "first of a kind" with "state-of-the-art" instruments and new spacecraft and payload technologies that make it difficult to base estimates on the cost histories of previous missions. Even the cost of heritage avionics is uncertain due to parts obsolescence and the resulting redesign work. Through experience and use of industry best practices developed in participation with the Aerospace Industries Association (AIA), Northrop Grumman has developed a parametric modeling approach that can provide a reasonably accurate cost range and most probable cost for future space missions. During the initial mission phases, the approach uses mass- and powerbased cost estimating relationships (CER)'s developed with historical data from previous missions. In later mission phases, when the mission requirements are better defined, these estimates are updated with vendor's bids and "bottoms- up", "grass-roots" material and labor cost estimates based on detailed schedules and assigned tasks. In this paper we describe how we develop our CER's for parametric cost estimation and how they can be applied to estimate the costs for future space science missions like those presented to the Astronomy & Astrophysics Decadal Survey Study Committees.

  10. Solar power satellite cost estimate

    NASA Technical Reports Server (NTRS)

    Harron, R. J.; Wadle, R. C.

    1981-01-01

    The solar power configuration costed is the 5 GW silicon solar cell reference system. The subsystems identified by work breakdown structure elements to the lowest level for which cost information was generated. This breakdown divides into five sections: the satellite, construction, transportation, the ground receiving station and maintenance. For each work breakdown structure element, a definition, design description and cost estimate were included. An effort was made to include for each element a reference that more thoroughly describes the element and the method of costing used. All costs are in 1977 dollars.

  11. Progress Toward Automated Cost Estimation

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1992-01-01

    Report discusses efforts to develop standard system of automated cost estimation (ACE) and computer-aided design (CAD). Advantage of system is time saved and accuracy enhanced by automating extraction of quantities from design drawings, consultation of price lists, and application of cost and markup formulas.

  12. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  13. Accurate Orientation Estimation Using AHRS under Conditions of Magnetic Distortion

    PubMed Central

    Yadav, Nagesh; Bleakley, Chris

    2014-01-01

    Low cost, compact attitude heading reference systems (AHRS) are now being used to track human body movements in indoor environments by estimation of the 3D orientation of body segments. In many of these systems, heading estimation is achieved by monitoring the strength of the Earth's magnetic field. However, the Earth's magnetic field can be locally distorted due to the proximity of ferrous and/or magnetic objects. Herein, we propose a novel method for accurate 3D orientation estimation using an AHRS, comprised of an accelerometer, gyroscope and magnetometer, under conditions of magnetic field distortion. The system performs online detection and compensation for magnetic disturbances, due to, for example, the presence of ferrous objects. The magnetic distortions are detected by exploiting variations in magnetic dip angle, relative to the gravity vector, and in magnetic strength. We investigate and show the advantages of using both magnetic strength and magnetic dip angle for detecting the presence of magnetic distortions. The correction method is based on a particle filter, which performs the correction using an adaptive cost function and by adapting the variance during particle resampling, so as to place more emphasis on the results of dead reckoning of the gyroscope measurements and less on the magnetometer readings. The proposed method was tested in an indoor environment in the presence of various magnetic distortions and under various accelerations (up to 3 g). In the experiments, the proposed algorithm achieves <2° static peak-to-peak error and <5° dynamic peak-to-peak error, significantly outperforming previous methods. PMID:25347584

  14. Quantifying Accurate Calorie Estimation Using the "Think Aloud" Method

    ERIC Educational Resources Information Center

    Holmstrup, Michael E.; Stearns-Bruening, Kay; Rozelle, Jeffrey

    2013-01-01

    Objective: Clients often have limited time in a nutrition education setting. An improved understanding of the strategies used to accurately estimate calories may help to identify areas of focused instruction to improve nutrition knowledge. Methods: A "Think Aloud" exercise was recorded during the estimation of calories in a standard dinner meal…

  15. Estimated costs of malingered disability.

    PubMed

    Chafetz, Michael; Underhill, James

    2013-11-01

    The feigning of disabling illness for the purpose of disability compensation, or "malingering," is common in Social Security Disability examinations, occurring in 45.8%-59.7% of adult cases. In this study, we estimated the costs of malingering based on mental disorder data published by the Social Security Administration. At the most widely accepted base rate of malingering in medicolegal cases involving external incentive, costs were high, totaling $20.02 billion in 2011 for adult mental disorder claimants. Moreover, these figures clearly underestimate the costs of the larger problem with feigned disability in both adults and children. We urge a change in Social Security policies to allow the use of validity testing in the examination for disability claims. PMID:23800432

  16. Xenia Spacecraft Study Addendum: Spacecraft Cost Estimate

    NASA Technical Reports Server (NTRS)

    Hill, Spencer; Hopkins, Randall

    2009-01-01

    This slide presentation reviews the Xenia spacecraft cost estimates as an addendum for the Xenia Spacecraft study. The NASA/Air Force Cost model (NAFCPOM) was used to derive the cost estimates that are expressed in 2009 dollars.

  17. Accurate parameter estimation for unbalanced three-phase system.

    PubMed

    Chen, Yuan; So, Hing Cheung

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newton-Raphson scheme, whose global convergence is studied in this paper. Computer simulations show that the mean square error performance of NLS method can attain the Cramér-Rao lower bound. Moreover, our proposal provides more accurate frequency estimation when compared with the complex least mean square (CLMS) and augmented CLMS. PMID:25162056

  18. Cost Estimation and Control for Flight Systems

    NASA Technical Reports Server (NTRS)

    Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)

    2002-01-01

    Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.

  19. Accurate pose estimation using single marker single camera calibration system

    NASA Astrophysics Data System (ADS)

    Pati, Sarthak; Erat, Okan; Wang, Lejing; Weidert, Simon; Euler, Ekkehard; Navab, Nassir; Fallavollita, Pascal

    2013-03-01

    Visual marker based tracking is one of the most widely used tracking techniques in Augmented Reality (AR) applications. Generally, multiple square markers are needed to perform robust and accurate tracking. Various marker based methods for calibrating relative marker poses have already been proposed. However, the calibration accuracy of these methods relies on the order of the image sequence and pre-evaluation of pose-estimation errors, making the method offline. Several studies have shown that the accuracy of pose estimation for an individual square marker depends on camera distance and viewing angle. We propose a method to accurately model the error in the estimated pose and translation of a camera using a single marker via an online method based on the Scaled Unscented Transform (SUT). Thus, the pose estimation for each marker can be estimated with highly accurate calibration results independent of the order of image sequences compared to cases when this knowledge is not used. This removes the need for having multiple markers and an offline estimation system to calculate camera pose in an AR application.

  20. Supplemental report on cost estimates'

    SciTech Connect

    1992-04-29

    The Office of Management and Budget (OMB) and the U.S. Army Corps of Engineers have completed an analysis of the Department of Energy's (DOE) Fiscal Year (FY) 1993 budget request for its Environmental Restoration and Waste Management (ERWM) program. The results were presented to an interagency review group (IAG) of senior-Administration officials for their consideration in the budget process. This analysis included evaluations of the underlying legal requirements and cost estimates on which the ERWM budget request was based. The major conclusions are contained in a separate report entitled, ''Interagency Review of the Department of Energy Environmental Restoration and Waste Management Program.'' This Corps supplemental report provides greater detail on the cost analysis.

  1. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate. 261.142 Section 261.142... Materials § 261.142 Cost estimate. (a) The owner or operator must have a detailed written estimate, in... facility. (1) The estimate must equal the cost of conducting the activities described in paragraph (a)...

  2. Estimating the Costs of Preventive Interventions

    ERIC Educational Resources Information Center

    Foster, E. Michael; Porter, Michele M.; Ayers, Tim S.; Kaplan, Debra L.; Sandler, Irwin

    2007-01-01

    The goal of this article is to improve the practice and reporting of cost estimates of prevention programs. It reviews the steps in estimating the costs of an intervention and the principles that should guide estimation. The authors then review prior efforts to estimate intervention costs using a sample of well-known but diverse studies. Finally,…

  3. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  4. An accurate link correlation estimator for improving wireless protocol performance.

    PubMed

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  5. Software Development Cost Estimation Executive Summary

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Menzies, Tim

    2006-01-01

    Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.

  6. Fast and accurate estimation for astrophysical problems in large databases

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.

    2010-10-01

    A recent flood of astronomical data has created much demand for sophisticated statistical and machine learning tools that can rapidly draw accurate inferences from large databases of high-dimensional data. In this Ph.D. thesis, methods for statistical inference in such databases will be proposed, studied, and applied to real data. I use methods for low-dimensional parametrization of complex, high-dimensional data that are based on the notion of preserving the connectivity of data points in the context of a Markov random walk over the data set. I show how this simple parameterization of data can be exploited to: define appropriate prototypes for use in complex mixture models, determine data-driven eigenfunctions for accurate nonparametric regression, and find a set of suitable features to use in a statistical classifier. In this thesis, methods for each of these tasks are built up from simple principles, compared to existing methods in the literature, and applied to data from astronomical all-sky surveys. I examine several important problems in astrophysics, such as estimation of star formation history parameters for galaxies, prediction of redshifts of galaxies using photometric data, and classification of different types of supernovae based on their photometric light curves. Fast methods for high-dimensional data analysis are crucial in each of these problems because they all involve the analysis of complicated high-dimensional data in large, all-sky surveys. Specifically, I estimate the star formation history parameters for the nearly 800,000 galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7 spectroscopic catalog, determine redshifts for over 300,000 galaxies in the SDSS photometric catalog, and estimate the types of 20,000 supernovae as part of the Supernova Photometric Classification Challenge. Accurate predictions and classifications are imperative in each of these examples because these estimates are utilized in broader inference problems

  7. Hydrogen Station Cost Estimates: Comparing Hydrogen Station Cost Calculator Results with other Recent Estimates

    SciTech Connect

    Melaina, M.; Penev, M.

    2013-09-01

    This report compares hydrogen station cost estimates conveyed by expert stakeholders through the Hydrogen Station Cost Calculation (HSCC) to a select number of other cost estimates. These other cost estimates include projections based upon cost models and costs associated with recently funded stations.

  8. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  9. Cost Estimates for Federal Student Loans: The Market Cost Debate

    ERIC Educational Resources Information Center

    Delisle, Jason

    2008-01-01

    In an ongoing debate about the relative costs of the federal government's direct and guaranteed student loan programs, some budget experts and private lenders have argued for the use of "market cost" estimates. They assert that official government cost estimates for federal student loans differ from what private entities would likely charge…

  10. Accurate and robust estimation of camera parameters using RANSAC

    NASA Astrophysics Data System (ADS)

    Zhou, Fuqiang; Cui, Yi; Wang, Yexin; Liu, Liu; Gao, He

    2013-03-01

    Camera calibration plays an important role in the field of machine vision applications. The popularly used calibration approach based on 2D planar target sometimes fails to give reliable and accurate results due to the inaccurate or incorrect localization of feature points. To solve this problem, an accurate and robust estimation method for camera parameters based on RANSAC algorithm is proposed to detect the unreliability and provide the corresponding solutions. Through this method, most of the outliers are removed and the calibration errors that are the main factors influencing measurement accuracy are reduced. Both simulative and real experiments have been carried out to evaluate the performance of the proposed method and the results show that the proposed method is robust under large noise condition and quite efficient to improve the calibration accuracy compared with the original state.

  11. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  12. Accurate, low-cost 3D-models of gullies

    NASA Astrophysics Data System (ADS)

    Onnen, Nils; Gronz, Oliver; Ries, Johannes B.; Brings, Christine

    2015-04-01

    are able to produce accurate and low-cost 3D-models of gullies.

  13. Outer planet probe cost estimates: First impressions

    NASA Technical Reports Server (NTRS)

    Niehoff, J.

    1974-01-01

    An examination was made of early estimates of outer planetary atmospheric probe cost by comparing the estimates with past planetary projects. Of particular interest is identification of project elements which are likely cost drivers for future probe missions. Data are divided into two parts: first, the description of a cost model developed by SAI for the Planetary Programs Office of NASA, and second, use of this model and its data base to evaluate estimates of probe costs. Several observations are offered in conclusion regarding the credibility of current estimates and specific areas of the outer planet probe concept most vulnerable to cost escalation.

  14. Robust ODF smoothing for accurate estimation of fiber orientation.

    PubMed

    Beladi, Somaieh; Pathirana, Pubudu N; Brotchie, Peter

    2010-01-01

    Q-ball imaging was presented as a model free, linear and multimodal diffusion sensitive approach to reconstruct diffusion orientation distribution function (ODF) using diffusion weighted MRI data. The ODFs are widely used to estimate the fiber orientations. However, the smoothness constraint was proposed to achieve a balance between the angular resolution and noise stability for ODF constructs. Different regularization methods were proposed for this purpose. However, these methods are not robust and quite sensitive to the global regularization parameter. Although, numerical methods such as L-curve test are used to define a globally appropriate regularization parameter, it cannot serve as a universal value suitable for all regions of interest. This may result in over smoothing and potentially end up in neglecting an existing fiber population. In this paper, we propose to include an interpolation step prior to the spherical harmonic decomposition. This interpolation based approach is based on Delaunay triangulation provides a reliable, robust and accurate smoothing approach. This method is easy to implement and does not require other numerical methods to define the required parameters. Also, the fiber orientations estimated using this approach are more accurate compared to other common approaches. PMID:21096202

  15. Accurate estimators of correlation functions in Fourier space

    NASA Astrophysics Data System (ADS)

    Sefusatti, E.; Crocce, M.; Scoccimarro, R.; Couchman, H. M. P.

    2016-08-01

    Efficient estimators of Fourier-space statistics for large number of objects rely on fast Fourier transforms (FFTs), which are affected by aliasing from unresolved small-scale modes due to the finite FFT grid. Aliasing takes the form of a sum over images, each of them corresponding to the Fourier content displaced by increasing multiples of the sampling frequency of the grid. These spurious contributions limit the accuracy in the estimation of Fourier-space statistics, and are typically ameliorated by simultaneously increasing grid size and discarding high-frequency modes. This results in inefficient estimates for e.g. the power spectrum when desired systematic biases are well under per cent level. We show that using interlaced grids removes odd images, which include the dominant contribution to aliasing. In addition, we discuss the choice of interpolation kernel used to define density perturbations on the FFT grid and demonstrate that using higher order interpolation kernels than the standard Cloud-In-Cell algorithm results in significant reduction of the remaining images. We show that combining fourth-order interpolation with interlacing gives very accurate Fourier amplitudes and phases of density perturbations. This results in power spectrum and bispectrum estimates that have systematic biases below 0.01 per cent all the way to the Nyquist frequency of the grid, thus maximizing the use of unbiased Fourier coefficients for a given grid size and greatly reducing systematics for applications to large cosmological data sets.

  16. Accurate charge capture and cost allocation: cost justification for bedside computing.

    PubMed Central

    Grewal, R.; Reed, R. L.

    1993-01-01

    This paper shows that cost justification for bedside clinical computing can be made by recouping charges with accurate charge capture. Twelve months worth of professional charges for a sixteen bed surgical intensive care unit are computed from charted data in a bedside clinical database and are compared to the professional charges actually billed by the unit. A substantial difference in predicted charges and billed charges was found. This paper also discusses the concept of appropriate cost allocation in the inpatient environment and the feasibility of appropriate allocation as a by-product of bedside computing. PMID:8130444

  17. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Cost estimate. 261.142 Section 261.142... AND LISTING OF HAZARDOUS WASTE Financial Requirements for Management of Excluded Hazardous Secondary Materials § 261.142 Cost estimate. (a) The owner or operator must have a detailed written estimate,...

  18. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Cost estimate. 261.142 Section 261.142... AND LISTING OF HAZARDOUS WASTE Financial Requirements for Management of Excluded Hazardous Secondary Materials § 261.142 Cost estimate. (a) The owner or operator must have a detailed written estimate,...

  19. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Cost estimate. 261.142 Section 261.142... AND LISTING OF HAZARDOUS WASTE Financial Requirements for Management of Excluded Hazardous Secondary Materials § 261.142 Cost estimate. (a) The owner or operator must have a detailed written estimate,...

  20. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.

  1. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  2. Developing Analogy Cost Estimates for Space Missions

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2004-01-01

    The analogy approach in cost estimation combines actual cost data from similar existing systems, activities, or items with adjustments for a new project's technical, physical or programmatic differences to derive a cost estimate for the new system. This method is normally used early in a project cycle when there is insufficient design/cost data to use as a basis for (or insufficient time to perform) a detailed engineering cost estimate. The major limitation of this method is that it relies on the judgment and experience of the analyst/estimator. The analyst must ensure that the best analogy or analogies have been selected, and that appropriate adjustments have been made. While analogy costing is common, there is a dearth of advice in the literature on the 'adjustment methodology', especially for hardware projects. This paper discusses some potential approaches that can improve rigor and repeatability in the analogy costing process.

  3. Data Service Provider Cost Estimation Tool

    NASA Technical Reports Server (NTRS)

    Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel

    2011-01-01

    The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.

  4. A Cost Estimation Tool for Charter Schools

    ERIC Educational Resources Information Center

    Hayes, Cheryl D.; Keller, Eric

    2009-01-01

    To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…

  5. COST ESTIMATING SYSTEMS FOR REMEDIAL ACTION PROJECTS

    EPA Science Inventory

    This paper details the ongoing collaboration between the U.S. EPA and the U.S. Army Corps of Engineers in the development of complementary micro-computer based cost estimating systems for hazardous waste remediations. he U.S. EPA system, "Remedial Action Cost Estimating System" (...

  6. Demystifying the Cost Estimation Process

    ERIC Educational Resources Information Center

    Obi, Samuel C.

    2010-01-01

    In manufacturing today, nothing is more important than giving a customer a clear and straight-forward accounting of what their money has purchased. Many potentially promising return business orders are lost because of unclear, ambiguous, or improper billing. One of the best ways of resolving cost bargaining conflicts is by providing a…

  7. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  8. A Framework for Automating Cost Estimates in Assembly Processes

    SciTech Connect

    Calton, T.L.; Peters, R.R.

    1998-12-09

    When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process may iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.

  9. An approach to software cost estimation

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.

    1984-01-01

    A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.

  10. Estimating the costs of human space exploration

    NASA Technical Reports Server (NTRS)

    Mandell, Humboldt C., Jr.

    1994-01-01

    The plan for NASA's new exploration initiative has the following strategic themes: (1) incremental, logical evolutionary development; (2) economic viability; and (3) excellence in management. The cost estimation process is involved with all of these themes and they are completely dependent upon the engineering cost estimator for success. The purpose is to articulate the issues associated with beginning this major new government initiative, to show how NASA intends to resolve them, and finally to demonstrate the vital importance of a leadership role by the cost estimation community.

  11. Process Equipment Cost Estimation, Final Report

    SciTech Connect

    H.P. Loh; Jennifer Lyons; Charles W. White, III

    2002-01-01

    This report presents generic cost curves for several equipment types generated using ICARUS Process Evaluator. The curves give Purchased Equipment Cost as a function of a capacity variable. This work was performed to assist NETL engineers and scientists in performing rapid, order of magnitude level cost estimates or as an aid in evaluating the reasonableness of cost estimates submitted with proposed systems studies or proposals for new processes. The specific equipment types contained in this report were selected to represent a relatively comprehensive set of conventional chemical process equipment types.

  12. Statistical methods of estimating mining costs

    USGS Publications Warehouse

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  13. Estimating archiving costs for engineering records

    SciTech Connect

    Stutz, R.A.; Lamartine, B.C.

    1997-02-01

    Information technology has completely changed the concept of record keeping for engineering projects -- the advent of digital records was a momentous discovery, as significant as the invention of the printing press. Digital records allowed huge amounts of information to be stored in a very small space and to be examined quickly. However, digital documents are much more vulnerable to the passage of time than printed documents because the media on which they are stored are easily affected by physical phenomena, such as magnetic fields, oxidation, material decay, and by various environmental factors that may erase the information. Even more important, digital information becomes obsolete because, even if future generations may be able to read it, they may not necessarily be able to interpret it. Engineering projects of all sizes are becoming more dependent on digital records. These records are created on computers used in design, estimating, construction management, and construction. The necessity for the accurate and accessible storage of these documents, generated by computer software systems, is increasing for a number of reasons including legal and environment issues. This paper will discuss media life considerations and life cycle costs associated with several methods of storing engineering records.

  14. Dynamic cost risk estimation and budget misspecification

    NASA Technical Reports Server (NTRS)

    Ebbeler, D. H.; Fox, G.; Habib-Agahi, H.

    2003-01-01

    Cost risk for new technology development is estimated by explicit stochastic processes. Monte Carlo simulation is used to propagate technology development activity budget changes during the technology development cycle.

  15. The free energy cost of accurate biochemical oscillations

    PubMed Central

    Cao, Yuansheng; Wang, Hongli; Ouyang, Qi; Tu, Yuhai

    2015-01-01

    Oscillation is an important cellular process that regulates timing of different vital life cycles. However, in the noisy cellular environment, oscillations can be highly inaccurate due to phase fluctuations. It remains poorly understood how biochemical circuits suppress phase fluctuations and what is the incurred thermodynamic cost. Here, we study three different types of biochemical oscillations representing three basic oscillation motifs shared by all known oscillatory systems. In all the systems studied, we find that the phase diffusion constant depends on the free energy dissipation per period following the same inverse relation parameterized by system specific constants. This relationship and its range of validity are shown analytically in a model of noisy oscillation. Microscopically, we find that the oscillation is driven by multiple irreversible cycles that hydrolyze the fuel molecules such as ATP; the number of phase coherent periods is proportional to the free energy consumed per period. Experimental evidence in support of this general relationship and testable predictions are also presented. PMID:26566392

  16. COST ESTIMATING EQUATIONS FOR BEST MANAGEMENT PRACTICES

    EPA Science Inventory

    This paper describes the development of an interactive internet-based cost-estimating tool for commonly used urban storm runoff best management practices (BMP), including: retention and detention ponds, grassed swales, and constructed wetlands. The paper presents the cost data, c...

  17. Cost estimation model for advanced planetary programs, fourth edition

    NASA Technical Reports Server (NTRS)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  18. Estimating patient-level nursing home costs.

    PubMed Central

    Schlenker, R E; Shaughnessy, P W; Yslas, I

    1985-01-01

    This article presents a methodology developed to estimate patient-level nursing home costs. Such estimates are difficult to obtain because most cost data for nursing homes are available from Medicare or Medicaid cost reports, which provide only average values per patient-day across all patients (or all of a particular payer's patients). The methodology presented in this article yields "resource consumption" (RC) measures of the variable cost of nursing staff care incurred in treating individual nursing home patients. Results from the application of the methodology are presented, using data collected in 1980 on a sample of 961 nursing home patients in 74 Colorado nursing homes. This type of approach could be used to link nursing home payments to the care needs of individual patients, thus improving the overall equity of the payment system and possibly reducing the access barriers facing especially Medicaid patients with high-cost care needs. PMID:3921494

  19. Fast and Accurate Learning When Making Discrete Numerical Estimates.

    PubMed

    Sanborn, Adam N; Beierholm, Ulrik R

    2016-04-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  20. Fast and Accurate Learning When Making Discrete Numerical Estimates

    PubMed Central

    Sanborn, Adam N.; Beierholm, Ulrik R.

    2016-01-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  1. Cost Estimation in Engineer-to-Order Manufacturing

    NASA Astrophysics Data System (ADS)

    Hooshmand, Yousef; Köhler, Peter; Korff-Krumm, Andrea

    2016-02-01

    In Engineer-to-Order (ETO) manufacturing the price of products must be defined during early stages of product design and during the bidding process, thus an overestimation of product development (PD) costs may lead to the loss of orders and an underestimation causes a profit loss. What many ETO systems have in common is that the products have to be developed based on different customer requirements so that each order usually results in a new variant. Furthermore, many customer requirement change-requests may arise in different phases of the PD, which is to be considered properly. Thus it is utmost important for ETO systems to have an accurate cost estimation in first stages of the product design and to be able to determine the cost of customer requirement changes in different phases of PD. This paper aims to present a cost estimation methodology as well as a cost estimation model, which estimate the cost of products by relative comparison of the attributes of new product variants with the attributes of standard product variants. In addition, as a necessity in ETO manufacturing, the cost calculation of customer requirement changes in different phases of PD is integrated in the presented method.

  2. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb, we incorporated Pb-contaminated soils or Pb acetate into diets for Japanese quail (Coturnix japonica), fed the quail for 15 days, and ...

  3. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE BIOAVAILABILITY OF LEAD TO QUAIL

    EPA Science Inventory

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contami...

  4. Some insight on censored cost estimators.

    PubMed

    Zhao, H; Cheng, Y; Bang, H

    2011-08-30

    Censored survival data analysis has been studied for many years. Yet, the analysis of censored mark variables, such as medical cost, quality-adjusted lifetime, and repeated events, faces a unique challenge that makes standard survival analysis techniques invalid. Because of the 'informative' censorship imbedded in censored mark variables, the use of the Kaplan-Meier (Journal of the American Statistical Association 1958; 53:457-481) estimator, as an example, will produce biased estimates. Innovative estimators have been developed in the past decade in order to handle this issue. Even though consistent estimators have been proposed, the formulations and interpretations of some estimators are less intuitive to practitioners. On the other hand, more intuitive estimators have been proposed, but their mathematical properties have not been established. In this paper, we prove the analytic identity between some estimators (a statistically motivated estimator and an intuitive estimator) for censored cost data. Efron (1967) made similar investigation for censored survival data (between the Kaplan-Meier estimator and the redistribute-to-the-right algorithm). Therefore, we view our study as an extension of Efron's work to informatively censored data so that our findings could be applied to other marked variables. PMID:21748774

  5. 48 CFR 1852.216-73 - Estimated cost and cost sharing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Estimated cost and cost... and Clauses 1852.216-73 Estimated cost and cost sharing. As prescribed in 1816.307-70(a), insert the following clause: Estimated Cost and Cost Sharing (DEC 1991) (a) It is estimated that the total cost...

  6. 48 CFR 1552.216-76 - Estimated cost and cost-sharing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Estimated cost and cost... 1552.216-76 Estimated cost and cost-sharing. As prescribed in 1516.307(c), insert the following clause: Estimated Cost and Cost-Sharing (APR 1996) (a) The total estimated cost of performing the work under...

  7. 48 CFR 1852.216-73 - Estimated cost and cost sharing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and cost... and Clauses 1852.216-73 Estimated cost and cost sharing. As prescribed in 1816.307-70(a), insert the following clause: Estimated Cost and Cost Sharing (DEC 1991) (a) It is estimated that the total cost...

  8. 48 CFR 1552.216-76 - Estimated cost and cost-sharing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and cost... 1552.216-76 Estimated cost and cost-sharing. As prescribed in 1516.307(c), insert the following clause: Estimated Cost and Cost-Sharing (APR 1996) (a) The total estimated cost of performing the work under...

  9. Support to LANL: Cost estimation. Final report

    SciTech Connect

    Not Available

    1993-10-04

    This report summarizes the activities and progress by ICF Kaiser Engineers conducted on behalf of Los Alamos National Laboratories (LANL) for the US Department of Energy, Office of Waste Management (EM-33) in the area of improving methods for Cost Estimation. This work was conducted between October 1, 1992 and September 30, 1993. ICF Kaiser Engineers supported LANL in providing the Office of Waste Management with planning and document preparation services for a Cost and Schedule Estimating Guide (Guide). The intent of the Guide was to use Activity-Based Cost (ABC) estimation as a basic method in preparing cost estimates for DOE planning and budgeting documents, including Activity Data Sheets (ADSs), which form the basis for the Five Year Plan document. Prior to the initiation of the present contract with LANL, ICF Kaiser Engineers was tasked to initiate planning efforts directed toward a Guide. This work, accomplished from June to September, 1992, included visits to eight DOE field offices and consultation with DOE Headquarters staff to determine the need for a Guide, the desired contents of a Guide, and the types of ABC estimation methods and documentation requirements that would be compatible with current or potential practices and expertise in existence at DOE field offices and their contractors.

  10. Cost estimating Brayton and Stirling engines

    NASA Technical Reports Server (NTRS)

    Fortgang, H. R.

    1980-01-01

    Brayton and Stirling engines were analyzed for cost and selling price for production quantities ranging from 1000 to 400,000 units per year. Parts and components were subjected to indepth scrutiny to determine optimum manufacturing processes coupled with make or buy decisions on materials and small parts. Tooling and capital equipment costs were estimated for each detail and/or assembly. For low annual production volumes, the Brayton engine appears to have a lower cost and selling price than the Stirling Engine. As annual production quantities increase, the Stirling becomes a lower cost engine than the Brayton. Both engines could benefit cost wise if changes were made in materials, design and manufacturing process as annual production quantities increase.

  11. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  12. PACE 2: Pricing and Cost Estimating Handbook

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.; Shepherd, T.

    1977-01-01

    An automatic data processing system to be used for the preparation of industrial engineering type manhour and material cost estimates has been established. This computer system has evolved into a highly versatile and highly flexible tool which significantly reduces computation time, eliminates computational errors, and reduces typing and reproduction time for estimators and pricers since all mathematical and clerical functions are automatic once basic inputs are derived.

  13. How Accurately Do Spectral Methods Estimate Effective Elastic Thickness?

    NASA Astrophysics Data System (ADS)

    Perez-Gussinye, M.; Lowry, A. R.; Watts, A. B.; Velicogna, I.

    2002-12-01

    The effective elastic thickness, Te, is an important parameter that has the potential to provide information on the long-term thermal and mechanical properties of the the lithosphere. Previous studies have estimated Te using both forward and inverse (spectral) methods. While there is generally good agreement between the results obtained using these methods, spectral methods are limited because they depend on the spectral estimator and the window size chosen for analysis. In order to address this problem, we have used a multitaper technique which yields optimal estimates of the bias and variance of the Bouguer coherence function relating topography and gravity anomaly data. The technique has been tested using realistic synthetic topography and gravity. Synthetic data were generated assuming surface and sub-surface (buried) loading of an elastic plate with fractal statistics consistent with real data sets. The cases of uniform and spatially varying Te are examined. The topography and gravity anomaly data consist of 2000x2000 km grids sampled at 8 km interval. The bias in the Te estimate is assessed from the difference between the true Te value and the mean from analyzing 100 overlapping windows within the 2000x2000 km data grids. For the case in which Te is uniform, the bias and variance decrease with window size and increase with increasing true Te value. In the case of a spatially varying Te, however, there is a trade-off between spatial resolution and variance. With increasing window size the variance of the Te estimate decreases, but the spatial changes in Te are smeared out. We find that for a Te distribution consisting of a strong central circular region of Te=50 km (radius 600 km) and progressively smaller Te towards its edges, the 800x800 and 1000x1000 km window gave the best compromise between spatial resolution and variance. Our studies demonstrate that assumed stationarity of the relationship between gravity and topography data yields good results even in

  14. Cost estimate of initial SSC experimental equipment

    SciTech Connect

    1986-06-01

    The cost of the initial detector complement at recently constructed colliding beam facilities (or at those under construction) has been a significant fraction of the cost of the accelerator complex. Because of the complexity of large modern-day detectors, the time-scale for their design and construction is comparable to the time-scale needed for accelerator design and construction. For these reasons it is appropriate to estimate the cost of the anticipated detector complement in parallel with the cost estimates of the collider itself. The fundamental difficulty with this procedure is that, whereas a firm conceptual design of the collider does exist, comparable information is unavailable for the detectors. Traditionally, these have been built by the high energy physics user community according to their perception of the key scientific problems that need to be addressed. The role of the accelerator laboratory in that process has involved technical and managerial coordination and the allocation of running time and local facilities among the proposed experiments. It seems proper that the basic spirit of experimentation reflecting the scientific judgment of the community should be preserved at the SSC. Furthermore, the formal process of initiation of detector proposals can only start once the SSC has been approved as a construction project and a formal laboratory administration put in place. Thus an ad hoc mechanism had to be created to estimate the range of potential detector needs, potential detector costs, and associated computing equipment.

  15. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  16. PROCEDURE FOR ESTIMATING PERMANENT TOTAL ENCLOSURE COSTS

    EPA Science Inventory

    The paper discusses a procedure for estimating permanent total enclosure (PTE) costs. (NOTE: Industries that use add-on control devices must adequately capture emissions before delivering them to the control device. One way to capture emissions is to use PTEs, enclosures that mee...

  17. Hydrogen from coal cost estimation guidebook

    NASA Technical Reports Server (NTRS)

    Billings, R. E.

    1981-01-01

    In an effort to establish baseline information whereby specific projects can be evaluated, a current set of parameters which are typical of coal gasification applications was developed. Using these parameters a computer model allows researchers to interrelate cost components in a sensitivity analysis. The results make possible an approximate estimation of hydrogen energy economics from coal, under a variety of circumstances.

  18. Estimating Teacher Turnover Costs: A Case Study

    ERIC Educational Resources Information Center

    Levy, Abigail Jurist; Joy, Lois; Ellis, Pamela; Jablonski, Erica; Karelitz, Tzur M.

    2012-01-01

    High teacher turnover in large U.S. cities is a critical issue for schools and districts, and the students they serve; but surprisingly little work has been done to develop methodologies and standards that districts and schools can use to make reliable estimates of turnover costs. Even less is known about how to detect variations in turnover costs…

  19. Estimating the cost of production stoppage

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1979-01-01

    Estimation model considers learning curve quantities, and time of break to forecast losses due to break in production schedule. Major parameters capable of predicting costs are number of units made prior to production sequence, length of production break, and slope of learning curve produced prior to break.

  20. 2012 NASA Cost Estimating Handbook Highlights

    NASA Technical Reports Server (NTRS)

    Rosenberg, Leigh; Stukes, Sherry

    2012-01-01

    The major goal is to ensure that appropriate policy is adopted and that best practices are being developed, communicated, and used across the Agency. -- Accomplished by engaging the NASA Cost Estimating Community representatives in the update. Scheduled to be complete by the end of FY 2012. Document has been through 3 detailed reviews across NASA.

  1. Unmanned Aerial Vehicles unique cost estimating requirements

    NASA Astrophysics Data System (ADS)

    Malone, P.; Apgar, H.; Stukes, S.; Sterk, S.

    Unmanned Aerial Vehicles (UAVs), also referred to as drones, are aerial platforms that fly without a human pilot onboard. UAVs are controlled autonomously by a computer in the vehicle or under the remote control of a pilot stationed at a fixed ground location. There are a wide variety of drone shapes, sizes, configurations, complexities, and characteristics. Use of these devices by the Department of Defense (DoD), NASA, civil and commercial organizations continues to grow. UAVs are commonly used for intelligence, surveillance, reconnaissance (ISR). They are also use for combat operations, and civil applications, such as firefighting, non-military security work, surveillance of infrastructure (e.g. pipelines, power lines and country borders). UAVs are often preferred for missions that require sustained persistence (over 4 hours in duration), or are “ too dangerous, dull or dirty” for manned aircraft. Moreover, they can offer significant acquisition and operations cost savings over traditional manned aircraft. Because of these unique characteristics and missions, UAV estimates require some unique estimating methods. This paper describes a framework for estimating UAV systems total ownership cost including hardware components, software design, and operations. The challenge of collecting data, testing the sensitivities of cost drivers, and creating cost estimating relationships (CERs) for each key work breakdown structure (WBS) element is discussed. The autonomous operation of UAVs is especially challenging from a software perspective.

  2. Accurate feature detection and estimation using nonlinear and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Rudin, Leonid; Osher, Stanley

    1994-11-01

    A program for feature detection and estimation using nonlinear and multiscale analysis was completed. The state-of-the-art edge detection was combined with multiscale restoration (as suggested by the first author) and robust results in the presence of noise were obtained. Successful applications to numerous images of interest to DOD were made. Also, a new market in the criminal justice field was developed, based in part, on this work.

  3. Software Estimates Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Smith, C. L.

    2003-01-01

    Simulation-Based Cost Model (SiCM), a discrete event simulation developed in Extend , simulates pertinent aspects of the testing of rocket propulsion test articles for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  4. Software Estimates Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Simulation-Based Cost Model (SiCM) is a computer program that simulates pertinent aspects of the testing of rocket engines for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  5. 48 CFR 1352.216-70 - Estimated and allowable costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Estimated and allowable costs. As prescribed in 48 CFR 1316.307(a), insert the following clause: Estimated and Allowable Costs (APR 2010) (a) Estimated Costs. The estimated cost of this contract is... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Estimated and...

  6. New developments in capital cost estimating

    SciTech Connect

    Stutz, R.A.; Zocher, M.A.

    1988-01-01

    The new developments in cost engineering revolve around the ability to capture information that in the past could not be automated. The purpose of automation is not to eliminate the expert cost engineer. The goal is to use available technology to have more information available to the professionals in the cost engineering field. In that sense, the demand for expertise increases in order to produce the highest quality estimate and project possible from all levels of cost engineers. We cannot overemphasize the importance of using a good source of expert information in building these types of programs. ''Garbage in, garbage out'' still applies in this form of programming. Expert systems technology will become commonplace in many vertical markets; it is important to undersand what can and cannot be accomplished in our field, and where this technology will lead us in the future.

  7. Accurate tempo estimation based on harmonic + noise decomposition

    NASA Astrophysics Data System (ADS)

    Alonso, Miguel; Richard, Gael; David, Bertrand

    2006-12-01

    We present an innovative tempo estimation system that processes acoustic audio signals and does not use any high-level musical knowledge. Our proposal relies on a harmonic + noise decomposition of the audio signal by means of a subspace analysis method. Then, a technique to measure the degree of musical accentuation as a function of time is developed and separately applied to the harmonic and noise parts of the input signal. This is followed by a periodicity estimation block that calculates the salience of musical accents for a large number of potential periods. Next, a multipath dynamic programming searches among all the potential periodicities for the most consistent prospects through time, and finally the most energetic candidate is selected as tempo. Our proposal is validated using a manually annotated test-base containing 961 music signals from various musical genres. In addition, the performance of the algorithm under different configurations is compared. The robustness of the algorithm when processing signals of degraded quality is also measured.

  8. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    USGS Publications Warehouse

    Beyer, W. Nelson; Basta, Nicholas T; Chaney, Rufus L.; Henry, Paula F.; Mosby, David; Rattner, Barnett A.; Scheckel, Kirk G.; Sprague, Dan; Weber, John

    2016-01-01

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with phosphorus significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite and tertiary Pb phosphate), and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb.

  9. Bioaccessibility tests accurately estimate bioavailability of lead to quail.

    PubMed

    Beyer, W Nelson; Basta, Nicholas T; Chaney, Rufus L; Henry, Paula F P; Mosby, David E; Rattner, Barnett A; Scheckel, Kirk G; Sprague, Daniel T; Weber, John S

    2016-09-01

    Hazards of soil-borne lead (Pb) to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, the authors measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from 5 Pb-contaminated Superfund sites had relative bioavailabilities from 33% to 63%, with a mean of approximately 50%. Treatment of 2 of the soils with phosphorus (P) significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in 6 in vitro tests and regressed on bioavailability: the relative bioavailability leaching procedure at pH 1.5, the same test conducted at pH 2.5, the Ohio State University in vitro gastrointestinal method, the urban soil bioaccessible lead test, the modified physiologically based extraction test, and the waterfowl physiologically based extraction test. All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the relative bioavailability leaching procedure at pH 2.5 and Ohio State University in vitro gastrointestinal tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite, and tertiary Pb phosphate) and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb, and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb. Environ Toxicol Chem 2016;35:2311-2319. Published 2016 Wiley Periodicals Inc. on behalf of

  10. Probabilistic cost estimates for climate change mitigation.

    PubMed

    Rogelj, Joeri; McCollum, David L; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-01-01

    For more than a decade, the target of keeping global warming below 2 °C has been a key focus of the international climate debate. In response, the scientific community has published a number of scenario studies that estimate the costs of achieving such a target. Producing these estimates remains a challenge, particularly because of relatively well known, but poorly quantified, uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on the one hand, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other hand, has spent years improving its understanding of the geophysical response of the Earth system to emissions of greenhouse gases. This geophysical response remains a key uncertainty in the cost of mitigation scenarios but has been integrated with assessments of other uncertainties in only a rudimentary manner, that is, for equilibrium conditions. Here we bridge this gap between the two research communities by generating distributions of the costs associated with limiting transient global temperature increase to below specific values, taking into account uncertainties in four factors: geophysical, technological, social and political. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical uncertainties, social factors influencing future energy demand and, lastly, technological uncertainties surrounding the availability of greenhouse gas mitigation options. Our information on temperature risk and mitigation costs provides crucial information for policy-making, because it clarifies the relative importance of mitigation costs, energy demand and the timing of global action in reducing the risk of exceeding a global temperature increase of 2 °C, or other limits such as 3 °C or 1.5

  11. 48 CFR 1852.216-81 - Estimated cost.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Estimated cost. 1852.216... 1852.216-81 Estimated cost. As prescribed in 1816.307-70(d), insert the following clause: Estimated cost (DEC 1988) The total estimated cost for complete performance of this contract is $ . See...

  12. 48 CFR 1852.216-81 - Estimated cost.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost. 1852.216-81... Estimated cost. As prescribed in 1816.307-70(d), insert the following clause: Estimated cost (DEC 1988) The total estimated cost for complete performance of this contract is $ . See FAR clause 52.216-11,...

  13. Estimating the social costs of nitrogen pollution

    NASA Astrophysics Data System (ADS)

    Gourevitch, J.; Keeler, B.; Polasky, S.

    2014-12-01

    Agricultural expansion can degrade water quality and related ecosystem services through increased export of nutrients. Such damages to water quality can negatively affect recreation, property values, and human health. While the relationship between agricultural production and nitrogen export is well-studied, the economic costs of nitrogen loss are less well understood. We present a comprehensive assessment of the full costs associated with nitrate pollution from agricultural sources in Minnesota. We found that the most significant economic costs are likely from groundwater contamination of nitrate in public and private wells. For example, we estimated that loss of grassland to corn cultivation in Minnesota between 2007 and 2012 is expected to increase the future number of domestic wells exceeding nitrate concentrations of 10 ppm by 31%. This increase in contamination is estimated to cost well owners $1.4 to 19 million (present values over a 20 year horizon) through remediation, avoidance, and replacement. Our findings demonstrate linkages between changes in land use, water quality, and human well-being.

  14. Measuring system complexity to support development cost estimates

    NASA Astrophysics Data System (ADS)

    Malone, P.; Wolfarth, L.

    Systems and System-of-Systems (SoS) are being used more frequently either as a design element of stand alone systems or architectural frameworks. Consequently, a programmatic need has arisen to understand and measure systems complexity in order to estimate more accurately development plans and life-cycle costs. In a prior paper, we introduced the System Readiness Level (SRL) concept as a composite function of both Technology Readiness Levels (TRLs) and Integration Readiness Levels (IRLs) and touched on system complexity. While the SRL approach provides a repeatable, process-driven method to assess the maturity of a system or SoS, it does not capture all aspects of system complexity. In this paper we assess the concept of cyclomatic complexity as a system complexity metric and consider its utility as an approach for estimating the life-cycle costs and cost growth of complex systems. We hypothesize that the greater the number of technologies and integration tasks, the more complex the system and the higher its cost to develop and maintain. We base our analysis on historical data from DoD programs that have experienced significant cost growth, including some that have been cancelled due to unsustainable cost (and schedule) growth. We begin by describing the original implementation of the cyclomatic method, which was developed to estimate the effort to maintain system software. We then describe how the method can be generalized and applied to systems. Next, we show how to estimate the cyclomatic number (CN) and show the statistical significance between a system's CN metric and its cost. We illustrate the method with an example. Last, we discuss opportunities for future research.

  15. The use of parametric cost estimating relationships for transport aircraft systems in establishing initial Design to Cost Targets

    NASA Technical Reports Server (NTRS)

    Beltramo, M. N.; Anderson, J. L.

    1977-01-01

    This paper provides a brief overview of Design to Cost (DTC). Problems inherent in attempting to estimate costs are discussed, along with techniques and types of models that have been developed to estimate aircraft costs. A set of cost estimating relationships that estimate the total production cost of commercial and military transport aircraft at the systems level is presented and the manner in which these equations might be used effectively in developing initial DTC targets is indicated. The principal point made in this paper is that, by using a disagregated set of equations to estimate transport aircraft costs at the systems level, reasonably accurate preliminary cost estimates may be achieved. These estimates may serve directly as initial DTC targets, or adjustments may be made to the estimates obtained for some of the systems to estimate the production cost impact of alternative designs or manufacturing technologies. The relative ease by which estimates may be made with this model, the flexibility it provides by being disaggregated, and the accuracy of the estimates it provides make it a unique and useful tool in establishing initial DTC targets.

  16. The cost of a hospital ward in Europe: is there a methodology available to accurately measure the costs?

    PubMed

    Negrini, D; Kettle, A; Sheppard, L; Mills, G H; Edbrooke, D L

    2004-01-01

    Costing health care services has become a major requirement due to an increase in demand for health care and technological advances. Several studies have been published describing the computation of the costs of hospital wards. The objective of this article is to examine the methodologies utilised to try to describe the basic components of a standardised method, which could be applied throughout Europe. Cost measurement however is a complex matter and a lack of clarity exists in the terminology and the cost concepts utilised. The methods discussed in this review make it evident that there is a lack of standardized methodologies for the determination of accurate costs of hospital wards. A standardized costing methodology would facilitate comparisons, encourage economic evaluation within the ward and hence assist in the decision-making process with regard to the efficient allocation of resources. PMID:15366283

  17. Compile-time estimation of communication costs in multicomputers

    NASA Technical Reports Server (NTRS)

    Gupta, Manish; Banerjee, Prithviraj

    1991-01-01

    An important problem facing numerous research projects on parallelizing compilers for distributed memory machines is that of automatically determining a suitable data partitioning scheme for a program. Any strategy for automatic data partitioning needs a mechanism for estimating the performance of a program under a given partitioning scheme, the most crucial part of which involves determining the communication costs incurred by the program. A methodology is described for estimating the communication costs at compile-time as functions of the numbers of processors over which various arrays are distributed. A strategy is described along with its theoretical basis, for making program transformations that expose opportunities for combining of messages, leading to considerable savings in the communication costs. For certain loops with regular dependences, the compiler can detect the possibility of pipelining, and thus estimate communication costs more accurately than it could otherwise. These results are of great significance to any parallelization system supporting numeric applications on multicomputers. In particular, they lay down a framework for effective synthesis of communication on multicomputers from sequential program references.

  18. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Cost estimate submission. 100.16 Section..., COMMUNICATIONS ASSISTANCE FOR LAW ENFORCEMENT ACT OF 1994 § 100.16 Cost estimate submission. (a) The carrier... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from...

  19. 28 CFR 19.4 - Cost and percentage estimates.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Cost and percentage estimates. 19.4... RECOVERY OF MISSING CHILDREN § 19.4 Cost and percentage estimates. It is estimated that this program will cost DOJ $78,000 during the initial year. This figure is based on estimates of printing, inserting,...

  20. Accurate and cost-effective natural resource data from super large scale aerial photography

    NASA Astrophysics Data System (ADS)

    Grotefendt, Richard Alan

    Increasing amounts and types of timely and accurate data are required for monitoring to ensure compliance with natural resource regulatory requirements. This study developed a cost-effective method to partially fulfill these data requirements using super large scale aerial photography (Scale: greater than 1:2,000). Two synchronized, metric, Rolleiflex 70mm (2.76in) cameras mounted 12m (40ft) apart on a rigid platform and carried at 5.6 km/hr (3 knots) by a helicopter collected this high resolution, 3D imagery from Alaska and Washington. The overlapping photo pairs provided 3D views of natural resource objects as fine as twigs. The 12m (40ft) inter-camera distance improved ground visibility between tree crowns of dense old growth forests. Analytical stereoplotters and the application of photogrammetric principles enabled measurement and interpretation of photo objects such as trees and their height in a cost-effective way. Horizontal and vertical measurement accuracy was within 2% and 3% of field measurement, respectively. Forest inventory and riparian buffer monitoring applications were used to test this method. Although field work is still required to develop photo-field relationships unique to each ecosystem and for quality assurance, the photo estimates of individual tree height, volume, diameter, type, and location, as well as down tree decay class and landing spot, plot timber volume, and area were comparable to and may replace approximately 95% of field effort. For example, the average of the absolute differences between field and photo estimates for tree height was 2.4m (7.8ft) (s.d. = 2.1m (6.8ft), n = 376), diameter at breast height (1.4m (4.5ft) above ground on uphill tree side) was 5.8cm (2.3in) (s.d. = 5.6cm (2.2in), n = 109), and plot volume in gross board feet was within 10.9% to 13.4% (n = 10) depending on the estimator used. Forest type was correctly classified 99.4% (n = 180) of the time. Timber inventory, species identification, sample

  1. Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information

    NASA Technical Reports Server (NTRS)

    Butts, Glenn

    2007-01-01

    Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.

  2. PRELIMINARY COST ESTIMATES OF POLLUTION CONTROL TECHNOLOGIES FOR GEOTHERMAL DEVELOPMENTS

    EPA Science Inventory

    This report provides preliminary cost estimates of air and water pollution control technologies for geothermal energy conversion facilities. Costs for solid waste disposal are also estimated. The technologies examined include those for control of hydrogen sulfide emissions and fo...

  3. ICPP tank farm closure study. Volume 3: Cost estimates, planning schedules, yearly cost flowcharts, and life-cycle cost estimates

    SciTech Connect

    1998-02-01

    This volume contains information on cost estimates, planning schedules, yearly cost flowcharts, and life-cycle costs for the six options described in Volume 1, Section 2: Option 1 -- Total removal clean closure; No subsequent use; Option 2 -- Risk-based clean closure; LLW fill; Option 3 -- Risk-based clean closure; CERCLA fill; Option 4 -- Close to RCRA landfill standards; LLW fill; Option 5 -- Close to RCRA landfill standards; CERCLA fill; and Option 6 -- Close to RCRA landfill standards; Clean fill. This volume is divided into two portions. The first portion contains the cost and planning schedule estimates while the second portion contains life-cycle costs and yearly cash flow information for each option.

  4. Estimation of immunization providers' activities cost, medication cost, and immunization dose errors cost in Iraq.

    PubMed

    Al-lela, Omer Qutaiba B; Bahari, Mohd Baidi; Al-abbassi, Mustafa G; Salih, Muhannad R M; Basher, Amena Y

    2012-06-01

    The immunization status of children is improved by interventions that increase community demand for compulsory and non-compulsory vaccines, one of the most important interventions related to immunization providers. The aim of this study is to evaluate the activities of immunization providers in terms of activities time and cost, to calculate the immunization doses cost, and to determine the immunization dose errors cost. Time-motion and cost analysis study design was used. Five public health clinics in Mosul-Iraq participated in the study. Fifty (50) vaccine doses were required to estimate activities time and cost. Micro-costing method was used; time and cost data were collected for each immunization-related activity performed by the clinic staff. A stopwatch was used to measure the duration of activity interactions between the parents and clinic staff. The immunization service cost was calculated by multiplying the average salary/min by activity time per minute. 528 immunization cards of Iraqi children were scanned to determine the number and the cost of immunization doses errors (extraimmunization doses and invalid doses). The average time for child registration was 6.7 min per each immunization dose, and the physician spent more than 10 min per dose. Nurses needed more than 5 min to complete child vaccination. The total cost of immunization activities was 1.67 US$ per each immunization dose. Measles vaccine (fifth dose) has a lower price (0.42 US$) than all other immunization doses. The cost of a total of 288 invalid doses was 744.55 US$ and the cost of a total of 195 extra immunization doses was 503.85 US$. The time spent on physicians' activities was longer than that spent on registrars' and nurses' activities. Physician total cost was higher than registrar cost and nurse cost. The total immunization cost will increase by about 13.3% owing to dose errors. PMID:22521848

  5. Cost estimates supporting West Valley DEIS

    SciTech Connect

    Pirro, J.

    1981-01-01

    An Environmental Impact Statement (EIS) is being prepared which considers alternate means for solidifying the high level liquid wastes (HLLW) at the Western New York Nuclear Service Center (WNYNSC). For this purpose three basic scenarios were considered. In the first scenario, the HLLW is converted into terminal waste form of borosilicate glass. Before vitrification, the non-radioactive chemical salts are separated from the radioactive and transuranic (TRU) constituents in the HLLW. In the second scenario, the HLLW is converted into an intermediate form-fused salt. The stored HLLW is dewatered and melted and the solids are transported to a Department of Energy (DOE) site. The fused salt will be processed of the DOE site at a later date where it will be converted to a vitrified form in a facility that will be constructed to treat HLLW stored at that site. The vitrified salt will be eventually removed for permanent disposal at a Federal repository. In the third scenario, the HLLW is solidified in the existing HLLW storage tanks with cement and returned for on-site disposal in the existing tanks or additional tanks as needed to accommodate the volume. To support the EIS, the costs to accomplish each of the alternatives is provided. The purpose of this cost estimate is to provide a common basis to evaluate the expenditures required to immobilize the HLLW presently stored at the WNYNSC. (DMC)

  6. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Cost estimating system... of Provisions And Clauses 252.215-7002 Cost estimating system requirements. As prescribed in 215.408(2), use the following clause: Cost Estimating System Requirements (DEC 2012) (a)...

  7. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Cost estimating system... of Provisions And Clauses 252.215-7002 Cost estimating system requirements. As prescribed in 215.408(2), use the following clause: Cost Estimating System Requirements (DEC 2012) (a)...

  8. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Cost estimating system... of Provisions And Clauses 252.215-7002 Cost estimating system requirements. As prescribed in 215.408(2), use the following clause: Cost Estimating System Requirements (FEB 2012) (a)...

  9. ESTIMATED SAVINGS IN MEDICAL COSTS RESULTING FROM ASTHMA MANAGEMENT PROGRAMS

    EPA Science Inventory

    The purpose of this project is to estimate the direct medical costs of asthma to HMOs and health insurers. The study will estimate full medical costs and the subset of these full medical costs that is borne by HMOs/insurers. Next, the study will estimate the potential savings to ...

  10. 40 CFR 267.142 - Cost estimate for closure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Cost estimate for closure. 267.142... PERMIT Financial Requirements § 267.142 Cost estimate for closure. (a) The owner or operator must have at the facility a detailed written estimate, in current dollars, of the cost of closing the facility...

  11. 40 CFR 265.142 - Cost estimate for closure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for closure. 265.142... DISPOSAL FACILITIES Financial Requirements § 265.142 Cost estimate for closure. (a) The owner or operator must have a detailed written estimate, in current dollars, of the cost of closing the facility...

  12. 40 CFR 264.142 - Cost estimate for closure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for closure. 264.142... Financial Requirements § 264.142 Cost estimate for closure. (a) The owner or operator must have a detailed written estimate, in current dollars, of the cost of closing the facility in accordance with...

  13. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Cost estimating system... of Provisions And Clauses 252.215-7002 Cost estimating system requirements. As prescribed in 215.408(2), use the following clause: Cost Estimating System Requirements (DEC 2006) (a)...

  14. Radiologists’ ability to accurately estimate and compare their own interpretative mammography performance to their peers

    PubMed Central

    Cook, Andrea J.; Elmore, Joann G.; Zhu, Weiwei; Jackson, Sara L.; Carney, Patricia A.; Flowers, Chris; Onega, Tracy; Geller, Berta; Rosenberg, Robert D.; Miglioretti, Diana L.

    2013-01-01

    Objective To determine if U.S. radiologists accurately estimate their own interpretive performance of screening mammography and how they compare their performance to their peers’. Materials and Methods 174 radiologists from six Breast Cancer Surveillance Consortium (BCSC) registries completed a mailed survey between 2005 and 2006. Radiologists’ estimated and actual recall, false positive, and cancer detection rates and positive predictive value of biopsy recommendation (PPV2) for screening mammography were compared. Radiologists’ ratings of their performance as lower, similar, or higher than their peers were compared to their actual performance. Associations with radiologist characteristics were estimated using weighted generalized linear models. The study was approved by the institutional review boards of the participating sites, informed consent was obtained from radiologists, and procedures were HIPAA compliant. Results While most radiologists accurately estimated their cancer detection and recall rates (74% and 78% of radiologists), fewer accurately estimated their false positive rate and PPV2 (19% and 26%). Radiologists reported having similar (43%) or lower (31%) recall rates and similar (52%) or lower (33%) false positive rates compared to their peers, and similar (72%) or higher (23%) cancer detection rates and similar (72%) or higher (38%) PPV2. Estimation accuracy did not differ by radiologists’ characteristics except radiologists who interpret ≤1,000 mammograms annually were less accurate at estimating their recall rates. Conclusion Radiologists perceive their performance to be better than it actually is and at least as good as their peers. Radiologists have particular difficulty estimating their false positive rates and PPV2. PMID:22915414

  15. IDC RP2 & 3 US Industry Standard Cost Estimate Summary.

    SciTech Connect

    Harris, James M.; Huelskamp, Robert M.

    2015-01-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, using a commercial software cost estimation tool calibrated to US industry performance parameters. This is not a cost estimate for Sandia to perform the project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  16. 15 CFR 23.4 - Cost and percentage estimates.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Cost and percentage estimates. 23.4... LOCATION AND RECOVERY OF MISSING CHILDREN § 23.4 Cost and percentage estimates. It is estimated that this... estimate that 9% of its penalty mail will transmit missing children photographs and information when...

  17. Accurate Non-parametric Estimation of Recent Effective Population Size from Segments of Identity by Descent

    PubMed Central

    Browning, Sharon R.; Browning, Brian L.

    2015-01-01

    Existing methods for estimating historical effective population size from genetic data have been unable to accurately estimate effective population size during the most recent past. We present a non-parametric method for accurately estimating recent effective population size by using inferred long segments of identity by descent (IBD). We found that inferred segments of IBD contain information about effective population size from around 4 generations to around 50 generations ago for SNP array data and to over 200 generations ago for sequence data. In human populations that we examined, the estimates of effective size were approximately one-third of the census size. We estimate the effective population size of European-ancestry individuals in the UK four generations ago to be eight million and the effective population size of Finland four generations ago to be 0.7 million. Our method is implemented in the open-source IBDNe software package. PMID:26299365

  18. Bi-fluorescence imaging for estimating accurately the nuclear condition of Rhizoctonia spp.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aims: To simplify the determination of the nuclear condition of the pathogenic Rhizoctonia, which currently needs to be performed either using two fluorescent dyes, thus is more costly and time-consuming, or using only one fluorescent dye, and thus less accurate. Methods and Results: A red primary ...

  19. ESTIMATION OF SMALL SYSTEM WATER TREATMENT COSTS

    EPA Science Inventory

    This report presents cost data for unit processes that are capable of removing contaminants included in the National Interim Primary Drinking Water Regulations. Construction and operation and maintenance cost data are presented for 45 centralized treatment unit processes that are...

  20. AN OVERVIEW OF TOOL FOR RESPONSE ACTION COST ESTIMATING (TRACE)

    SciTech Connect

    FERRIES SR; KLINK KL; OSTAPKOWICZ B

    2012-01-30

    Tools and techniques that provide improved performance and reduced costs are important to government programs, particularly in current times. An opportunity for improvement was identified for preparation of cost estimates used to support the evaluation of response action alternatives. As a result, CH2M HILL Plateau Remediation Company has developed Tool for Response Action Cost Estimating (TRACE). TRACE is a multi-page Microsoft Excel{reg_sign} workbook developed to introduce efficiencies into the timely and consistent production of cost estimates for response action alternatives. This tool combines costs derived from extensive site-specific runs of commercially available remediation cost models with site-specific and estimator-researched and derived costs, providing the best estimating sources available. TRACE also provides for common quantity and key parameter links across multiple alternatives, maximizing ease of updating estimates and performing sensitivity analyses, and ensuring consistency.

  1. 43 CFR 9185.4-1 - Estimate of cost.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Estimate of cost. 9185.4-1 Section 9185.4... Estimate of cost. (a) The cost of resurvey procedure is as a rule considerably in excess of that incident... required in order to obtain technical control, and where, by reason of errors in the original survey,...

  2. 43 CFR 9185.4-1 - Estimate of cost.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Estimate of cost. 9185.4-1 Section 9185.4... Estimate of cost. (a) The cost of resurvey procedure is as a rule considerably in excess of that incident... required in order to obtain technical control, and where, by reason of errors in the original survey,...

  3. 43 CFR 9185.4-1 - Estimate of cost.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Estimate of cost. 9185.4-1 Section 9185.4... Estimate of cost. (a) The cost of resurvey procedure is as a rule considerably in excess of that incident... required in order to obtain technical control, and where, by reason of errors in the original survey,...

  4. 43 CFR 9185.4-1 - Estimate of cost.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Estimate of cost. 9185.4-1 Section 9185.4... Estimate of cost. (a) The cost of resurvey procedure is as a rule considerably in excess of that incident... required in order to obtain technical control, and where, by reason of errors in the original survey,...

  5. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  6. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  7. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  8. 28 CFR 100.16 - Cost estimate submission.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... evaluation of the estimated costs. The FBI reserves the right to request additional cost data from carriers... if, as determined by the FBI, all cost data reasonably available to the carrier are either submitted... explain the estimating process are required by the FBI and the carrier refuses to provide necessary...

  9. Assuring Software Cost Estimates: Is it an Oxymoron?

    NASA Technical Reports Server (NTRS)

    Hihn, Jarius; Tregre, Grant

    2013-01-01

    The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.

  10. Simple, fast and accurate eight points amplitude estimation method of sinusoidal signals for DSP based instrumentation

    NASA Astrophysics Data System (ADS)

    Vizireanu, D. N.; Halunga, S. V.

    2012-04-01

    A simple, fast and accurate amplitude estimation algorithm of sinusoidal signals for DSP based instrumentation is proposed. It is shown that eight samples, used in two steps, are sufficient. A practical analytical formula for amplitude estimation is obtained. Numerical results are presented. Simulations have been performed when the sampled signal is affected by white Gaussian noise and when the samples are quantized on a given number of bits.

  11. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are

  12. 48 CFR 36.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Government estimate of... Contracting for Construction 36.203 Government estimate of construction costs. (a) An independent Government estimate of construction costs shall be prepared and furnished to the contracting officer at the...

  13. Space tug economic analysis study. Volume 3: Cost estimates

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Cost estimates for the space tug operation are presented. The subjects discussed are: (1) research and development costs, (2) investment costs, (3) operations costs, and (4) funding requirements. The emphasis is placed on the single stage tug configuration using various types of liquid propellants.

  14. 40 CFR 261.142 - Cost estimate.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... current dollars, of the cost of disposing of any hazardous secondary material as listed or characteristic... requirements if he can demonstrate that on-site disposal capacity will exist at all times over the life of the... value. (b) During the active life of the facility, the owner or operator must adjust the cost...

  15. ESTIMATING COSTS FOR WATER SUPPLY DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The issue of economic effects and the cost of water supply is a continuing factor in implementing the Safe Drinking Water Act. The cost of distributing water to the final user after it has been treated is of growing concern as well as its quality. There are a significant number o...

  16. Estimating demolition cost of plutonium buildings for dummies

    SciTech Connect

    Tower, S.E.

    2000-07-01

    The primary purpose of the Rocky Flats Field Office of the US Department of Energy is to decommission the entire plant. In an effort to improve the basis and the accuracy of the future decommissioning cost, Rocky Flats has developed a powerful but easy-to-use tool to determine budget cost estimates to characterize, decontaminate, and demolish all its buildings. The parametric cost-estimating tool is called the Facilities Disposition Cost Model (FDCM).

  17. State Medicaid Pharmacy Payments and Their Relation to Estimated Costs

    PubMed Central

    Adams, E. Kathleen; Kreling, David H.; Gondek, Kathleen

    1994-01-01

    Although prescription drugs do not appear to be a primary source of recent surges in Medicaid spending, their share of Medicaid expenditures has risen despite efforts to control costs. As part of a general concern with prescription drug policy, Congress mandated a study of the adequacy of Medicaid payments to pharmacies. In this study, several data sources were used to develop 1991 estimates of average pharmacy ingredient and dispensing costs. A simulation was used to estimate the amounts States pay. Nationally, simulated payments averaged 96 percent of estimated costs overall but were lower for dispensing costs (79 percent) and higher for ingredient costs (102 percent). PMID:10137796

  18. Pros, Cons, and Alternatives to Weight Based Cost Estimating

    NASA Technical Reports Server (NTRS)

    Joyner, Claude R.; Lauriem, Jonathan R.; Levack, Daniel H.; Zapata, Edgar

    2011-01-01

    Many cost estimating tools use weight as a major parameter in projecting the cost. This is often combined with modifying factors such as complexity, technical maturity of design, environment of operation, etc. to increase the fidelity of the estimate. For a set of conceptual designs, all meeting the same requirements, increased weight can be a major driver in increased cost. However, once a design is fixed, increased weight generally decreases cost, while decreased weight generally increases cost - and the relationship is not linear. Alternative approaches to estimating cost without using weight (except perhaps for materials costs) have been attempted to try to produce a tool usable throughout the design process - from concept studies through development. This paper will address the pros and cons of using weight based models for cost estimating, using liquid rocket engines as the example. It will then examine approaches that minimize the impct of weight based cost estimating. The Rocket Engine- Cost Model (RECM) is an attribute based model developed internally by Pratt & Whitney Rocketdyne for NASA. RECM will be presented primarily to show a successful method to use design and programmatic parameters instead of weight to estimate both design and development costs and production costs. An operations model developed by KSC, the Launch and Landing Effects Ground Operations model (LLEGO), will also be discussed.

  19. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  20. Preliminary estimates of operating costs for lighter than air transports

    NASA Technical Reports Server (NTRS)

    Smith, C. L.; Ardema, M. D.

    1975-01-01

    A preliminary set of operating cost relationships are presented for airship transports. The starting point for the development of the relationships is the direct operating cost formulae and the indirect operating cost categories commonly used for estimating costs of heavier than air commercial transports. Modifications are made to the relationships to account for the unique features of airships. To illustrate the cost estimating method, the operating costs of selected airship cargo transports are computed. Conventional fully buoyant and hybrid semi-buoyant systems are investigated for a variety of speeds, payloads, ranges, and altitudes. Comparisons are made with aircraft transports for a range of cargo densities.

  1. Preliminary estimates of operating costs for lighter than air transports

    NASA Technical Reports Server (NTRS)

    Smith, C. L.; Ardema, M. D.

    1975-01-01

    Presented is a preliminary set of operating cost relationships for airship transports. The starting point for the development of the relationships is the direct operating cost formulae and the indirect operating cost categories commonly used for estimating costs of heavier than air commercial transports. Modifications are made to the relationships to account for the unique features of airships. To illustrate the cost estimating method, the operating costs of selected airship cargo transports are computed. Conventional fully buoyant and hybrid semi-buoyant systems are investigated for a variety of speeds, payloads, ranges, and altitudes. Comparisons are made with aircraft transports for a range of cargo densities.

  2. Estimation of the cost of using chemical protective clothing

    SciTech Connect

    Schwope, A.D.; Renard, E.R.

    1993-01-01

    The U.S. Environmental Protection Agency, either directly or through its Superfund contractors, is a major user of chemical protective clothing. The purpose of the study was to develop estimates for the cost of using this clothing. These estimates can be used to guide purchase decisions and use practices. For example, economic guidelines would assist in decisions pertinent to single-use versus reusable clothing. Eight cost elements were considered: (1) purchase cost, (2) the number of times an item is used, (3) the number of items used per day, (4) cost of decontamination, (5) cost of inspection, (6) cost of maintenance, (7) cost of storage, and (8) cost of disposal. Estimates or assumed inputs for each of these elements were developed based on labor costs, fixed costs, and recurring costs. The cost elements were combined into an economic (mathematical) model having the single output of cost/use. By comparing cost/use for various use scenarios, conclusions are readily reached as to the optimum economics for purchase, use, and reuse of the clothing. In general, clothing should be considered disposable if its purchase cost is less than its average cost/use per use for the anticipated number of times it will be reused.

  3. Estimating the cost of major ongoing cost plus hardware development programs

    NASA Technical Reports Server (NTRS)

    Bush, J. C.

    1990-01-01

    Approaches are developed for forecasting the cost of major hardware development programs while these programs are in the design and development C/D phase. Three approaches are developed: a schedule assessment technique for bottom-line summary cost estimation, a detailed cost estimation approach, and an intermediate cost element analysis procedure. The schedule assessment technique was developed using historical cost/schedule performance data.

  4. A new geometric-based model to accurately estimate arm and leg inertial estimates.

    PubMed

    Wicke, Jason; Dumas, Geneviève A

    2014-06-01

    Segment estimates of mass, center of mass and moment of inertia are required input parameters to analyze the forces and moments acting across the joints. The objectives of this study were to propose a new geometric model for limb segments, to evaluate it against criterion values obtained from DXA, and to compare its performance to five other popular models. Twenty five female and 24 male college students participated in the study. For the criterion measures, the participants underwent a whole body DXA scan, and estimates for segment mass, center of mass location, and moment of inertia (frontal plane) were directly computed from the DXA mass units. For the new model, the volume was determined from two standing frontal and sagittal photographs. Each segment was modeled as a stack of slices, the sections of which were ellipses if they are not adjoining another segment and sectioned ellipses if they were adjoining another segment (e.g. upper arm and trunk). Length of axes of the ellipses was obtained from the photographs. In addition, a sex-specific, non-uniform density function was developed for each segment. A series of anthropometric measurements were also taken by directly following the definitions provided of the different body segment models tested, and the same parameters determined for each model. Comparison of models showed that estimates from the new model were consistently closer to the DXA criterion than those from the other models, with an error of less than 5% for mass and moment of inertia and less than about 6% for center of mass location. PMID:24735506

  5. Satellite servicing mission preliminary cost estimation model

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The cost model presented is a preliminary methodology for determining a rough order-of-magnitude cost for implementing a satellite servicing mission. Mission implementation, in this context, encompassess all activities associated with mission design and planning, including both flight and ground crew training and systems integration (payload processing) of servicing hardward with the Shuttle. A basic assumption made in developing this cost model is that a generic set of servicing hardware was developed and flight tested, is inventoried, and is maintained by NASA. This implies that all hardware physical and functional interfaces are well known and therefore recurring CITE testing is not required. The development of the cost model algorithms and examples of their use are discussed.

  6. Handbook for cost estimating. A method for developing estimates of costs for generic actions for nuclear power plants

    SciTech Connect

    Ball, J.R.; Cohen, S.; Ziegler, E.Z.

    1984-10-01

    This document provides overall guidance to assist the NRC in preparing the types of cost estimates required by the Regulatory Analysis Guidelines and to assist in the assignment of priorities in resolving generic safety issues. The Handbook presents an overall cost model that allows the cost analyst to develop a chronological series of activities needed to implement a specific regulatory requirement throughout all applicable commercial LWR power plants and to identify the significant cost elements for each activity. References to available cost data are provided along with rules of thumb and cost factors to assist in evaluating each cost element. A suitable code-of-accounts data base is presented to assist in organizing and aggregating costs. Rudimentary cost analysis methods are described to allow the analyst to produce a constant-dollar, lifetime cost for the requirement. A step-by-step example cost estimate is included to demonstrate the overall use of the Handbook.

  7. Accurate Estimation of the Entropy of Rotation-Translation Probability Distributions.

    PubMed

    Fogolari, Federico; Dongmo Foumthuim, Cedrix Jurgal; Fortuna, Sara; Soler, Miguel Angel; Corazza, Alessandra; Esposito, Gennaro

    2016-01-12

    The estimation of rotational and translational entropies in the context of ligand binding has been the subject of long-time investigations. The high dimensionality (six) of the problem and the limited amount of sampling often prevent the required resolution to provide accurate estimates by the histogram method. Recently, the nearest-neighbor distance method has been applied to the problem, but the solutions provided either address rotation and translation separately, therefore lacking correlations, or use a heuristic approach. Here we address rotational-translational entropy estimation in the context of nearest-neighbor-based entropy estimation, solve the problem numerically, and provide an exact and an approximate method to estimate the full rotational-translational entropy. PMID:26605696

  8. Polynomial Fitting of DT-MRI Fiber Tracts Allows Accurate Estimation of Muscle Architectural Parameters

    PubMed Central

    Damon, Bruce M.; Heemskerk, Anneriet M.; Ding, Zhaohua

    2012-01-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor MRI fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image datasets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8, and 15.3 m−1), signal-to-noise ratio (50, 75, 100, and 150), and voxel geometry (13.8 and 27.0 mm3 voxel volume with isotropic resolution; 13.5 mm3 volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to 2nd order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m−1), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation. PMID:22503094

  9. Fuel Cost Estimation for Sumatra Grid System

    NASA Astrophysics Data System (ADS)

    Liun, Edwaren

    2010-06-01

    Sumatra has a high growth rate electricity energy demand from the first decade in this century. At the medium of this decade the growth is 11% per annum. On the other side capability of Government of Indonesia cq. PLN authority is limited, while many and most old existing power plants will be retired. The electricity demand growth of Sumatra is increasing the fuel consumption for several next decades. Based on several cases by vary growth scenarios and economic parameters, it shown that some kinds of fossil fuel keep to be required until next several decades. Although Sumatra has abundant coal resource, however, the other fuel types such as fuel oil, diesel, gas and nuclear are needed. On the Base Scenario and discount rate of 10%, the Sumatra System will require 11.6 million tones of coal until 2030 producing 866 TWh with cost of US10558 million. Nuclear plants produce about 501 TWh or 32% by cost of US3.1 billion. On the High Scenario and discount rate 10%, the coal consumption becomes 486.6 million tones by fuel cost of US12.7 billion producing 1033 TWh electricity energy. Nuclear fuel cost required in this scenario is US7.06 billion. The other fuel in large amount consumed is natural gas for combined cycle plants by cost of US1.38 billion producing 11.7 TWh of electricity energy on the Base Scenario and discount rate of 10%. In the High Scenario and discount rate 10% coal plants take role in power generation in Sumatra producing about 866 TWh or 54% of electricity energy. Coal consumption will be the highest on the Base Scenario with discount rate of 12% producing 756 TWh and required cost of US17.1 billion. Nuclear plants will not applicable in this scenario due to its un-competitiveness. The fuel cost will depend on nuclear power role in Sumatra system. Fuel cost will increase correspond to the increasing of coal consumption on the case where nuclear power plants not appear.

  10. ABC estimation of unit costs for emergency department services.

    PubMed

    Holmes, R L; Schroeder, R E

    1996-04-01

    Rapid evolution of the health care industry forces managers to make cost-effective decisions. Typical hospital cost accounting systems do not provide emergency department managers with the information needed, but emergency department settings are so complex and dynamic as to make the more accurate activity-based costing (ABC) system prohibitively expensive. Through judicious use of the available traditional cost accounting information and simple computer spreadsheets. managers may approximate the decision-guiding information that would result from the much more costly and time-consuming implementation of ABC. PMID:10156656

  11. Cost estimate guidelines for advanced nuclear power technologies

    SciTech Connect

    Hudson, C.R. II

    1986-07-01

    To make comparative assessments of competing technologies, consistent ground rules must be applied when developing cost estimates. This document provides a uniform set of assumptions, ground rules, and requirements that can be used in developing cost estimates for advanced nuclear power technologies.

  12. 40 CFR 267.142 - Cost estimate for closure.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... active life of the facility, the owner or operator must adjust the closure cost estimate for inflation... closure cost estimate must be updated for inflation within 30 days after the close of the firm's fiscal... dollars, or by using an inflation factor derived from the most recent Implicit Price Deflator for...

  13. 40 CFR 267.142 - Cost estimate for closure.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... active life of the facility, the owner or operator must adjust the closure cost estimate for inflation... closure cost estimate must be updated for inflation within 30 days after the close of the firm's fiscal... dollars, or by using an inflation factor derived from the most recent Implicit Price Deflator for...

  14. 40 CFR 267.142 - Cost estimate for closure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... latest published annual Deflator by the Deflator for the previous year. (1) The first adjustment is made... cost estimate. (2) Subsequent adjustments are made by multiplying the latest adjusted closure cost estimate by the latest inflation factor. (c) During the active life of the facility, the owner or...

  15. 40 CFR 264.142 - Cost estimate for closure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... dividing the latest published annual Deflator by the Deflator for the previous year. (1) The first... adjusted closure cost estimate. (2) Subsequent adjustments are made by multiplying the latest adjusted closure cost estimate by the latest inflation factor. (c) During the active life of the facility,...

  16. 40 CFR 265.142 - Cost estimate for closure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... dividing the latest published annual Deflator by the Deflator for the previous year. (1) The first... adjusted closure cost estimate. (2) Subsequent adjustments are made by multiplying the latest adjusted closure cost estimate by the latest inflation factor. (c) During the active life of the facility,...

  17. Cost estimate guidelines for advanced nuclear power technologies

    SciTech Connect

    Delene, J.G.; Hudson, C.R. II.

    1990-03-01

    To make comparative assessments of competing technologies, consistent ground rules must be applied when developing cost estimates. This document provides a uniform set of assumptions, ground rules, and requirements that can be used in developing cost estimates for advanced nuclear power technologies. 10 refs., 8 figs., 32 tabs.

  18. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  19. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  20. The DOE National Transportation Program Cost-Estimating Model

    SciTech Connect

    Rawl, R.R.

    2001-09-25

    The United States Department of Energy (DOE) carries out a significant amount of transportation each year, including waste remediation activities at the sites for which it is responsible. In future years, the amount of material transported is expected to increase, and the costs of this transportation are expected to be large. To support the assessment of such costs, a cost-estimating model was developed in 1996, peer-reviewed against other available packaging and transportation cost data, and used to calculate the costs for a significant number of shipping campaigns of radioactive waste. This cost-estimating model, known as the Ten-year Plan Transportation Model (TEPTRAM), served as the cost-estimating model for radioactive material shipments in developing the DOE Office of Environmental Management's Ten Year Plan. The TEPTRAM model considered costs for recovery and processing of the wastes, packaging of the wastes for transport, carriage of the waste and a rough estimate of labor cost s associated with preparing and undertaking the shipments. At the user's direction, the model could also include the cost for DOE's interaction with its external stakeholders (e.g., state and local governments and tribal entities) and the cost associated with tracking and communication (e.g., use of the DOE TRANSCOM system). By considering all of these sources of costs, it provided a mechanism for assessing and comparing the costs of various waste processing and shipping campaign alternatives to help guide decision-making. Recognizing that a more user-friendly version of a cost-estimating model would be more useful to the DOE packaging and transportation community, the National Transportation Program sponsored an update of the TEPTRAM model. The new Transportation Cost Estimating Model (TRANSCOST) was developed to fulfill this need. TRANSCOST utilizes a series of input and output screens to facilitate information flow, and a number of new features were added on the basis of features

  1. Review of storage battery system cost estimates

    SciTech Connect

    Brown, D.R.; Russell, J.A.

    1986-04-01

    Cost analyses for zinc bromine, sodium sulfur, and lead acid batteries were reviewed. Zinc bromine and sodium sulfur batteries were selected because of their advanced design nature and the high level of interest in these two technologies. Lead acid batteries were included to establish a baseline representative of a more mature technology.

  2. Estimating software development costs for a patient multimedia education project.

    PubMed

    Caban, A; Cimino, C; Swencionis, C; Ginsberg, M; Wylie-Rosett, J

    2001-01-01

    The authors compare alternative methods of cost estimation for a patient multimedia education (PME) program, using a computerized weight-reduction PME project as an example. Data from the project planning and budgeting process and actual costs of the completed project are analyzed retrospectively to calculate three different estimates-pre-work, post-work, and actual work. Three traditional methods of estimating the cost of computer programs (the lines-of-code, function point, and task ratio analyses) underestimate costs in this example. A commercial program (Cost Xpert) that calculates the cost of developing a graphical user interface provided a better estimate, as did a tally reflecting the complexity and quality of media material in the project. PMID:11230386

  3. Accurate estimation of forest carbon stocks by 3-D remote sensing of individual trees.

    PubMed

    Omasa, Kenji; Qiu, Guo Yu; Watanuki, Kenichi; Yoshimi, Kenji; Akiyama, Yukihide

    2003-03-15

    Forests are one of the most important carbon sinks on Earth. However, owing to the complex structure, variable geography, and large area of forests, accurate estimation of forest carbon stocks is still a challenge for both site surveying and remote sensing. For these reasons, the Kyoto Protocol requires the establishment of methodologies for estimating the carbon stocks of forests (Kyoto Protocol, Article 5). A possible solution to this challenge is to remotely measure the carbon stocks of every tree in an entire forest. Here, we present a methodology for estimating carbon stocks of a Japanese cedar forest by using a high-resolution, helicopter-borne 3-dimensional (3-D) scanning lidar system that measures the 3-D canopy structure of every tree in a forest. Results show that a digital image (10-cm mesh) of woody canopy can be acquired. The treetop can be detected automatically with a reasonable accuracy. The absolute error ranges for tree height measurements are within 42 cm. Allometric relationships of height to carbon stocks then permit estimation of total carbon storage by measurement of carbon stocks of every tree. Thus, we suggest that our methodology can be used to accurately estimate the carbon stocks of Japanese cedar forests at a stand scale. Periodic measurements will reveal changes in forest carbon stocks. PMID:12680675

  4. Retrofit FGD cost-estimating guidelines. Final report. [6 processes

    SciTech Connect

    Shattuck, D.M.; Ireland, P.A.; Keeth, R.J.; Mora, R.R.; Scheck, R.W.; Archambeault, J.A.; Rathbun, G.R.

    1984-10-01

    This report presents a method to estimate specific plant FGD retrofit costs. The basis of the estimate is a new plant's FGD system cost, as provided in EPRI's Economic Evaluation of FGD Systems CS-3342, or any other generalized cost estimate. The methodology adjusts the capital cost for the sulfur content of the coal, sulfur removal required, unit size, geographic location variables, and retrofit considerations. The methodology also allows the user to calculate first year operating and maintenance (O and M) costs based on site-specific variables. Finally, the report provides a means to adjust for remaining unit life in determining the levelized busbar cost. Levelized cost is presented in mills/kWh and $/t SO/sub 2/ removed.

  5. AX Tank Farm waste retrieval alternatives cost estimates

    SciTech Connect

    Krieg, S.A.

    1998-07-21

    This report presents the estimated costs associated with retrieval of the wastes from the four tanks in AX Tank Farm. The engineering cost estimates developed for this report are based on previous cost data prepared for Project W-320 and the HTI 241-C-106 Heel Retrieval System. The costs presented in this report address only the retrieval of the wastes from the four AX Farm tanks. This includes costs for equipment procurement, fabrication, installation, and operation to retrieve the wastes. The costs to modify the existing plant equipment and systems to support the retrieval equipment are also included. The estimates do not include operational costs associated with pumping the waste out of the waste receiver tank (241-AY-102) between AX Farm retrieval campaigns or transportation, processing, and disposal of the retrieved waste.

  6. A method to accurately estimate the muscular torques of human wearing exoskeletons by torque sensors.

    PubMed

    Hwang, Beomsoo; Jeon, Doyoung

    2015-01-01

    In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions. PMID:25860074

  7. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  8. The use of artificial neural networks for residential buildings conceptual cost estimation

    NASA Astrophysics Data System (ADS)

    Juszczyk, Michał

    2013-10-01

    Accurate cost estimation in the early phase of the building's design process is of key importance for a project's success. Both underestimation and overestimation may lead to projects failure in terms of costs. The paper presents synthetically some research results on the use of neural networks for conceptual cost estimation of residential buildings. In the course of the research the author focused on regression models binding together the basic information about residential buildings available in the early stage of design and construction cost. Application of different neural networks types was analysed (multilayer perceptron, multilayer perceptron with data compression based on principal component analysis and radial basis function networks). Due to the research results, multilayer perceptron networks proved to be the best neural network type for the problem solution. The research results indicate that a neural approach may be an interesting alternative for the traditional methods of conceptual cost estimation in construction projects.

  9. Measuring nonlinear oscillations using a very accurate and low-cost linear optical position transducer

    NASA Astrophysics Data System (ADS)

    Donoso, Guillermo; Ladera, Celso L.

    2016-09-01

    An accurate linear optical displacement transducer of about 0.2 mm resolution over a range of ∼40 mm is presented. This device consists of a stack of thin cellulose acetate strips, each strip longitudinally slid ∼0.5 mm over the precedent one so that one end of the stack becomes a stepped wedge of constant step. A narrowed light beam from a white LED orthogonally incident crosses the wedge at a known point, the transmitted intensity being detected with a phototransistor whose emitter is connected to a diode. We present the interesting analytical proof that the voltage across the diode is linearly dependent upon the ordinate of the point where the light beam falls on the wedge, as well as the experimental validation of such a theoretical proof. Applications to nonlinear oscillations are then presented—including the interesting case of a body moving under dry friction, and the more advanced case of an oscillator in a quartic energy potential—whose time-varying positions were accurately measured with our transducer. Our sensing device can resolve the dynamics of an object attached to it with great accuracy and precision at a cost considerably less than that of a linear neutral density wedge. The technique used to assemble the wedge of acetate strips is described.

  10. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  11. The application of artificial neural networks in indirect cost estimation

    NASA Astrophysics Data System (ADS)

    Leśniak, Agnieszka

    2013-10-01

    Estimating of the costs of construction project is one of the most important task in the management of the project. The total costs can be divided into direct costs that are related to executing the works, and indirect costs that accompany delivery. A precise costs estimation is usually a highly labour and time-intensive task especially when using manual calculation methods. This paper presents Artificial Neural Network (ANN) approach to predicting index of indirect cost of construction projects in Poland. A quantitative study was undertaken on the factors conditioning indirect costs of polish construction projects and a determination was made of the actual costs incurred by enterprises during project implementation. As a result of these studies, a data set was assembled covering 72 real-life cases of building projects constructed in Poland.

  12. Fuzzy case based reasoning in sports facilities unit cost estimating

    NASA Astrophysics Data System (ADS)

    Zima, Krzysztof

    2016-06-01

    This article presents an example of estimating costs in the early phase of the project using fuzzy case-based reasoning. The fragment of database containing descriptions and unit cost of sports facilities was shown. The formulas used in Case Based Reasoning method were presented, too. The article presents similarity measurement using a few formulas, including fuzzy similarity. The outcome of cost calculations based on CBR method was presented as a fuzzy number of unit cost of construction work.

  13. Estimates of costs by DRG in Sydney teaching hospitals: an application of the Yale cost model.

    PubMed

    Palmer, G; Aisbett, C; Fetter, R; Winchester, L; Reid, B; Rigby, E

    1991-01-01

    The results are reported of a first round of costing by DRG in seven major teaching hospital sites in Sydney using the Yale cost model. These results, when compared between the hospitals and with values of relative costs by DRG from the United States, indicate that the cost modelling procedure has produced credible and potentially useful estimates of casemix costs. The rationale and underlying theory of cost modelling is explained, and the need for further work to improve the method of allocating costs to DRGs, and to improve the cost centre definitions currently used by the hospitals, is emphasised. PMID:10117339

  14. Estimating the Life Cycle Cost of Space Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    A space system's Life Cycle Cost (LCC) includes design and development, launch and emplacement, and operations and maintenance. Each of these cost factors is usually estimated separately. NASA uses three different parametric models for the design and development cost of crewed space systems; the commercial PRICE-H space hardware cost model, the NASA-Air Force Cost Model (NAFCOM), and the Advanced Missions Cost Model (AMCM). System mass is an important parameter in all three models. System mass also determines the launch and emplacement cost, which directly depends on the cost per kilogram to launch mass to Low Earth Orbit (LEO). The launch and emplacement cost is the cost to launch to LEO the system itself and also the rockets, propellant, and lander needed to emplace it. The ratio of the total launch mass to payload mass depends on the mission scenario and destination. The operations and maintenance costs include any material and spares provided, the ground control crew, and sustaining engineering. The Mission Operations Cost Model (MOCM) estimates these costs as a percentage of the system development cost per year.

  15. Cost estimates for removal of orbital debris

    NASA Technical Reports Server (NTRS)

    Petro, Andrew; Ashley, Howard

    1989-01-01

    While there are currently no active measures for the removal of nonfunctional satellites or spent rocket stages from earth orbit, it has been deemed prudent to begin to identify and economically evaluate potential approaches for such orbital decluttering. The methods presently considered encompass retrieval with an OMV, forcible deorbiting via attached propulsive devices, and deorbiting via passive, drag-augmentation devices; the increases in payload-delivery costs they represent are respectively $15-20 million/object, $7.8 million/vehicle, and $5.5-15.5 million/unit. OMV removal appears the least economically feasible method.

  16. 48 CFR 236.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Government estimate of...-ENGINEER CONTRACTS Special Aspects of Contracting for Construction 236.203 Government estimate of construction costs. Follow the procedures at PGI 236.203 for handling the Government estimate of...

  17. 48 CFR 836.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government estimate of... Contracting for Construction 836.203 Government estimate of construction costs. The overall amount of the Government estimate must not be disclosed until after award of the contract. After award, the...

  18. 48 CFR 1336.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government estimate of... Contracting for Construction 1336.203 Government estimate of construction costs. After award, the independent Government estimated price can be released, upon request, to those firms or individuals who...

  19. Cost estimate guidelines for advanced nuclear power technologies

    SciTech Connect

    Delene, J.G.; Hudson, C.R. II

    1993-05-01

    Several advanced power plant concepts are currently under development. These include the Modular High Temperature Gas Cooled Reactors, the Advanced Liquid Metal Reactor and the Advanced Light Water Reactors. One measure of the attractiveness of a new concept is its cost. Invariably, the cost of a new type of power plant will be compared with other alternative forms of electrical generation. This report provides a common starting point, whereby the cost estimates for the various power plants to be considered are developed with common assumptions and ground rules. Comparisons can then be made on a consistent basis. This is the second update of these cost estimate guidelines. Changes have been made to make the guidelines more current (January 1, 1992) and in response to suggestions made as a result of the use of the previous report. The principal changes are that the reference site has been changed from a generic Northeast (Middletown) site to a more central site (EPRI`s East/West Central site) and that reference bulk commodity prices and labor productivity rates have been added. This report is designed to provide a framework for the preparation and reporting of costs. The cost estimates will consist of the overnight construction cost, the total plant capital cost, the operation and maintenance costs, the fuel costs, decommissioning costs and the power production or busbar generation cost.

  20. The unit cost factors and calculation methods for decommissioning - Cost estimation of nuclear research facilities

    SciTech Connect

    Kwan-Seong Jeong; Dong-Gyu Lee; Chong-Hun Jung; Kune-Woo Lee

    2007-07-01

    Available in abstract form only. Full text of publication follows: The uncertainties of decommissioning costs increase high due to several conditions. Decommissioning cost estimation depends on the complexity of nuclear installations, its site-specific physical and radiological inventories. Therefore, the decommissioning costs of nuclear research facilities must be estimated in accordance with the detailed sub-tasks and resources by the tasks of decommissioning activities. By selecting the classified activities and resources, costs are calculated by the items and then the total costs of all decommissioning activities are reshuffled to match with its usage and objectives. And the decommissioning cost of nuclear research facilities is calculated by applying a unit cost factor method on which classification of decommissioning works fitted with the features and specifications of decommissioning objects and establishment of composition factors are based. Decommissioning costs of nuclear research facilities are composed of labor cost, equipment and materials cost. Of these three categorical costs, the calculation of labor costs are very important because decommissioning activities mainly depend on labor force. Labor costs in decommissioning activities are calculated on the basis of working time consumed in decommissioning objects and works. The working times are figured out of unit cost factors and work difficulty factors. Finally, labor costs are figured out by using these factors as parameters of calculation. The accuracy of decommissioning cost estimation results is much higher compared to the real decommissioning works. (authors)

  1. Easy and accurate variance estimation of the nonparametric estimator of the partial area under the ROC curve and its application.

    PubMed

    Yu, Jihnhee; Yang, Luge; Vexler, Albert; Hutson, Alan D

    2016-06-15

    The receiver operating characteristic (ROC) curve is a popular technique with applications, for example, investigating an accuracy of a biomarker to delineate between disease and non-disease groups. A common measure of accuracy of a given diagnostic marker is the area under the ROC curve (AUC). In contrast with the AUC, the partial area under the ROC curve (pAUC) looks into the area with certain specificities (i.e., true negative rate) only, and it can be often clinically more relevant than examining the entire ROC curve. The pAUC is commonly estimated based on a U-statistic with the plug-in sample quantile, making the estimator a non-traditional U-statistic. In this article, we propose an accurate and easy method to obtain the variance of the nonparametric pAUC estimator. The proposed method is easy to implement for both one biomarker test and the comparison of two correlated biomarkers because it simply adapts the existing variance estimator of U-statistics. In this article, we show accuracy and other advantages of the proposed variance estimation method by broadly comparing it with previously existing methods. Further, we develop an empirical likelihood inference method based on the proposed variance estimator through a simple implementation. In an application, we demonstrate that, depending on the inferences by either the AUC or pAUC, we can make a different decision on a prognostic ability of a same set of biomarkers. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26790540

  2. Life cycle cost estimating of waste management facilities

    SciTech Connect

    Shropshire, D.; Feizollahi, F.; Teheranian, B.; Waldman, M.

    1994-12-31

    Waste Management Facilities cost Information (WMFCI) provides a modular cost method for estimating planning-level life-cycle costs of waste management alternatives. This methodology includes over 120 cost modules that cover a variety of treatment, storage, disposal, and support facility options. The WMFCI method can be used to estimate virtually every technology option and related facilities needed by the Department of Energy for cradle-to-grave management of hazardous, radioactive, mixed waste, and spent nuclear fuel. Various waste streams covered by the WMFCI are low-level waste (LLW), mixed low-level waste (MLLW), alpha contaminated LLW, alpha contaminated MLLW, transuranic waste, spent nuclear fuel, Greater-Than-Class C and DOE equivalent special case wastes, and hazardous wastes. The methodology also contains cost versus capacity relationships for each cost module to aid in estimating various waste management configurations.

  3. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images.

    PubMed

    Lavoie, Benjamin R; Okoniewski, Michal; Fear, Elise C

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  4. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  5. Effective Echo Detection and Accurate Orbit Estimation Algorithms for Space Debris Radar

    NASA Astrophysics Data System (ADS)

    Isoda, Kentaro; Sakamoto, Takuya; Sato, Toru

    Orbit estimation of space debris, objects of no inherent value orbiting the earth, is a task that is important for avoiding collisions with spacecraft. The Kamisaibara Spaceguard Center radar system was built in 2004 as the first radar facility in Japan devoted to the observation of space debris. In order to detect the smaller debris, coherent integration is effective in improving SNR (Signal-to-Noise Ratio). However, it is difficult to apply coherent integration to real data because the motions of the targets are unknown. An effective algorithm is proposed for echo detection and orbit estimation of the faint echoes from space debris. The characteristics of the evaluation function are utilized by the algorithm. Experiments show the proposed algorithm improves SNR by 8.32dB and enables estimation of orbital parameters accurately to allow for re-tracking with a single radar.

  6. Parameter Estimation of Ion Current Formulations Requires Hybrid Optimization Approach to Be Both Accurate and Reliable

    PubMed Central

    Loewe, Axel; Wilhelms, Mathias; Schmid, Jochen; Krause, Mathias J.; Fischer, Fathima; Thomas, Dierk; Scholz, Eberhard P.; Dössel, Olaf; Seemann, Gunnar

    2016-01-01

    Computational models of cardiac electrophysiology provided insights into arrhythmogenesis and paved the way toward tailored therapies in the last years. To fully leverage in silico models in future research, these models need to be adapted to reflect pathologies, genetic alterations, or pharmacological effects, however. A common approach is to leave the structure of established models unaltered and estimate the values of a set of parameters. Today’s high-throughput patch clamp data acquisition methods require robust, unsupervised algorithms that estimate parameters both accurately and reliably. In this work, two classes of optimization approaches are evaluated: gradient-based trust-region-reflective and derivative-free particle swarm algorithms. Using synthetic input data and different ion current formulations from the Courtemanche et al. electrophysiological model of human atrial myocytes, we show that neither of the two schemes alone succeeds to meet all requirements. Sequential combination of the two algorithms did improve the performance to some extent but not satisfactorily. Thus, we propose a novel hybrid approach coupling the two algorithms in each iteration. This hybrid approach yielded very accurate estimates with minimal dependency on the initial guess using synthetic input data for which a ground truth parameter set exists. When applied to measured data, the hybrid approach yielded the best fit, again with minimal variation. Using the proposed algorithm, a single run is sufficient to estimate the parameters. The degree of superiority over the other investigated algorithms in terms of accuracy and robustness depended on the type of current. In contrast to the non-hybrid approaches, the proposed method proved to be optimal for data of arbitrary signal to noise ratio. The hybrid algorithm proposed in this work provides an important tool to integrate experimental data into computational models both accurately and robustly allowing to assess the often non

  7. 16 CFR 305.5 - Determinations of estimated annual energy consumption, estimated annual operating cost, and...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... consumption, estimated annual operating cost, and energy efficiency rating, and of water use rate. 305.5... energy efficiency rating, and of water use rate. (a) Procedures for determining the estimated annual energy consumption, the estimated annual operating costs, the energy efficiency ratings, and the...

  8. 16 CFR 305.5 - Determinations of estimated annual energy consumption, estimated annual operating cost, and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... consumption, estimated annual operating cost, and energy efficiency rating, and of water use rate. 305.5... energy efficiency rating, and of water use rate. (a) Procedures for determining the estimated annual energy consumption, the estimated annual operating costs, the energy efficiency ratings, and the...

  9. 16 CFR 305.5 - Determinations of estimated annual energy consumption, estimated annual operating cost, and...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... consumption, estimated annual operating cost, and energy efficiency rating, and of water use rate. 305.5... energy efficiency rating, and of water use rate. (a) Procedures for determining the estimated annual energy consumption, the estimated annual operating costs, the energy efficiency ratings, and the...

  10. Cost-estimating systems for remedial-action projects

    SciTech Connect

    Evans, G.M.; Peterson, J.

    1991-01-01

    The paper details the ongoing collaboration between the U.S. EPA and the U.S. Army Corps of Engineers in the development of complementary microcomputer based cost estimating systems for hazardous waste remediations. The U.S. EPA system, Remedial Action Cost Estimating System (RACES), is a technology based application. Estimates generated by RACES are based upon cost engineering relationships. The estimates are designed for use in the early stages of remediation design. The U.S. Army Corps of Engineers system, Micro-Computer Aided Cost Engineering System (M-CACES), is a bottoms-up system for use in situations where detailed design information is available. While both systems will stand alone, they have been designed to allow the transfer of estimates generated by RACES directly into the M-CACES system.

  11. Accurate reconstruction of viral quasispecies spectra through improved estimation of strain richness

    PubMed Central

    2015-01-01

    Background Estimating the number of different species (richness) in a mixed microbial population has been a main focus in metagenomic research. Existing methods of species richness estimation ride on the assumption that the reads in each assembled contig correspond to only one of the microbial genomes in the population. This assumption and the underlying probabilistic formulations of existing methods are not useful for quasispecies populations where the strains are highly genetically related. The lack of knowledge on the number of different strains in a quasispecies population is observed to hinder the precision of existing Viral Quasispecies Spectrum Reconstruction (QSR) methods due to the uncontrolled reconstruction of a large number of in silico false positives. In this work, we formulated a novel probabilistic method for strain richness estimation specifically targeting viral quasispecies. By using this approach we improved our recently proposed spectrum reconstruction pipeline ViQuaS to achieve higher levels of precision in reconstructed quasispecies spectra without compromising the recall rates. We also discuss how one other existing popular QSR method named ShoRAH can be improved using this new approach. Results On benchmark data sets, our estimation method provided accurate richness estimates (< 0.2 median estimation error) and improved the precision of ViQuaS by 2%-13% and F-score by 1%-9% without compromising the recall rates. We also demonstrate that our estimation method can be used to improve the precision and F-score of ShoRAH by 0%-7% and 0%-5% respectively. Conclusions The proposed probabilistic estimation method can be used to estimate the richness of viral populations with a quasispecies behavior and to improve the accuracy of the quasispecies spectra reconstructed by the existing methods ViQuaS and ShoRAH in the presence of a moderate level of technical sequencing errors. Availability http://sourceforge.net/projects/viquas/ PMID:26678073

  12. Asthma control cost-utility randomized trial evaluation (ACCURATE): the goals of asthma treatment

    PubMed Central

    2011-01-01

    Background Despite the availability of effective therapies, asthma remains a source of significant morbidity and use of health care resources. The central research question of the ACCURATE trial is whether maximal doses of (combination) therapy should be used for long periods in an attempt to achieve complete control of all features of asthma. An additional question is whether patients and society value the potential incremental benefit, if any, sufficiently to concur with such a treatment approach. We assessed patient preferences and cost-effectiveness of three treatment strategies aimed at achieving different levels of clinical control: 1. sufficiently controlled asthma 2. strictly controlled asthma 3. strictly controlled asthma based on exhaled nitric oxide as an additional disease marker Design 720 Patients with mild to moderate persistent asthma from general practices with a practice nurse, age 18-50 yr, daily treatment with inhaled corticosteroids (more then 3 months usage of inhaled corticosteroids in the previous year), will be identified via patient registries of general practices in the Leiden, Nijmegen, and Amsterdam areas in The Netherlands. The design is a 12-month cluster-randomised parallel trial with 40 general practices in each of the three arms. The patients will visit the general practice at baseline, 3, 6, 9, and 12 months. At each planned and unplanned visit to the general practice treatment will be adjusted with support of an internet-based asthma monitoring system supervised by a central coordinating specialist nurse. Patient preferences and utilities will be assessed by questionnaire and interview. Data on asthma control, treatment step, adherence to treatment, utilities and costs will be obtained every 3 months and at each unplanned visit. Differences in societal costs (medication, other (health) care and productivity) will be compared to differences in the number of limited activity days and in quality adjusted life years (Dutch EQ5D, SF6D

  13. Intraocular lens power estimation by accurate ray tracing for eyes underwent previous refractive surgeries

    NASA Astrophysics Data System (ADS)

    Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong

    2015-08-01

    For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.

  14. Improving The Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning many program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility. This paper presents the plans for the newly established role. Described is how the Independent Program Assessment Office, working with all NASA Centers, NASA Headquarters, other Government agencies, and industry, is focused on creating cost estimation and analysis as a professional discipline that will be recognized equally with the technical disciplines needed to design new space and aeronautics activities. Investments in selected, new analysis tools, creating advanced training opportunities for analysts, and developing career paths for future analysts engaged in the discipline are all elements of the plan. Plans also include increasing the human resources available to conduct independent cost analysis of Agency programs during their formulation, to improve near-term capability to conduct economic cost-benefit assessments, to support NASA management's decision process, and to provide cost analysis results emphasizing "full-cost" and "full-life cycle" considerations. The Agency cost analysis improvement plan has been approved for implementation starting this calendar year. Adequate financial

  15. Space Station Furnace Facility. Volume 3: Program cost estimate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The approach used to estimate costs for the Space Station Furnace Facility (SSFF) is based on a computer program developed internally at Teledyne Brown Engineering (TBE). The program produces time-phased estimates of cost elements for each hardware component, based on experience with similar components. Engineering estimates of the degree of similarity or difference between the current project and the historical data is then used to adjust the computer-produced cost estimate and to fit it to the current project Work Breakdown Structure (WBS). The SSFF Concept as presented at the Requirements Definition Review (RDR) was used as the base configuration for the cost estimate. This program incorporates data on costs of previous projects and the allocation of those costs to the components of one of three, time-phased, generic WBS's. Input consists of a list of similar components for which cost data exist, number of interfaces with their type and complexity, identification of the extent to which previous designs are applicable, and programmatic data concerning schedules and miscellaneous data (travel, off-site assignments). Output is program cost in labor hours and material dollars, for each component, broken down by generic WBS task and program schedule phase.

  16. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  17. Commercial Crew Cost Estimating - A Look at Estimating Processes, Challenges and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Battle, Rick; Cole, Lance

    2015-01-01

    To support annual PPBE budgets and NASA HQ requests for cost information for commercial crew transportation to the International Space Station (ISS), the NASA ISS ACES team developed system development and per flight cost estimates for the potential providers for each annual PPBE submit from 2009-2014. This paper describes the cost estimating processes used, challenges and lessons learned to develop estimates for this key NASA project that diverted from the traditional procurement approach and used a new way of doing business

  18. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  19. Regularization Based Iterative Point Match Weighting for Accurate Rigid Transformation Estimation.

    PubMed

    Liu, Yonghuai; De Dominicis, Luigi; Wei, Baogang; Chen, Liang; Martin, Ralph R

    2015-09-01

    Feature extraction and matching (FEM) for 3D shapes finds numerous applications in computer graphics and vision for object modeling, retrieval, morphing, and recognition. However, unavoidable incorrect matches lead to inaccurate estimation of the transformation relating different datasets. Inspired by AdaBoost, this paper proposes a novel iterative re-weighting method to tackle the challenging problem of evaluating point matches established by typical FEM methods. Weights are used to indicate the degree of belief that each point match is correct. Our method has three key steps: (i) estimation of the underlying transformation using weighted least squares, (ii) penalty parameter estimation via minimization of the weighted variance of the matching errors, and (iii) weight re-estimation taking into account both matching errors and information learnt in previous iterations. A comparative study, based on real shapes captured by two laser scanners, shows that the proposed method outperforms four other state-of-the-art methods in terms of evaluating point matches between overlapping shapes established by two typical FEM methods, resulting in more accurate estimates of the underlying transformation. This improved transformation can be used to better initialize the iterative closest point algorithm and its variants, making 3D shape registration more likely to succeed. PMID:26357287

  20. Econometric estimation of country-specific hospital costs

    PubMed Central

    Adam, Taghreed; Evans, David B; Murray, Christopher JL

    2003-01-01

    Information on the unit cost of inpatient and outpatient care is an essential element for costing, budgeting and economic-evaluation exercises. Many countries lack reliable estimates, however. WHO has recently undertaken an extensive effort to collect and collate data on the unit cost of hospitals and health centres from as many countries as possible; so far, data have been assembled from 49 countries, for various years during the period 1973–2000. The database covers a total of 2173 country-years of observations. Large gaps remain, however, particularly for developing countries. Although the long-term solution is that all countries perform their own costing studies, the question arises whether it is possible to predict unit costs for different countries in a standardized way for short-term use. The purpose of the work described in this paper, a modelling exercise, was to use the data collected across countries to predict unit costs in countries for which data are not yet available, with the appropriate uncertainty intervals. The model presented here forms part of a series of models used to estimate unit costs for the WHO-CHOICE project. The methods and the results of the model, however, may be used to predict a number of different types of country-specific unit costs, depending on the purpose of the exercise. They may be used, for instance, to estimate the costs per bed-day at different capacity levels; the "hotel" component of cost per bed-day; or unit costs net of particular components such as drugs. In addition to reporting estimates for selected countries, the paper shows that unit costs of hospitals vary within countries, sometimes by an order of magnitude. Basing cost-effectiveness studies or budgeting exercises on the results of a study of a single facility, or even a small group of facilities, is likely to be misleading. PMID:12773218

  1. Construction cost estimation of municipal incinerators by fuzzy linear regression

    SciTech Connect

    Chang, N.B.; Chen, Y.L.; Yang, H.H.

    1996-12-31

    Regression analysis has been widely used in engineering cost estimation. It is recognized that the fuzzy structure in cost estimation is a different type of uncertainty compared to the measurement error in the least-squares regression modeling. Hence, the uncertainties encountered in many events of construction and operating costs estimation and prediction cannot be fully depicted by conventional least-squares regression models. This paper presents a construction cost analysis of municipal incinerators by the techniques of fuzzy linear regression. A thorough investigation of construction costs in the Taiwan Resource Recovery Project was conducted based on design parameters such as design capacity, type of grate system, and the selected air pollution control process. The focus has been placed upon the methodology for dealing with the heterogeneity phenomenon of a set of observations for which regression is evaluated.

  2. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2003-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  3. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2004-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  4. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merion M.

    2002-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  5. Software cost estimation using class point metrics (CPM)

    NASA Astrophysics Data System (ADS)

    Ghode, Aditi; Periyasamy, Kasilingam

    2011-12-01

    Estimating cost for the software project is one of the most important and crucial task to maintain the software reliability. Many cost estimation models have been reported till now, but most of them have significant drawbacks due to rapid changes in the technology. For example, Source Line Of Code (SLOC) can only be counted when the software construction is complete. Function Point (FP) metric is deficient in handling Object Oriented Technology, as it was designed for procedural languages such as COBOL. Since Object-Oriented Programming became a popular development practice, most of the software companies started applying the Unified Modeling Language (UML). The objective of this research is to develop a new cost estimation model with the application of class diagram for the software cost estimation.

  6. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization

    PubMed Central

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 μs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  7. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization.

    PubMed

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 µs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  8. The accurate estimation of physicochemical properties of ternary mixtures containing ionic liquids via artificial neural networks.

    PubMed

    Cancilla, John C; Díaz-Rodríguez, Pablo; Matute, Gemma; Torrecilla, José S

    2015-02-14

    The estimation of the density and refractive index of ternary mixtures comprising the ionic liquid (IL) 1-butyl-3-methylimidazolium tetrafluoroborate, 2-propanol, and water at a fixed temperature of 298.15 K has been attempted through artificial neural networks. The obtained results indicate that the selection of this mathematical approach was a well-suited option. The mean prediction errors obtained, after simulating with a dataset never involved in the training process of the model, were 0.050% and 0.227% for refractive index and density estimation, respectively. These accurate results, which have been attained only using the composition of the dissolutions (mass fractions), imply that, most likely, ternary mixtures similar to the one analyzed, can be easily evaluated utilizing this algorithmic tool. In addition, different chemical processes involving ILs can be monitored precisely, and furthermore, the purity of the compounds in the studied mixtures can be indirectly assessed thanks to the high accuracy of the model. PMID:25583241

  9. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-01

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems. PMID:26651397

  10. Direct estimation of the cost effectiveness of tornado shelters.

    PubMed

    Simmons, Kevin M; Sutter, Daniel

    2006-08-01

    This article estimates the cost effectiveness of tornado shelters using the annual probability of a tornado and new data on fatalities per building struck by a tornado. This approach differs from recent estimates of the cost effectiveness of tornado shelters in Reference 1 that use historical casualties. Historical casualties combine both tornado risk and resident action. If residents of tornado-prone states take greater precautions, observed fatalities might not be much higher than in states with lower risk. Estimation using the tornado probability avoids this potential bias. Despite the very different method used, the estimates are 68 million US dollars in permanent homes and 6.0 million US dollars in mobile homes in Oklahoma using a 3% real discount rate, within about 10% of estimates based on historical fatalities. The findings suggest that shelters provide cost-effective protection for mobile homes in the most tornado-prone states but not for permanent homes. PMID:16948687

  11. Improving the Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning man), program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility.

  12. COST ESTIMATING EQUATIONS FOR BEST MANAGEMENT PRACTICES (BMP)

    EPA Science Inventory

    This paper describes the development of an interactive internet-based cost-estimating tool for commonly used urban storm runoff best management practices (BMP), including: retention and detention ponds, grassed swales, and constructed wetlands. The paper presents the cost data, c...

  13. 40 CFR 265.142 - Cost estimate for closure.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... operator may use costs for on-site disposal if he can demonstrate that on-site disposal capacity will exist... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Cost estimate for closure. 265.142... (CONTINUED) INTERIM STATUS STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE,...

  14. Estimating design costs for first-of-a-kind projects

    SciTech Connect

    Banerjee, Bakul; /Fermilab

    2006-03-01

    Modern scientific facilities are often outcomes of projects that are first-of-a-kind, that is, minimal historical data are available for project costs and schedules. However, at Fermilab, there was an opportunity to execute two similar projects consecutively. In this paper, a comparative study of the design costs for these two projects is presented using earned value methodology. This study provides some insights into how to estimate the cost of a replicated project.

  15. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    SciTech Connect

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-18

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1–2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S{sub 0} and A{sub 0}, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A{sub 0} to thickness variations was shown to be superior to S{sub 0}, however, the attenuation from A{sub 0} when a liquid loading was present was much higher than S{sub 0}. A{sub 0} was less sensitive to the presence of coatings on the surface of than S{sub 0}.

  16. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    NASA Astrophysics Data System (ADS)

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-01

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1-2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S0 and A0, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A0 to thickness variations was shown to be superior to S0, however, the attenuation from A0 when a liquid loading was present was much higher than S0. A0 was less sensitive to the presence of coatings on the surface of than S0.

  17. Removing the thermal component from heart rate provides an accurate VO2 estimation in forest work.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Lebel, Luc; Kolus, Ahmet

    2016-05-01

    Heart rate (HR) was monitored continuously in 41 forest workers performing brushcutting or tree planting work. 10-min seated rest periods were imposed during the workday to estimate the HR thermal component (ΔHRT) per Vogt et al. (1970, 1973). VO2 was measured using a portable gas analyzer during a morning submaximal step-test conducted at the work site, during a work bout over the course of the day (range: 9-74 min), and during an ensuing 10-min rest pause taken at the worksite. The VO2 estimated, from measured HR and from corrected HR (thermal component removed), were compared to VO2 measured during work and rest. Varied levels of HR thermal component (ΔHRTavg range: 0-38 bpm) originating from a wide range of ambient thermal conditions, thermal clothing insulation worn, and physical load exerted during work were observed. Using raw HR significantly overestimated measured work VO2 by 30% on average (range: 1%-64%). 74% of VO2 prediction error variance was explained by the HR thermal component. VO2 estimated from corrected HR, was not statistically different from measured VO2. Work VO2 can be estimated accurately in the presence of thermal stress using Vogt et al.'s method, which can be implemented easily by the practitioner with inexpensive instruments. PMID:26851474

  18. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    PubMed Central

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  19. Hybridization modeling of oligonucleotide SNP arrays for accurate DNA copy number estimation

    PubMed Central

    Wan, Lin; Sun, Kelian; Ding, Qi; Cui, Yuehua; Li, Ming; Wen, Yalu; Elston, Robert C.; Qian, Minping; Fu, Wenjiang J

    2009-01-01

    Affymetrix SNP arrays have been widely used for single-nucleotide polymorphism (SNP) genotype calling and DNA copy number variation inference. Although numerous methods have achieved high accuracy in these fields, most studies have paid little attention to the modeling of hybridization of probes to off-target allele sequences, which can affect the accuracy greatly. In this study, we address this issue and demonstrate that hybridization with mismatch nucleotides (HWMMN) occurs in all SNP probe-sets and has a critical effect on the estimation of allelic concentrations (ACs). We study sequence binding through binding free energy and then binding affinity, and develop a probe intensity composite representation (PICR) model. The PICR model allows the estimation of ACs at a given SNP through statistical regression. Furthermore, we demonstrate with cell-line data of known true copy numbers that the PICR model can achieve reasonable accuracy in copy number estimation at a single SNP locus, by using the ratio of the estimated AC of each sample to that of the reference sample, and can reveal subtle genotype structure of SNPs at abnormal loci. We also demonstrate with HapMap data that the PICR model yields accurate SNP genotype calls consistently across samples, laboratories and even across array platforms. PMID:19586935

  20. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets.

    PubMed

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant "collective" variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  1. Methods for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, M.R.; Bland, R.

    2000-01-01

    Accurate estimates of net residual discharge in tidally affected rivers and estuaries are possible because of recently developed ultrasonic discharge measurement techniques. Previous discharge estimates using conventional mechanical current meters and methods based on stage/discharge relations or water slope measurements often yielded errors that were as great as or greater than the computed residual discharge. Ultrasonic measurement methods consist of: 1) the use of ultrasonic instruments for the measurement of a representative 'index' velocity used for in situ estimation of mean water velocity and 2) the use of the acoustic Doppler current discharge measurement system to calibrate the index velocity measurement data. Methods used to calibrate (rate) the index velocity to the channel velocity measured using the Acoustic Doppler Current Profiler are the most critical factors affecting the accuracy of net discharge estimation. The index velocity first must be related to mean channel velocity and then used to calculate instantaneous channel discharge. Finally, discharge is low-pass filtered to remove the effects of the tides. An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin Rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. Two sets of data were collected during a spring tide (monthly maximum tidal current) and one of data collected during a neap tide (monthly minimum tidal current). The relative magnitude of instrumental errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was found to be the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three

  2. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    NASA Astrophysics Data System (ADS)

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj-xi)/(tj-ti) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  3. Toward an Accurate and Inexpensive Estimation of CCSD(T)/CBS Binding Energies of Large Water Clusters.

    PubMed

    Sahu, Nityananda; Singh, Gurmeet; Nandi, Apurba; Gadre, Shridhar R

    2016-07-21

    Owing to the steep scaling behavior, highly accurate CCSD(T) calculations, the contemporary gold standard of quantum chemistry, are prohibitively difficult for moderate- and large-sized water clusters even with the high-end hardware. The molecular tailoring approach (MTA), a fragmentation-based technique is found to be useful for enabling such high-level ab initio calculations. The present work reports the CCSD(T) level binding energies of many low-lying isomers of large (H2O)n (n = 16, 17, and 25) clusters employing aug-cc-pVDZ and aug-cc-pVTZ basis sets within the MTA framework. Accurate estimation of the CCSD(T) level binding energies [within 0.3 kcal/mol of the respective full calculation (FC) results] is achieved after effecting the grafting procedure, a protocol for minimizing the errors in the MTA-derived energies arising due to the approximate nature of MTA. The CCSD(T) level grafting procedure presented here hinges upon the well-known fact that the MP2 method, which scales as O(N(5)), can be a suitable starting point for approximating to the highly accurate CCSD(T) [that scale as O(N(7))] energies. On account of the requirement of only an MP2-level FC on the entire cluster, the current methodology ultimately leads to a cost-effective solution for the CCSD(T) level accurate binding energies of large-sized water clusters even at the complete basis set limit utilizing off-the-shelf hardware. PMID:27351269

  4. Hydrogen Production Cost Estimate Using Biomass Gasification: Independent Review

    SciTech Connect

    Ruth, M.

    2011-10-01

    This independent review is the conclusion arrived at from data collection, document reviews, interviews and deliberation from December 2010 through April 2011 and the technical potential of Hydrogen Production Cost Estimate Using Biomass Gasification. The Panel reviewed the current H2A case (Version 2.12, Case 01D) for hydrogen production via biomass gasification and identified four principal components of hydrogen levelized cost: CapEx; feedstock costs; project financing structure; efficiency/hydrogen yield. The panel reexamined the assumptions around these components and arrived at new estimates and approaches that better reflect the current technology and business environments.

  5. 48 CFR 436.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Government estimate of... Contracting for Construction 436.203 Government estimate of construction costs. For acquisitions using sealed bid procedures, the contracting officer may disclose the overall amount of the Government's...

  6. ESTIMATION OF THE COST OF USING CHEMICAL PROTECTIVE CLOTHING

    EPA Science Inventory

    The U.S. Environmental Protection Agency, either directly or through its Superfund contractors, is a major user of chemical protective clothing. he purpose of this study was to develop estimates for the cost of using this clothing. hese estimates can be used to guide purchase dec...

  7. High-Resolution Tsunami Inundation Simulations Based on Accurate Estimations of Coastal Waveforms

    NASA Astrophysics Data System (ADS)

    Oishi, Y.; Imamura, F.; Sugawara, D.; Furumura, T.

    2015-12-01

    We evaluate the accuracy of high-resolution tsunami inundation simulations in detail using the actual observational data of the 2011 Tohoku-Oki earthquake (Mw9.0) and investigate the methodologies to improve the simulation accuracy.Due to the recent development of parallel computing technologies, high-resolution tsunami inundation simulations are conducted more commonly than before. To evaluate how accurately these simulations can reproduce inundation processes, we test several types of simulation configurations on a parallel computer, where we can utilize the observational data (e.g., offshore and coastal waveforms and inundation properties) that are recorded during the Tohoku-Oki earthquake.Before discussing the accuracy of inundation processes on land, the incident waves at coastal sites must be accurately estimated. However, for megathrust earthquakes, it is difficult to find the tsunami source that can provide accurate estimations of tsunami waveforms at every coastal site because of the complex spatiotemporal distribution of the source and the limitation of observation. To overcome this issue, we employ a site-specific source inversion approach that increases the estimation accuracy within a specific coastal site by applying appropriate weighting to the observational data in the inversion process.We applied our source inversion technique to the Tohoku tsunami and conducted inundation simulations using 5-m resolution digital elevation model data (DEM) for the coastal area around Miyako Bay and Sendai Bay. The estimated waveforms at the coastal wave gauges of these bays successfully agree with the observed waveforms. However, the simulations overestimate the inundation extent indicating the necessity to improve the inundation model. We find that the value of Manning's roughness coefficient should be modified from the often-used value of n = 0.025 to n = 0.033 to obtain proper results at both cities.In this presentation, the simulation results with several

  8. Cost estimation for unmanned lunar and planetary programs

    NASA Technical Reports Server (NTRS)

    Dunkin, J. H.; Pekar, P. R.; Spadoni, D. J.; Stone, C. A.

    1973-01-01

    A basic model is presented for estimating the cost of unmanned lunar and planetary programs. Cost data were collected and analyzed for eight lunar and planetary programs. Total cost was separated into the following components: labor, overhead, materials, and technical support. The study determined that direct labor cost of unmanned lunar and planetary programs comprises 30 percent of the total program cost. Twelve program categories were defined for modeling: six spacecraft subsystem categories (science, structure, propulsion, electrical power, communications, and guidance and integration, test and quality assurance, launch and flight operations, ground equipment, systems analysis and engineering, and program management). An analysis showed that on a percentage basis, direct labor cost and direct labor manhours compare on a one-to-one ratio. Therefore, direct labor hours is used as the parameter for predicting cost, with the advantage of eliminating the effect of inflation on the analysis.

  9. Updated cost estimates of meeting geothermal hydrogen sulfide emission regulations

    SciTech Connect

    Wells, K.D.; Currie, J.W.; Weakley, S.A.; Ballinger, M.Y.

    1981-08-01

    A means of estimating the cost of hydrogen sulfide (H/sub 2/S) emission control was investigated. This study was designed to derive H/sub 2/S emission abatement cost functions and illustrate the cost of H/sub 2/S emission abatement at a hydrothermal site. Four tasks were undertaken: document the release of H/sub 2/S associated with geothermal development; review H/sub 2/S environmental standards; develop functional relationships that may be used to estimate the most cose-effective available H/sub 2/S abatement process; and use the cost functions to generate abatement cost estimates for a specific site. The conclusions and recommendations derived from the research are presented. The definition of the term impacts as used in this research is discussed and current estimates of the highest expected H/sub 2/S concentrations of in geothermal reservoirs are provided. Regulations governing H/sub 2/S emissions are reviewed and a review of H/sub 2/S control technology and a summary of the control cost functions are included. A case study is presented to illustrate H/sub 2/S abatement costs at the Baca KGRA in New Mexico.

  10. Fuel Cell System for Transportation -- 2005 Cost Estimate

    SciTech Connect

    Wheeler, D.

    2006-10-01

    Independent review report of the methodology used by TIAX to estimate the cost of producing PEM fuel cells using 2005 cell stack technology. The U.S. Department of Energy (DOE) Hydrogen, Fuel Cells and Infrastructure Technologies Program Manager asked the National Renewable Energy Laboratory (NREL) to commission an independent review of the 2005 TIAX cost analysis for fuel cell production. The NREL Systems Integrator is responsible for conducting independent reviews of progress toward meeting the DOE Hydrogen Program (the Program) technical targets. An important technical target of the Program is the proton exchange membrane (PEM) fuel cell cost in terms of dollars per kilowatt ($/kW). The Program's Multi-Year Program Research, Development, and Demonstration Plan established $125/kW as the 2005 technical target. Over the last several years, the Program has contracted with TIAX, LLC (TIAX) to produce estimates of the high volume cost of PEM fuel cell production for transportation use. Since no manufacturer is yet producing PEM fuel cells in the quantities needed for an initial hydrogen-based transportation economy, these estimates are necessary for DOE to gauge progress toward meeting its targets. For a PEM fuel cell system configuration developed by Argonne National Laboratory, TIAX estimated the total cost to be $108/kW, based on assumptions of 500,000 units per year produced with 2005 cell stack technology, vertical integration of cell stack manufacturing, and balance-of-plant (BOP) components purchased from a supplier network. Furthermore, TIAX conducted a Monte Carlo analysis by varying ten key parameters over a wide range of values and estimated with 98% certainty that the mean PEM fuel cell system cost would be below DOE's 2005 target of $125/kW. NREL commissioned DJW TECHNOLOGY, LLC to form an Independent Review Team (the Team) of industry fuel cell experts and to evaluate the cost estimation process and the results reported by TIAX. The results of this

  11. Estimating the costs of landslide damage in the United States

    USGS Publications Warehouse

    Fleming, Robert W.; Taylor, Fred A.

    1980-01-01

    Landslide damages are one of the most costly natural disasters in the United States. A recent estimate of the total annual cost of landslide damage is in excess of $1 billion {Schuster, 1978}. The damages can be significantly reduced, however, through the combined action of technical experts, government, and the public. Before they can be expected to take action, local governments need to have an appreciation of costs of damage in their areas of responsibility and of the reductions in losses that can be achieved. Where studies of cost of landslide damages have been conducted, it is apparent that {1} costs to the public and private sectors of our economy due to landslide damage are much larger than anticipated; {2} taxpayers and public officials generally are unaware of the magnitude of the cost, owing perhaps to the lack of any centralization of data; and {3} incomplete records and unavailability of records result in lower reported costs than actually were incurred. The U.S. Geological Survey has developed a method to estimate the cost of landslide damages in regional and local areas and has applied the method in three urban areas and one rural area. Costs are for different periods and are unadjusted for inflation; therefore, strict comparisons of data from different years should be avoided. Estimates of the average annual cost of landslide damage for the urban areas studied are $5,900,000 in the San Francisco Bay area; $4,000,000 in Allegheny County, Pa.; and $5,170,000 in Hamilton County, Ohio. Adjusting these figures for the population of each area, the annual cost of damages per capita are $1.30 in the nine-county San Francisco Bay region; $2.50 in Allegheny County, Pa.; and $5.80 in Hamilton County, Ohio. On the basis of data from other sources, the estimated annual damages on a per capita basis for the City of Los Angeles, Calif., are about $1.60. If the costs were available for the damages from landslides in Los Angeles in 1977-78 and 1979-80, the annual per

  12. Accurate estimation of the RMS emittance from single current amplifier data

    SciTech Connect

    Stockli, Martin P.; Welton, R.F.; Keller, R.; Letchford, A.P.; Thomae, R.W.; Thomason, J.W.G.

    2002-05-31

    This paper presents the SCUBEEx rms emittance analysis, a self-consistent, unbiased elliptical exclusion method, which combines traditional data-reduction methods with statistical methods to obtain accurate estimates for the rms emittance. Rather than considering individual data, the method tracks the average current density outside a well-selected, variable boundary to separate the measured beam halo from the background. The average outside current density is assumed to be part of a uniform background and not part of the particle beam. Therefore the average outside current is subtracted from the data before evaluating the rms emittance within the boundary. As the boundary area is increased, the average outside current and the inside rms emittance form plateaus when all data containing part of the particle beam are inside the boundary. These plateaus mark the smallest acceptable exclusion boundary and provide unbiased estimates for the average background and the rms emittance. Small, trendless variations within the plateaus allow for determining the uncertainties of the estimates caused by variations of the measured background outside the smallest acceptable exclusion boundary. The robustness of the method is established with complementary variations of the exclusion boundary. This paper presents a detailed comparison between traditional data reduction methods and SCUBEEx by analyzing two complementary sets of emittance data obtained with a Lawrence Berkeley National Laboratory and an ISIS H{sup -} ion source.

  13. Accurate estimation of motion blur parameters in noisy remote sensing image

    NASA Astrophysics Data System (ADS)

    Shi, Xueyan; Wang, Lin; Shao, Xiaopeng; Wang, Huilin; Tao, Zhong

    2015-05-01

    The relative motion between remote sensing satellite sensor and objects is one of the most common reasons for remote sensing image degradation. It seriously weakens image data interpretation and information extraction. In practice, point spread function (PSF) should be estimated firstly for image restoration. Identifying motion blur direction and length accurately is very crucial for PSF and restoring image with precision. In general, the regular light-and-dark stripes in the spectrum can be employed to obtain the parameters by using Radon transform. However, serious noise existing in actual remote sensing images often causes the stripes unobvious. The parameters would be difficult to calculate and the error of the result relatively big. In this paper, an improved motion blur parameter identification method to noisy remote sensing image is proposed to solve this problem. The spectrum characteristic of noisy remote sensing image is analyzed firstly. An interactive image segmentation method based on graph theory called GrabCut is adopted to effectively extract the edge of the light center in the spectrum. Motion blur direction is estimated by applying Radon transform on the segmentation result. In order to reduce random error, a method based on whole column statistics is used during calculating blur length. Finally, Lucy-Richardson algorithm is applied to restore the remote sensing images of the moon after estimating blur parameters. The experimental results verify the effectiveness and robustness of our algorithm.

  14. Painfree and accurate Bayesian estimation of psychometric functions for (potentially) overdispersed data.

    PubMed

    Schütt, Heiko H; Harmeling, Stefan; Macke, Jakob H; Wichmann, Felix A

    2016-05-01

    The psychometric function describes how an experimental variable, such as stimulus strength, influences the behaviour of an observer. Estimation of psychometric functions from experimental data plays a central role in fields such as psychophysics, experimental psychology and in the behavioural neurosciences. Experimental data may exhibit substantial overdispersion, which may result from non-stationarity in the behaviour of observers. Here we extend the standard binomial model which is typically used for psychometric function estimation to a beta-binomial model. We show that the use of the beta-binomial model makes it possible to determine accurate credible intervals even in data which exhibit substantial overdispersion. This goes beyond classical measures for overdispersion-goodness-of-fit-which can detect overdispersion but provide no method to do correct inference for overdispersed data. We use Bayesian inference methods for estimating the posterior distribution of the parameters of the psychometric function. Unlike previous Bayesian psychometric inference methods our software implementation-psignifit 4-performs numerical integration of the posterior within automatically determined bounds. This avoids the use of Markov chain Monte Carlo (MCMC) methods typically requiring expert knowledge. Extensive numerical tests show the validity of the approach and we discuss implications of overdispersion for experimental design. A comprehensive MATLAB toolbox implementing the method is freely available; a python implementation providing the basic capabilities is also available. PMID:27013261

  15. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method. PMID:23893759

  16. Cost estimates for membrane filtration and conventional treatment

    SciTech Connect

    Wiesner, M.R.; Hackney, J.; Sethi, S. ); Jacangelo, J.G. ); Laine, J.M. . Lyonnaise des Eaux)

    1994-12-01

    Costs of several ultrafiltration and nanofiltration processes are compared with the cost of conventional liquid-solid separation with and without GAC adsorption for small water treatment facilities. Data on raw-water quality, permeate flux, recovery, frequency of backflushing, and chemical dosage obtained from a pilot study were used with a previously developed model for membrane costs to calculate anticipated capital and operating costs for each instance. Data from the US Environmental Protection Agency were used to estimate conventional treatment costs. All of the membrane process calculations showed comparable or lower total costs per unit volume treated compared with conventional treatment for small facilities (< 200,000 m[sup 3]/d or about 5 mgd). Membrane processes may offer small facilities a less expensive alternative for the removal of particles and organic materials from drinking water.

  17. Quick and accurate estimation of the elastic constants using the minimum image method

    NASA Astrophysics Data System (ADS)

    Tretiakov, Konstantin V.; Wojciechowski, Krzysztof W.

    2015-04-01

    A method for determining the elastic properties using the minimum image method (MIM) is proposed and tested on a model system of particles interacting by the Lennard-Jones (LJ) potential. The elastic constants of the LJ system are determined in the thermodynamic limit, N → ∞, using the Monte Carlo (MC) method in the NVT and NPT ensembles. The simulation results show that when determining the elastic constants, the contribution of long-range interactions cannot be ignored, because that would lead to erroneous results. In addition, the simulations have revealed that the inclusion of further interactions of each particle with all its minimum image neighbors even in case of small systems leads to results which are very close to the values of elastic constants in the thermodynamic limit. This enables one for a quick and accurate estimation of the elastic constants using very small samples.

  18. Oil and gas pipeline construction cost analysis and developing regression models for cost estimation

    NASA Astrophysics Data System (ADS)

    Thaduri, Ravi Kiran

    In this study, cost data for 180 pipelines and 136 compressor stations have been analyzed. On the basis of the distribution analysis, regression models have been developed. Material, Labor, ROW and miscellaneous costs make up the total cost of a pipeline construction. The pipelines are analyzed based on different pipeline lengths, diameter, location, pipeline volume and year of completion. In a pipeline construction, labor costs dominate the total costs with a share of about 40%. Multiple non-linear regression models are developed to estimate the component costs of pipelines for various cross-sectional areas, lengths and locations. The Compressor stations are analyzed based on the capacity, year of completion and location. Unlike the pipeline costs, material costs dominate the total costs in the construction of compressor station, with an average share of about 50.6%. Land costs have very little influence on the total costs. Similar regression models are developed to estimate the component costs of compressor station for various capacities and locations.

  19. US-based Drug Cost Parameter Estimation for Economic Evaluations

    PubMed Central

    Levy, Joseph F; Meek, Patrick D; Rosenberg, Marjorie A

    2014-01-01

    Introduction In the US, more than 10% of national health expenditures are for prescription drugs. Assessing drug costs in US economic evaluation studies is not consistent, as the true acquisition cost of a drug is not known by decision modelers. Current US practice focuses on identifying one reasonable drug cost and imposing some distributional assumption to assess uncertainty. Methods We propose a set of Rules based on current pharmacy practice that account for the heterogeneity of drug product costs. The set of products derived from our Rules, and their associated costs, form an empirical distribution that can be used for more realistic sensitivity analyses, and create transparency in drug cost parameter computation. The Rules specify an algorithmic process to select clinically equivalent drug products that reduce pill burden, use an appropriate package size, and assume uniform weighting of substitutable products. Three diverse examples show derived empirical distributions and are compared with previously reported cost estimates. Results The shapes of the empirical distributions among the three drugs differ dramatically, including multiple modes and different variation. Previously published estimates differed from the means of the empirical distributions. Published ranges for sensitivity analyses did not cover the ranges of the empirical distributions. In one example using lisinopril, the empirical mean cost of substitutable products was $444 (range $23–$953) as compared to a published estimate of $305 (range $51–$523). Conclusions Our Rules create a simple and transparent approach to create cost estimates of drug products and assess their variability. The approach is easily modified to include a subset of, or different weighting for, substitutable products. The derived empirical distribution is easily incorporated into one-way or probabilistic sensitivity analyses. PMID:25532826

  20. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms.

    PubMed

    Saccà, Alessandro

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes' principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of 'unellipticity' introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  1. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms

    PubMed Central

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes’ principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of ‘unellipticity’ introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  2. Accurate Estimation of the Fine Layering Effect on the Wave Propagation in the Carbonate Rocks

    NASA Astrophysics Data System (ADS)

    Bouchaala, F.; Ali, M. Y.

    2014-12-01

    The attenuation caused to the seismic wave during its propagation can be mainly divided into two parts, the scattering and the intrinsic attenuation. The scattering is an elastic redistribution of the energy due to the medium heterogeneities. However the intrinsic attenuation is an inelastic phenomenon, mainly due to the fluid-grain friction during the wave passage. The intrinsic attenuation is directly related to the physical characteristics of the medium, so this parameter is very can be used for media characterization and fluid detection, which is beneficial for the oil and gas industry. The intrinsic attenuation is estimated by subtracting the scattering from the total attenuation, therefore the accuracy of the intrinsic attenuation is directly dependent on the accuracy of the total attenuation and the scattering. The total attenuation can be estimated from the recorded waves, by using in-situ methods as the spectral ratio and frequency shift methods. The scattering is estimated by assuming the heterogeneities as a succession of stacked layers, each layer is characterized by a single density and velocity. The accuracy of the scattering is strongly dependent on the layer thicknesses, especially in the case of the media composed of carbonate rocks, such media are known for their strong heterogeneity. Previous studies gave some assumptions for the choice of the layer thickness, but they showed some limitations especially in the case of carbonate rocks. In this study we established a relationship between the layer thicknesses and the frequency of the propagation, after certain mathematical development of the Generalized O'Doherty-Anstey formula. We validated this relationship through some synthetic tests and real data provided from a VSP carried out over an onshore oilfield in the emirate of Abu Dhabi in the United Arab Emirates, primarily composed of carbonate rocks. The results showed the utility of our relationship for an accurate estimation of the scattering

  3. Accurate biopsy-needle depth estimation in limited-angle tomography using multi-view geometry

    NASA Astrophysics Data System (ADS)

    van der Sommen, Fons; Zinger, Sveta; de With, Peter H. N.

    2016-03-01

    Recently, compressed-sensing based algorithms have enabled volume reconstruction from projection images acquired over a relatively small angle (θ < 20°). These methods enable accurate depth estimation of surgical tools with respect to anatomical structures. However, they are computationally expensive and time consuming, rendering them unattractive for image-guided interventions. We propose an alternative approach for depth estimation of biopsy needles during image-guided interventions, in which we split the problem into two parts and solve them independently: needle-depth estimation and volume reconstruction. The complete proposed system consists of the previous two steps, preceded by needle extraction. First, we detect the biopsy needle in the projection images and remove it by interpolation. Next, we exploit epipolar geometry to find point-to-point correspondences in the projection images to triangulate the 3D position of the needle in the volume. Finally, we use the interpolated projection images to reconstruct the local anatomical structures and indicate the position of the needle within this volume. For validation of the algorithm, we have recorded a full CT scan of a phantom with an inserted biopsy needle. The performance of our approach ranges from a median error of 2.94 mm for an distributed viewing angle of 1° down to an error of 0.30 mm for an angle larger than 10°. Based on the results of this initial phantom study, we conclude that multi-view geometry offers an attractive alternative to time-consuming iterative methods for the depth estimation of surgical tools during C-arm-based image-guided interventions.

  4. Detailed cost estimate of reference residential photovoltaic designs

    SciTech Connect

    Palmer, R.S.; Penasa, D.A.; Thomas, M.G.

    1983-04-01

    This report presents estimated installation costs for four reference residential photovoltaic designs. Installation cost estimates ranged from $1.28 to $2.12/W/sub p/ for arrays installed by union labor (4.1 to 6.07 kW/sub p/-systems), and from $1.22 to $1.83 W/sub p/ for non-union installations. Standoff mounting was found to increase costs from $1.63/W/sub p/ to $2.12/W/sub p/ for a representative case, whereas 25 kWh of battery storage capacity increased installation costs from $1.44/W/sub p/ to $2.08/W/sub p/. Overall system costs (union-based were $6000 to $7000 for a 4.1 kW array in the northeast, to approx. $9000 for a 6.07 kW/sub p/ array in the southwest. This range of installation costs, approx. $1 to $2/W/sub p/ (in 1980 dollars), is representative of current installation costs for residential PV systems. Any future cost reductions are likely to be small and can be accomplished only by optimization of mounting techniques, module efficiencies, and module reliability in toto.

  5. Estimating the Costs of Educating Handicapped Children: A Resource-Cost Model Approach--Summary Report.

    ERIC Educational Resources Information Center

    Hartman, William T.

    The purpose of this study was to develop an appropriate methodology and use it to estimate the costs of providing appropriate special education programs and services for all school-aged handicapped children in the U.S. in 1980-81. A resource-cost model approach was selected, based on a mathematical formulation of the relationships among students,…

  6. Can student health professionals accurately estimate alcohol content in commonly occurring drinks?

    PubMed Central

    Sinclair, Julia; Searle, Emma

    2016-01-01

    Objectives: Correct identification of alcohol as a contributor to, or comorbidity of, many psychiatric diseases requires health professionals to be competent and confident to take an accurate alcohol history. Being able to estimate (or calculate) the alcohol content in commonly consumed drinks is a prerequisite for quantifying levels of alcohol consumption. The aim of this study was to assess this ability in medical and nursing students. Methods: A cross-sectional survey of 891 medical and nursing students across different years of training was conducted. Students were asked the alcohol content of 10 different alcoholic drinks by seeing a slide of the drink (with picture, volume and percentage of alcohol by volume) for 30 s. Results: Overall, the mean number of correctly estimated drinks (out of the 10 tested) was 2.4, increasing to just over 3 if a 10% margin of error was used. Wine and premium strength beers were underestimated by over 50% of students. Those who drank alcohol themselves, or who were further on in their clinical training, did better on the task, but overall the levels remained low. Conclusions: Knowledge of, or the ability to work out, the alcohol content of commonly consumed drinks is poor, and further research is needed to understand the reasons for this and the impact this may have on the likelihood to undertake screening or initiate treatment. PMID:27536344

  7. Ultrasound Fetal Weight Estimation: How Accurate Are We Now Under Emergency Conditions?

    PubMed

    Dimassi, Kaouther; Douik, Fatma; Ajroudi, Mariem; Triki, Amel; Gara, Mohamed Faouzi

    2015-10-01

    The primary aim of this study was to evaluate the accuracy of sonographic estimation of fetal weight when performed at due date by first-line sonographers. This was a prospective study including 500 singleton pregnancies. Ultrasound examinations were performed by residents on delivery day. Estimated fetal weights (EFWs) were calculated and compared with the corresponding birth weights. The median absolute difference between EFW and birth weight was 200 g (100-330). This difference was within ±10% in 75.2% of the cases. The median absolute percentage error was 5.53% (2.70%-10.03%). Linear regression analysis revealed a good correlation between EFW and birth weight (r = 0.79, p < 0.0001). According to Bland-Altman analysis, bias was -85.06 g (95% limits of agreement: -663.33 to 494.21). In conclusion, EFWs calculated by residents were as accurate as those calculated by experienced sonographers. Nevertheless, predictive performance remains limited, with a low sensitivity in the diagnosis of macrosomia. PMID:26164286

  8. Ocean Lidar Measurements of Beam Attenuation and a Roadmap to Accurate Phytoplankton Biomass Estimates

    NASA Astrophysics Data System (ADS)

    Hu, Yongxiang; Behrenfeld, Mike; Hostetler, Chris; Pelon, Jacques; Trepte, Charles; Hair, John; Slade, Wayne; Cetinic, Ivona; Vaughan, Mark; Lu, Xiaomei; Zhai, Pengwang; Weimer, Carl; Winker, David; Verhappen, Carolus C.; Butler, Carolyn; Liu, Zhaoyan; Hunt, Bill; Omar, Ali; Rodier, Sharon; Lifermann, Anne; Josset, Damien; Hou, Weilin; MacDonnell, David; Rhew, Ray

    2016-06-01

    Beam attenuation coefficient, c, provides an important optical index of plankton standing stocks, such as phytoplankton biomass and total particulate carbon concentration. Unfortunately, c has proven difficult to quantify through remote sensing. Here, we introduce an innovative approach for estimating c using lidar depolarization measurements and diffuse attenuation coefficients from ocean color products or lidar measurements of Brillouin scattering. The new approach is based on a theoretical formula established from Monte Carlo simulations that links the depolarization ratio of sea water to the ratio of diffuse attenuation Kd and beam attenuation C (i.e., a multiple scattering factor). On July 17, 2014, the CALIPSO satellite was tilted 30° off-nadir for one nighttime orbit in order to minimize ocean surface backscatter and demonstrate the lidar ocean subsurface measurement concept from space. Depolarization ratios of ocean subsurface backscatter are measured accurately. Beam attenuation coefficients computed from the depolarization ratio measurements compare well with empirical estimates from ocean color measurements. We further verify the beam attenuation coefficient retrievals using aircraft-based high spectral resolution lidar (HSRL) data that are collocated with in-water optical measurements.

  9. Utilizing Expert Knowledge in Estimating Future STS Costs

    NASA Technical Reports Server (NTRS)

    Fortner, David B.; Ruiz-Torres, Alex J.

    2004-01-01

    A method of estimating the costs of future space transportation systems (STSs) involves classical activity-based cost (ABC) modeling combined with systematic utilization of the knowledge and opinions of experts to extend the process-flow knowledge of existing systems to systems that involve new materials and/or new architectures. The expert knowledge is particularly helpful in filling gaps that arise in computational models of processes because of inconsistencies in historical cost data. Heretofore, the costs of planned STSs have been estimated following a "top-down" approach that tends to force the architectures of new systems to incorporate process flows like those of the space shuttles. In this ABC-based method, one makes assumptions about the processes, but otherwise follows a "bottoms up" approach that does not force the new system architecture to incorporate a space-shuttle-like process flow. Prototype software has been developed to implement this method. Through further development of software, it should be possible to extend the method beyond the space program to almost any setting in which there is a need to estimate the costs of a new system and to extend the applicable knowledge base in order to make the estimate.

  10. Biodiversity on Swedish pastures: estimating biodiversity production costs.

    PubMed

    Nilsson, Fredrik Olof Laurentius

    2009-01-01

    This paper estimates the costs of producing biological diversity on Swedish permanent grasslands. A simple model is introduced where biodiversity on pastures is produced using grazing animals. On the pastures, the grazing animals create a sufficient grazing pressure to lead to an environment that suits many rare and red-listed species. Two types of pastures are investigated: semi-natural and cultivated. Biological diversity produced on a pasture is estimated by combining a biodiversity indicator, which measures the quality of the land, with the size of the pasture. Biodiversity is, in this context, a quantitative measure where a given quantity can be produced either by small area with high quality or a larger area with lower quality. Two areas in different parts of Sweden are investigated. Box-Cox transformations, which provide flexible functional forms, are used in the empirical analysis and the results indicate that the biodiversity production costs differ between the regions. The major contribution of this paper is that it develops and tests a method of estimating biodiversity production costs on permanent pastures when biodiversity quality differs between pastures. If the method were to be used with cost data, that were more thoroughly collected and covered additional production areas, biodiversity cost functions could be estimated and used in applied policy work. PMID:18079049

  11. Evaluation and application of cost estimates for hazardous waste remediation

    SciTech Connect

    LeBoeuf, E.J.; Roberts, P.V.; McCarty, P.L.

    1996-11-01

    The remediation of sites contaminated by hazardous wastes is often a very difficult and frustrating task for all parties involved. The public rightfully demands quick elimination of possible health threats caused by the contamination of the subsurface with hazardous chemicals. The government demands the same, but is also concerned with permanence of the remediation process, and ensuring the potentially responsible parties (PRP), are held fully liable for the cleanup. Finally, the PRP is concerned about all of the aforementioned factors, its reputation, and, as important, costs. It is this final aspect of hazardous waste remediation projects that has caused the largest concern. Because business and government often evaluate costs with differing criteria, it is necessary that both parties understand each other`s position, and especially the limitations and uncertainties associated with the preparation or preliminary remediation project cost estimates. Often it is these preliminary estimates that are used to determine which available technology will be employed at a specific site. The purpose of this paper is to describe the development of remediation cost estimates, evaluate available cost assessment programs, and finally compare remediation technologies using the US Environmental Protection Agency`s Cost of Remedial Action (CORA) program in an actual remedial action case study.

  12. 48 CFR 1615.406-2 - Certificate of accurate cost or pricing data for community-rated carriers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Certificate of accurate cost or pricing data for community-rated carriers. 1615.406-2 Section 1615.406-2 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION...

  13. Power shortage costs: estimates and applications. Final report

    SciTech Connect

    Mosbaek, E.J.

    1981-12-01

    This report presents estimates of costs associated with major outages in San Diego, California, and Key West, Florida. It also outlines several applications of shortage cost estimates. The total losses for the several-hour outage in San Diego were $3.12 and $2.62 per kWh for industrial and commercial users, respectively. For the 26-day shortage in Key West, the total loss for commercial and industrial users combined was $2.20 per kWh in the short-run (during and immediately after the outage) plus an additional $.19 per kWh during the subsequent year. The suggested applications of shortage cost estimates include: (1) designing and evaluating better rate design; (2) establishing optimum reliability; (3) scheduling plant expansion; (4) designing programs in load management; (5) selecting options in loss-of-load management; and (6) establishing a benchmark from which to measure improvements in the supply of electric power. The estimates and suggested applications in this report should be helpful for many decisions by utility companies, regulatory authorities, and government programs. All estimates in this report are the true cost of a shortage - they represent the willingness-to-pay to avoid a kWh of shortage. Willingness-to-pay is the most helpful measure of shortage impact because it shows the value of eliminating or reducing a shortage of electricity.

  14. Discrete state model and accurate estimation of loop entropy of RNA secondary structures.

    PubMed

    Zhang, Jian; Lin, Ming; Chen, Rong; Wang, Wei; Liang, Jie

    2008-03-28

    Conformational entropy makes important contribution to the stability and folding of RNA molecule, but it is challenging to either measure or compute conformational entropy associated with long loops. We develop optimized discrete k-state models of RNA backbone based on known RNA structures for computing entropy of loops, which are modeled as self-avoiding walks. To estimate entropy of hairpin, bulge, internal loop, and multibranch loop of long length (up to 50), we develop an efficient sampling method based on the sequential Monte Carlo principle. Our method considers excluded volume effect. It is general and can be applied to calculating entropy of loops with longer length and arbitrary complexity. For loops of short length, our results are in good agreement with a recent theoretical model and experimental measurement. For long loops, our estimated entropy of hairpin loops is in excellent agreement with the Jacobson-Stockmayer extrapolation model. However, for bulge loops and more complex secondary structures such as internal and multibranch loops, we find that the Jacobson-Stockmayer extrapolation model has large errors. Based on estimated entropy, we have developed empirical formulae for accurate calculation of entropy of long loops in different secondary structures. Our study on the effect of asymmetric size of loops suggest that loop entropy of internal loops is largely determined by the total loop length, and is only marginally affected by the asymmetric size of the two loops. Our finding suggests that the significant asymmetric effects of loop length in internal loops measured by experiments are likely to be partially enthalpic. Our method can be applied to develop improved energy parameters important for studying RNA stability and folding, and for predicting RNA secondary and tertiary structures. The discrete model and the program used to calculate loop entropy can be downloaded at http://gila.bioengr.uic.edu/resources/RNA.html. PMID:18376982

  15. Cost estimates for commercial plasma source ion implantation

    SciTech Connect

    Rej, D.J. ); Alexander, R.B. )

    1994-07-01

    A semiempirical model for the cost of a commercial plasma source ion implantation (PSII) facility is presented. Amortized capital and operating expenses are estimated as functions of the surface area throughput [ital T]. The impact of secondary electron emission and batch processing time is considered. Treatment costs are found to decrease monotonically with [ital T] until they saturate at large [ital T] when capital equipment payback and space rental dominate the expense. A reasonably sized PSII treatment facility should be able to treat a surface area of 10[sup 4] m[sup 2] per year at a cost of $0.01 per cm[sup 2].

  16. A process model to estimate biodiesel production costs.

    PubMed

    Haas, Michael J; McAloon, Andrew J; Yee, Winnie C; Foglia, Thomas A

    2006-03-01

    'Biodiesel' is the name given to a renewable diesel fuel that is produced from fats and oils. It consists of the simple alkyl esters of fatty acids, most typically the methyl esters. We have developed a computer model to estimate the capital and operating costs of a moderately-sized industrial biodiesel production facility. The major process operations in the plant were continuous-process vegetable oil transesterification, and ester and glycerol recovery. The model was designed using contemporary process simulation software, and current reagent, equipment and supply costs, following current production practices. Crude, degummed soybean oil was specified as the feedstock. Annual production capacity of the plant was set at 37,854,118 l (10 x 10(6)gal). Facility construction costs were calculated to be US dollar 11.3 million. The largest contributors to the equipment cost, accounting for nearly one third of expenditures, were storage tanks to contain a 25 day capacity of feedstock and product. At a value of US dollar 0.52/kg (dollar 0.236/lb) for feedstock soybean oil, a biodiesel production cost of US dollar 0.53/l (dollar 2.00/gal) was predicted. The single greatest contributor to this value was the cost of the oil feedstock, which accounted for 88% of total estimated production costs. An analysis of the dependence of production costs on the cost of the feedstock indicated a direct linear relationship between the two, with a change of US dollar 0.020/l (dollar 0.075/gal) in product cost per US dollar 0.022/kg (dollar 0.01/lb) change in oil cost. Process economics included the recovery of coproduct glycerol generated during biodiesel production, and its sale into the commercial glycerol market as an 80% w/w aqueous solution, which reduced production costs by approximately 6%. The production cost of biodiesel was found to vary inversely and linearly with variations in the market value of glycerol, increasing by US dollar 0.0022/l (dollar 0.0085/gal) for every US

  17. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  18. Estimating the Cost-Effectiveness of Coordinated DSM Programs.

    ERIC Educational Resources Information Center

    Hill, Lawrence J.; Brown, Marilyn A.

    1995-01-01

    A methodology for estimating the cost-effectiveness of coordinated programs from the standpoint of an electric or gas utility is described and illustrated. The discussion focuses on demand-side management programs cofunded by the government and utilities, but it can be applied to other types of cofunded programs. (SLD)

  19. Stochastic Estimation of Cost Frontier: Evidence from Bangladesh

    ERIC Educational Resources Information Center

    Mamun, Shamsul Arifeen Khan

    2012-01-01

    In the literature of higher education cost function study, enough knowledge is created in the area of economy scale in the context of developed countries but the knowledge of input demand is lacking. On the other hand, empirical knowledge in the context of developing countries is very meagre. The paper fills up the knowledge gap, estimating a…

  20. 40 CFR 264.142 - Cost estimate for closure.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Cost estimate for closure. 264.142 Section 264.142 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND DISPOSAL...

  1. 48 CFR 836.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Government estimate of construction costs. 836.203 Section 836.203 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  2. 48 CFR 36.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Government estimate of construction costs. 36.203 Section 36.203 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  3. 48 CFR 836.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Government estimate of construction costs. 836.203 Section 836.203 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  4. 48 CFR 436.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Government estimate of construction costs. 436.203 Section 436.203 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  5. 48 CFR 36.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Government estimate of construction costs. 36.203 Section 36.203 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  6. 48 CFR 1336.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Government estimate of construction costs. 1336.203 Section 1336.203 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  7. 48 CFR 436.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Government estimate of construction costs. 436.203 Section 436.203 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  8. 48 CFR 1336.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Government estimate of construction costs. 1336.203 Section 1336.203 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  9. 48 CFR 1336.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Government estimate of construction costs. 1336.203 Section 1336.203 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  10. 48 CFR 836.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Government estimate of construction costs. 836.203 Section 836.203 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  11. 48 CFR 436.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Government estimate of construction costs. 436.203 Section 436.203 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  12. 48 CFR 1336.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Government estimate of construction costs. 1336.203 Section 1336.203 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  13. 48 CFR 836.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Government estimate of construction costs. 836.203 Section 836.203 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  14. 48 CFR 436.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Government estimate of construction costs. 436.203 Section 436.203 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  15. 48 CFR 36.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Government estimate of construction costs. 36.203 Section 36.203 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  16. 48 CFR 36.203 - Government estimate of construction costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Government estimate of construction costs. 36.203 Section 36.203 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Special Aspects...

  17. Common Day Care Safety Renovations: Descriptions, Explanations and Cost Estimates.

    ERIC Educational Resources Information Center

    Spack, Stan

    This booklet explains some of the day care safety features specified by the new Massachusetts State Building Code (January 1, 1975) which must be met before a new day care center can be licensed. The safety features described are those which most often require renovation to meet the building code standards. Best estimates of the costs involved in…

  18. Scheduling and Estimating the Cost of Crew Time

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Levri, Julie A.; Vaccari, David A.; Luna, Bernadette (Technical Monitor)

    2000-01-01

    In a previous paper, Theory and Application of the Equivalent System Mass Metric, Julie Levri, David Vaccari, and Alan Drysdale developed a method for computing the Equivalent System Mass (ESM) of crew time. ESM is an analog of cost. The suggested approach has been applied but seems to impose too high a cost for small additional requirements for crew time. The proposed method is based on the minimum average cost of crew time. In this work, the scheduling of crew time is examined in more detail, using suggested crew time allocations and daily work schedules. Crew tasks are typically assigned using priorities, which can also be used to construct a crew time demand curve mapping the value or cost per hour versus the total number of hours worked. The cost of additional crew time can be estimated by considering the intersection and shapes of the demand and supply curves. If e assume a mathematical form for the demand curve, a revised method can be developed for computing the cost or ESM of crew time. This method indicates a low cost per hour for small additional requirements for crew time and an increasing cost per hour for larger requirements.

  19. 24 CFR 886.330 - Work write-ups and cost estimates.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... cost estimate to complete rehabilitation. The cost of any necessary relocation, as determined by HUD as... costs allowable by HUD will be included in the cost estimate. The work write-up and cost estimate shall... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Work write-ups and cost...

  20. Accurate Visual Heading Estimation at High Rotation Rate Without Oculomotor or Static-Depth Cues

    NASA Technical Reports Server (NTRS)

    Stone, Leland S.; Perrone, John A.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    It has been claimed that either oculomotor or static depth cues provide the signals about self-rotation necessary approx.-1 deg/s. We tested this hypothesis by simulating self-motion along a curved path with the eyes fixed in the head (plus or minus 16 deg/s of rotation). Curvilinear motion offers two advantages: 1) heading remains constant in retinotopic coordinates, and 2) there is no visual-oculomotor conflict (both actual and simulated eye position remain stationary). We simulated 400 ms of rotation combined with 16 m/s of translation at fixed angles with respect to gaze towards two vertical planes of random dots initially 12 and 24 m away, with a field of view of 45 degrees. Four subjects were asked to fixate a central cross and to respond whether they were translating to the left or right of straight-ahead gaze. From the psychometric curves, heading bias (mean) and precision (semi-interquartile) were derived. The mean bias over 2-5 runs was 3.0, 4.0, -2.0, -0.4 deg for the first author and three naive subjects, respectively (positive indicating towards the rotation direction). The mean precision was 2.0, 1.9, 3.1, 1.6 deg. respectively. The ability of observers to make relatively accurate and precise heading judgments, despite the large rotational flow component, refutes the view that extra-flow-field information is necessary for human visual heading estimation at high rotation rates. Our results support models that process combined translational/rotational flow to estimate heading, but should not be construed to suggest that other cues do not play an important role when they are available to the observer.

  1. 16 CFR 305.5 - Determinations of estimated annual energy consumption, estimated annual operating cost, and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... consumption, estimated annual operating cost, and energy efficiency rating, and of water use rate. 305.5... energy efficiency rating, and of water use rate. Link to an amendment published at 75 FR 41713, July 19... operating costs, the energy efficiency ratings, and the efficacy factors of the following covered...

  2. Verify by Genability - Providing Solar Customers with Accurate Reports of Utility Bill Cost Savings

    SciTech Connect

    2015-12-01

    The National Renewable Energy Laboratory (NREL), partnering with Genability and supported by the U.S. Department of Energy's SunShot Incubator program, independently verified the accuracy of Genability's monthly cost savings.

  3. An accurate cost effective DFT approach to study the sensing behaviour of polypyrrole towards nitrate ions in gas and aqueous phases.

    PubMed

    Wasim, Fatima; Mahmood, Tariq; Ayub, Khurshid

    2016-07-28

    Density functional theory (DFT) calculations have been performed to study the response of polypyrrole towards nitrate ions in gas and aqueous phases. First, an accurate estimate of interaction energies is obtained by methods calibrated against the gold standard CCSD(T) method. Then, a number of low cost DFT methods are also evaluated for their ability to accurately estimate the binding energies of polymer-nitrate complexes. The low cost methods evaluated here include dispersion corrected potential (DCP), Grimme's D3 correction, counterpoise correction of the B3LYP method, and Minnesota functionals (M05-2X). The interaction energies calculated using the counterpoise (CP) correction and DCP methods at the B3LYP level are in better agreement with the interaction energies calculated using the calibrated methods. The interaction energies of an infinite polymer (polypyrrole) with nitrate ions are calculated by a variety of low cost methods in order to find the associated errors. The electronic and spectroscopic properties of polypyrrole oligomers nPy (where n = 1-9) and nPy-NO3(-) complexes are calculated, and then extrapolated for an infinite polymer through a second degree polynomial fit. Charge analysis, frontier molecular orbital (FMO) analysis and density of state studies also reveal the sensing ability of polypyrrole towards nitrate ions. Interaction energies, charge analysis and density of states analyses illustrate that the response of polypyrrole towards nitrate ions is considerably reduced in the aqueous medium (compared to the gas phase). PMID:27375267

  4. Estimating resource use and cost of prophylactic management of neutropenia with filgrastim.

    PubMed

    Annemans, L; Van Overbeke, N; Standaert, B; Van Belle, S

    2005-05-01

    The study objective is to develop a methodology for the measurement of time, resource use and cost of the prophylactic management of neutropenia with filgrastim in different settings where the drug is routinely used: in-hospital care, outpatient care and home care. The activity-based costing method is used to analyse the cost of managing prophylactically neutropenia and comprises four steps. First, department heads in each of the chosen settings were selected and interviewed to obtain key elements in the workflow that involves the management of neutropenia, followed by the second step involving in-depth, structured interviews of key personnel. The third step was the measurement of the time required for frequently occurring activities in monitoring neutropenia and the administration of filgrastim by a study nurse. Finally, information on resource unit costs and personnel salaries were collected from the administration units to calculate an average cost. Sensitivity analyses were undertaken on estimated variables in the study. A list of eight to 14 consecutive activities linked to the prophylactic management of neutropenia was observed. The number and type of activities do not differ between an in-hospital oncology ward and an outpatient setting except for blood samplings. The difference is more pronounced between hospital and home care settings, as in the latter the patient performs many of the activities him/herself. The cost estimate per setting for prophylactic drug use is 6.30 Euros for in-hospital care, 3.67 Euros for outpatient care and 5.49 Euros for home care. Taking the two most frequently occurring scenarios per chemotherapy cycle (i.e. with or without febrile neutropenia), the following cost estimates are obtained: 60.41 Euros for a patient with febrile neutropenia and 56.77 Euros for a patient without febrile neutropenia, excluding drug costs. With the activity-based costing method it is possible to accurately demonstrate cost savings in the management

  5. 16 CFR 305.5 - Determinations of estimated annual energy consumption, estimated annual operating cost, and...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Determinations of estimated annual energy consumption, estimated annual operating cost, and energy efficiency rating, water use rate, and other required disclosure content. 305.5 Section 305.5 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS ENERGY...

  6. Building of an experimental cline with Arabidopsis thaliana to estimate herbicide fitness cost.

    PubMed

    Roux, Fabrice; Giancola, Sandra; Durand, Stéphanie; Reboud, Xavier

    2006-06-01

    Various management strategies aim at maintaining pesticide resistance frequency under a threshold value by taking advantage of the benefit of the fitness penalty (the cost) expressed by the resistance allele outside the treated area or during the pesticide selection "off years." One method to estimate a fitness cost is to analyze the resistance allele frequency along transects across treated and untreated areas. On the basis of the shape of the cline, this method gives the relative contributions of both gene flow and the fitness difference between genotypes in the treated and untreated areas. Taking advantage of the properties of such migration-selection balance, an artificial cline was built up to optimize the conditions where the fitness cost of two herbicide-resistant mutants (acetolactate synthase and auxin-induced target genes) in the model species Arabidopsis thaliana could be more accurately measured. The analysis of the microevolutionary dynamics in these experimental populations indicated mean fitness costs of approximately 15 and 92% for the csr1-1 and axr2-1 resistances, respectively. In addition, negative frequency dependence for the fitness cost was also detected for the axr2-1 resistance. The advantages and disadvantages of the cline approach are discussed in regard to other methods of cost estimation. This comparison highlights the powerful ability of an experimental cline to measure low fitness costs and detect sensibility to frequency-dependent variations. PMID:16582450

  7. Building of an Experimental Cline With Arabidopsis thaliana to Estimate Herbicide Fitness Cost

    PubMed Central

    Roux, Fabrice; Giancola, Sandra; Durand, Stéphanie; Reboud, Xavier

    2006-01-01

    Various management strategies aim at maintaining pesticide resistance frequency under a threshold value by taking advantage of the benefit of the fitness penalty (the cost) expressed by the resistance allele outside the treated area or during the pesticide selection “off years.” One method to estimate a fitness cost is to analyze the resistance allele frequency along transects across treated and untreated areas. On the basis of the shape of the cline, this method gives the relative contributions of both gene flow and the fitness difference between genotypes in the treated and untreated areas. Taking advantage of the properties of such migration–selection balance, an artificial cline was built up to optimize the conditions where the fitness cost of two herbicide-resistant mutants (acetolactate synthase and auxin-induced target genes) in the model species Arabidopsis thaliana could be more accurately measured. The analysis of the microevolutionary dynamics in these experimental populations indicated mean fitness costs of ∼15 and 92% for the csr1-1 and axr2-1 resistances, respectively. In addition, negative frequency dependence for the fitness cost was also detected for the axr2-1 resistance. The advantages and disadvantages of the cline approach are discussed in regard to other methods of cost estimation. This comparison highlights the powerful ability of an experimental cline to measure low fitness costs and detect sensibility to frequency-dependent variations. PMID:16582450

  8. Estimating the Deep Space Network modification costs to prepare for future space missions by using major cost drivers

    NASA Technical Reports Server (NTRS)

    Remer, Donald S.; Sherif, Josef; Buchanan, Harry R.

    1993-01-01

    This paper develops a cost model to do long range planning cost estimates for Deep Space Network (DSN) support of future space missions. The paper focuses on the costs required to modify and/or enhance the DSN to prepare for future space missions. The model is a function of eight major mission cost drivers and estimates both the total cost and the annual costs of a similar future space mission. The model is derived from actual cost data from three space missions: Voyager (Uranus), Voyager (Neptune), and Magellan. Estimates derived from the model are tested against actual cost data for two independent missions, Viking and Mariner Jupiter/Saturn (MJS).

  9. Skin Temperature Over the Carotid Artery, an Accurate Non-invasive Estimation of Near Core Temperature

    PubMed Central

    Imani, Farsad; Karimi Rouzbahani, Hamid Reza; Goudarzi, Mehrdad; Tarrahi, Mohammad Javad; Ebrahim Soltani, Alireza

    2016-01-01

    Background: During anesthesia, continuous body temperature monitoring is essential, especially in children. Anesthesia can increase the risk of loss of body temperature by three to four times. Hypothermia in children results in increased morbidity and mortality. Since the measurement points of the core body temperature are not easily accessible, near core sites, like rectum, are used. Objectives: The purpose of this study was to measure skin temperature over the carotid artery and compare it with the rectum temperature, in order to propose a model for accurate estimation of near core body temperature. Patients and Methods: Totally, 124 patients within the age range of 2 - 6 years, undergoing elective surgery, were selected. Temperature of rectum and skin over the carotid artery was measured. Then, the patients were randomly divided into two groups (each including 62 subjects), namely modeling (MG) and validation groups (VG). First, in the modeling group, the average temperature of the rectum and skin over the carotid artery were measured separately. The appropriate model was determined, according to the significance of the model’s coefficients. The obtained model was used to predict the rectum temperature in the second group (VG group). Correlation of the predicted values with the real values (the measured rectum temperature) in the second group was investigated. Also, the difference in the average values of these two groups was examined in terms of significance. Results: In the modeling group, the average rectum and carotid temperatures were 36.47 ± 0.54°C and 35.45 ± 0.62°C, respectively. The final model was obtained, as follows: Carotid temperature × 0.561 + 16.583 = Rectum temperature. The predicted value was calculated based on the regression model and then compared with the measured rectum value, which showed no significant difference (P = 0.361). Conclusions: The present study was the first research, in which rectum temperature was compared with that

  10. Cost Estimation of Laser Additive Manufacturing of Stainless Steel

    NASA Astrophysics Data System (ADS)

    Piili, Heidi; Happonen, Ari; Väistö, Tapio; Venkataramanan, Vijaikrishnan; Partanen, Jouni; Salminen, Antti

    Laser additive manufacturing (LAM) is a layer wise fabrication method in which a laser beam melts metallic powder to form solid objects. Although 3D printing has been invented 30 years ago, the industrial use is quite limited whereas the introduction of cheap consumer 3D printers, in recent years, has familiarized the 3D printing. Interest is focused more and more in manufacturing of functional parts. Aim of this study is to define and discuss the current economic opportunities and restrictions of LAM process. Manufacturing costs were studied with different build scenarios each with estimated cost structure by calculated build time and calculating the costs of the machine, material and energy with optimized machine utilization. All manufacturing and time simulations in this study were carried out with a research machine equal to commercial EOS M series equipment. The study shows that the main expense in LAM is the investment cost of the LAM machine, compared to which the relative proportions of the energy and material costs are very low. The manufacturing time per part is the key factor to optimize costs of LAM.

  11. Cost estimate for muddy water palladium production facility at Mound

    SciTech Connect

    McAdams, R.K.

    1988-11-30

    An economic feasibility study was performed on the ''Muddy Water'' low-chlorine content palladium powder production process developed by Mound. The total capital investment and total operating costs (dollars per gram) were determined for production batch sizes of 1--10 kg in 1-kg increments. The report includes a brief description of the Muddy Water process, the process flow diagram, and material balances for the various production batch sizes. Two types of facilities were evaluated--one for production of new, ''virgin'' palladium powder, and one for recycling existing material. The total capital investment for virgin facilities ranged from $600,000 --$1.3 million for production batch sizes of 1--10 kg, respectively. The range for recycle facilities was $1--$2.3 million. The total operating cost for 100% acceptable powder production in the virgin facilities ranged from $23 per gram for a 1-kg production batch size to $8 per gram for a 10-kg batch size. Similarly for recycle facilities, the total operating cost ranged from $34 per gram to $5 per gram. The total operating cost versus product acceptability (ranging from 50%--100% acceptability) was also evaluated for both virgin and recycle facilities. Because production sizes studied vary widely and because scale-up factors are unknown for batch sizes greater than 1 kg, all costs are ''order-of-magnitude'' estimates. All costs reported are in 1987 dollars.

  12. Estimating boiling water reactor decommissioning costs. A user`s manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    SciTech Connect

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  13. Estimating pressurized water reactor decommissioning costs: A user`s manual for the PWR Cost Estimating Computer Program (CECP) software. Draft report for comment

    SciTech Connect

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  14. Estimating boiling water reactor decommissioning costs: A user`s manual for the BWR Cost Estimating Computer Program (CECP) software. Final report

    SciTech Connect

    Bierschbach, M.C.

    1996-06-01

    Nuclear power plant licensees are required to submit to the US Nuclear Regulatory Commission (NRC) for review their decommissioning cost estimates. This user`s manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning boiling water reactor (BWR) power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning.

  15. Man power/cost estimation model: Automated planetary projects

    NASA Technical Reports Server (NTRS)

    Kitchen, L. D.

    1975-01-01

    A manpower/cost estimation model is developed which is based on a detailed level of financial analysis of over 30 million raw data points which are then compacted by more than three orders of magnitude to the level at which the model is applicable. The major parameter of expenditure is manpower (specifically direct labor hours) for all spacecraft subsystem and technical support categories. The resultant model is able to provide a mean absolute error of less than fifteen percent for the eight programs comprising the model data base. The model includes cost saving inheritance factors, broken down in four levels, for estimating follow-on type programs where hardware and design inheritance are evident or expected.

  16. Decommissioning Cost Estimating Factors And Earned Value Integration

    SciTech Connect

    Sanford, P.C.; Cimmarron, E.

    2008-07-01

    The Rocky Flats 771 Project progressed from the planning stage of decommissioning a plutonium facility, through the strip-out of highly-contaminated equipment, removal of utilities and structural decontamination, and building demolition. Actual cost data was collected from the strip-out activities and compared to original estimates, allowing the development of cost by equipment groupings and types and over time. Separate data was developed from the project control earned value reporting and compared with the equipment data. The paper discusses the analysis to develop the detailed factors for the different equipment types, and the items that need to be considered during characterization of a similar facility when preparing an estimate. The factors are presented based on direct labor requirements by equipment type. The paper also includes actual support costs, and examples of fixed or one-time start-up costs. The integration of the estimate and the earned value system used for the 771 Project is also discussed. The paper covers the development of the earned value system as well as its application to a facility to be decommissioned and an existing work breakdown structure. Lessons learned are provided, including integration with scheduling and craft supervision, measurement approaches, and verification of scope completion. In summary: The work of decommissioning the Rocky Flats 771 Project process equipment was completed in 2003. Early in the planning process, we had difficulty in identifying credible data and implementing processes for estimating and controlling this work. As the project progressed, we were able to collect actual data on the costs of removing plutonium contaminated equipment from various areas over the life of this work and associate those costs with individual pieces of equipment. We also were able to develop and test out a system for measuring the earned value of a decommissioning project based on an evolving estimate. These were elements that

  17. Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Douglas, Freddie; Bourgeois, Edit Kaminsky

    2005-01-01

    The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).

  18. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture

  19. 48 CFR 1852.216-85 - Estimated cost and award fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and award... and Clauses 1852.216-85 Estimated cost and award fee. As prescribed in 1816.406-70(e), insert the following clause: Estimated Cost and Award Fee (SEP 1993) The estimated cost of this contract is $___....

  20. 48 CFR 1852.216-74 - Estimated cost and fixed fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and fixed... and Clauses 1852.216-74 Estimated cost and fixed fee. As prescribed in 1816.307-70(b), insert the following clause: Estimated Cost and Fixed Fee (DEC 1991) The estimated cost of this contract...

  1. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  2. Cost estimate for a proposed GDF Suez LNG testing program

    SciTech Connect

    Blanchat, Thomas K.; Brady, Patrick Dennis; Jernigan, Dann A.; Luketa, Anay Josephine; Nissen, Mark R.; Lopez, Carlos; Vermillion, Nancy; Hightower, Marion Michael

    2014-02-01

    At the request of GDF Suez, a Rough Order of Magnitude (ROM) cost estimate was prepared for the design, construction, testing, and data analysis for an experimental series of large-scale (Liquefied Natural Gas) LNG spills on land and water that would result in the largest pool fires and vapor dispersion events ever conducted. Due to the expected cost of this large, multi-year program, the authors utilized Sandia's structured cost estimating methodology. This methodology insures that the efforts identified can be performed for the cost proposed at a plus or minus 30 percent confidence. The scale of the LNG spill, fire, and vapor dispersion tests proposed by GDF could produce hazard distances and testing safety issues that need to be fully explored. Based on our evaluations, Sandia can utilize much of our existing fire testing infrastructure for the large fire tests and some small dispersion tests (with some modifications) in Albuquerque, but we propose to develop a new dispersion testing site at our remote test area in Nevada because of the large hazard distances. While this might impact some testing logistics, the safety aspects warrant this approach. In addition, we have included a proposal to study cryogenic liquid spills on water and subsequent vaporization in the presence of waves. Sandia is working with DOE on applications that provide infrastructure pertinent to wave production. We present an approach to conduct repeatable wave/spill interaction testing that could utilize such infrastructure.

  3. Estimated incident cost savings in shipping due to inspections.

    PubMed

    Knapp, Sabine; Bijwaard, Govert; Heij, Christiaan

    2011-07-01

    The effectiveness of safety inspections of ships has been analysed from various angles, but until now, relatively little attention has been given to translate risk reduction into incident cost savings. This paper provides a monetary quantification of the cost savings that can be attributed to port state control inspections and industry vetting inspections. The dataset consists of more than half a million ship arrivals between 2002 and 2007 and contains inspections of port state authorities in the USA and Australia and of three industry vetting regimes. The effect of inspections in reducing the risk of total loss accidents is estimated by means of duration models, in terms of the gained probability of survival. The monetary benefit of port state control inspections is estimated to range, on average, from about 70 to 190 thousand dollars, with median values ranging from about 20 to 45 thousand dollars. Industry inspections have even higher benefits, especially for tankers. The savings are in general higher for older and larger vessels, and also for vessels with undefined flag and unknown classification society. As inspection costs are relatively low in comparison to potential cost savings, the results underline the importance of determining ships with relatively high risk of total loss. PMID:21545887

  4. Using average cost methods to estimate encounter-level costs for medical-surgical stays in the VA.

    PubMed

    Wagner, Todd H; Chen, Shuo; Barnett, Paul G

    2003-09-01

    The U.S. Department of Veterans Affairs (VA) maintains discharge abstracts, but these do not include cost information. This article describes the methods the authors used to estimate the costs of VA medical-surgical hospitalizations in fiscal years 1998 to 2000. They estimated a cost regression with 1996 Medicare data restricted to veterans receiving VA care in an earlier year. The regression accounted for approximately 74 percent of the variance in cost-adjusted charges, and it proved to be robust to outliers and the year of input data. The beta coefficients from the cost regression were used to impute costs of VA medical-surgical hospital discharges. The estimated aggregate costs were reconciled with VA budget allocations. In addition to the direct medical costs, their cost estimates include indirect costs and physician services; both of these were allocated in proportion to direct costs. They discuss the method's limitations and application in other health care systems. PMID:15095543

  5. Cost estimation of timber bridges using neural networks

    SciTech Connect

    Creese, R.C.; Li. L.

    1995-05-01

    Neural network models, or more simply {open_quotes}neural nets,{close_quotes} have great potential application in speech and image recognition. They also have great potential for cost estimating. Neural networks are particularly effective for complex estimation where the relationship between the output and the input cannot be expressed by simple mathematic relationships. A neural network method was applied to the cost estimation of timber bridges to illustrate the technique. The results of the neural network method were evaluated by the coefficient of determination, The R square value for the key input variables. A comparison of the neural network results and the standard linear regression results was performed upon the timber bridge data. A step-by-step validation is presented to make it easy to understand the application of neural networks to this estimation process. The input is propagated from the input through each layer until an output is generated. The output is compared with the desired output and the error is distributed for each node in the outer layer. The error is transmitted backward (thus the phase {open_quotes}back propagation{close_quotes}) from the output layer to the intermediate layers and then to the input layer. Based upon the errors, the weights are adjusted and the procedure is repeated. The number of training cycles is 15,000 to 50,000 for simple networks, but this usually takes only a few minutes on a personal computer. 7 refs., 4 figs., 11 tabs.

  6. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  7. Cost-effective accurate coarse-grid method for highly convective multidimensional unsteady flows

    NASA Technical Reports Server (NTRS)

    Leonard, B. P.; Niknafs, H. S.

    1991-01-01

    A fundamentally multidimensional convection scheme is described based on vector transient interpolation modeling rewritten in conservative control-volume form. Vector third-order upwinding is used as the basis of the algorithm; this automatically introduces important cross-difference terms that are absent from schemes using component-wise one-dimensional formulas. Third-order phase accuracy is good; this is important for coarse-grid large-eddy or full simulation. Potential overshoots or undershoots are avoided by using a recently developed universal limiter. Higher order accuracy is obtained locally, where needed, by the cost-effective strategy of adaptive stencil expansion in a direction normal to each control-volume face; this is controlled by monitoring the absolute normal gradient and curvature across the face. Higher (than third) order cross-terms do not appear to be needed. Since the wider stencil is used only in isolated narrow regions (near discontinuities), extremely high (in this case, seventh) order accuracy can be achieved for little more than the cost of a globally third-order scheme.

  8. Estimating cost ratio distribution between fatal and non-fatal road accidents in Malaysia

    NASA Astrophysics Data System (ADS)

    Hamdan, Nurhidayah; Daud, Noorizam

    2014-07-01

    Road traffic crashes are a global major problem, and should be treated as a shared responsibility. In Malaysia, road accident tragedies kill 6,917 people and injure or disable 17,522 people in year 2012, and government spent about RM9.3 billion in 2009 which cost the nation approximately 1 to 2 percent loss of gross domestic product (GDP) reported annually. The current cost ratio for fatal and non-fatal accident used by Ministry of Works Malaysia simply based on arbitrary value of 6:4 or equivalent 1.5:1 depends on the fact that there are six factors involved in the calculation accident cost for fatal accident while four factors for non-fatal accident. The simple indication used by the authority to calculate the cost ratio is doubted since there is lack of mathematical and conceptual evidence to explain how this ratio is determined. The main aim of this study is to determine the new accident cost ratio for fatal and non-fatal accident in Malaysia based on quantitative statistical approach. The cost ratio distributions will be estimated based on Weibull distribution. Due to the unavailability of official accident cost data, insurance claim data both for fatal and non-fatal accident have been used as proxy information for the actual accident cost. There are two types of parameter estimates used in this study, which are maximum likelihood (MLE) and robust estimation. The findings of this study reveal that accident cost ratio for fatal and non-fatal claim when using MLE is 1.33, while, for robust estimates, the cost ratio is slightly higher which is 1.51. This study will help the authority to determine a more accurate cost ratio between fatal and non-fatal accident as compared to the official ratio set by the government, since cost ratio is an important element to be used as a weightage in modeling road accident related data. Therefore, this study provides some guidance tips to revise the insurance claim set by the Malaysia road authority, hence the appropriate method

  9. Evaluation of a low-cost and accurate ocean temperature logger on subsurface mooring systems

    SciTech Connect

    Tian, Chuan; Deng, Zhiqun; Lu, Jun; Xu, Xiaoyang; Zhao, Wei; Xu, Ming

    2014-06-23

    Monitoring seawater temperature is important to understanding evolving ocean processes. To monitor internal waves or ocean mixing, a large number of temperature loggers are typically mounted on subsurface mooring systems to obtain high-resolution temperature data at different water depths. In this study, we redesigned and evaluated a compact, low-cost, self-contained, high-resolution and high-accuracy ocean temperature logger, TC-1121. The newly designed TC-1121 loggers are smaller, more robust, and their sampling intervals can be automatically changed by indicated events. They have been widely used in many mooring systems to study internal wave and ocean mixing. The logger’s fundamental design, noise analysis, calibration, drift test, and a long-term sea trial are discussed in this paper.

  10. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, Boy-Santhos; Hut, Rolf; van de Giesen, Nick

    2013-04-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the 150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  11. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, B.; Hut, R.; Van De Giesen, N.

    2012-12-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the $150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  12. Treatment Cost Analysis Tool (TCAT) for Estimating Costs of Outpatient Treatment Services

    PubMed Central

    Flynn, Patrick M.; Broome, Kirk M.; Beaston-Blaakman, Aaron; Knight, Danica K.; Horgan, Constance M.; Shepard, Donald S.

    2009-01-01

    A Microsoft® Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment types were $882, $1,310, and $1,381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment. PMID:19004576

  13. Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.

    PubMed

    Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S

    2009-02-01

    A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment. PMID:19004576

  14. Measuring costs of child abuse and neglect: a mathematic model of specific cost estimations.

    PubMed

    Conrad, Cynthia

    2006-01-01

    Few empirical facts exist regarding the actual costs of child abuse in the United States. Consistent data is not available for national or even statewide analysis. Clearly there is a need for such accounting in order to fully understand the damage created by child abuse and neglect. Policy makers and social welfare planners should take child abuse costs into consideration when determining expenditures for prevention and intervention programs. The real savings may far outweigh the costs of such programs when both direct and indirect costs of child abuse and neglect enter into the analysis. This paper offers a model in which the actual costs of child abuse and neglect, based on direct, indirect, and opportunity costs associated with each case. Direct costs are those associated with the treatment of abused and neglected children as well as the costs of family intervention programs or foster care. Indirect costs are costs to society created by the negative effects of child abuse and neglect evinced by individuals who suffer such abuse and then as teens or adults engage in criminal behavior. Indirect costs also derive from the long term and ongoing health care needs required by victims of abuse, for both physical and mental health disorders. With the existence of this model, the author hopes to stimulate the discussion and desire for better data collection and analysis. In order to demonstrate the utility of the model, the author has included some cost estimates from the Connecticut State Department of Children and Families and the works of other scholars looking into the question of costs for child abuse and neglect. This data represents the best available at this time. As a result, the model appearing here is specific to Connecticut. Even so, once more valid data becomes available, the model's structure and theoretical framework should adapt to the needs of other states to facilitate better measurement of relevant costs and provide a clearer picture of the utility of

  15. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  16. An accurate modeling, simulation, and analysis tool for predicting and estimating Raman LIDAR system performance

    NASA Astrophysics Data System (ADS)

    Grasso, Robert J.; Russo, Leonard P.; Barrett, John L.; Odhner, Jefferson E.; Egbert, Paul I.

    2007-09-01

    BAE Systems presents the results of a program to model the performance of Raman LIDAR systems for the remote detection of atmospheric gases, air polluting hydrocarbons, chemical and biological weapons, and other molecular species of interest. Our model, which integrates remote Raman spectroscopy, 2D and 3D LADAR, and USAF atmospheric propagation codes permits accurate determination of the performance of a Raman LIDAR system. The very high predictive performance accuracy of our model is due to the very accurate calculation of the differential scattering cross section for the specie of interest at user selected wavelengths. We show excellent correlation of our calculated cross section data, used in our model, with experimental data obtained from both laboratory measurements and the published literature. In addition, the use of standard USAF atmospheric models provides very accurate determination of the atmospheric extinction at both the excitation and Raman shifted wavelengths.

  17. Improved Recharge Estimation from Portable, Low-Cost Weather Stations.

    PubMed

    Holländer, Hartmut M; Wang, Zijian; Assefa, Kibreab A; Woodbury, Allan D

    2016-03-01

    Groundwater recharge estimation is a critical quantity for sustainable groundwater management. The feasibility and robustness of recharge estimation was evaluated using physical-based modeling procedures, and data from a low-cost weather station with remote sensor techniques in Southern Abbotsford, British Columbia, Canada. Recharge was determined using the Richards-based vadose zone hydrological model, HYDRUS-1D. The required meteorological data were recorded with a HOBO(TM) weather station for a short observation period (about 1 year) and an existing weather station (Abbotsford A) for long-term study purpose (27 years). Undisturbed soil cores were taken at two locations in the vicinity of the HOBO(TM) weather station. The derived soil hydraulic parameters were used to characterize the soil in the numerical model. Model performance was evaluated using observed soil moisture and soil temperature data obtained from subsurface remote sensors. A rigorous sensitivity analysis was used to test the robustness of the model. Recharge during the short observation period was estimated at 863 and 816 mm. The mean annual recharge was estimated at 848 and 859 mm/year based on a time series of 27 years. The relative ratio of annual recharge-precipitation varied from 43% to 69%. From a monthly recharge perspective, the majority (80%) of recharge due to precipitation occurred during the hydrologic winter period. The comparison of the recharge estimates with other studies indicates a good agreement. Furthermore, this method is able to predict transient recharge estimates, and can provide a reasonable tool for estimates on nutrient leaching that is often controlled by strong precipitation events and rapid infiltration of water and nitrate into the soil. PMID:26011672

  18. Architects and Design-Phase Cost Estimates: Design Professionals Should Reconsider the Value of Third-Party Estimates

    ERIC Educational Resources Information Center

    Coakley, John

    2010-01-01

    Professional cost estimators are widely used by architects during the design phases of a project to provide preliminary cost estimates. These estimates may begin at the conceptual design phase and are prepared at regular intervals through the construction document phase. Estimating professionals are frequently tasked with "selling" the importance…

  19. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Estimated Breakdown of Dwelling Costs for Estimating Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  20. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping.

    PubMed

    Lee, Han B; Schwab, Tanya L; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L; Cervera, Roberto Lopez; McNulty, Melissa S; Bostwick, Hannah S; Clark, Karl J

    2016-06-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus. PMID:26986823

  1. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping

    PubMed Central

    Lee, Han B.; Schwab, Tanya L.; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L.; Cervera, Roberto Lopez; McNulty, Melissa S.; Bostwick, Hannah S.; Clark, Karl J.

    2016-01-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus. PMID:26986823

  2. 48 CFR 1852.216-84 - Estimated cost and incentive fee.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Estimated cost and... Provisions and Clauses 1852.216-84 Estimated cost and incentive fee. As prescribed in 1816.406-70(d), insert the following clause: Estimated Cost and Incentive Fee (OCT 1996) The target cost of this contract...

  3. Why Don't They Just Give Us Money? Project Cost Estimating and Cost Reporting

    NASA Technical Reports Server (NTRS)

    Comstock, Douglas A.; Van Wychen, Kristin; Zimmerman, Mary Beth

    2015-01-01

    Successful projects require an integrated approach to managing cost, schedule, and risk. This is especially true for complex, multi-year projects involving multiple organizations. To explore solutions and leverage valuable lessons learned, NASA's Virtual Project Management Challenge will kick off a three-part series examining some of the challenges faced by project and program managers when it comes to managing these important elements. In this first session of the series, we will look at cost management, with an emphasis on the critical roles of cost estimating and cost reporting. By taking a proactive approach to both of these activities, project managers can better control life cycle costs, maintain stakeholder confidence, and protect other current and future projects in the organization's portfolio. Speakers will be Doug Comstock, Director of NASA's Cost Analysis Division, Kristin Van Wychen, Senior Analyst in the GAO Acquisition and Sourcing Management Team, and Mary Beth Zimmerman, Branch Chief for NASA's Portfolio Analysis Branch, Strategic Investments Division. Moderator Ramien Pierre is from NASA's Academy for Program/Project and Engineering Leadership (APPEL).

  4. An evolutionary morphological approach for software development cost estimation.

    PubMed

    Araújo, Ricardo de A; Oliveira, Adriano L I; Soares, Sergio; Meira, Silvio

    2012-08-01

    In this work we present an evolutionary morphological approach to solve the software development cost estimation (SDCE) problem. The proposed approach consists of a hybrid artificial neuron based on framework of mathematical morphology (MM) with algebraic foundations in the complete lattice theory (CLT), referred to as dilation-erosion perceptron (DEP). Also, we present an evolutionary learning process, called DEP(MGA), using a modified genetic algorithm (MGA) to design the DEP model, because a drawback arises from the gradient estimation of morphological operators in the classical learning process of the DEP, since they are not differentiable in the usual way. Furthermore, an experimental analysis is conducted with the proposed model using five complex SDCE problems and three well-known performance metrics, demonstrating good performance of the DEP model to solve SDCE problems. PMID:22560678

  5. Estimation method of point spread function based on Kalman filter for accurately evaluating real optical properties of photonic crystal fibers.

    PubMed

    Shen, Yan; Lou, Shuqin; Wang, Xin

    2014-03-20

    The evaluation accuracy of real optical properties of photonic crystal fibers (PCFs) is determined by the accurate extraction of air hole edges from microscope images of cross sections of practical PCFs. A novel estimation method of point spread function (PSF) based on Kalman filter is presented to rebuild the micrograph image of the PCF cross-section and thus evaluate real optical properties for practical PCFs. Through tests on both artificially degraded images and microscope images of cross sections of practical PCFs, we prove that the proposed method can achieve more accurate PSF estimation and lower PSF variance than the traditional Bayesian estimation method, and thus also reduce the defocus effect. With this method, we rebuild the microscope images of two kinds of commercial PCFs produced by Crystal Fiber and analyze the real optical properties of these PCFs. Numerical results are in accord with the product parameters. PMID:24663461

  6. IUS/TUG orbital operations and mission support study. Volume 5: Cost estimates

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The costing approach, methodology, and rationale utilized for generating cost data for composite IUS and space tug orbital operations are discussed. Summary cost estimates are given along with cost data initially derived for the IUS program and space tug program individually, and cost estimates for each work breakdown structure element.

  7. Technical note: tree truthing: how accurate are substrate estimates in primate field studies?

    PubMed

    Bezanson, Michelle; Watts, Sean M; Jobin, Matthew J

    2012-04-01

    Field studies of primate positional behavior typically rely on ground-level estimates of substrate size, angle, and canopy location. These estimates potentially influence the identification of positional modes by the observer recording behaviors. In this study we aim to test ground-level estimates against direct measurements of support angles, diameters, and canopy heights in trees at La Suerte Biological Research Station in Costa Rica. After reviewing methods that have been used by past researchers, we provide data collected within trees that are compared to estimates obtained from the ground. We climbed five trees and measured 20 supports. Four observers collected measurements of each support from different locations on the ground. Diameter estimates varied from the direct tree measures by 0-28 cm (Mean: 5.44 ± 4.55). Substrate angles varied by 1-55° (Mean: 14.76 ± 14.02). Height in the tree was best estimated using a clinometer as estimates with a two-meter reference placed by the tree varied by 3-11 meters (Mean: 5.31 ± 2.44). We determined that the best support size estimates were those generated relative to the size of the focal animal and divided into broader categories. Support angles were best estimated in 5° increments and then checked using a Haglöf clinometer in combination with a laser pointer. We conclude that three major factors should be addressed when estimating support features: observer error (e.g., experience and distance from the target), support deformity, and how support size and angle influence the positional mode selected by a primate individual. individual. PMID:22371099

  8. Accurate state estimation for a hydraulic actuator via a SDRE nonlinear filter

    NASA Astrophysics Data System (ADS)

    Strano, Salvatore; Terzo, Mario

    2016-06-01

    The state estimation in hydraulic actuators is a fundamental tool for the detection of faults or a valid alternative to the installation of sensors. Due to the hard nonlinearities that characterize the hydraulic actuators, the performances of the linear/linearization based techniques for the state estimation are strongly limited. In order to overcome these limits, this paper focuses on an alternative nonlinear estimation method based on the State-Dependent-Riccati-Equation (SDRE). The technique is able to fully take into account the system nonlinearities and the measurement noise. A fifth order nonlinear model is derived and employed for the synthesis of the estimator. Simulations and experimental tests have been conducted and comparisons with the largely used Extended Kalman Filter (EKF) are illustrated. The results show the effectiveness of the SDRE based technique for applications characterized by not negligible nonlinearities such as dead zone and frictions.

  9. Software cost/resource modeling: Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  10. FAST TRACK COMMUNICATION Accurate estimate of α variation and isotope shift parameters in Na and Mg+

    NASA Astrophysics Data System (ADS)

    Sahoo, B. K.

    2010-12-01

    We present accurate calculations of fine-structure constant variation coefficients and isotope shifts in Na and Mg+ using the relativistic coupled-cluster method. In our approach, we are able to discover the roles of various correlation effects explicitly to all orders in these calculations. Most of the results, especially for the excited states, are reported for the first time. It is possible to ascertain suitable anchor and probe lines for the studies of possible variation in the fine-structure constant by using the above results in the considered systems.

  11. Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Sanner, Robert M.

    2006-01-01

    Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.

  12. Precision Pointing Control to and Accurate Target Estimation of a Non-Cooperative Vehicle

    NASA Technical Reports Server (NTRS)

    VanEepoel, John; Thienel, Julie; Sanner, Robert M.

    2006-01-01

    In 2004, NASA began investigating a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates in order to achieve capture by the proposed Hubble Robotic Vehicle (HRV), but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST. To generalize the situation, HST is the target vehicle and HRV is the chaser. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a control scheme. Non-cooperative in this context relates to the target vehicle no longer having the ability to maintain attitude control or transmit attitude knowledge.

  13. A microbial clock provides an accurate estimate of the postmortem interval in a mouse model system

    PubMed Central

    Metcalf, Jessica L; Wegener Parfrey, Laura; Gonzalez, Antonio; Lauber, Christian L; Knights, Dan; Ackermann, Gail; Humphrey, Gregory C; Gebert, Matthew J; Van Treuren, Will; Berg-Lyons, Donna; Keepers, Kyle; Guo, Yan; Bullard, James; Fierer, Noah; Carter, David O; Knight, Rob

    2013-01-01

    Establishing the time since death is critical in every death investigation, yet existing techniques are susceptible to a range of errors and biases. For example, forensic entomology is widely used to assess the postmortem interval (PMI), but errors can range from days to months. Microbes may provide a novel method for estimating PMI that avoids many of these limitations. Here we show that postmortem microbial community changes are dramatic, measurable, and repeatable in a mouse model system, allowing PMI to be estimated within approximately 3 days over 48 days. Our results provide a detailed understanding of bacterial and microbial eukaryotic ecology within a decomposing corpse system and suggest that microbial community data can be developed into a forensic tool for estimating PMI. DOI: http://dx.doi.org/10.7554/eLife.01104.001 PMID:24137541

  14. Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates

    NASA Astrophysics Data System (ADS)

    Peffley, Al F.

    1991-04-01

    The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.

  15. Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates

    NASA Technical Reports Server (NTRS)

    Peffley, Al F.

    1991-01-01

    The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.

  16. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  17. Spectral estimation from laser scanner data for accurate color rendering of objects

    NASA Astrophysics Data System (ADS)

    Baribeau, Rejean

    2002-06-01

    Estimation methods are studied for the recovery of the spectral reflectance across the visible range from the sensing at just three discrete laser wavelengths. Methods based on principal component analysis and on spline interpolation are judged based on the CIE94 color differences for some reference data sets. These include the Macbeth color checker, the OSA-UCS color charts, some artist pigments, and a collection of miscellaneous surface colors. The optimal three sampling wavelengths are also investigated. It is found that color can be estimated with average accuracy ΔE94 = 2.3 when optimal wavelengths 455 nm, 540 n, and 610 nm are used.

  18. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  19. Accurate radiocarbon age estimation using "early" measurements: a new approach to reconstructing the Paleolithic absolute chronology

    NASA Astrophysics Data System (ADS)

    Omori, Takayuki; Sano, Katsuhiro; Yoneda, Minoru

    2014-05-01

    This paper presents new correction approaches for "early" radiocarbon ages to reconstruct the Paleolithic absolute chronology. In order to discuss time-space distribution about the replacement of archaic humans, including Neanderthals in Europe, by the modern humans, a massive data, which covers a wide-area, would be needed. Today, some radiocarbon databases focused on the Paleolithic have been published and used for chronological studies. From a viewpoint of current analytical technology, however, the any database have unreliable results that make interpretation of radiocarbon dates difficult. Most of these unreliable ages had been published in the early days of radiocarbon analysis. In recent years, new analytical methods to determine highly-accurate dates have been developed. Ultrafiltration and ABOx-SC methods, as new sample pretreatments for bone and charcoal respectively, have attracted attention because they could remove imperceptible contaminates and derive reliable accurately ages. In order to evaluate the reliability of "early" data, we investigated the differences and variabilities of radiocarbon ages on different pretreatments, and attempted to develop correction functions for the assessment of the reliability. It can be expected that reliability of the corrected age is increased and the age applied to chronological research together with recent ages. Here, we introduce the methodological frameworks and archaeological applications.

  20. A Generalized Subspace Least Mean Square Method for High-resolution Accurate Estimation of Power System Oscillation Modes

    SciTech Connect

    Zhang, Peng; Zhou, Ning; Abdollahi, Ali

    2013-09-10

    A Generalized Subspace-Least Mean Square (GSLMS) method is presented for accurate and robust estimation of oscillation modes from exponentially damped power system signals. The method is based on orthogonality of signal and noise eigenvectors of the signal autocorrelation matrix. Performance of the proposed method is evaluated using Monte Carlo simulation and compared with Prony method. Test results show that the GSLMS is highly resilient to noise and significantly dominates Prony method in tracking power system modes under noisy environments.

  1. Accurate motion parameter estimation for colonoscopy tracking using a regression method

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Subramanian, Kalpathi R.; Yoo, Terry S.

    2010-03-01

    Co-located optical and virtual colonoscopy images have the potential to provide important clinical information during routine colonoscopy procedures. In our earlier work, we presented an optical flow based algorithm to compute egomotion from live colonoscopy video, permitting navigation and visualization of the corresponding patient anatomy. In the original algorithm, motion parameters were estimated using the traditional Least Sum of squares(LS) procedure which can be unstable in the context of optical flow vectors with large errors. In the improved algorithm, we use the Least Median of Squares (LMS) method, a robust regression method for motion parameter estimation. Using the LMS method, we iteratively analyze and converge toward the main distribution of the flow vectors, while disregarding outliers. We show through three experiments the improvement in tracking results obtained using the LMS method, in comparison to the LS estimator. The first experiment demonstrates better spatial accuracy in positioning the virtual camera in the sigmoid colon. The second and third experiments demonstrate the robustness of this estimator, resulting in longer tracked sequences: from 300 to 1310 in the ascending colon, and 410 to 1316 in the transverse colon.

  2. How Accurate and Robust Are the Phylogenetic Estimates of Austronesian Language Relationships?

    PubMed Central

    Greenhill, Simon J.; Drummond, Alexei J.; Gray, Russell D.

    2010-01-01

    We recently used computational phylogenetic methods on lexical data to test between two scenarios for the peopling of the Pacific. Our analyses of lexical data supported a pulse-pause scenario of Pacific settlement in which the Austronesian speakers originated in Taiwan around 5,200 years ago and rapidly spread through the Pacific in a series of expansion pulses and settlement pauses. We claimed that there was high congruence between traditional language subgroups and those observed in the language phylogenies, and that the estimated age of the Austronesian expansion at 5,200 years ago was consistent with the archaeological evidence. However, the congruence between the language phylogenies and the evidence from historical linguistics was not quantitatively assessed using tree comparison metrics. The robustness of the divergence time estimates to different calibration points was also not investigated exhaustively. Here we address these limitations by using a systematic tree comparison metric to calculate the similarity between the Bayesian phylogenetic trees and the subgroups proposed by historical linguistics, and by re-estimating the age of the Austronesian expansion using only the most robust calibrations. The results show that the Austronesian language phylogenies are highly congruent with the traditional subgroupings, and the date estimates are robust even when calculated using a restricted set of historical calibrations. PMID:20224774

  3. Accurate Angle Estimator for High-Frame-Rate 2-D Vector Flow Imaging.

    PubMed

    Villagomez Hoyos, Carlos Armando; Stuart, Matthias Bo; Hansen, Kristoffer Lindskov; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2016-06-01

    This paper presents a novel approach for estimating 2-D flow angles using a high-frame-rate ultrasound method. The angle estimator features high accuracy and low standard deviation (SD) over the full 360° range. The method is validated on Field II simulations and phantom measurements using the experimental ultrasound scanner SARUS and a flow rig before being tested in vivo. An 8-MHz linear array transducer is used with defocused beam emissions. In the simulations of a spinning disk phantom, a 360° uniform behavior on the angle estimation is observed with a median angle bias of 1.01° and a median angle SD of 1.8°. Similar results are obtained on a straight vessel for both simulations and measurements, where the obtained angle biases are below 1.5° with SDs around 1°. Estimated velocity magnitudes are also kept under 10% bias and 5% relative SD in both simulations and measurements. An in vivo measurement is performed on a carotid bifurcation of a healthy individual. A 3-s acquisition during three heart cycles is captured. A consistent and repetitive vortex is observed in the carotid bulb during systoles. PMID:27093598

  4. Estimating the additional cost of disability: beyond budget standards.

    PubMed

    Wilkinson-Meyers, Laura; Brown, Paul; McNeill, Robert; Patston, Philip; Dylan, Sacha; Baker, Ronelle

    2010-11-01

    Disabled people have long advocated for sufficient resources to live a life with the same rights and responsibilities as non-disabled people. Identifying the unique resource needs of disabled people relative to the population as a whole and understanding the source of these needs is critical for determining adequate levels of income support and for prioritising service provision. Previous attempts to identify the resources and costs associated with disability have tended to rely on surveys of current resource use. These approaches have been criticised as being inadequate for identifying the resources that would be required to achieve a similar standard of living to non-disabled people and for not using methods that are acceptable to and appropriate for the disabled community. The challenge is therefore to develop a methodology that accurately identifies these unique resource needs, uses an approach that is acceptable to the disabled community, enables all disabled people to participate, and distinguishes 'needs' from 'wants.' This paper describes and presents the rationale for a mixed methodology for identifying and prioritising the resource needs of disabled people. The project is a partnership effort between disabled researchers, a disability support organisation and academic researchers in New Zealand. The method integrates a social model of disability framework and an economic cost model using a budget standards approach to identify additional support, equipment, travel and time required to live an 'ordinary life' in the community. A survey is then used to validate the findings and identify information gaps and resource priorities of the community. Both the theoretical basis of the approach and the practical challenges of designing and implementing a methodology that is acceptable to the disabled community, service providers and funding agencies are discussed. PMID:20933315

  5. Cost estimation of HVDC transmission system of Bangka's NPP candidates

    NASA Astrophysics Data System (ADS)

    Liun, Edwaren; Suparman

    2014-09-01

    Regarding nuclear power plant development in Bangka Island, it can be estimated that produced power will be oversupply for the Bangka Island and needs to transmit to Sumatra or Java Island. The distance between the regions or islands causing considerable loss of power in transmission by alternating current, and a wide range of technical and economical issues. The objective of this paper addresses to economics analysis of direct current transmission system to overcome those technical problem. Direct current transmission has a stable characteristic, so that the power delivery from Bangka to Sumatra or Java in a large scale efficiently and reliably can be done. HVDC system costs depend on the power capacity applied to the system and length of the transmission line in addition to other variables that may be different.

  6. FASTSim: A Model to Estimate Vehicle Efficiency, Cost and Performance

    SciTech Connect

    Brooker, A.; Gonder, J.; Wang, L.; Wood, E.; Lopp, S.; Ramroth, L.

    2015-05-04

    The Future Automotive Systems Technology Simulator (FASTSim) is a high-level advanced vehicle powertrain systems analysis tool supported by the U.S. Department of Energy’s Vehicle Technologies Office. FASTSim provides a quick and simple approach to compare powertrains and estimate the impact of technology improvements on light- and heavy-duty vehicle efficiency, performance, cost, and battery batches of real-world drive cycles. FASTSim’s calculation framework and balance among detail, accuracy, and speed enable it to simulate thousands of driven miles in minutes. The key components and vehicle outputs have been validated by comparing the model outputs to test data for many different vehicles to provide confidence in the results. A graphical user interface makes FASTSim easy and efficient to use. FASTSim is freely available for download from the National Renewable Energy Laboratory’s website (see www.nrel.gov/fastsim).

  7. Improved rapid magnitude estimation for a community-based, low-cost MEMS accelerometer network

    USGS Publications Warehouse

    Chung, Angela I.; Cochran, Elizabeth S.; Kaiser, Anna E.; Christensen, Carl M.; Yildirim, Battalgazi; Lawrence, Jesse F.

    2015-01-01

    Immediately following the Mw 7.2 Darfield, New Zealand, earthquake, over 180 Quake‐Catcher Network (QCN) low‐cost micro‐electro‐mechanical systems accelerometers were deployed in the Canterbury region. Using data recorded by this dense network from 2010 to 2013, we significantly improved the QCN rapid magnitude estimation relationship. The previous scaling relationship (Lawrence et al., 2014) did not accurately estimate the magnitudes of nearby (<35  km) events. The new scaling relationship estimates earthquake magnitudes within 1 magnitude unit of the GNS Science GeoNet earthquake catalog magnitudes for 99% of the events tested, within 0.5 magnitude units for 90% of the events, and within 0.25 magnitude units for 57% of the events. These magnitudes are reliably estimated within 3 s of the initial trigger recorded on at least seven stations. In this report, we present the methods used to calculate a new scaling relationship and demonstrate the accuracy of the revised magnitude estimates using a program that is able to retrospectively estimate event magnitudes using archived data.

  8. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  9. Accurate estimation of influenza epidemics using Google search data via ARGO

    PubMed Central

    Yang, Shihao; Santillana, Mauricio; Kou, S. C.

    2015-01-01

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  10. Raman spectroscopy for highly accurate estimation of the age of refrigerated porcine muscle

    NASA Astrophysics Data System (ADS)

    Timinis, Constantinos; Pitris, Costas

    2016-03-01

    The high water content of meat, combined with all the nutrients it contains, make it vulnerable to spoilage at all stages of production and storage even when refrigerated at 5 °C. A non-destructive and in situ tool for meat sample testing, which could provide an accurate indication of the storage time of meat, would be very useful for the control of meat quality as well as for consumer safety. The proposed solution is based on Raman spectroscopy which is non-invasive and can be applied in situ. For the purposes of this project, 42 meat samples from 14 animals were obtained and three Raman spectra per sample were collected every two days for two weeks. The spectra were subsequently processed and the sample age was calculated using a set of linear differential equations. In addition, the samples were classified in categories corresponding to the age in 2-day steps (i.e., 0, 2, 4, 6, 8, 10, 12 or 14 days old), using linear discriminant analysis and cross-validation. Contrary to other studies, where the samples were simply grouped into two categories (higher or lower quality, suitable or unsuitable for human consumption, etc.), in this study, the age was predicted with a mean error of ~ 1 day (20%) or classified, in 2-day steps, with 100% accuracy. Although Raman spectroscopy has been used in the past for the analysis of meat samples, the proposed methodology has resulted in a prediction of the sample age far more accurately than any report in the literature.

  11. An estimating rule for deep space station control room equipment energy costs

    NASA Technical Reports Server (NTRS)

    Younger, H. C.

    1980-01-01

    A rule is described which can be used to estimate power costs for new equipment under development, helping to reduce life-cycle costs and energy consumption by justifying design alternatives that are more costly, but more efficient.

  12. Global cost estimates of reducing carbon emissions through avoided deforestation

    PubMed Central

    Kindermann, Georg; Obersteiner, Michael; Sohngen, Brent; Sathaye, Jayant; Andrasko, Kenneth; Rametsteiner, Ewald; Schlamadinger, Bernhard; Wunder, Sven; Beach, Robert

    2008-01-01

    Tropical deforestation is estimated to cause about one-quarter of anthropogenic carbon emissions, loss of biodiversity, and other environmental services. United Nations Framework Convention for Climate Change talks are now considering mechanisms for avoiding deforestation (AD), but the economic potential of AD has yet to be addressed. We use three economic models of global land use and management to analyze the potential contribution of AD activities to reduced greenhouse gas emissions. AD activities are found to be a competitive, low-cost abatement option. A program providing a 10% reduction in deforestation from 2005 to 2030 could provide 0.3–0.6 Gt (1 Gt = 1 × 105 g) CO2·yr−1 in emission reductions and would require $0.4 billion to $1.7 billion·yr−1 for 30 years. A 50% reduction in deforestation from 2005 to 2030 could provide 1.5–2.7 Gt CO2·yr−1 in emission reductions and would require $17.2 billion to $28.0 billion·yr−1. Finally, some caveats to the analysis that could increase costs of AD programs are described. PMID:18650377

  13. Cost Estimates of Electricity from a TPV Residential Heating System

    NASA Astrophysics Data System (ADS)

    Palfinger, Günther; Bitnar, Bernd; Durisch, Wilhelm; Mayor, Jean-Claude; Grützmacher, Detlev; Gobrecht, Jens

    2003-01-01

    A thermophotovoltaic (TPV) system was built using a 12 to 20 kWth methane burner which should be integrated into a conventional residential heating system. The TPV system is cylindrical in shape and consists of a selective Yb2O3 emitter, a quartz glass tube to prevent the exhaust gases from heating the cells and a 0.2 m2 monocrystalline silicon solar cell module which is water cooled. The maximum system efficiency of 1.0 % was obtained at a thermal input power of 12 kWth. The electrical power suffices to run a residential heating system in the full power range (12 to 20 kWth) independently of the grid. The end user costs of the TPV components - emitter, glass tube, photocells and cell cooling circuit - were estimated considering 4 different TPV scenarios. The existing technique was compared with an improved system currently under development, which consists of a flexible photocell module that can be glued into the boiler housing and with systems with improved system efficiency (1.5 to 5 %) and geometry. Prices of the electricity from 2.5 to 22 EURcents/kWhel (excl. gas of about 3.5 EURcents/kWh), which corresponds to system costs of 340 to 3000 EUR/kWel,peak, were calculated. The price of electricity by TPV was compared with that of fuel cells and gas engines. While fuel cells are still expensive, gas engines have the disadvantage of maintenance, noise and bulkiness. TPV, in contrast, is a cost efficient alternative to produce heat and electricity, particularly in small peripheral units.

  14. Techniques for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, Michael R.; Bland, Roger

    1999-01-01

    An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. The relative magnitude of equipment errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three sets of calibration data differed by less than an average of 4 cubic meters per second. Typical maximum flow rates during the data-collection period averaged 750 cubic meters per second.

  15. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1991-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  16. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1990-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  17. Are satellite based rainfall estimates accurate enough for crop modelling under Sahelian climate?

    NASA Astrophysics Data System (ADS)

    Ramarohetra, J.; Sultan, B.

    2012-04-01

    Agriculture is considered as the most climate dependant human activity. In West Africa and especially in the sudano-sahelian zone, rain-fed agriculture - that represents 93% of cultivated areas and is the means of support of 70% of the active population - is highly vulnerable to precipitation variability. To better understand and anticipate climate impacts on agriculture, crop models - that estimate crop yield from climate information (e.g rainfall, temperature, insolation, humidity) - have been developed. These crop models are useful (i) in ex ante analysis to quantify the impact of different strategies implementation - crop management (e.g. choice of varieties, sowing date), crop insurance or medium-range weather forecast - on yields, (ii) for early warning systems and to (iii) assess future food security. Yet, the successful application of these models depends on the accuracy of their climatic drivers. In the sudano-sahelian zone , the quality of precipitation estimations is then a key factor to understand and anticipate climate impacts on agriculture via crop modelling and yield estimations. Different kinds of precipitation estimations can be used. Ground measurements have long-time series but an insufficient network density, a large proportion of missing values, delay in reporting time, and they have limited availability. An answer to these shortcomings may lie in the field of remote sensing that provides satellite-based precipitation estimations. However, satellite-based rainfall estimates (SRFE) are not a direct measurement but rather an estimation of precipitation. Used as an input for crop models, it determines the performance of the simulated yield, hence SRFE require validation. The SARRAH crop model is used to model three different varieties of pearl millet (HKP, MTDO, Souna3) in a square degree centred on 13.5°N and 2.5°E, in Niger. Eight satellite-based rainfall daily products (PERSIANN, CMORPH, TRMM 3b42-RT, GSMAP MKV+, GPCP, TRMM 3b42v6, RFEv2 and

  18. Plant DNA Barcodes Can Accurately Estimate Species Richness in Poorly Known Floras

    PubMed Central

    Costion, Craig; Ford, Andrew; Cross, Hugh; Crayn, Darren; Harrington, Mark; Lowe, Andrew

    2011-01-01

    Background Widespread uptake of DNA barcoding technology for vascular plants has been slow due to the relatively poor resolution of species discrimination (∼70%) and low sequencing and amplification success of one of the two official barcoding loci, matK. Studies to date have mostly focused on finding a solution to these intrinsic limitations of the markers, rather than posing questions that can maximize the utility of DNA barcodes for plants with the current technology. Methodology/Principal Findings Here we test the ability of plant DNA barcodes using the two official barcoding loci, rbcLa and matK, plus an alternative barcoding locus, trnH-psbA, to estimate the species diversity of trees in a tropical rainforest plot. Species discrimination accuracy was similar to findings from previous studies but species richness estimation accuracy proved higher, up to 89%. All combinations which included the trnH-psbA locus performed better at both species discrimination and richness estimation than matK, which showed little enhanced species discriminatory power when concatenated with rbcLa. The utility of the trnH-psbA locus is limited however, by the occurrence of intraspecific variation observed in some angiosperm families to occur as an inversion that obscures the monophyly of species. Conclusions/Significance We demonstrate for the first time, using a case study, the potential of plant DNA barcodes for the rapid estimation of species richness in taxonomically poorly known areas or cryptic populations revealing a powerful new tool for rapid biodiversity assessment. The combination of the rbcLa and trnH-psbA loci performed better for this purpose than any two-locus combination that included matK. We show that although DNA barcodes fail to discriminate all species of plants, new perspectives and methods on biodiversity value and quantification may overshadow some of these shortcomings by applying barcode data in new ways. PMID:22096501

  19. Accurate estimation of retinal vessel width using bagged decision trees and an extended multiresolution Hermite model.

    PubMed

    Lupaşcu, Carmen Alina; Tegolo, Domenico; Trucco, Emanuele

    2013-12-01

    We present an algorithm estimating the width of retinal vessels in fundus camera images. The algorithm uses a novel parametric surface model of the cross-sectional intensities of vessels, and ensembles of bagged decision trees to estimate the local width from the parameters of the best-fit surface. We report comparative tests with REVIEW, currently the public database of reference for retinal width estimation, containing 16 images with 193 annotated vessel segments and 5066 profile points annotated manually by three independent experts. Comparative tests are reported also with our own set of 378 vessel widths selected sparsely in 38 images from the Tayside Scotland diabetic retinopathy screening programme and annotated manually by two clinicians. We obtain considerably better accuracies compared to leading methods in REVIEW tests and in Tayside tests. An important advantage of our method is its stability (success rate, i.e., meaningful measurement returned, of 100% on all REVIEW data sets and on the Tayside data set) compared to a variety of methods from the literature. We also find that results depend crucially on testing data and conditions, and discuss criteria for selecting a training set yielding optimal accuracy. PMID:24001930

  20. Compact and accurate linear and nonlinear autoregressive moving average model parameter estimation using laguerre functions.

    PubMed

    Chon, K H; Cohen, R J; Holstein-Rathlou, N H

    1997-01-01

    A linear and nonlinear autoregressive moving average (ARMA) identification algorithm is developed for modeling time series data. The algorithm uses Laguerre expansion of kernals (LEK) to estimate Volterra-Wiener kernals. However, instead of estimating linear and nonlinear system dynamics via moving average models, as is the case for the Volterra-Wiener analysis, we propose an ARMA model-based approach. The proposed algorithm is essentially the same as LEK, but this algorithm is extended to include past values of the output as well. Thus, all of the advantages associated with using the Laguerre function remain with our algorithm; but, by extending the algorithm to the linear and nonlinear ARMA model, a significant reduction in the number of Laguerre functions can be made, compared with the Volterra-Wiener approach. This translates into a more compact system representation and makes the physiological interpretation of higher order kernels easier. Furthermore, simulation results show better performance of the proposed approach in estimating the system dynamics than LEK in certain cases, and it remains effective in the presence of significant additive measurement noise. PMID:9236985

  1. Accurate estimation of sea surface temperatures using dissolution-corrected calibrations for Mg/Ca paleothermometry

    NASA Astrophysics Data System (ADS)

    Rosenthal, Yair; Lohmann, George P.

    2002-09-01

    Paired δ18O and Mg/Ca measurements on the same foraminiferal shells offer the ability to independently estimate sea surface temperature (SST) changes and assess their temporal relationship to the growth and decay of continental ice sheets. The accuracy of this method is confounded, however, by the absence of a quantitative method to correct Mg/Ca records for alteration by dissolution. Here we describe dissolution-corrected calibrations for Mg/Ca-paleothermometry in which the preexponent constant is a function of size-normalized shell weight: (1) for G. ruber (212-300 μm) (Mg/Ca)ruber = (0.025 wt + 0.11) e0.095T and (b) for G. sacculifer (355-425 μm) (Mg/Ca)sacc = (0.0032 wt + 0.181) e0.095T. The new calibrations improve the accuracy of SST estimates and are globally applicable. With this correction, eastern equatorial Atlantic SST during the Last Glacial Maximum is estimated to be 2.9° ± 0.4°C colder than today.

  2. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or...

  3. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or...

  4. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or...

  5. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or...

  6. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... estimated project costs. 151.24 Section 151.24 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Development Projects § 151.24 Procedures: Application; information on estimated project costs. (a) If any part of the estimated project costs consists of the value of donated land, labor, materials, or...

  7. Taking the Evolutionary Road to Developing an In-House Cost Estimate

    NASA Technical Reports Server (NTRS)

    Jacintho, David; Esker, Lind; Herman, Frank; Lavaque, Rodolfo; Regardie, Myma

    2011-01-01

    This slide presentation reviews the process and some of the problems and challenges of developing an In-House Cost Estimate (IHCE). Using as an example the Space Network Ground Segment Sustainment (SGSS) project, the presentation reviews the phases for developing a Cost estimate within the project to estimate government and contractor project costs to support a budget request.

  8. 40 CFR 144.62 - Cost estimate for plugging and abandonment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Cost estimate for plugging and... Waste Injection Wells § 144.62 Cost estimate for plugging and abandonment. (a) The owner or operator must prepare a written estimate, in current dollars, of the cost of plugging the injection well...

  9. An estimate of the global health care and lost productivity costs of dengue.

    PubMed

    Selck, Frederic W; Adalja, Amesh A; Boddie, Crystal R

    2014-11-01

    Contemporary cost estimates of dengue fever are difficult to attain in many countries in which the disease is endemic. By applying publicly available health care costs and wage data to recently available country-level estimates of dengue incidence, we estimate the total cost of dengue to be nearly 40 billion dollars in 2011. PMID:25409275

  10. Handbook for quick cost estimates. A method for developing quick approximate estimates of costs for generic actions for nuclear power plants

    SciTech Connect

    Ball, J.R.

    1986-04-01

    This document is a supplement to a ''Handbook for Cost Estimating'' (NUREG/CR-3971) and provides specific guidance for developing ''quick'' approximate estimates of the cost of implementing generic regulatory requirements for nuclear power plants. A method is presented for relating the known construction costs for new nuclear power plants (as contained in the Energy Economic Data Base) to the cost of performing similar work, on a back-fit basis, at existing plants. Cost factors are presented to account for variations in such important cost areas as construction labor productivity, engineering and quality assurance, replacement energy, reworking of existing features, and regional variations in the cost of materials and labor. Other cost categories addressed in this handbook include those for changes in plant operating personnel and plant documents, licensee costs, NRC costs, and costs for other government agencies. Data sheets, worksheets, and appropriate cost algorithms are included to guide the user through preparation of rough estimates. A sample estimate is prepared using the method and the estimating tools provided.

  11. Higher Accurate Estimation of Axial and Bending Stiffnesses of Plates Clamped by Bolts

    NASA Astrophysics Data System (ADS)

    Naruse, Tomohiro; Shibutani, Yoji

    Equivalent stiffness of clamped plates should be prescribed not only to evaluate the strength of bolted joints by the scheme of “joint diagram” but also to make structural analyses for practical structures with many bolted joints. We estimated the axial stiffness and bending stiffness of clamped plates by using Finite Element (FE) analyses while taking the contact condition on bearing surfaces and between the plates into account. The FE models were constructed for bolted joints tightened with M8, 10, 12 and 16 bolts and plate thicknesses of 3.2, 4.5, 6.0 and 9.0 mm, and the axial and bending compliances were precisely evaluated. These compliances of clamped plates were compared with those from VDI 2230 (2003) code, in which the equivalent conical compressive stress field in the plate has been assumed. The code gives larger axial stiffness for 11% and larger bending stiffness for 22%, and it cannot apply to the clamped plates with different thickness. Thus the code shall give lower bolt stress (unsafe estimation). We modified the vertical angle tangent, tanφ, of the equivalent conical by adding a term of the logarithm of thickness ratio t1/t2 and by fitting to the analysis results. The modified tanφ can estimate the axial compliance with the error from -1.5% to 6.8% and the bending compliance with the error from -6.5% to 10%. Furthermore, the modified tanφ can take the thickness difference into consideration.

  12. Innovation in the pharmaceutical industry: New estimates of R&D costs.

    PubMed

    DiMasi, Joseph A; Grabowski, Henry G; Hansen, Ronald W

    2016-05-01

    The research and development costs of 106 randomly selected new drugs were obtained from a survey of 10 pharmaceutical firms. These data were used to estimate the average pre-tax cost of new drug and biologics development. The costs of compounds abandoned during testing were linked to the costs of compounds that obtained marketing approval. The estimated average out-of-pocket cost per approved new compound is $1395 million (2013 dollars). Capitalizing out-of-pocket costs to the point of marketing approval at a real discount rate of 10.5% yields a total pre-approval cost estimate of $2558 million (2013 dollars). When compared to the results of the previous study in this series, total capitalized costs were shown to have increased at an annual rate of 8.5% above general price inflation. Adding an estimate of post-approval R&D costs increases the cost estimate to $2870 million (2013 dollars). PMID:26928437

  13. Accurate estimation of airborne ultrasonic time-of-flight for overlapping echoes.

    PubMed

    Sarabia, Esther G; Llata, Jose R; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  14. Accurate Estimation of Airborne Ultrasonic Time-of-Flight for Overlapping Echoes

    PubMed Central

    Sarabia, Esther G.; Llata, Jose R.; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P.

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  15. Cost estimation: An expert-opinion approach. [cost analysis of research projects using the Delphi method (forecasting)

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Fogleman, S.; Gielecki, M.

    1976-01-01

    A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.

  16. Modelling the Constraints of Spatial Environment in Fauna Movement Simulations: Comparison of a Boundaries Accurate Function and a Cost Function

    NASA Astrophysics Data System (ADS)

    Jolivet, L.; Cohen, M.; Ruas, A.

    2015-08-01

    Landscape influences fauna movement at different levels, from habitat selection to choices of movements' direction. Our goal is to provide a development frame in order to test simulation functions for animal's movement. We describe our approach for such simulations and we compare two types of functions to calculate trajectories. To do so, we first modelled the role of landscape elements to differentiate between elements that facilitate movements and the ones being hindrances. Different influences are identified depending on landscape elements and on animal species. Knowledge were gathered from ecologists, literature and observation datasets. Second, we analysed the description of animal movement recorded with GPS at fine scale, corresponding to high temporal frequency and good location accuracy. Analysing this type of data provides information on the relation between landscape features and movements. We implemented an agent-based simulation approach to calculate potential trajectories constrained by the spatial environment and individual's behaviour. We tested two functions that consider space differently: one function takes into account the geometry and the types of landscape elements and one cost function sums up the spatial surroundings of an individual. Results highlight the fact that the cost function exaggerates the distances travelled by an individual and simplifies movement patterns. The geometry accurate function represents a good bottom-up approach for discovering interesting areas or obstacles for movements.

  17. Voxel-based registration of simulated and real patient CBCT data for accurate dental implant pose estimation

    NASA Astrophysics Data System (ADS)

    Moreira, António H. J.; Queirós, Sandro; Morais, Pedro; Rodrigues, Nuno F.; Correia, André Ricardo; Fernandes, Valter; Pinho, A. C. M.; Fonseca, Jaime C.; Vilaça, João. L.

    2015-03-01

    The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant's pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant's pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant's main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant's pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67+/-34μm and 108μm, and angular misfits of 0.15+/-0.08° and 1.4°, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants' pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

  18. 48 CFR 9905.501 - Cost accounting standard-consistency in estimating, accumulating and reporting costs by...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Cost accounting standard-consistency in estimating, accumulating and reporting costs by educational institutions. 9905.501 Section 9905.501 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF...

  19. An Energy-Efficient Strategy for Accurate Distance Estimation in Wireless Sensor Networks

    PubMed Central

    Tarrío, Paula; Bernardos, Ana M.; Casar, José R.

    2012-01-01

    In line with recent research efforts made to conceive energy saving protocols and algorithms and power sensitive network architectures, in this paper we propose a transmission strategy to minimize the energy consumption in a sensor network when using a localization technique based on the measurement of the strength (RSS) or the time of arrival (TOA) of the received signal. In particular, we find the transmission power and the packet transmission rate that jointly minimize the total consumed energy, while ensuring at the same time a desired accuracy in the RSS or TOA measurements. We also propose some corrections to these theoretical results to take into account the effects of shadowing and packet loss in the propagation channel. The proposed strategy is shown to be effective in realistic scenarios providing energy savings with respect to other transmission strategies, and also guaranteeing a given accuracy in the distance estimations, which will serve to guarantee a desired accuracy in the localization result. PMID:23202218

  20. Energetic costs of mange in wolves estimated from infrared thermography

    USGS Publications Warehouse

    Cross, Paul C.; Almberg, Emily S.; Haase, Catherine G; Hudson, Peter J.; Maloney, Shane K; Metz, Matthew C; Munn, Adam J; Nugent, Paul; Putzeys, Olivier; Stahler, Daniel R.; Stewart, Anya C; Smith, Doug W.

    2016-01-01

    Parasites, by definition, extract energy from their hosts and thus affect trophic and food web dynamics even when the parasite may have limited effects on host population size. We studied the energetic costs of mange (Sarcoptes scabiei) in wolves (Canis lupus) using thermal cameras to estimate heat losses associated with compromised insulation during the winter. We combined the field data of known, naturally infected wolves with data set on captive wolves with shaved patches of fur as a positive control to simulate mange-induced hair loss. We predict that during the winter in Montana, more severe mange infection increases heat loss by around 5.2 to 12 MJ per night (1240 to 2850 kcal, or a 65% to 78% increase) for small and large wolves, respectively accounting for wind effects. To maintain body temperature would require a significant proportion of a healthy wolf's total daily energy demands (18-22 MJ/day). We also predict how these thermal costs may increase in colder climates by comparing our predictions in Bozeman, Montana to those from a place with lower ambient temperatures (Fairbanks, Alaska). Contrary to our expectations, the 14°C differential between these regions was not as important as the potential differences in wind speed. These large increases in energetic demands can be mitigated by either increasing consumption rates or decreasing other energy demands. Data from GPS-collared wolves indicated that healthy wolves move, on average, 17 km per day, which was reduced by 1.5, 1.8 and 6.5 km for light, medium, and severe hair loss. In addition, the wolf with the most hair loss was less active at night and more active during the day, which is the converse of the movement patterns of healthy wolves. At the individual level mange infections create significant energy demands and altered behavioral patterns, this may have cascading effects on prey consumption rates, food web dynamics, predator-prey interactions, and scavenger communities.

  1. [Research on maize multispectral image accurate segmentation and chlorophyll index estimation].

    PubMed

    Wu, Qian; Sun, Hong; Li, Min-zan; Song, Yuan-yuan; Zhang, Yan-e

    2015-01-01

    In order to rapidly acquire maize growing information in the field, a non-destructive method of maize chlorophyll content index measurement was conducted based on multi-spectral imaging technique and imaging processing technology. The experiment was conducted at Yangling in Shaanxi province of China and the crop was Zheng-dan 958 planted in about 1 000 m X 600 m experiment field. Firstly, a 2-CCD multi-spectral image monitoring system was available to acquire the canopy images. The system was based on a dichroic prism, allowing precise separation of the visible (Blue (B), Green (G), Red (R): 400-700 nm) and near-infrared (NIR, 760-1 000 nm) band. The multispectral images were output as RGB and NIR images via the system vertically fixed to the ground with vertical distance of 2 m and angular field of 50°. SPAD index of each sample was'measured synchronously to show the chlorophyll content index. Secondly, after the image smoothing using adaptive smooth filtering algorithm, the NIR maize image was selected to segment the maize leaves from background, because there was a big difference showed in gray histogram between plant and soil background. The NIR image segmentation algorithm was conducted following steps of preliminary and accuracy segmentation: (1) The results of OTSU image segmentation method and the variable threshold algorithm were discussed. It was revealed that the latter was better one in corn plant and weed segmentation. As a result, the variable threshold algorithm based on local statistics was selected for the preliminary image segmentation. The expansion and corrosion were used to optimize the segmented image. (2) The region labeling algorithm was used to segment corn plants from soil and weed background with an accuracy of 95. 59 %. And then, the multi-spectral image of maize canopy was accurately segmented in R, G and B band separately. Thirdly, the image parameters were abstracted based on the segmented visible and NIR images. The average gray

  2. Developing a Cost Model and Methodology to Estimate Capital Costs for Thermal Energy Storage

    SciTech Connect

    Glatzmaier, G.

    2011-12-01

    This report provides an update on the previous cost model for thermal energy storage (TES) systems. The update allows NREL to estimate the costs of such systems that are compatible with the higher operating temperatures associated with advanced power cycles. The goal of the Department of Energy (DOE) Solar Energy Technology Program is to develop solar technologies that can make a significant contribution to the United States domestic energy supply. The recent DOE SunShot Initiative sets a very aggressive cost goal to reach a Levelized Cost of Energy (LCOE) of 6 cents/kWh by 2020 with no incentives or credits for all solar-to-electricity technologies.1 As this goal is reached, the share of utility power generation that is provided by renewable energy sources is expected to increase dramatically. Because Concentrating Solar Power (CSP) is currently the only renewable technology that is capable of integrating cost-effective energy storage, it is positioned to play a key role in providing renewable, dispatchable power to utilities as the share of power generation from renewable sources increases. Because of this role, future CSP plants will likely have as much as 15 hours of Thermal Energy Storage (TES) included in their design and operation. As such, the cost and performance of the TES system is critical to meeting the SunShot goal for solar technologies. The cost of electricity from a CSP plant depends strongly on its overall efficiency, which is a product of two components - the collection and conversion efficiencies. The collection efficiency determines the portion of incident solar energy that is captured as high-temperature thermal energy. The conversion efficiency determines the portion of thermal energy that is converted to electricity. The operating temperature at which the overall efficiency reaches its maximum depends on many factors, including material properties of the CSP plant components. Increasing the operating temperature of the power generation

  3. Breckinridge Project, initial effort. Report IX. Operating cost estimate

    SciTech Connect

    1982-01-01

    Operating costs are normally broken into three major categories: variable costs including raw materials, annual catalyst and chemicals, and utilities; semi-variable costs including labor and labor related cost; and fixed or capital related charges. The raw materials and utilities costs are proportional to production; however, a small component of utilities cost is independent of production. The catalyst and chemicals costs are also normally proportional to production. Semi-variable costs include direct labor, maintenance labor, labor supervision, contract maintenance, maintenance materials, payroll overheads, operation supplies, and general overhead and administration. Fixed costs include local taxes, insurance and the time value of the capital investment. The latter charge often includes the investor's anticipated return on investment. In determining operating costs for financial analysis, return on investment (ROI) and depreciation are not treated as cash operating costs. These costs are developed in the financial analysis; the annual operating cost determined here omits ROI and depreciation. Project Annual Operating Costs are summarized in Table 1. Detailed supporting information for the cost elements listed below is included in the following sections: Electrical, catalyst and chemicals, and salaries and wages.

  4. The challenges of accurately estimating time of long bone injury in children.

    PubMed

    Pickett, Tracy A

    2015-07-01

    The ability to determine the time an injury occurred can be of crucial significance in forensic medicine and holds special relevance to the investigation of child abuse. However, dating paediatric long bone injury, including fractures, is nuanced by complexities specific to the paediatric population. These challenges include the ability to identify bone injury in a growing or only partially-calcified skeleton, different injury patterns seen within the spectrum of the paediatric population, the effects of bone growth on healing as a separate entity from injury, differential healing rates seen at different ages, and the relative scarcity of information regarding healing rates in children, especially the very young. The challenges posed by these factors are compounded by a lack of consistency in defining and categorizing healing parameters. This paper sets out the primary limitations of existing knowledge regarding estimating timing of paediatric bone injury. Consideration and understanding of the multitude of factors affecting bone injury and healing in children will assist those providing opinion in the medical-legal forum. PMID:26048508

  5. Accurate estimation of normal incidence absorption coefficients with confidence intervals using a scanning laser Doppler vibrometer

    NASA Astrophysics Data System (ADS)

    Vuye, Cedric; Vanlanduit, Steve; Guillaume, Patrick

    2009-06-01

    When using optical measurements of the sound fields inside a glass tube, near the material under test, to estimate the reflection and absorption coefficients, not only these acoustical parameters but also confidence intervals can be determined. The sound fields are visualized using a scanning laser Doppler vibrometer (SLDV). In this paper the influence of different test signals on the quality of the results, obtained with this technique, is examined. The amount of data gathered during one measurement scan makes a thorough statistical analysis possible leading to the knowledge of confidence intervals. The use of a multi-sine, constructed on the resonance frequencies of the test tube, shows to be a very good alternative for the traditional periodic chirp. This signal offers the ability to obtain data for multiple frequencies in one measurement, without the danger of a low signal-to-noise ratio. The variability analysis in this paper clearly shows the advantages of the proposed multi-sine compared to the periodic chirp. The measurement procedure and the statistical analysis are validated by measuring the reflection ratio at a closed end and comparing the results with the theoretical value. Results of the testing of two building materials (an acoustic ceiling tile and linoleum) are presented and compared to supplier data.

  6. Accurate Estimation of Protein Folding and Unfolding Times: Beyond Markov State Models.

    PubMed

    Suárez, Ernesto; Adelman, Joshua L; Zuckerman, Daniel M

    2016-08-01

    Because standard molecular dynamics (MD) simulations are unable to access time scales of interest in complex biomolecular systems, it is common to "stitch together" information from multiple shorter trajectories using approximate Markov state model (MSM) analysis. However, MSMs may require significant tuning and can yield biased results. Here, by analyzing some of the longest protein MD data sets available (>100 μs per protein), we show that estimators constructed based on exact non-Markovian (NM) principles can yield significantly improved mean first-passage times (MFPTs) for protein folding and unfolding. In some cases, MSM bias of more than an order of magnitude can be corrected when identical trajectory data are reanalyzed by non-Markovian approaches. The NM analysis includes "history" information, higher order time correlations compared to MSMs, that is available in every MD trajectory. The NM strategy is insensitive to fine details of the states used and works well when a fine time-discretization (i.e., small "lag time") is used. PMID:27340835

  7. ProViDE: A software tool for accurate estimation of viral diversity in metagenomic samples

    PubMed Central

    Ghosh, Tarini Shankar; Mohammed, Monzoorul Haque; Komanduri, Dinakar; Mande, Sharmila Shekhar

    2011-01-01

    Given the absence of universal marker genes in the viral kingdom, researchers typically use BLAST (with stringent E-values) for taxonomic classification of viral metagenomic sequences. Since majority of metagenomic sequences originate from hitherto unknown viral groups, using stringent e-values results in most sequences remaining unclassified. Furthermore, using less stringent e-values results in a high number of incorrect taxonomic assignments. The SOrt-ITEMS algorithm provides an approach to address the above issues. Based on alignment parameters, SOrt-ITEMS follows an elaborate work-flow for assigning reads originating from hitherto unknown archaeal/bacterial genomes. In SOrt-ITEMS, alignment parameter thresholds were generated by observing patterns of sequence divergence within and across various taxonomic groups belonging to bacterial and archaeal kingdoms. However, many taxonomic groups within the viral kingdom lack a typical Linnean-like taxonomic hierarchy. In this paper, we present ProViDE (Program for Viral Diversity Estimation), an algorithm that uses a customized set of alignment parameter thresholds, specifically suited for viral metagenomic sequences. These thresholds capture the pattern of sequence divergence and the non-uniform taxonomic hierarchy observed within/across various taxonomic groups of the viral kingdom. Validation results indicate that the percentage of ‘correct’ assignments by ProViDE is around 1.7 to 3 times higher than that by the widely used similarity based method MEGAN. The misclassification rate of ProViDE is around 3 to 19% (as compared to 5 to 42% by MEGAN) indicating significantly better assignment accuracy. ProViDE software and a supplementary file (containing supplementary figures and tables referred to in this article) is available for download from http://metagenomics.atc.tcs.com/binning/ProViDE/ PMID:21544173

  8. A new method based on the subpixel Gaussian model for accurate estimation of asteroid coordinates

    NASA Astrophysics Data System (ADS)

    Savanevych, V. E.; Briukhovetskyi, O. B.; Sokovikova, N. S.; Bezkrovny, M. M.; Vavilova, I. B.; Ivashchenko, Yu. M.; Elenin, L. V.; Khlamov, S. V.; Movsesian, Ia. S.; Dashkova, A. M.; Pogorelov, A. V.

    2015-08-01

    We describe a new iteration method to estimate asteroid coordinates, based on a subpixel Gaussian model of the discrete object image. The method operates by continuous parameters (asteroid coordinates) in a discrete observational space (the set of pixel potentials) of the CCD frame. In this model, the kind of coordinate distribution of the photons hitting a pixel of the CCD frame is known a priori, while the associated parameters are determined from a real digital object image. The method that is developed, which is flexible in adapting to any form of object image, has a high measurement accuracy along with a low calculating complexity, due to the maximum-likelihood procedure that is implemented to obtain the best fit instead of a least-squares method and Levenberg-Marquardt algorithm for minimization of the quadratic form. Since 2010, the method has been tested as the basis of our Collection Light Technology (COLITEC) software, which has been installed at several observatories across the world with the aim of the automatic discovery of asteroids and comets in sets of CCD frames. As a result, four comets (C/2010 X1 (Elenin), P/2011 NO1(Elenin), C/2012 S1 (ISON) and P/2013 V3 (Nevski)) as well as more than 1500 small Solar system bodies (including five near-Earth objects (NEOs), 21 Trojan asteroids of Jupiter and one Centaur object) have been discovered. We discuss these results, which allowed us to compare the accuracy parameters of the new method and confirm its efficiency. In 2014, the COLITEC software was recommended to all members of the Gaia-FUN-SSO network for analysing observations as a tool to detect faint moving objects in frames.

  9. Zero-Cost Estimation of Zero-Point Energies.

    PubMed

    Császár, Attila G; Furtenbacher, Tibor

    2015-10-01

    An additive, linear, atom-type-based (ATB) scheme is developed allowing no-cost estimation of zero-point vibrational energies (ZPVE) of neutral, closed-shell molecules in their ground electronic states. The atom types employed correspond to those defined within the MM2 molecular mechanics force field approach. The reference training set of 156 molecules cover chained and branched alkanes, alkenes, cycloalkanes and cycloalkenes, alkynes, alcohols, aldehydes, carboxylic acids, amines, amides, ethers, esters, ketones, benzene derivatives, heterocycles, nucleobases, all the natural amino acids, some dipeptides and sugars, as well as further simple molecules and ones containing several structural units, including several vitamins. A weighted linear least-squares fit of atom-type-based ZPVE increments results in recommended values for the following atoms, with the number of atom types defined in parentheses: H(8), D(1), B(1), C(6), N(7), O(3), F(1), Si(1), P(2), S(3), and Cl(1). The average accuracy of the ATB ZPVEs is considerably better than 1 kcal mol(-1), that is, better than chemical accuracy. The proposed ATB scheme could be extended to many more atoms and atom types, following a careful validation procedure; deviation from the MM2 atom types seems to be necessary, especially for third-row elements. PMID:26398318

  10. Figure of merit of diamond power devices based on accurately estimated impact ionization processes

    NASA Astrophysics Data System (ADS)

    Hiraiwa, Atsushi; Kawarada, Hiroshi

    2013-07-01

    Although a high breakdown voltage or field is considered as a major advantage of diamond, there has been a large difference in breakdown voltages or fields of diamond devices in literature. Most of these apparently contradictory results did not correctly reflect material properties because of specific device designs, such as punch-through structure and insufficient edge termination. Once these data were removed, the remaining few results, including a record-high breakdown field of 20 MV/cm, were theoretically reproduced, exactly calculating ionization integrals based on the ionization coefficients that were obtained after compensating for possible errors involved in reported theoretical values. In this compensation, we newly developed a method for extracting an ionization coefficient from an arbitrary relationship between breakdown voltage and doping density in the Chynoweth's framework. The breakdown field of diamond was estimated to depend on the doping density more than other materials, and accordingly required to be compared at the same doping density. The figure of merit (FOM) of diamond devices, obtained using these breakdown data, was comparable to the FOMs of 4H-SiC and Wurtzite-GaN devices at room temperature, but was projected to be larger than the latter by more than one order of magnitude at higher temperatures about 300 °C. Considering the relatively undeveloped state of diamond technology, there is room for further enhancement of the diamond FOM, improving breakdown voltage and mobility. Through these investigations, junction breakdown was found to be initiated by electrons or holes in a p--type or n--type drift layer, respectively. The breakdown voltages in the two types of drift layers differed from each other in a strict sense but were practically the same. Hence, we do not need to care about the conduction type of drift layers, but should rather exactly calculate the ionization integral without approximating ionization coefficients by a power

  11. Wind effect on PV module temperature: Analysis of different techniques for an accurate estimation.

    NASA Astrophysics Data System (ADS)

    Schwingshackl, Clemens; Petitta, Marcello; Ernst Wagner, Jochen; Belluardo, Giorgio; Moser, David; Castelli, Mariapina; Zebisch, Marc; Tetzlaff, Anke

    2013-04-01

    temperature estimation using meteorological parameters. References: [1] Skoplaki, E. et al., 2008: A simple correlation for the operating temperature of photovoltaic modules of arbitrary mounting, Solar Energy Materials & Solar Cells 92, 1393-1402 [2] Skoplaki, E. et al., 2008: Operating temperature of photovoltaic modules: A survey of pertinent correlations, Renewable Energy 34, 23-29 [3] Koehl, M. et al., 2011: Modeling of the nominal operating cell temperature based on outdoor weathering, Solar Energy Materials & Solar Cells 95, 1638-1646 [4] Mattei, M. et al., 2005: Calculation of the polycrystalline PV module temperature using a simple method of energy balance, Renewable Energy 31, 553-567 [5] Kurtz, S. et al.: Evaluation of high-temperature exposure of rack-mounted photovoltaic modules

  12. Different approaches to estimating transition costs in the electric- utility industry

    SciTech Connect

    Baxter, L.W.

    1995-10-01

    The term ``transition costs`` describes the potential revenue shortfall (or welfare loss) a utility (or other actor) may experience through government-initiated deregulation of electricity generation. The potential for transition costs arises whenever a regulated industry is subject to competitive market forces as a result of explicit government action. Federal and state proposals to deregulate electricity generation sparked a national debate on transition costs in the electric-utility industry. Industry-wide transition cost estimates range from about $20 billion to $500 billion. Such disparate estimates raise important questions on estimation methods for decision makers. This report examines different approaches to estimating transition costs. The study has three objectives. First, we discuss the concept of transition cost. Second, we identify the major cost categories included in transition cost estimates and summarize the current debate on which specific costs are appropriately included in these estimates. Finally, we identify general and specific estimation approaches and assess their strengths and weaknesses. We relied primarily on the evidentiary records established at the Federal Energy Regulatory Commission and the California Public Utilities Commission to identify major cost categories and specific estimation approaches. We also contacted regulatory commission staffs in ten states to ascertain estimation activities in each of these states. We refined a classification framework to describe and assess general estimation options. We subsequently developed and applied criteria to describe and assess specific estimation approaches proposed by federal regulators, state regulators, utilities, independent power companies, and consultants.

  13. Quaternion-Based Unscented Kalman Filter for Accurate Indoor Heading Estimation Using Wearable Multi-Sensor System

    PubMed Central

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  14. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  15. Disaster warning system study summary. [cost estimates using NOAA satellites

    NASA Technical Reports Server (NTRS)

    Leroy, B. F.; Maloy, J. E.; Braley, R. C.; Provencher, C. E.; Schumaker, H. A.; Valgora, M. E.

    1977-01-01

    A conceptual satellite system to replace or complement NOAA's data collection, internal communications, and public information dissemination systems for the mid-1980's was defined. Program cost and cost sensitivity to variations in communications functions are analyzed.

  16. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture

  17. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401 Cost... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Cost accounting...

  18. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401 Cost... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Cost accounting...

  19. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401 Cost... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Cost accounting...

  20. 48 CFR 9904.401 - Cost accounting standard-consistency in estimating, accumulating and reporting costs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401 Cost... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Cost accounting...

  1. Space launch systems cost estimation as design tool

    NASA Astrophysics Data System (ADS)

    Koelle, D. E.

    The paper describes the methods of cost engineering for launch vehicles: the application of cost analysis as the principle design criteria at the very beginning of a vehicle design- and not (only) as a final step. The statistic-analytical TRANSCOST Model is a typical tool for such an economic design optimization. The major elements of "Cost per Launch" (CpL) are described and the influence of launch system type and its development cost are discussed. Finally examples are shown of system design optimization by cost analysis.

  2. What Would It Cost to Coach Every New Principal? An Estimate Using Statewide Personnel Data

    ERIC Educational Resources Information Center

    Lochmiller, Chad R.

    2014-01-01

    In this paper, I use Levin and McEwan's (2001) cost feasibility approach and personnel data obtained from the Superintendent of Public Instruction to estimate the cost of providing coaching support to every newly hired principal in Washington State. Based on this descriptive analysis, I estimate that the cost to provide leadership coaching to…

  3. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  4. Bar codes and intrinsic-surface-roughness tag: Accurate and low-cost accountability for CFE. [Conventional force equipment (CFE)

    SciTech Connect

    DeVolpi, A.; Palm, R.

    1990-01-01

    CFE poses a number of verification challenges that could be met in part by an accurate and low-cost means of aiding in accountability of treaty-limited equipment. Although the treaty as signed does not explicitly call for the use of tags, there is a provision for recording serial numbers'' and placing special marks'' on equipment subject to reduction. There are approximately 150,000 residual items to be tracked for CFE-I, about half for each alliance of state parties. These highly mobile items are subject to complex treaty limitations: deployment limits and zones, ceilings subceilings, holdings and allowances. There are controls and requirements for storage, conversion, and reduction. In addition, there are national security concerns regarding modernization and mobilization capability. As written into the treaty, a heavy reliance has been placed on human inspectors for CFE verification. Inspectors will mostly make visual observations and photographs as the means of monitoring compliance; these observations can be recorded by handwriting or keyed into a laptop computer. CFE is now less a treaty between two alliances than a treaty among 22 state parties, with inspection data an reports to be shared with each party in the official languages designated by CSCE. One of the potential roles for bar-coded tags would be to provide a universal, exchangable, computer-compatible language for tracking TLE. 10 figs.

  5. Accurate path integral molecular dynamics simulation of ab-initio water at near-zero added cost

    NASA Astrophysics Data System (ADS)

    Elton, Daniel; Fritz, Michelle; Soler, José; Fernandez-Serra, Marivi

    It is now established that nuclear quantum motion plays an important role in determining water's structure and dynamics. These effects are important to consider when evaluating DFT functionals and attempting to develop better ones for water. The standard way of treating nuclear quantum effects, path integral molecular dynamics (PIMD), multiplies the number of energy/force calculations by the number of beads, which is typically 32. Here we introduce a method whereby PIMD can be incorporated into a DFT molecular dynamics simulation at virtually zero cost. The method is based on the cluster (many body) expansion of the energy. We first subtract the DFT monomer energies, using a custom DFT-based monomer potential energy surface. The evolution of the PIMD beads is then performed using only the more-accurate Partridge-Schwenke monomer energy surface. The DFT calculations are done using the centroid positions. Various bead thermostats can be employed to speed up the sampling of the quantum ensemble. The method bears some resemblance to multiple timestep algorithms and other schemes used to speed up PIMD with classical force fields. We show that our method correctly captures some of key effects of nuclear quantum motion on both the structure and dynamics of water. We acknowledge support from DOE Award No. DE-FG02-09ER16052 (D.E.) and DOE Early Career Award No. DE-SC0003871 (M.V.F.S.).

  6. Estimating the Cost-Effectiveness of Implementation: Is Sufficient Evidence Available?

    PubMed Central

    Whyte, Sophie; Dixon, Simon; Faria, Rita; Walker, Simon; Palmer, Stephen; Sculpher, Mark; Radford, Stefanie

    2016-01-01

    Background Timely implementation of recommended interventions can provide health benefits to patients and cost savings to the health service provider. Effective approaches to increase the implementation of guidance are needed. Since investment in activities that improve implementation competes for funding against other health generating interventions, it should be assessed in term of its costs and benefits. Objective In 2010, the National Institute for Health and Care Excellence released a clinical guideline recommending natriuretic peptide (NP) testing in patients with suspected heart failure. However, its implementation in practice was variable across the National Health Service in England. This study demonstrates the use of multi-period analysis together with diffusion curves to estimate the value of investing in implementation activities to increase uptake of NP testing. Methods Diffusion curves were estimated based on historic data to produce predictions of future utilization. The value of an implementation activity (given its expected costs and effectiveness) was estimated. Both a static population and a multi-period analysis were undertaken. Results The value of implementation interventions encouraging the utilization of NP testing is shown to decrease over time as natural diffusion occurs. Sensitivity analyses indicated that the value of the implementation activity depends on its efficacy and on the population size. Conclusions Value of implementation can help inform policy decisions of how to invest in implementation activities even in situations in which data are sparse. Multi-period analysis is essential to accurately quantify the time profile of the value of implementation given the natural diffusion of the intervention and the incidence of the disease. PMID:27021746

  7. Cost estimation for solid waste management in industrialising regions - Precedents, problems and prospects

    SciTech Connect

    Parthan, Shantha R.; Milke, Mark W.; Wilson, David C.; Cocks, John H.

    2012-03-15

    Highlights: Black-Right-Pointing-Pointer We review cost estimation approaches for solid waste management. Black-Right-Pointing-Pointer Unit cost method and benchmarking techniques used in industrialising regions (IR). Black-Right-Pointing-Pointer Variety in scope, quality and stakeholders makes cost estimation challenging in IR. Black-Right-Pointing-Pointer Integrate waste flow and cost models using cost functions to improve cost planning. - Abstract: The importance of cost planning for solid waste management (SWM) in industrialising regions (IR) is not well recognised. The approaches used to estimate costs of SWM can broadly be classified into three categories - the unit cost method, benchmarking techniques and developing cost models using sub-approaches such as cost and production function analysis. These methods have been developed into computer programmes with varying functionality and utility. IR mostly use the unit cost and benchmarking approach to estimate their SWM costs. The models for cost estimation, on the other hand, are used at times in industrialised countries, but not in IR. Taken together, these approaches could be viewed as precedents that can be modified appropriately to suit waste management systems in IR. The main challenges (or problems) one might face while attempting to do so are a lack of cost data, and a lack of quality for what data do exist. There are practical benefits to planners in IR where solid waste problems are critical and budgets are limited.

  8. 40 CFR 264.144 - Cost estimate for post-closure care.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for post-closure care... FACILITIES Financial Requirements § 264.144 Cost estimate for post-closure care. (a) The owner or operator of... contingent closure and post-closure plan, must have a detailed written estimate, in current dollars, of...

  9. 48 CFR 1336.605 - Government cost estimate for architect-engineer work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government cost estimate... Architect-Engineer Services 1336.605 Government cost estimate for architect-engineer work. After award, the independent Government estimated price can be released, upon request, to those firms or individuals...

  10. 40 CFR 265.144 - Cost estimate for post-closure care.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Cost estimate for post-closure care..., STORAGE, AND DISPOSAL FACILITIES Financial Requirements § 265.144 Cost estimate for post-closure care. (a) The owner or operator of a hazardous waste disposal unit must have a detailed written estimate,...

  11. Estimating dietary costs of low-income women in California: A comparison of two approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: Compare two approaches for estimating individual daily diet costs in a population of low-income women in California. Design: Cost estimates based on time-intensive Method 1 (three 24-h recalls and associated food prices on receipts) were compared with estimates using a lesser intensive M...

  12. A medical cost estimation with fuzzy neural network of acute hepatitis patients in emergency room.

    PubMed

    Kuo, R J; Cheng, W C; Lien, W C; Yang, T J

    2015-10-01

    Taiwan is an area where chronic hepatitis is endemic. Liver cancer is so common that it has been ranked first among cancer mortality rates since the early 1980s in Taiwan. Besides, liver cirrhosis and chronic liver diseases are the sixth or seventh in the causes of death. Therefore, as shown by the active research on hepatitis, it is not only a health threat, but also a huge medical cost for the government. The estimated total number of hepatitis B carriers in the general population aged more than 20 years old is 3,067,307. Thus, a case record review was conducted from all patients with diagnosis of acute hepatitis admitted to the Emergency Department (ED) of a well-known teaching-oriented hospital in Taipei. The cost of medical resource utilization is defined as the total medical fee. In this study, a fuzzy neural network is employed to develop the cost forecasting model. A total of 110 patients met the inclusion criteria. The computational results indicate that the FNN model can provide more accurate forecasts than the support vector regression (SVR) or artificial neural network (ANN). In addition, unlike SVR and ANN, FNN can also provide fuzzy IF-THEN rules for interpretation. PMID:26153643

  13. Laboratory demonstration of aircraft estimation using low-cost sensors

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.

    1978-01-01

    Four nonlinear state estimators were devised which provide techniques for obtaining the angular orientation (attitude) of the aircraft. An extensive FORTRAN computer program was developed to demonstrate and evaluate the estimators by using recorded flight test data. This program simulates the estimator operation, and it compares the state estimates with actual state measurements. The program was used to evaluate the state estimators with data recorded on the NASA Ames CV-990 and CESSNA 402B aircraft. A preliminary assessment was made of the memory, word length, and timing requirements for implementing the selected state estimator on a typical microcomputer.

  14. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  15. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    SciTech Connect

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.; Huelskamp, Robert M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  16. Los Alamos Waste Management Cost Estimation Model; Final report: Documentation of waste management process, development of Cost Estimation Model, and model reference manual

    SciTech Connect

    Matysiak, L.M.; Burns, M.L.

    1994-03-01

    This final report completes the Los Alamos Waste Management Cost Estimation Project, and includes the documentation of the waste management processes at Los Alamos National Laboratory (LANL) for hazardous, mixed, low-level radioactive solid and transuranic waste, development of the cost estimation model and a user reference manual. The ultimate goal of this effort was to develop an estimate of the life cycle costs for the aforementioned waste types. The Cost Estimation Model is a tool that can be used to calculate the costs of waste management at LANL for the aforementioned waste types, under several different scenarios. Each waste category at LANL is managed in a separate fashion, according to Department of Energy requirements and state and federal regulations. The cost of the waste management process for each waste category has not previously been well documented. In particular, the costs associated with the handling, treatment and storage of the waste have not been well understood. It is anticipated that greater knowledge of these costs will encourage waste generators at the Laboratory to apply waste minimization techniques to current operations. Expected benefits of waste minimization are a reduction in waste volume, decrease in liability and lower waste management costs.

  17. Estimating the Cost of Standardized Student Testing in the United States.

    ERIC Educational Resources Information Center

    Phelps, Richard P.

    2000-01-01

    Describes and contrasts different methods of estimating costs of standardized testing. Using a cost-accounting approach, compares gross and marginal costs and considers testing objects (test materials and services, personnel and student time, and administrative/building overhead). Social marginal costs of replacing existing tests with a national…

  18. Cost estimates for flat plate and concentrator collector arrays

    NASA Technical Reports Server (NTRS)

    Shimada, K.

    1982-01-01

    The current module and installation costs for the U.S. National Photovoltaic Program's grid-connected systems are significantly higher than required for economic viability of this alternative. Attention is accordingly given to the prospects for installed module cost reductions in flat plate, linear focus Fresnel concentrator, and point focus Fresnel concentrator candidate systems. Cost projections indicate that all three systems would meet near-term and midterm goals, provided that module costs of $2.80/W(p) and $0.70/W(p), respectively, are met. The point focus Fresnel system emerges as the most viable for the near term.

  19. 48 CFR 252.215-7002 - Cost estimating system requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Contractor's policies, procedures, and practices for budgeting and planning controls, and generating...) Flow of work, coordination, and communication; and (5) Budgeting, planning, estimating methods... personnel have sufficient training, experience, and guidance to perform estimating and budgeting tasks...

  20. 40 CFR 144.62 - Cost estimate for plugging and abandonment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Waste Injection Wells § 144.62 Cost estimate for plugging and abandonment. (a) The owner or operator... Oil and Gas Field Equipment Cost Index. The inflation factor is the result of dividing the...

  1. 40 CFR 144.62 - Cost estimate for plugging and abandonment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Waste Injection Wells § 144.62 Cost estimate for plugging and abandonment. (a) The owner or operator... Oil and Gas Field Equipment Cost Index. The inflation factor is the result of dividing the...

  2. 40 CFR 144.62 - Cost estimate for plugging and abandonment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Waste Injection Wells § 144.62 Cost estimate for plugging and abandonment. (a) The owner or operator... Oil and Gas Field Equipment Cost Index. The inflation factor is the result of dividing the...

  3. 40 CFR 144.62 - Cost estimate for plugging and abandonment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Waste Injection Wells § 144.62 Cost estimate for plugging and abandonment. (a) The owner or operator... Oil and Gas Field Equipment Cost Index. The inflation factor is the result of dividing the...

  4. Lunar base scenario cost estimates: Lunar base systems study task 6.1

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The projected development and production costs of each of the Lunar Base's systems are described and unit costs are estimated for transporting the systems to the lunar surface and for setting up the system.

  5. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    SciTech Connect

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared.

  6. On the Utility of National Datasets and Resource Cost Models for Estimating Faculty Instructional Costs in Higher Education

    ERIC Educational Resources Information Center

    Morphew, Christopher; Baker, Bruce

    2007-01-01

    In this article, the authors present the results of a research study in which they used two national datasets to construct and examine a model that estimates relative faculty instructional costs for specific undergraduate degree programs and also identifies differences in these costs by region and institutional type. They conducted this research…

  7. COST ESTIMATION MODELS FOR DRINKING WATER TREATMENT UNIT PROCESSES

    EPA Science Inventory

    Cost models for unit processes typically utilized in a conventional water treatment plant and in package treatment plant technology are compiled in this paper. The cost curves are represented as a function of specified design parameters and are categorized into four major catego...

  8. Improving Space Project Cost Estimating with Engineering Management Variables

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph W.; Roth, Axel (Technical Monitor)

    2001-01-01

    Current space project cost models attempt to predict space flight project cost via regression equations, which relate the cost of projects to technical performance metrics (e.g. weight, thrust, power, pointing accuracy, etc.). This paper examines the introduction of engineering management parameters to the set of explanatory variables. A number of specific engineering management variables are considered and exploratory regression analysis is performed to determine if there is statistical evidence for cost effects apart from technical aspects of the projects. It is concluded that there are other non-technical effects at work and that further research is warranted to determine if it can be shown that these cost effects are definitely related to engineering management.

  9. Species Distribution 2.0: An Accurate Time- and Cost-Effective Method of Prospection Using Street View Imagery

    PubMed Central

    Schwoertzig, Eugénie; Millon, Alexandre

    2016-01-01

    Species occurrence data provide crucial information for biodiversity studies in the current context of global environmental changes. Such studies often rely on a limited number of occurrence data collected in the field and on pseudo-absences arbitrarily chosen within the study area, which reduces the value of these studies. To overcome this issue, we propose an alternative method of prospection using geo-located street view imagery (SVI). Following a standardised protocol of virtual prospection using both vertical (aerial photographs) and horizontal (SVI) perceptions, we have surveyed 1097 randomly selected cells across Spain (0.1x0.1 degree, i.e. 20% of Spain) for the presence of Arundo donax L. (Poaceae). In total we have detected A. donax in 345 cells, thus substantially expanding beyond the now two-centuries-old field-derived record, which described A. donax only 216 cells. Among the field occurrence cells, 81.1% were confirmed by SVI prospection to be consistent with species presence. In addition, we recorded, by SVI prospection, 752 absences, i.e. cells where A. donax was considered absent. We have also compared the outcomes of climatic niche modeling based on SVI data against those based on field data. Using generalized linear models fitted with bioclimatic predictors, we have found SVI data to provide far more compelling results in terms of niche modeling than does field data as classically used in SDM. This original, cost- and time-effective method provides the means to accurately locate highly visible taxa, reinforce absence data, and predict species distribution without long and expensive in situ prospection. At this time, the majority of available SVI data is restricted to human-disturbed environments that have road networks. However, SVI is becoming increasingly available in natural areas, which means the technique has considerable potential to become an important factor in future biodiversity studies. PMID:26751565

  10. User's manual for the INDCEPT code for estimating industrial steam boiler plant capital investment costs

    SciTech Connect

    Bowers, H I; Fuller, L C; Hudson, II, C R

    1982-09-01

    The INDCEPT computer code package was developed to provide conceptual capital investment cost estimates for single- and multiple-unit industrial steam boiler plants. Cost estimates can be made as a function of boiler type, size, location, and date of initial operation. The output includes a detailed breakdown of the estimate into direct and indirect costs. Boiler plant cost models are provided to reflect various types and sources of coal and alternate means of sulfur and particulate removal. Cost models are also included for low-Btu and medium-Btu gas produced in coal gasification plants.

  11. Stochastic Frontier Estimation of a CES Cost Function: The Case of Higher Education in Britain.

    ERIC Educational Resources Information Center

    Izadi, Hooshang; Johnes, Geraint; Oskrochi, Reza; Crouchley, Robert

    2002-01-01

    Examines the use of stochastic frontier estimation of constant elasticity of substitution (CES) cost function to measure differences in efficiency among British universities. (Contains 28 references.) (PKP)

  12. Mass screening for neuroblastoma and estimation of costs.

    PubMed

    Nishi, M; Miyake, H; Takeda, T; Takasugi, N; Hanai, J; Kawai, T

    1991-01-01

    On the basis of epidemiological data and medical costs for patients with neuroblastoma, we have calculated the cost of mass screening for neuroblastoma with high performance liquid chromatography (HPLC) compared to the cost when it is not performed. If the sensitivity of the mass screening is 80% and 22,000 infants are screened annually the cost will be 27,809,000 yen ($191,800). If mass is not performed, the cost will be 28,446,000 yen ($196,200). The difference in cost (637,000 yen or $4,400) is fairly small. If the sensitivity is 75% and 16,500 infants are screened, the difference is also small (174,000 yen or $1,200). Therefore, mass screening with the HPLC method will not be an undue financial burden. But re-screening at an older age will be done with less financially favorable results, considering that the sensitivity may not be as high as that of the first screening and that mothers are somewhat reluctant about re-screening. The balance of the cost of mass screening by qualitative methods may also be less favorable, since the detection rate is low. PMID:1957600

  13. Estimating costs of sea lice control strategy in Norway.

    PubMed

    Liu, Yajie; Bjelland, Hans Vanhauwaer

    2014-12-01

    This paper explores the costs of sea lice control strategies associated with salmon aquaculture at a farm level in Norway. Diseases can cause reduction in growth, low feed efficiency and market prices, increasing mortality rates, and expenditures on prevention and treatment measures. Aquaculture farms suffer the most direct and immediate economic losses from diseases. The goal of a control strategy is to minimize the total disease costs, including biological losses, and treatment costs while to maximize overall profit. Prevention and control strategies are required to eliminate or minimize the disease, while cost-effective disease control strategies at the fish farm level are designed to reduce the losses, and to enhance productivity and profitability. Thus, the goal can be achieved by integrating models of fish growth, sea lice dynamics and economic factors. A production function is first constructed to incorporate the effects of sea lice on production at a farm level, followed by a detailed cost analysis of several prevention and treatment strategies associated with sea lice in Norway. The results reveal that treatments are costly and treatment costs are very sensitive to treatment types used and timing of the treatment conducted. Applying treatment at an early growth stage is more economical than at a later stage. PMID:25443395

  14. A systematic approach for the accurate non-invasive estimation of blood glucose utilizing a novel light-tissue interaction adaptive modelling scheme

    NASA Astrophysics Data System (ADS)

    Rybynok, V. O.; Kyriacou, P. A.

    2007-10-01

    Diabetes is one of the biggest health challenges of the 21st century. The obesity epidemic, sedentary lifestyles and an ageing population mean prevalence of the condition is currently doubling every generation. Diabetes is associated with serious chronic ill health, disability and premature mortality. Long-term complications including heart disease, stroke, blindness, kidney disease and amputations, make the greatest contribution to the costs of diabetes care. Many of these long-term effects could be avoided with earlier, more effective monitoring and treatment. Currently, blood glucose can only be monitored through the use of invasive techniques. To date there is no widely accepted and readily available non-invasive monitoring technique to measure blood glucose despite the many attempts. This paper challenges one of the most difficult non-invasive monitoring techniques, that of blood glucose, and proposes a new novel approach that will enable the accurate, and calibration free estimation of glucose concentration in blood. This approach is based on spectroscopic techniques and a new adaptive modelling scheme. The theoretical implementation and the effectiveness of the adaptive modelling scheme for this application has been described and a detailed mathematical evaluation has been employed to prove that such a scheme has the capability of extracting accurately the concentration of glucose from a complex biological media.

  15. Accurate state estimation from uncertain data and models: an application of data assimilation to mathematical models of human brain tumors

    PubMed Central

    2011-01-01

    Background Data assimilation refers to methods for updating the state vector (initial condition) of a complex spatiotemporal model (such as a numerical weather model) by combining new observations with one or more prior forecasts. We consider the potential feasibility of this approach for making short-term (60-day) forecasts of the growth and spread of a malignant brain cancer (glioblastoma multiforme) in individual patient cases, where the observations are synthetic magnetic resonance images of a hypothetical tumor. Results We apply a modern state estimation algorithm (the Local Ensemble Transform Kalman Filter), previously developed for numerical weather prediction, to two different mathematical models of glioblastoma, taking into account likely errors in model parameters and measurement uncertainties in magnetic resonance imaging. The filter can accurately shadow the growth of a representative synthetic tumor for 360 days (six 60-day forecast/update cycles) in the presence of a moderate degree of systematic model error and measurement noise. Conclusions The mathematical methodology described here may prove useful for other modeling efforts in biology and oncology. An accurate forecast system for glioblastoma may prove useful in clinical settings for treatment planning and patient counseling. Reviewers This article was reviewed by Anthony Almudevar, Tomas Radivoyevitch, and Kristin Swanson (nominated by Georg Luebeck). PMID:22185645

  16. Estimated Position Replacement Costs for Technician Personnel in a State's Public Facilities

    ERIC Educational Resources Information Center

    Zaharia, E. S.; Baumeister, A. A.

    1978-01-01

    Estimates and fiscal data were gathered from three public institutions for the developmentally disabled to estimate technician replacement costs in the residential service delivery system of a southeastern state. (Author/SBH)

  17. But what will it Cost? The history of NASA cost estimating

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph W.

    1994-01-01

    Within two years of being chartered in 1958 as an independent agency to conduct civilian pursuits in aeronautics and space, NASA absorbed either wholly or partially the people, facilities, and equipment of several existing organizations. These included the laboratories of the National Advisory Committee of Aeronautics (NACA) at Langley Research Center in Virginia, Ames Research Center in California, and Lewis Research Center in Ohio; the Army Ballistic Missile Agency (ABMA) at Redstone Arsenal Alabama, for which the team of Wernher von Braun worked; and the Department of Defense Advanced Research Projects Agency (ARPA) and their ongoing work on big boosters. These were especially valuable resources to jump start the new agency in light of the shocking success of the Soviet space probe Sputnik in the autumn of the previous year and the corresponding pressure from an impatient American public to produce some response. Along with these inheritances, there came some existing systems engineering and management practices, including project cost estimating methodologies. This paper will briefly trace the origins of those methods and how they evolved within the agency over the past three decades.

  18. Reservoir evaluation of thin-bedded turbidites and hydrocarbon pore thickness estimation for an accurate quantification of resource

    NASA Astrophysics Data System (ADS)

    Omoniyi, Bayonle; Stow, Dorrik

    2016-04-01

    One of the major challenges in the assessment of and production from turbidite reservoirs is to take full account of thin and medium-bedded turbidites (<10cm and <30cm respectively). Although such thinner, low-pay sands may comprise a significant proportion of the reservoir succession, they can go unnoticed by conventional analysis and so negatively impact on reserve estimation, particularly in fields producing from prolific thick-bedded turbidite reservoirs. Field development plans often take little note of such thin beds, which are therefore bypassed by mainstream production. In fact, the trapped and bypassed fluids can be vital where maximising field value and optimising production are key business drivers. We have studied in detail, a succession of thin-bedded turbidites associated with thicker-bedded reservoir facies in the North Brae Field, UKCS, using a combination of conventional logs and cores to assess the significance of thin-bedded turbidites in computing hydrocarbon pore thickness (HPT). This quantity, being an indirect measure of thickness, is critical for an accurate estimation of original-oil-in-place (OOIP). By using a combination of conventional and unconventional logging analysis techniques, we obtain three different results for the reservoir intervals studied. These results include estimated net sand thickness, average sand thickness, and their distribution trend within a 3D structural grid. The net sand thickness varies from 205 to 380 ft, and HPT ranges from 21.53 to 39.90 ft. We observe that an integrated approach (neutron-density cross plots conditioned to cores) to HPT quantification reduces the associated uncertainties significantly, resulting in estimation of 96% of actual HPT. Further work will focus on assessing the 3D dynamic connectivity of the low-pay sands with the surrounding thick-bedded turbidite facies.

  19. Estimation of marginal costs at existing waste treatment facilities.

    PubMed

    Martinez-Sanchez, Veronica; Hulgaard, Tore; Hindsgaul, Claus; Riber, Christian; Kamuk, Bettina; Astrup, Thomas F

    2016-04-01

    This investigation aims at providing an improved basis for assessing economic consequences of alternative Solid Waste Management (SWM) strategies for existing waste facilities. A bottom-up methodology was developed to determine marginal costs in existing facilities due to changes in the SWM system, based on the determination of average costs in such waste facilities as function of key facility and waste compositional parameters. The applicability of the method was demonstrated through a case study including two existing Waste-to-Energy (WtE) facilities, one with co-generation of heat and power (CHP) and another with only power generation (Power), affected by diversion strategies of five waste fractions (fibres, plastic, metals, organics and glass), named "target fractions". The study assumed three possible responses to waste diversion in the WtE facilities: (i) biomass was added to maintain a constant thermal load, (ii) Refused-Derived-Fuel (RDF) was included to maintain a constant thermal load, or (iii) no reaction occurred resulting in a reduced waste throughput without full utilization of the facility capacity. Results demonstrated that marginal costs of diversion from WtE were up to eleven times larger than average costs and dependent on the response in the WtE plant. Marginal cost of diversion were between 39 and 287 € Mg(-1) target fraction when biomass was added in a CHP (from 34 to 303 € Mg(-1) target fraction in the only Power case), between -2 and 300 € Mg(-1) target fraction when RDF was added in a CHP (from -2 to 294 € Mg(-1) target fraction in the only Power case) and between 40 and 303 € Mg(-1) target fraction when no reaction happened in a CHP (from 35 to 296 € Mg(-1) target fraction in the only Power case). Although average costs at WtE facilities were highly influenced by energy selling prices, marginal costs were not (provided a response was initiated at the WtE to keep constant the utilized thermal capacity). Failing to systematically

  20. Estimates and implications of the costs of compliance with biosafety regulations in developing countries.

    PubMed

    Falck-Zepeda, Jose; Yorobe, Jose; Husin, Bahagiawati Amir; Manalo, Abraham; Lokollo, Erna; Ramon, Godfrey; Zambrano, Patricia; Sutrisno

    2012-01-01

    Estimating the cost of compliance with biosafety regulations is important as it helps developers focus their investments in producer development. We provide estimates for the cost of compliance for a set of technologies in Indonesia, the Philippines and other countries. These costs vary from US $100,000 to 1.7 million. These are estimates of regulatory costs and do not include product development or deployment costs. Cost estimates need to be compared with potential gains when the technology is introduced in these countries and the gains in knowledge accumulate during the biosafety assessment process. Although the cost of compliance is important, time delays and uncertainty are even more important and may have an adverse impact on innovations reaching farmers. PMID:22614639

  1. Probabilistic estimation of numbers and costs of future landslides in the San Francisco Bay region

    USGS Publications Warehouse

    Crovelli, R.A.; Coe, J.A.

    2009-01-01

    We used historical records of damaging landslides triggered by rainstorms and a newly developed Probabilistic Landslide Assessment Cost Estimation System (PLACES) to estimate the numbers and direct costs of future landslides in the 10-county San Francisco Bay region. Historical records of damaging landslides in the region are incomplete. Therefore, our estimates of numbers and costs of future landslides are minimal estimates. The estimated mean annual number of future damaging landslides for the entire 10-county region is about 65. Santa Cruz County has the highest estimated mean annual number of damaging future landslides (about 18), whereas Napa, San Francisco, and Solano Counties have the lowest estimated mean numbers of damaging landslides (about 1 each). The estimated mean annual cost of future landslides in the entire region is about US $14.80 million (year 2000 $). The estimated mean annual cost is highest for San Mateo County ($3.24 million) and lowest for Solano County ($0.18 million). The annual per capita cost for the entire region will be about $2.10. Santa Cruz County will have the highest annual per capita cost at $8.45, whereas San Francisco County will have the lowest per capita cost at $0.31. Normalising costs by dividing by the percentage of land area with slopes equal to or greater than 17% indicates that San Francisco County will have the highest cost per square km ($7,101), whereas Santa Clara County will have the lowest cost per square km ($229). These results indicate that the San Francisco Bay region has one of the highest levels of landslide risk in the United States. Compared with landslide cost estimates from the rest of the world, the risk level in the Bay region seems high, but not exceptionally high.

  2. Can endocranial volume be estimated accurately from external skull measurements in great-tailed grackles (Quiscalus mexicanus)?

    PubMed

    Logan, Corina J; Palmstrom, Christin R

    2015-01-01

    There is an increasing need to validate and collect data approximating brain size on individuals in the field to understand what evolutionary factors drive brain size variation within and across species. We investigated whether we could accurately estimate endocranial volume (a proxy for brain size), as measured by computerized tomography (CT) scans, using external skull measurements and/or by filling skulls with beads and pouring them out into a graduated cylinder for male and female great-tailed grackles. We found that while females had higher correlations than males, estimations of endocranial volume from external skull measurements or beads did not tightly correlate with CT volumes. We found no accuracy in the ability of external skull measures to predict CT volumes because the prediction intervals for most data points overlapped extensively. We conclude that we are unable to detect individual differences in endocranial volume using external skull measurements. These results emphasize the importance of validating and explicitly quantifying the predictive accuracy of brain size proxies for each species and each sex. PMID:26082858

  3. A plan for accurate estimation of daily area-mean rainfall during the CaPE experiment

    NASA Technical Reports Server (NTRS)

    Duchon, Claude E.

    1992-01-01

    The Convection and Precipitation/Electrification (CaPE) experiment took place in east central Florida from 8 July to 18 August, 1991. There were five research themes associated with CaPE. In broad terms they are: investigation of the evolution of the electric field in convective clouds, determination of meteorological and electrical conditions associated with lightning, development of mesoscale numerical forecasts (2-12 hr) and nowcasts (less than 2 hr) of convective initiation and remote estimation of rainfall. It is the last theme coupled with numerous raingage and streamgage measurements, satellite and aircraft remote sensing, radiosondes and other meteorological measurements in the atmospheric boundary layer that provide the basis for determining the hydrologic cycle for the CaPE experiment area. The largest component of the hydrologic cycle in this region is rainfall. An accurate determination of daily area-mean rainfall is important in correctly modeling its apportionment into runoff, infiltration and evapotranspiration. In order to achieve this goal a research plan was devised and initial analysis begun. The overall research plan is discussed with special emphasis placed on the adjustment of radar rainfall estimates to raingage rainfall.

  4. Application of parametric weight and cost estimating relationships to future transport aircraft

    NASA Technical Reports Server (NTRS)

    Beltramo, M. N.; Morris, M. A.; Anderson, J. L.

    1979-01-01

    A model comprised of system level weight and cost estimating relationships for transport aircraft is presented. In order to determine the production cost of future aircraft its weight is first estimated based on performance parameters, and then the cost is estimated as a function of weight. For initial evaluation CERs were applied to actual system weights of six aircraft (3 military and 3 commercial) with mean empty weights ranging from 30,000 to 300,000 lb. The resulting cost estimates were compared with actual costs. The average absolute error was only 4.3%. Then the model was applied to five aircraft still in the design phase (Boeing 757, 767 and 777, and BAC HS146-100 and HS146-200). While the estimates for the 757 and 767 are within 2 to 3 percent of their assumed break-even costs, it is recognized that these are very sensitive to the validity of the estimated weights, inflation factor, the amount assumed for nonrecurring costs, etc., and it is suggested that the model may be used in conjunction with other information such as RDT&E cost estimates and market forecasts. The model will help NASA evaluate new technologies and production costs of future aircraft.

  5. 40 CFR 265.144 - Cost estimate for post-closure care.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of the facility, the owner or operator must adjust the post-closure cost estimate for inflation...-closure care cost estimate must be updated for inflation no later than 30 days after the close of the firm... current dollars or by using an inflation factor derived from the most recent Implicit Price Deflator...

  6. Estimating the Cost of National Class Size Reductions under Different Policy Alternatives.

    ERIC Educational Resources Information Center

    Brewer, Dominic J.; Krop, Cathy; Gill, Brian P.; Reichardt, Robert

    1999-01-01

    Estimates the operational costs of nationwide class-size-reduction programs under various policy alternatives, including the specified class size, flexibility in implementation, and whether the policy is targeted toward at-risk students. Depending on the options, estimated costs range from about $2 billion per year to over $11 billion per year.…

  7. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance: Treasury 1 2011-07-01 2011-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  8. USDA Estimates of the Cost of Raising a Child: A Guide to Their Use and Interpretation.

    ERIC Educational Resources Information Center

    Edwards, Carolyn S.

    This guide describes estimates of the cost of raising a child made by the Family Economics Research Group of the United States Department of Agriculture (USDA). The guide starts with a description of what estimates are available, giving short profiles of the cost of raising urban, rural nonfarm, and rural farm children. The next section defines…

  9. 48 CFR 736.605 - Government cost estimate for architect-engineer work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Government cost estimate for architect-engineer work. 736.605 Section 736.605 Federal Acquisition Regulations System AGENCY FOR... Architect-Engineer Services 736.605 Government cost estimate for architect-engineer work. See 736.602-3(c)(5)....

  10. 31 CFR Appendix I(f) to Part 13 - Estimated Overhead and Administrative Costs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Estimated Overhead and Administrative Costs I(F) Appendix I(F) to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury... Pt. 13, App. I(F) Appendix I(F) to Part 13—Estimated Overhead and Administrative Costs Date:...

  11. Estimating Development Cost of an Interactive Website Based Cancer Screening Promotion Program

    PubMed Central

    Lairson, David R.; Chung, Tong Han; Smith, Lisa G.; Springston, Jeffrey K.; Champion, Victoria L.

    2015-01-01

    Objectives The aim of this study was to estimate the initial development costs for an innovative talk show format tailored intervention delivered via the interactive web, for increasing cancer screening in women 50 to 75 who were non-adherent to screening guidelines for colorectal cancer and/or breast cancer. Methods The cost of the intervention development was estimated from a societal perspective. Micro costing methods plus vendor contract costs were used to estimate cost. Staff logs were used to track personnel time. Non-personnel costs include all additional resources used to produce the intervention. Results Development cost of the interactive web based intervention was $.39 million, of which 77% was direct cost. About 98% of the cost was incurred in personnel time cost, contract cost and overhead cost. Conclusions The new web-based disease prevention medium required substantial investment in health promotion and media specialist time. The development cost was primarily driven by the high level of human capital required. The cost of intervention development is important information for assessing and planning future public and private investments in web-based health promotion interventions. PMID:25749548

  12. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  13. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  14. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  15. 7 CFR Exhibit A to Subpart A of... - Estimated Breakdown of Dwelling Costs for Estimating Partial Payments

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Partial Payments A Exhibit A to Subpart A of Part 1924 Agriculture Regulations of the Department of... Planning and Performing Construction and Other Development Pt. 1924, Subpt. A, Exh. A Exhibit A to Subpart A of Part 1924—Estimated Breakdown of Dwelling Costs for Estimating Partial Payments With slab...

  16. ESTIMATING INNOVATIVE TECHNOLOGY COSTS FOR THE SITE PROGRAM

    EPA Science Inventory

    Among the objectives of the EPA`s Superfund Innovative Technology Evaluation (SITE) Program are two which pertain to the issue of economics: 1) That the program will provide a projected cost for each treatment technology demonstrated. 2) That the program will attempt to identify ...

  17. Estimating the Costs of Torture: Challenges and Opportunities.

    PubMed

    Mpinga, Emmanuel Kabengele; Kandala, Ngianga-Bakwin; Hasselgård-Rowe, Jennifer; Tshimungu Kandolo, Félicien; Verloo, Henk; Bukonda, Ngoyi K Zacharie; Chastonay, Philippe

    2015-12-01

    Due to its nature, extent and consequences, torture is considered a major public health problem and a serious violation of human rights. Our study aims to set the foundation for a theoretical framework of the costs related to torture. It examines existing challenges and proposes some solutions. Our proposed framework targets policy makers, human rights activists, professionals working in programmes, centres and rehabilitation projects, judges and lawyers, survivors of torture and their families and anyone involved in the prevention and fight against this practice and its consequences. We adopted a methodology previously used in studies investigating the challenges in measuring and valuing productivity costs in health disorders. We identify and discuss conceptual, methodological, political and ethical challenges that studies on the economic and social costs of torture pose and propose alternatives in terms of possible solutions to these challenges. The economic dimension of torture is rarely debated and integrated in research, policies and programmes. Several challenges such as epistemological, methodological, ethical or political ones have often been presented as obstacles to cost studies of torture and as an excuse for not investigating this dimension. In identifying, analysing and proposing solutions to these challenges, we intend to stimulate the integration of the economic dimension in research and prevention of torture strategies. PMID:26385586

  18. ACID RAIN MITIGATION STUDY. VOLUME II. FGD COST ESTIMATES (APPENDICES)

    EPA Science Inventory

    The report gives results of work to provide a consistent set of capital investment and operating costs for flue gas desulfurization (FGD) systems retrofitted to existing industrial boilers. The investigation of wet limestone scrubbers and lime spray drying FGD systems included: (...

  19. ACID RAIN MITIGATION STUDY. VOLUME I: FGD COST ESTIMATES

    EPA Science Inventory

    The report gives results of work to provide a consistent set of capital investment and operating costs for flue gas desulfurization (FGD) systems retrofitted to existing industrial boilers. The investigation of wet limestone scrubbers and lime spray drying FGD systems included: (...

  20. An Analysis of Government Cost Estimates of Freedom of Information Act Compliance.

    ERIC Educational Resources Information Center

    Ullmann, John; List, Karen

    1985-01-01

    Examines the Reagan administration's use of cost as an argument for amending the Freedom of Information Act (FOIA). Finds two general problems with the argument: (1) incomplete, inconsistent, and inaccurate reporting of FOIA costs and (2) agency attitudes that affected the cost estimates they presented. (FL)

  1. Estimating Resource Costs of Levy Campaigns in Five Ohio School Districts

    ERIC Educational Resources Information Center

    Ingle, W. Kyle; Petroff, Ruth Ann; Johnson, Paul A.

    2011-01-01

    Using Levin and McEwan's (2001) "ingredients method," this study identified the major activities and associated costs of school levy campaigns in five districts. The ingredients were divided into one of five cost categories--human resources, facilities, fees, marketing, and supplies. As to overall costs of the campaigns, estimates ranged from a…

  2. Estimating the costs of intensity-modulated and 3-dimensional conformal radiotherapy in Ontario

    PubMed Central

    Yong, J.H.E.; McGowan, T.; Redmond-Misner, R.; Beca, J.; Warde, P.; Gutierrez, E.; Hoch, J.S.

    2016-01-01

    Background Radiotherapy is a common treatment for many cancers, but up-to-date estimates of the costs of radiotherapy are lacking. In the present study, we estimated the unit costs of intensity-modulated radiotherapy (imrt) and 3-dimensional conformal radiotherapy (3D-crt) in Ontario. Methods An activity-based costing model was developed to estimate the costs of imrt and 3D-crt in prostate cancer. It included the costs of equipment, staff, and supporting infrastructure. The framework was subsequently adapted to estimate the costs of radiotherapy in breast cancer and head-and-neck cancer. We also tested various scenarios by varying the program maturity and the use of volumetric modulated arc therapy (vmat) alongside imrt. Results From the perspective of the health care system, treating prostate cancer with imrt and 3D-crt respectively cost $12,834 and $12,453 per patient. The cost of radiotherapy ranged from $5,270 to $14,155 and was sensitive to analytic perspective, radiation technique, and disease site. Cases of head-and-neck cancer were the most costly, being driven by treatment complexity and fractions per treatment. Although imrt was more costly than 3D-crt, its cost will likely decline over time as programs mature and vmat is incorporated. Conclusions Our costing model can be modified to estimate the costs of 3D-crt and imrt for various disease sites and settings. The results demonstrate the important role of capital costs in studies of radiotherapy cost from a health system perspective, which our model can accommodate. In addition, our study established the need for future analyses of imrt cost to consider how vmat affects time consumption. PMID:27330359

  3. Time-Dependent Risk Estimation and Cost-Benefit Analysis for Mitigation Actions

    NASA Astrophysics Data System (ADS)

    van Stiphout, T.; Wiemer, S.; Marzocchi, W.

    2009-04-01

    Earthquakes strongly cluster in space and time. Consequently, the most dangerous time is right after a moderate earthquake has happened, because their is a ‘high' (i.e., 2-5 percent) probability that this event will be followed by a subsequent aftershock which happens to be as large or larger than the initiating event. The seismic hazard during this time-period exceeds the background probability significantly and by several orders of magnitude. Scientists have developed increasingly accurate forecast models that model this time-dependent hazard, and such models are currently being validated in prospective testing. However, this probabilistic information in the hazard space is difficult to digest for decision makers, the media and general public. Here, we introduce a possible bridge between seismology and decision makers (authorities, civil defense) by proposing a more objective way to estimate time-dependent risk assessment. Short Term Earthquake Risk assessment (STEER) combines aftershock hazard and loss assessments. We use site-specific information on site effects and building class distribution and combine this with existing loss models to compute site specific time-dependent risk curves (probability of exceedance for fatalities, injuries, damages etc). We show the effect of uncertainties in the different components using Monte Carlo Simulations of the input parameters. This time-dependent risk curves can act as a decision support. We extend the STEER approach by introducing a Cost-Benefit approach for certain mitigation actions after a medium-sized earthquake. Such Cost-Benefit approaches have been recently developed for volcanic risk assessment to rationalize precautionary evacuations in densely inhabitated areas threatened by volcanoes. Here we extend the concept to time-dependent probabilistic seismic risk assessment. For the Cost-Benefit analysis of mitigation actions we calculate the ratio between the cost for the mitigation actions and the cost of the

  4. Estimating Criminal Justice System Costs and Cost-Savings Benefits of Day Reporting Centers

    ERIC Educational Resources Information Center

    Craddock, Amy

    2004-01-01

    This paper reports on the net cost-savings benefits (loss) to the criminal justice system of one rural and one urban day reporting center, both of which serve high risk/high need probationers. It also discusses issues of conducting criminal justice system cost studies of community corrections programs. The average DRC participant in the rural…

  5. Overweight and obesity on the island of Ireland: an estimation of costs

    PubMed Central

    Dee, Anne; Callinan, Aoife; Doherty, Edel; O'Neill, Ciaran; McVeigh, Treasa; Sweeney, Mary Rose; Staines, Anthony; Kearns, Karen; Fitzgerald, Sarah; Sharp, Linda; Kee, Frank; Hughes, John; Balanda, Kevin; Perry, Ivan J

    2015-01-01

    Objectives The increasing prevalence of overweight and obesity worldwide continues to compromise population health and creates a wider societal cost in terms of productivity loss and premature mortality. Despite extensive international literature on the cost of overweight and obesity, findings are inconsistent between Europe and the USA, and particularly within Europe. Studies vary on issues of focus, specific costs and methods. This study aims to estimate the healthcare and productivity costs of overweight and obesity for the island of Ireland in 2009, using both top-down and bottom-up approaches. Methods Costs were estimated across four categories: healthcare utilisation, drug costs, work absenteeism and premature mortality. Healthcare costs were estimated using Population Attributable Fractions (PAFs). PAFs were applied to national cost data for hospital care and drug prescribing. PAFs were also applied to social welfare and national mortality data to estimate productivity costs due to absenteeism and premature mortality. Results The healthcare costs of overweight and obesity in 2009 were estimated at €437 million for the Republic of Ireland (ROI) and €127.41 million for NI. Productivity loss due to overweight and obesity was up to €865 million for ROI and €362 million for NI. The main drivers of healthcare costs are cardiovascular disease, type II diabetes, colon cancer, stroke and gallbladder disease. In terms of absenteeism, low back pain is the main driver in both jurisdictions, and for productivity loss due to premature mortality the primary driver of cost is coronary heart disease. Conclusions The costs are substantial, and urgent public health action is required in Ireland to address the problem of increasing prevalence of overweight and obesity, which if left unchecked will lead to unsustainable cost escalation within the health service and unacceptable societal costs. PMID:25776042

  6. Solid Waste Operations Complex W-113: Project cost estimate. Preliminary design report. Volume IV

    SciTech Connect

    1995-01-01

    This document contains Volume IV of the Preliminary Design Report for the Solid Waste Operations Complex W-113 which is the Project Cost Estimate and construction schedule. The estimate was developed based upon Title 1 material take-offs, budgetary equipment quotes and Raytheon historical in-house data. The W-113 project cost estimate and project construction schedule were integrated together to provide a resource loaded project network.

  7. Estimating Power Outage Cost based on a Survey for Industrial Customers

    NASA Astrophysics Data System (ADS)

    Yoshida, Yoshikuni; Matsuhashi, Ryuji

    A survey was conducted on power outage cost for industrial customers. 5139 factories, which are designated energy management factories in Japan, answered their power consumption and the loss of production value due to the power outage in an hour in summer weekday. The median of unit cost of power outage of whole sectors is estimated as 672 yen/kWh. The sector of services for amusement and hobbies and the sector of manufacture of information and communication electronics equipment relatively have higher unit cost of power outage. Direct damage cost from power outage in whole sectors reaches 77 billion yen. Then utilizing input-output analysis, we estimated indirect damage cost that is caused by the repercussion of production halt. Indirect damage cost in whole sectors reaches 91 billion yen. The sector of wholesale and retail trade has the largest direct damage cost. The sector of manufacture of transportation equipment has the largest indirect damage cost.

  8. COMPUTER PROGRAMS FOR ESTIMATING THE COST OF PARTICULATE CONTROL EQUIPMENT

    EPA Science Inventory

    The report describes an interactive computer program, written to estimate the capital and operating expenses of electrostatic precipitators, fabric filters, and venturi scrubbers used on coal-fired boilers. The program accepts as input the current interest rate, coal analysis, em...

  9. Development of a new, robust and accurate, spectroscopic metric for scatterer size estimation in optical coherence tomography (OCT) images

    NASA Astrophysics Data System (ADS)

    Kassinopoulos, Michalis; Pitris, Costas

    2016-03-01

    The modulations appearing on the backscattering spectrum originating from a scatterer are related to its diameter as described by Mie theory for spherical particles. Many metrics for Spectroscopic Optical Coherence Tomography (SOCT) take advantage of this observation in order to enhance the contrast of Optical Coherence Tomography (OCT) images. However, none of these metrics has achieved high accuracy when calculating the scatterer size. In this work, Mie theory was used to further investigate the relationship between the degree of modulation in the spectrum and the scatterer size. From this study, a new spectroscopic metric, the bandwidth of the Correlation of the Derivative (COD) was developed which is more robust and accurate, compared to previously reported techniques, in the estimation of scatterer size. The self-normalizing nature of the derivative and the robustness of the first minimum of the correlation as a measure of its width, offer significant advantages over other spectral analysis approaches especially for scatterer sizes above 3 μm. The feasibility of this technique was demonstrated using phantom samples containing 6, 10 and 16 μm diameter microspheres as well as images of normal and cancerous human colon. The results are very promising, suggesting that the proposed metric could be implemented in OCT spectral analysis for measuring nuclear size distribution in biological tissues. A technique providing such information would be of great clinical significance since it would allow the detection of nuclear enlargement at the earliest stages of precancerous development.

  10. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation.

    PubMed

    Subramanian, Swetha; Mast, T Douglas

    2015-10-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature. PMID:26352462

  11. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation

    NASA Astrophysics Data System (ADS)

    Subramanian, Swetha; Mast, T. Douglas

    2015-09-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature.

  12. Hyperketonemia in early lactation dairy cattle: a deterministic estimate of component and total cost per case.

    PubMed

    McArt, J A A; Nydam, D V; Overton, M W

    2015-03-01

    The purpose of this study was to develop a deterministic economic model to estimate the costs associated with (1) the component cost per case of hyperketonemia (HYK) and (2) the total cost per case of HYK when accounting for costs related to HYK-attributed diseases. Data from current literature was used to model the incidence and risks of HYK (defined as a blood β-hydroxybutyrate concentration≥1.2 mmol/L), displaced abomasa (DA), metritis, disease associations, milk production, culling, and reproductive outcomes. The component cost of HYK was estimated based on 1,000 calvings per year; the incidence of HYK in primiparous and multiparous animals; the percent of animals receiving clinical treatment; the direct costs of diagnostics, therapeutics, labor, and death loss; and the indirect costs of future milk production losses, future culling losses, and reproduction losses. Costs attributable to DA and metritis were estimated based on the incidence of each disease in the first 30 DIM; the number of cases of each disease attributable to HYK; the direct costs of diagnostics, therapeutics, discarded milk during treatment and the withdrawal period, veterinary service (DA only), and death loss; and the indirect costs of future milk production losses, future culling losses, and reproduction losses. The component cost per case of HYK was estimated at $134 and $111 for primiparous and multiparous animals, respectively; the average component cost per case of HYK was estimated to be $117. Thirty-four percent of the component cost of HYK was due to future reproductive losses, 26% to death loss, 26% to future milk production losses, 8% to future culling losses, 3% to therapeutics, 2% to labor, and 1% to diagnostics. The total cost per case of HYK was estimated at $375 and $256 for primiparous and multiparous animals, respectively; the average total cost per case of HYK was $289. Forty-one percent of the total cost of HYK was due to the component cost of HYK, 33% to costs

  13. Survey of State-Level Cost and Benefit Estimates of Renewable Portfolio Standards

    SciTech Connect

    Heeter, J.; Barbose, G.; Bird, L.; Weaver, S.; Flores-Espino, F.; Kuskova-Burns, K.; Wiser, R.

    2014-05-01

    Most renewable portfolio standards (RPS) have five or more years of implementation experience, enabling an assessment of their costs and benefits. Understanding RPS costs and benefits is essential for policymakers evaluating existing RPS policies, assessing the need for modifications, and considering new policies. This study provides an overview of methods used to estimate RPS compliance costs and benefits, based on available data and estimates issued by utilities and regulators. Over the 2010-2012 period, average incremental RPS compliance costs in the United States were equivalent to 0.8% of retail electricity rates, although substantial variation exists around this average, both from year-to-year and across states. The methods used by utilities and regulators to estimate incremental compliance costs vary considerably from state to state and a number of states are currently engaged in processes to refine and standardize their approaches to RPS cost calculation. The report finds that state assessments of RPS benefits have most commonly attempted to quantitatively assess avoided emissions and human health benefits, economic development impacts, and wholesale electricity price savings. Compared to the summary of RPS costs, the summary of RPS benefits is more limited, as relatively few states have undertaken detailed benefits estimates, and then only for a few types of potential policy impacts. In some cases, the same impacts may be captured in the assessment of incremental costs. For these reasons, and because methodologies and level of rigor vary widely, direct comparisons between the estimates of benefits and costs are challenging.

  14. The Hospitalization Costs of Diabetes and Hypertension Complications in Zimbabwe: Estimations and Correlations.

    PubMed

    Mutowo, Mutsa P; Lorgelly, Paula K; Laxy, Michael; Renzaho, Andre M N; Mangwiro, John C; Owen, Alice J

    2016-01-01

    Objective. Treating complications associated with diabetes and hypertension imposes significant costs on health care systems. This study estimated the hospitalization costs for inpatients in a public hospital in Zimbabwe. Methods. The study was retrospective and utilized secondary data from medical records. Total hospitalization costs were estimated using generalized linear models. Results. The median cost and interquartile range (IQR) for patients with diabetes, $994 (385-1553) mean $1319 (95% CI: 981-1657), was higher than patients with hypertension, $759 (494-1147) mean $914 (95% CI: 825-1003). Female patients aged below 65 years with diabetes had the highest estimated mean costs ($1467 (95% CI: 1177-1828)). Wound care had the highest estimated mean cost of all procedures, $2884 (95% CI: 2004-4149) for patients with diabetes and $2239 (95% CI: 1589-3156) for patients with hypertension. Age below 65 years, medical procedures (amputation, wound care, dialysis, and physiotherapy), the presence of two or more comorbidities, and being prescribed two or more drugs were associated with significantly higher hospitalization costs. Conclusion. Our estimated costs could be used to evaluate and improve current inpatient treatment and management of patients with diabetes and hypertension and determine the most cost-effective interventions to prevent complications and comorbidities. PMID:27403444

  15. The Hospitalization Costs of Diabetes and Hypertension Complications in Zimbabwe: Estimations and Correlations

    PubMed Central

    Mutowo, Mutsa P.; Lorgelly, Paula K.; Laxy, Michael; Mangwiro, John C.; Owen, Alice J.

    2016-01-01

    Objective. Treating complications associated with diabetes and hypertension imposes significant costs on health care systems. This study estimated the hospitalization costs for inpatients in a public hospital in Zimbabwe. Methods. The study was retrospective and utilized secondary data from medical records. Total hospitalization costs were estimated using generalized linear models. Results. The median cost and interquartile range (IQR) for patients with diabetes, $994 (385–1553) mean $1319 (95% CI: 981–1657), was higher than patients with hypertension, $759 (494–1147) mean $914 (95% CI: 825–1003). Female patients aged below 65 years with diabetes had the highest estimated mean costs ($1467 (95% CI: 1177–1828)). Wound care had the highest estimated mean cost of all procedures, $2884 (95% CI: 2004–4149) for patients with diabetes and $2239 (95% CI: 1589–3156) for patients with hypertension. Age below 65 years, medical procedures (amputation, wound care, dialysis, and physiotherapy), the presence of two or more comorbidities, and being prescribed two or more drugs were associated with significantly higher hospitalization costs. Conclusion. Our estimated costs could be used to evaluate and improve current inpatient treatment and management of patients with diabetes and hypertension and determine the most cost-effective interventions to prevent complications and comorbidities. PMID:27403444

  16. Comparing NASA and ESA Cost Estimating Methods for Human Missions to Mars

    NASA Technical Reports Server (NTRS)

    Hunt, Charles D.; vanPelt, Michel O.

    2004-01-01

    To compare working methodologies between the cost engineering functions in NASA Marshall Space Flight Center (MSFC) and ESA European Space Research and Technology Centre (ESTEC), as well as to set-up cost engineering capabilities for future manned Mars projects and other studies which involve similar subsystem technologies in MSFC and ESTEC, a demonstration cost estimate exercise was organized. This exercise was a direct way of enhancing not only cooperation between agencies but also both agencies commitment to credible cost analyses. Cost engineers in MSFC and ESTEC independently prepared life-cycle cost estimates for a reference human Mars project and subsequently compared the results and estimate methods in detail. As a non-sensitive, public domain reference case for human Mars projects, the Mars Direct concept was chosen. In this paper the results of the exercise are shown; the differences and similarities in estimate methodologies, philosophies, and databases between MSFC and ESTEC, as well as the estimate results for the Mars Direct concept. The most significant differences are explained and possible estimate improvements identified. In addition, the Mars Direct plan and the extensive cost breakdown structure jointly set-up by MSFC and ESTEC for this concept are presented. It was found that NASA applied estimate models mainly based on historic Apollo and Space Shuttle cost data, taking into account the changes in technology since then. ESA used models mostly based on European satellite and launcher cost data, taking into account the higher equipment and testing standards for human space flight. Most of NASA's and ESA s estimates for the Mars Direct case are comparable, but there are some important, consistent differences in the estimates for: 1) Large Structures and Thermal Control subsystems; 2) System Level Management, Engineering, Product Assurance and Assembly, Integration and Test/Verification activities; 3) Mission Control; 4) Space Agency Program Level

  17. Estimate of the direct and indirect annual cost of bacterial conjunctivitis in the United States

    PubMed Central

    2009-01-01

    Background The aim of this study was to estimate both the direct and indirect annual costs of treating bacterial conjunctivitis (BC) in the United States. This was a cost of illness study performed from a U.S. healthcare payer perspective. Methods A comprehensive review of the medical literature was supplemented by data on the annual incidence of BC which was obtained from an analysis of the National Ambulatory Medical Care Survey (NAMCS) database for the year 2005. Cost estimates for medical visits and laboratory or diagnostic tests were derived from published Medicare CPT fee codes. The cost of prescription drugs was obtained from standard reference sources. Indirect costs were calculated as those due to lost productivity. Due to the acute nature of BC, no cost discounting was performed. All costs are expressed in 2007 U.S. dollars. Results The number of BC cases in the U.S. for 2005 was estimated at approximately 4 million yielding an estimated annual incidence rate of 135 per 10,000. Base-case analysis estimated the total direct and indirect cost of treating patients with BC in the United States at $ 589 million. One- way sensitivity analysis, assuming either a 20% variation in the annual incidence of BC or treatment costs, generated a cost range of $ 469 million to $ 705 million. Two-way sensitivity analysis, assuming a 20% variation in both the annual incidence of BC and treatment costs occurring simultaneously, resulted in an estimated cost range of $ 377 million to $ 857 million. Conclusion The economic burden posed by BC is significant. The findings may prove useful to decision makers regarding the allocation of healthcare resources necessary to address the economic burden of BC in the United States. PMID:19939250

  18. Bread Basket: a gaming model for estimating home-energy costs

    SciTech Connect

    Not Available

    1982-01-01

    An instructional manual for answering the twenty variables on COLORADO ENERGY's computerized program estimating home energy costs. The program will generate home-energy cost estimates based on individual household data, such as total square footage, number of windows and doors, number and variety of appliances, heating system design, etc., and will print out detailed costs, showing the percentages of the total household budget that energy costs will amount to over a twenty-year span. Using the program, homeowners and policymakers alike can predict the effects of rising energy prices on total spending by Colorado households.

  19. Cost estimate of hospital stays for premature newborns in a public tertiary hospital in Brazil

    PubMed Central

    Desgualdo, Claudia Maria; Riera, Rachel; Zucchi, Paola

    2011-01-01

    OBJECTIVES: To estimate the direct costs of hospital stays for premature newborns in the Interlagos Hospital and Maternity Center in São Paulo, Brazil and to assess the difference between the amount reimbursed to the hospital by the Unified Health System and the real cost of care for each premature newborn. METHODS: A cost-estimate study in which hospital and professional costs were estimated for premature infants born at 22 to 36 weeks gestation during the calendar year of 2004 and surviving beyond one hour of age. Direct costs included hospital services, professional care, diagnoses and therapy, orthotics, prosthetics, special materials, and blood products. Costs were estimated using tables published by the Unified Health System and the Brasíndice as well as the list of medical procedures provided by the Brazilian Classification of Medical Procedures. RESULTS: The average direct cost of care for initial hospitalization of a premature newborn in 2004 was $2,386 USD. Total hospital expenses and professional services for all premature infants in this hospital were $227,000 and $69,500 USD, respectively. The costs for diagnostic testing and blood products for all premature infants totaled $22,440 and $1,833 USD. The daily average cost of a premature newborn weighing less than 1,000 g was $115 USD, and the daily average cost of a premature newborn weighing more than 2,500 g was $89 USD. Amounts reimbursed to the hospital by the Unified Health System corresponded to only 27.42% of the real cost of care. CONCLUSIONS: The cost of hospital stays for premature newborns was much greater than the amount reimbursed to the hospital by the Unified Health System. The highest costs corresponded to newborns with lower birth weight. Hospital costs progressively and discretely decreased as the newborns' weight increased. PMID:22012050

  20. Motion estimation by integrated low cost system (vision and MEMS) for positioning of a scooter "Vespa"

    NASA Astrophysics Data System (ADS)

    Guarnieri, A.; Milan, N.; Pirotti, F.; Vettore, A.

    2011-12-01

    In the automotive sector, especially in these last decade, a growing number of investigations have taken into account electronic systems to check and correct the behavior of drivers, increasing road safety. The possibility to identify with high accuracy the vehicle position in a mapping reference frame for driving directions and best-route analysis is also another topic which attracts lot of interest from the research and development sector. To reach the objective of accurate vehicle positioning and integrate response events, it is necessary to estimate time by time the position, orientation and velocity of the system. To this aim low cost GPS and MEMS (sensors can be used. In comparison to a four wheel vehicle, the dynamics of a two wheel vehicle (e.g. a scooter) feature a higher level of complexity. Indeed more degrees of freedom must be taken into account to describe the motion of the latter. For example a scooter can twist sideways, thus generating a roll angle. A slight pitch angle has to be considered as well, since wheel suspensions have a higher degree of motion with respect to four wheel vehicles. In this paper we present a method for the accurate reconstruction of the trajectory of a motorcycle ("Vespa" scooter), which can be used as alternative to the "classical" approach based on the integration of GPS and INS sensors. Position and orientation of the scooter are derived from MEMS data and images acquired by on-board digital camera. A Bayesian filter provides the means for integrating the data from MEMS-based orientation sensor and the GPS receiver.