Science.gov

Sample records for accurate cost information

  1. Profitable capitation requires accurate costing.

    PubMed

    West, D A; Hicks, L L; Balas, E A; West, T D

    1996-01-01

    In the name of costing accuracy, nurses are asked to track inventory use on per treatment basis when more significant costs, such as general overhead and nursing salaries, are usually allocated to patients or treatments on an average cost basis. Accurate treatment costing and financial viability require analysis of all resources actually consumed in treatment delivery, including nursing services and inventory. More precise costing information enables more profitable decisions as is demonstrated by comparing the ratio-of-cost-to-treatment method (aggregate costing) with alternative activity-based costing methods (ABC). Nurses must participate in this costing process to assure that capitation bids are based upon accurate costs rather than simple averages. PMID:8788799

  2. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  3. Costing Information Services

    PubMed Central

    Lutz, Raymond P.

    1971-01-01

    Information centers are being established for many disciplines. For the medical profession, users can benefit directly from these centers by having information searched by medical library professionals and readily available. If the users of an information system are to share in the operating expenses, some equitable system of charges must be established. The numerous systems of establishing user charges are listed and discussed, with the advantages or disadvantages of each system explained. After the systems have been reviewed, alternative methods of establishing prices are presented along with a typical example of what these prices might be, ranging from $7.50 to $2.50 per request. The implementation of the cost system is outlined and certain philosophical questions are posed. PMID:5582090

  4. Accurate, low-cost 3D-models of gullies

    NASA Astrophysics Data System (ADS)

    Onnen, Nils; Gronz, Oliver; Ries, Johannes B.; Brings, Christine

    2015-04-01

    are able to produce accurate and low-cost 3D-models of gullies.

  5. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  6. 78 FR 34604 - Submitting Complete and Accurate Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    ... COMMISSION 10 CFR Part 50 Submitting Complete and Accurate Information AGENCY: Nuclear Regulatory Commission... accurate information as would a licensee or an applicant for a license.'' DATES: Submit comments by August... may submit comments by any of the following methods (unless this document describes a different...

  7. Managing Costs and Medical Information

    Cancer.gov

    People with cancer may face major financial challenges and need help dealing with the high costs of care. Cancer treatment can be very expensive, even when you have insurance. Learn ways to manage medical information, paperwork, bills, and other records.

  8. The free energy cost of accurate biochemical oscillations

    PubMed Central

    Cao, Yuansheng; Wang, Hongli; Ouyang, Qi; Tu, Yuhai

    2015-01-01

    Oscillation is an important cellular process that regulates timing of different vital life cycles. However, in the noisy cellular environment, oscillations can be highly inaccurate due to phase fluctuations. It remains poorly understood how biochemical circuits suppress phase fluctuations and what is the incurred thermodynamic cost. Here, we study three different types of biochemical oscillations representing three basic oscillation motifs shared by all known oscillatory systems. In all the systems studied, we find that the phase diffusion constant depends on the free energy dissipation per period following the same inverse relation parameterized by system specific constants. This relationship and its range of validity are shown analytically in a model of noisy oscillation. Microscopically, we find that the oscillation is driven by multiple irreversible cycles that hydrolyze the fuel molecules such as ATP; the number of phase coherent periods is proportional to the free energy consumed per period. Experimental evidence in support of this general relationship and testable predictions are also presented. PMID:26566392

  9. The free-energy cost of accurate biochemical oscillations

    NASA Astrophysics Data System (ADS)

    Cao, Yuansheng; Wang, Hongli; Ouyang, Qi; Tu, Yuhai

    2015-09-01

    Oscillations within the cell regulate the timing of many important life cycles. However, in this noisy environment, oscillations can be highly inaccurate owing to phase fluctuations. It remains poorly understood how biochemical circuits suppress these phase fluctuations and what is the incurred thermodynamic cost. Here, we study three different types of biochemical oscillation, representing three basic oscillation motifs shared by all known oscillatory systems. In all the systems studied, we find that the phase diffusion constant depends on the free-energy dissipation per period, following the same inverse relation parameterized by system-specific constants. This relationship and its range of validity are shown analytically in a model of noisy oscillation. Microscopically, we find that the oscillation is driven by multiple irreversible cycles that hydrolyse fuel molecules such as ATP; the number of phase coherent periods is proportional to the free energy consumed per period. Experimental evidence in support of this general relationship and testable predictions are also presented.

  10. Ultra-accurate collaborative information filtering via directed user similarity

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  11. Methods of Cost Reduction in Information Retrieval.

    ERIC Educational Resources Information Center

    Wilmoth, James Noel

    Cost effectiveness of the QUERY program for searching the Educational Resources Information Center (ERIC) data base has been an important issue at Auburn University. At least two broad categories of costs are associated with information retrieval from a data base such as ERIC: fixed costs or overhead and data base associated costs. The concern at…

  12. 76 FR 6516 - Insurance Cost Information Regulation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-04

    ... Vehicle Information and Cost Savings Act, 15 U.S.C. 1941(e), on March 5, 1993, 58 FR 12545, the National... National Highway Traffic Safety Administration Insurance Cost Information Regulation AGENCY: National... announces publication by NHTSA of the 2011 text and data for the annual insurance cost information...

  13. 77 FR 11191 - Insurance Cost Information Regulation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-24

    ..., 58 FR 12545, NHTSA amended 49 CFR part 582, Insurance Cost Information Regulation, to require all... insurance cost information booklet that all car dealers must make available to prospective purchasers... insurance costs of different makes and models of passenger cars based on differences in...

  14. Automatic and Accurate Shadow Detection Using Near-Infrared Information.

    PubMed

    Rüfenacht, Dominic; Fredembach, Clément; Süsstrunk, Sabine

    2014-08-01

    We present a method to automatically detect shadows in a fast and accurate manner by taking advantage of the inherent sensitivity of digital camera sensors to the near-infrared (NIR) part of the spectrum. Dark objects, which confound many shadow detection algorithms, often have much higher reflectance in the NIR. We can thus build an accurate shadow candidate map based on image pixels that are dark both in the visible and NIR representations. We further refine the shadow map by incorporating ratios of the visible to the NIR image, based on the observation that commonly encountered light sources have very distinct spectra in the NIR band. The results are validated on a new database, which contains visible/NIR images for a large variety of real-world shadow creating illuminant conditions, as well as manually labeled shadow ground truth. Both quantitative and qualitative evaluations show that our method outperforms current state-of-the-art shadow detection algorithms in terms of accuracy and computational efficiency.

  15. A new accurate pill recognition system using imprint information

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Kamata, Sei-ichiro

    2013-12-01

    Great achievements in modern medicine benefit human beings. Also, it has brought about an explosive growth of pharmaceuticals that current in the market. In daily life, pharmaceuticals sometimes confuse people when they are found unlabeled. In this paper, we propose an automatic pill recognition technique to solve this problem. It functions mainly based on the imprint feature of the pills, which is extracted by proposed MSWT (modified stroke width transform) and described by WSC (weighted shape context). Experiments show that our proposed pill recognition method can reach an accurate rate up to 92.03% within top 5 ranks when trying to classify more than 10 thousand query pill images into around 2000 categories.

  16. 75 FR 5169 - Insurance Cost Information Regulation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-01

    ... Vehicle Information and Cost Savings Act, 15 U.S.C. 1941(e), on March 5, 1993, 58 FR 12545, the National... that all car dealers must make available to prospective purchasers, pursuant to 49 CFR 582.4. This... compares differences in insurance costs of different makes and models of passenger cars based...

  17. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Reasonable steps to assure information is... Reasonable Steps Commission Will Take To Assure Information It Discloses Is Accurate, and That Disclosure Is... Administers § 1101.32 Reasonable steps to assure information is accurate. (a) The Commission considers...

  18. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information is... Reasonable Steps Commission Will Take To Assure Information It Discloses Is Accurate, and That Disclosure Is... Administers § 1101.32 Reasonable steps to assure information is accurate. (a) The Commission considers...

  19. Unifying cost and information in information-theoretic competitive learning.

    PubMed

    Kamimura, Ryotaro

    2005-01-01

    In this paper, we introduce costs into the framework of information maximization and try to maximize the ratio of information to its associated cost. We have shown that competitive learning is realized by maximizing mutual information between input patterns and competitive units. One shortcoming of the method is that maximizing information does not necessarily produce representations faithful to input patterns. Information maximizing primarily focuses on some parts of input patterns that are used to distinguish between patterns. Therefore, we introduce the cost, which represents average distance between input patterns and connection weights. By minimizing the cost, final connection weights reflect input patterns well. We applied the method to a political data analysis, a voting attitude problem and a Wisconsin cancer problem. Experimental results confirmed that, when the cost was introduced, representations faithful to input patterns were obtained. In addition, improved generalization performance was obtained within a relatively short learning time.

  20. Transaction costs, externalities and information technology in health care.

    PubMed

    Ferguson, B; Keen, J

    1996-01-01

    This paper discusses some of the economic issues which underpin the rationale for investment in information and communications technologies (ICTs). Information imperfections lead to significant transaction costs (search, negotiating and monitoring) which in turn confer a negative externality on parties involved in exchange. This divergence in private and social costs leads to a degree of resource misallocation (efficiency loss) which, uncorrected, results in a sub-optimal outcome. Traditional solutions to this problem are to rely upon direct government action to reduce the costs of transacting between market agents, or to employ tax/subsidy measures and other legislative action to achieve the desired market outcome. Three key policy questions are raised in the context of the NHS purchaser/provider relationship. Firstly, what is the optimum level of transaction costs; secondly, can ICTs assist in lowering the level of transaction costs to the optimum level; thirdly, who should bear the investment cost in reducing the level of transaction costs? The issue of property rights in different information systems is discussed and raises interesting policy questions about how much investment should be undertaken centrally rather than devolved to a more local level. In some ways this economic framework offers a post hoc justification of why different ICT systems have been introduced at various levels of the NHS. Essentially this reduces to the problem of externalities: providing good information confers a positive externality: not providing relevant, timely and accurate information confers a negative externality, by increasing further the level of transaction costs. The crucial role which ICT systems can play lies in attempting to reduce the level of transaction costs and driving the market towards what Dahlman has described as the transaction-cost-constrained equilibrium.

  1. Chromatography paper as a low-cost medium for accurate spectrophotometric assessment of blood hemoglobin concentration.

    PubMed

    Bond, Meaghan; Elguea, Carlos; Yan, Jasper S; Pawlowski, Michal; Williams, Jessica; Wahed, Amer; Oden, Maria; Tkaczyk, Tomasz S; Richards-Kortum, Rebecca

    2013-06-21

    Anemia affects a quarter of the world's population, and a lack of appropriate diagnostic tools often prevents treatment in low-resource settings. Though the HemoCue 201+ is an appropriate device for diagnosing anemia in low-resource settings, the high cost of disposables ($0.99 per test in Malawi) limits its availability. We investigated using spectrophotometric measurement of blood spotted on chromatography paper as a low-cost (<$0.01 per test) alternative to HemoCue cuvettes. For this evaluation, donor blood was diluted with plasma to simulate anemia, a micropipette spotted blood on paper, and a bench-top spectrophotometer validated the approach before the development of a low-cost reader. We optimized impregnating paper with chemicals to lyse red blood cells, paper type, drying time, wavelengths measured, and sensitivity to variations in volume of blood, and we validated our approach using patient samples. Lysing the blood cells with sodium deoxycholate dried in Whatman Chr4 chromatography paper gave repeatable results, and the absorbance difference between 528 nm and 656 nm was stable over time in measurements taken up to 10 min after sample preparation. The method was insensitive to the amount of blood spotted on the paper over the range of 5 μL to 25 μL. We created a low-cost, handheld reader to measure the transmission of paper cuvettes at these optimal wavelengths. Training and validating our method with patient samples on both the spectrometer and the handheld reader showed that both devices are accurate to within 2 g dL(-1) of the HemoCue device for 98% and 95% of samples, respectively.

  2. Measuring nonlinear oscillations using a very accurate and low-cost linear optical position transducer

    NASA Astrophysics Data System (ADS)

    Donoso, Guillermo; Ladera, Celso L.

    2016-09-01

    An accurate linear optical displacement transducer of about 0.2 mm resolution over a range of ∼40 mm is presented. This device consists of a stack of thin cellulose acetate strips, each strip longitudinally slid ∼0.5 mm over the precedent one so that one end of the stack becomes a stepped wedge of constant step. A narrowed light beam from a white LED orthogonally incident crosses the wedge at a known point, the transmitted intensity being detected with a phototransistor whose emitter is connected to a diode. We present the interesting analytical proof that the voltage across the diode is linearly dependent upon the ordinate of the point where the light beam falls on the wedge, as well as the experimental validation of such a theoretical proof. Applications to nonlinear oscillations are then presented—including the interesting case of a body moving under dry friction, and the more advanced case of an oscillator in a quartic energy potential—whose time-varying positions were accurately measured with our transducer. Our sensing device can resolve the dynamics of an object attached to it with great accuracy and precision at a cost considerably less than that of a linear neutral density wedge. The technique used to assemble the wedge of acetate strips is described.

  3. A 1% treadmill grade most accurately reflects the energetic cost of outdoor running.

    PubMed

    Jones, A M; Doust, J H

    1996-08-01

    When running indoors on a treadmill, the lack of air resistance results in a lower energy cost compared with running outdoors at the same velocity. A slight incline of the treadmill gradient can be used to increase the energy cost in compensation. The aim of this study was to determine the treadmill gradient that most accurately reflects the energy cost of outdoor running. Nine trained male runners, thoroughly habituated to treadmill running, ran for 6 min at six different velocities (2.92, 3.33, 3.75, 4.17, 4.58 and 5.0 m s-1) with 6 min recovery between runs. This routine was repeated six times, five times on a treadmill set at different grades (0%, 0%, 1%, 2%, 3%) and once outdoors along a level road. Duplicate collections of expired air were taken during the final 2 min of each run to determine oxygen consumption. The repeatability of the methodology was confirmed by high correlations (r = 0.99) and non-significant differences between the duplicate expired air collections and between the repeated runs at 0% grade. The relationship between oxygen uptake (VO2) and velocity for each grade was highly linear (r > 0.99). At the two lowest velocities, VO2 during road running was not significantly different from treadmill running at 0% or 1% grade, but was significantly less than 2% and 3% grade. For 3.75 m s-1, the VO2 during road running was significantly different from treadmill running at 0%, 2% and 3% grades but not from 1% grade. For 4.17 and 4.58 m s-1, the VO2 during road running was not significantly different from that at 1% or 2% grade but was significantly greater than 0% grade and significantly less than 3% grade. At 5.0 m s-1, the VO2 for road running fell between the VO2 value for 1% and 2% grade treadmill running but was not significantly different from any of the treadmill grade conditions. This study demonstrates equality of the energetic cost of treadmill and outdoor running with the use of a 1% treadmill grade over a duration of approximately 5 min

  4. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization

    PubMed Central

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 μs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  5. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization.

    PubMed

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 µs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  6. A Low Cost Course Information Syndication System

    ERIC Educational Resources Information Center

    Ajayi, A. O.; Olajubu, E. A.; Bello, S. A.; Soriyan, H. A.; Obamuyide, A. V.

    2011-01-01

    This study presents a cost effective, reliable, and convenient mobile web-based system to facilitate the dissemination of course information to students, to support interaction that goes beyond the classroom. The system employed the Really Simple Syndication (RSS) technology and was developed using Rapid Application Development (RAD) methodology.…

  7. Cost of Information Handling in Hospitals

    PubMed Central

    Jydstrup, Ronald A.; Gross, Malvern J.

    1966-01-01

    Cost of information handling (noncomputerized) in hospitals was studied in detail from an industrial engineering point of view at Rochester General, Highland, and Geneva General hospitals. Activities were observed, personnel questioned, and time studies carried out. It was found that information handling comprises about one fourth of the hospitals' operating cost—a finding strongly recommending revision and streamlining of both forms and inefficient operations. In an Appendix to this study are presented 15 items that would improve information handling in one area of the hospital, nursing units, where this activity is greater than in any other in a hospital. PMID:5971636

  8. Modelling the Constraints of Spatial Environment in Fauna Movement Simulations: Comparison of a Boundaries Accurate Function and a Cost Function

    NASA Astrophysics Data System (ADS)

    Jolivet, L.; Cohen, M.; Ruas, A.

    2015-08-01

    Landscape influences fauna movement at different levels, from habitat selection to choices of movements' direction. Our goal is to provide a development frame in order to test simulation functions for animal's movement. We describe our approach for such simulations and we compare two types of functions to calculate trajectories. To do so, we first modelled the role of landscape elements to differentiate between elements that facilitate movements and the ones being hindrances. Different influences are identified depending on landscape elements and on animal species. Knowledge were gathered from ecologists, literature and observation datasets. Second, we analysed the description of animal movement recorded with GPS at fine scale, corresponding to high temporal frequency and good location accuracy. Analysing this type of data provides information on the relation between landscape features and movements. We implemented an agent-based simulation approach to calculate potential trajectories constrained by the spatial environment and individual's behaviour. We tested two functions that consider space differently: one function takes into account the geometry and the types of landscape elements and one cost function sums up the spatial surroundings of an individual. Results highlight the fact that the cost function exaggerates the distances travelled by an individual and simplifies movement patterns. The geometry accurate function represents a good bottom-up approach for discovering interesting areas or obstacles for movements.

  9. Waste Management Facilities Cost Information Report

    SciTech Connect

    Feizollahi, F.; Shropshire, D.

    1992-10-01

    The Waste Management Facility Cost Information (WMFCI) Report, commissioned by the US Department of Energy (DOE), develops planning life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities. This report contains PLCC estimates versus capacity for 26 different facility cost modules. A procedure to guide DOE and its contractor personnel in the use of estimating data is also provided. Estimates in the report apply to five distinctive waste streams: low-level waste, low-level mixed waste, alpha contaminated low-level waste, alpha contaminated low-level mixed waste, and transuranic waste. The report addresses five different treatment types: incineration, metal/melting and recovery, shredder/compaction, solidification, and vitrification. Data in this report allows the user to develop PLCC estimates for various waste management options.

  10. 77 FR 67366 - Federal Acquisition Regulation; Information Collection; Travel Costs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... Regulation; Information Collection; Travel Costs AGENCY: Department of Defense (DOD), General Services... collection requirement concerning Travel Costs. Public comments are particularly invited on: Whether this... Information Collection 9000- 0088, Travel Costs by any of the following methods: Regulations.gov :...

  11. Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification.

    PubMed

    Kumar, Ajay; Kwong, Cyril

    2015-03-01

    Human identification using fingerprint impressions has been widely studied and employed for more than 2000 years. Despite new advancements in the 3D imaging technologies, widely accepted representation of 3D fingerprint features and matching methodology is yet to emerge. This paper investigates 3D representation of widely employed 2D minutiae features by recovering and incorporating (i) minutiae height z and (ii) its 3D orientation φ information and illustrates an effective matching strategy for matching popular minutiae features extended in 3D space. One of the obstacles of the emerging 3D fingerprint identification systems to replace the conventional 2D fingerprint system lies in their bulk and high cost, which is mainly contributed from the usage of structured lighting system or multiple cameras. This paper attempts to addresses such key limitations of the current 3D fingerprint technologies bydeveloping the single camera-based 3D fingerprint identification system. We develop a generalized 3D minutiae matching model and recover extended 3D fingerprint features from the reconstructed 3D fingerprints. 2D fingerprint images acquired for the 3D fingerprint reconstruction can themselves be employed for the performance improvement and have been illustrated in the work detailed in this paper. This paper also attempts to answer one of the most fundamental questions on the availability of inherent discriminable information from 3D fingerprints. The experimental results are presented on a database of 240 clients 3D fingerprints, which is made publicly available to further research efforts in this area, and illustrate the discriminant power of 3D minutiae representation and matching to achieve performance improvement.

  12. The minimal work cost of information processing.

    PubMed

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-07

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  13. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  14. 48 CFR 1615.406-2 - Certificates of accurate cost or pricing data for community rated carriers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION... knowledge and belief: (1) The cost or pricing data submitted (or, if not submitted, maintained and... with the requirements of 48 CFR Chapter 16 and the FEHB Program contract and are accurate,...

  15. Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information

    NASA Technical Reports Server (NTRS)

    Butts, Glenn

    2007-01-01

    Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.

  16. Verify by Genability - Providing Solar Customers with Accurate Reports of Utility Bill Cost Savings

    SciTech Connect

    2015-12-01

    The National Renewable Energy Laboratory (NREL), partnering with Genability and supported by the U.S. Department of Energy's SunShot Incubator program, independently verified the accuracy of Genability's monthly cost savings.

  17. 75 FR 57284 - Agency Information Collection Activities: Cost Submission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... SECURITY U.S. Customs and Border Protection Agency Information Collection Activities: Cost Submission... other Federal agencies to comment on an information collection requirement concerning: Cost Submission... forms of information technology; and (e) the annual costs burden to respondents or record keepers...

  18. Species Distribution 2.0: An Accurate Time- and Cost-Effective Method of Prospection Using Street View Imagery

    PubMed Central

    Schwoertzig, Eugénie; Millon, Alexandre

    2016-01-01

    Species occurrence data provide crucial information for biodiversity studies in the current context of global environmental changes. Such studies often rely on a limited number of occurrence data collected in the field and on pseudo-absences arbitrarily chosen within the study area, which reduces the value of these studies. To overcome this issue, we propose an alternative method of prospection using geo-located street view imagery (SVI). Following a standardised protocol of virtual prospection using both vertical (aerial photographs) and horizontal (SVI) perceptions, we have surveyed 1097 randomly selected cells across Spain (0.1x0.1 degree, i.e. 20% of Spain) for the presence of Arundo donax L. (Poaceae). In total we have detected A. donax in 345 cells, thus substantially expanding beyond the now two-centuries-old field-derived record, which described A. donax only 216 cells. Among the field occurrence cells, 81.1% were confirmed by SVI prospection to be consistent with species presence. In addition, we recorded, by SVI prospection, 752 absences, i.e. cells where A. donax was considered absent. We have also compared the outcomes of climatic niche modeling based on SVI data against those based on field data. Using generalized linear models fitted with bioclimatic predictors, we have found SVI data to provide far more compelling results in terms of niche modeling than does field data as classically used in SDM. This original, cost- and time-effective method provides the means to accurately locate highly visible taxa, reinforce absence data, and predict species distribution without long and expensive in situ prospection. At this time, the majority of available SVI data is restricted to human-disturbed environments that have road networks. However, SVI is becoming increasingly available in natural areas, which means the technique has considerable potential to become an important factor in future biodiversity studies. PMID:26751565

  19. Cost-effective accurate coarse-grid method for highly convective multidimensional unsteady flows

    NASA Technical Reports Server (NTRS)

    Leonard, B. P.; Niknafs, H. S.

    1991-01-01

    A fundamentally multidimensional convection scheme is described based on vector transient interpolation modeling rewritten in conservative control-volume form. Vector third-order upwinding is used as the basis of the algorithm; this automatically introduces important cross-difference terms that are absent from schemes using component-wise one-dimensional formulas. Third-order phase accuracy is good; this is important for coarse-grid large-eddy or full simulation. Potential overshoots or undershoots are avoided by using a recently developed universal limiter. Higher order accuracy is obtained locally, where needed, by the cost-effective strategy of adaptive stencil expansion in a direction normal to each control-volume face; this is controlled by monitoring the absolute normal gradient and curvature across the face. Higher (than third) order cross-terms do not appear to be needed. Since the wider stencil is used only in isolated narrow regions (near discontinuities), extremely high (in this case, seventh) order accuracy can be achieved for little more than the cost of a globally third-order scheme.

  20. Evaluation of a low-cost and accurate ocean temperature logger on subsurface mooring systems

    SciTech Connect

    Tian, Chuan; Deng, Zhiqun; Lu, Jun; Xu, Xiaoyang; Zhao, Wei; Xu, Ming

    2014-06-23

    Monitoring seawater temperature is important to understanding evolving ocean processes. To monitor internal waves or ocean mixing, a large number of temperature loggers are typically mounted on subsurface mooring systems to obtain high-resolution temperature data at different water depths. In this study, we redesigned and evaluated a compact, low-cost, self-contained, high-resolution and high-accuracy ocean temperature logger, TC-1121. The newly designed TC-1121 loggers are smaller, more robust, and their sampling intervals can be automatically changed by indicated events. They have been widely used in many mooring systems to study internal wave and ocean mixing. The logger’s fundamental design, noise analysis, calibration, drift test, and a long-term sea trial are discussed in this paper.

  1. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, Boy-Santhos; Hut, Rolf; van de Giesen, Nick

    2013-04-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the 150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  2. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, B.; Hut, R.; Van De Giesen, N.

    2012-12-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the $150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  3. A Cost-Benefit and Accurate Method for Assessing Microalbuminuria: Single versus Frequent Urine Analysis

    PubMed Central

    Hemmati, Roholla; Gharipour, Mojgan; Khosravi, Alireza; Jozan, Mahnaz

    2013-01-01

    Background. The purpose of this study was to answer the question whether a single testing for microalbuminuria results in a reliable conclusion leading costs saving. Methods. This current cross-sectional study included a total of 126 consecutive persons. Microalbuminuria was assessed by collection of two fasting random urine specimens on arrival to the clinic as well as one week later in the morning. Results. In overall, 17 out of 126 participants suffered from microalbuminuria that, among them, 12 subjects were also diagnosed as microalbuminuria once assessing this factor with a sensitivity of 70.6%, a specificity of 100%, a PPV of 100%, a NPV of 95.6%, and an accuracy of 96.0%. The measured sensitivity, specificity, PVV, NPV, and accuracy in hypertensive patients were 73.3%, 100%, 100%, 94.8%, and 95.5%, respectively. Also, these rates in nonhypertensive groups were 50.0%, 100%, 100%, 97.3%, and 97.4%, respectively. According to the ROC curve analysis, a single measurement of UACR had a high value for discriminating defected from normal renal function state (c = 0.989). Urinary albumin concentration in a single measurement had also high discriminative value for diagnosis of damaged kidney (c = 0.995). Conclusion. The single testing of both UACR and urine albumin level rather frequent testing leads to high diagnostic sensitivity, specificity, and accuracy as well as high predictive values in total population and also in hypertensive subgroups. PMID:24455207

  4. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained. PMID:17363231

  5. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping.

    PubMed

    Lee, Han B; Schwab, Tanya L; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L; Cervera, Roberto Lopez; McNulty, Melissa S; Bostwick, Hannah S; Clark, Karl J

    2016-06-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus. PMID:26986823

  6. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping.

    PubMed

    Lee, Han B; Schwab, Tanya L; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L; Cervera, Roberto Lopez; McNulty, Melissa S; Bostwick, Hannah S; Clark, Karl J

    2016-06-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus.

  7. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping

    PubMed Central

    Lee, Han B.; Schwab, Tanya L.; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L.; Cervera, Roberto Lopez; McNulty, Melissa S.; Bostwick, Hannah S.; Clark, Karl J.

    2016-01-01

    genotypes. ASQ is cost-effective because universal fluorescent probes negate the necessity of designing expensive probes for each locus. PMID:26986823

  8. Managerial Cost Accounting for a Technical Information Center.

    ERIC Educational Resources Information Center

    Helmkamp, John G.

    A two-fold solution to the cost information deficiency problem is proposed. A formal managerial cost accounting system is designed expressly for the two information services of retrospective search and selective dissemination. The system was employed during a trial period to test its effectiveness in a technical information center. Once…

  9. Differential contribution of visual and auditory information to accurately predict the direction and rotational motion of a visual stimulus.

    PubMed

    Park, Seoung Hoon; Kim, Seonjin; Kwon, MinHyuk; Christou, Evangelos A

    2016-03-01

    Vision and auditory information are critical for perception and to enhance the ability of an individual to respond accurately to a stimulus. However, it is unknown whether visual and auditory information contribute differentially to identify the direction and rotational motion of the stimulus. The purpose of this study was to determine the ability of an individual to accurately predict the direction and rotational motion of the stimulus based on visual and auditory information. In this study, we recruited 9 expert table-tennis players and used table-tennis service as our experimental model. Participants watched recorded services with different levels of visual and auditory information. The goal was to anticipate the direction of the service (left or right) and the rotational motion of service (topspin, sidespin, or cut). We recorded their responses and quantified the following outcomes: (i) directional accuracy and (ii) rotational motion accuracy. The response accuracy was the accurate predictions relative to the total number of trials. The ability of the participants to predict the direction of the service accurately increased with additional visual information but not with auditory information. In contrast, the ability of the participants to predict the rotational motion of the service accurately increased with the addition of auditory information to visual information but not with additional visual information alone. In conclusion, this finding demonstrates that visual information enhances the ability of an individual to accurately predict the direction of the stimulus, whereas additional auditory information enhances the ability of an individual to accurately predict the rotational motion of stimulus.

  10. Activity-based costing via an information system: an application created for a breast imaging center.

    PubMed

    Hawkins, H; Langer, J; Padua, E; Reaves, J

    2001-06-01

    Activity-based costing (ABC) is a process that enables the estimation of the cost of producing a product or service. More accurate than traditional charge-based approaches, it emphasizes analysis of processes, and more specific identification of both direct and indirect costs. This accuracy is essential in today's healthcare environment, in which managed care organizations necessitate responsible and accountable costing. However, to be successfully utilized, it requires time, effort, expertise, and support. Data collection can be tedious and expensive. By integrating ABC with information management (IM) and systems (IS), organizations can take advantage of the process orientation of both, extend and improve ABC, and decrease resource utilization for ABC projects. In our case study, we have examined the process of a multidisciplinary breast center. We have mapped the constituent activities and established cost drivers. This information has been structured and included in our information system database for subsequent analysis.

  11. Activity-based costing via an information system: an application created for a breast imaging center.

    PubMed

    Hawkins, H; Langer, J; Padua, E; Reaves, J

    2001-06-01

    Activity-based costing (ABC) is a process that enables the estimation of the cost of producing a product or service. More accurate than traditional charge-based approaches, it emphasizes analysis of processes, and more specific identification of both direct and indirect costs. This accuracy is essential in today's healthcare environment, in which managed care organizations necessitate responsible and accountable costing. However, to be successfully utilized, it requires time, effort, expertise, and support. Data collection can be tedious and expensive. By integrating ABC with information management (IM) and systems (IS), organizations can take advantage of the process orientation of both, extend and improve ABC, and decrease resource utilization for ABC projects. In our case study, we have examined the process of a multidisciplinary breast center. We have mapped the constituent activities and established cost drivers. This information has been structured and included in our information system database for subsequent analysis. PMID:11442093

  12. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    SciTech Connect

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Anderson, Kevin K.; Daly, Don S.; Smith, Richard D.

    2005-08-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/- 5 ppm and 1 ppm) and NET value (no constraint, +/- 0.05 and 0.01 on a 0-1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LCMS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate easurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/- 1 ppm and elution time easurements within +/- 0.01 NET.

  13. Accurate, fast and cost-effective diagnostic test for monosomy 1p36 using real-time quantitative PCR.

    PubMed

    Cunha, Pricila da Silva; Pena, Heloisa B; D'Angelo, Carla Sustek; Koiffmann, Celia P; Rosenfeld, Jill A; Shaffer, Lisa G; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5-0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  14. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  15. Collecting and Reporting Real Costs of Information Systems.

    ERIC Educational Resources Information Center

    Price, Douglas S.

    This document attempts to provide managers and designers of information systems with a usable, practical "building block" system for unit costing. The model is sufficiently flexible to be applicable to a wide variety of cost control requirements; costing elements include project, product, account, organization and function. Design of the cost…

  16. A Cost Benefit Technique for R & D Based Information.

    ERIC Educational Resources Information Center

    Stern, B. T.

    A cost benefit technique consisting of the following five phases is proposed: (a) specific objectives of the service, (b) measurement of work flow, (c) work costing, (d) charge to users of the information service, and (e) equating demand and cost. In this approach, objectives are best stated by someone not routinely concerned with the individual…

  17. 48 CFR 1515.305-72 - Release of cost information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Release of cost information. 1515.305-72 Section 1515.305-72 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection 1515.305-72 Release of cost information. (a) In...

  18. Capturing Accurate and Useful Information on Medication-Related Telenursing Triage Calls.

    PubMed

    Lake, R; Li, L; Baysari, M; Byrne, M; Robinson, M; Westbrook, J I

    2016-01-01

    Registered nurses providing telenursing triage and advice services record information on the medication related calls they handle. However the quality and consistency of these data were rarely examined. Our aim was to examine medication related calls made to the healthdirect advice service in November 2014, to assess their basic characteristics and how the data entry format influenced information collected and data consistency. Registered nurses selected the patient question type from a range of categories, and entered the medications involved in a free text field. Medication names were manually extracted from the free text fields. We also compared the selected patient question type with the free text description of the call, in order to gauge data consistency. Results showed that nurses provided patients with advice on medication-related queries in a timely matter (the median call duration of 9 minutes). From 1835 calls, we were able to identify and classify 2156 medications into 384 generic names. However, in 204 cases (11.2% of calls) no medication name was entered. A further 308 (15.0%) of the medication names entered were not identifiable. When we compared the selected patient question with the free text description of calls, we found that these were consistent in 63.27% of cases. Telenursing and triage advice services provide a valuable resource to the public with quick and easily accessible advice. To support nurses provide quality services and record accurate information about the queries, appropriate data entry format and design would be beneficial. PMID:27440292

  19. Accurate Intermolecular Interactions at Dramatically Reduced Cost and a Many-Body Energy Decomposition Scheme for XPol+SAPT

    NASA Astrophysics Data System (ADS)

    Lao, Ka Un; Herbert, John M.

    2013-06-01

    An efficient, monomer-based electronic structure method is introduced for computing non-covalent interactions in molecular and ionic clusters. It builds upon our ``explicit polarization" (XPol) with pairwise-additive symmetry-adapted perturbation theory (SAPT) using the Kohn-Sham (KS) version of SAPT, but replaces the problematic and expensive sum-over-states dispersion terms with empirical potentials. This modification reduces the scaling from {O}(N^5) to {O}(N^3) and also facilitates the use of Kohn-Sham density functional theory (KS-DFT) as a low-cost means to capture intramolecular electron correlation. Accurate binding energies are obtained for benchmark databases of dimer binding energies, and potential energy curves are also captured accurately, for a variety of challenging systems. As compared to traditional DFT-SAPT or SAPT(DFT) methods, it removes the limitation to dimers and extends SAPT-based methodology to many-body systems. For many-body systems such as water clusters and halide-water cluster anions, the new method is superior to established density-functional methods for non-covalent interactions. We suggest that using different asymptotic corrections for different monomers is necessary to get good binding energies in general, as DFT-SAPT or SAPT(DFT), especially for hydrogen-bonded complexes. We also introduce a decomposition scheme for the interaction energy that extends traditional SAPT energy decomposition analysis to systems containing more than two monomers, and we find that the various energy components (electrostatic, exchange, induction, and dispersion) are in very good agreement with high-level SAPT benchmarks for dimers. For (H_2O)_6, the many-body contribution to the interaction energy agrees well with that obtained from traditional Kitaura-Morokuma energy decomposition analysis.

  20. Conditional mutual inclusive information enables accurate quantification of associations in gene regulatory networks.

    PubMed

    Zhang, Xiujun; Zhao, Juan; Hao, Jin-Kao; Zhao, Xing-Ming; Chen, Luonan

    2015-03-11

    Mutual information (MI), a quantity describing the nonlinear dependence between two random variables, has been widely used to construct gene regulatory networks (GRNs). Despite its good performance, MI cannot separate the direct regulations from indirect ones among genes. Although the conditional mutual information (CMI) is able to identify the direct regulations, it generally underestimates the regulation strength, i.e. it may result in false negatives when inferring gene regulations. In this work, to overcome the problems, we propose a novel concept, namely conditional mutual inclusive information (CMI2), to describe the regulations between genes. Furthermore, with CMI2, we develop a new approach, namely CMI2NI (CMI2-based network inference), for reverse-engineering GRNs. In CMI2NI, CMI2 is used to quantify the mutual information between two genes given a third one through calculating the Kullback-Leibler divergence between the postulated distributions of including and excluding the edge between the two genes. The benchmark results on the GRNs from DREAM challenge as well as the SOS DNA repair network in Escherichia coli demonstrate the superior performance of CMI2NI. Specifically, even for gene expression data with small sample size, CMI2NI can not only infer the correct topology of the regulation networks but also accurately quantify the regulation strength between genes. As a case study, CMI2NI was also used to reconstruct cancer-specific GRNs using gene expression data from The Cancer Genome Atlas (TCGA). CMI2NI is freely accessible at http://www.comp-sysbio.org/cmi2ni.

  1. Waste management facilities cost information for transuranic waste

    SciTech Connect

    Shropshire, D.; Sherick, M.; Biagi, C.

    1995-06-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for managing transuranic waste. The report`s information on treatment and storage modules can be integrated to develop total life-cycle costs for various waste management options. A procedure to guide the U.S. Department of Energy and its contractor personnel in the use of cost estimation data is also summarized in this report.

  2. Waste Management Facilities cost information for low-level waste

    SciTech Connect

    Shropshire, D.; Sherick, M.; Biadgi, C.

    1995-06-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for managing low-level waste. The report`s information on treatment, storage, and disposal modules can be integrated to develop total life-cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of cost estimation data is also summarized in this report.

  3. Waste management facilities cost information for hazardous waste. Revision 1

    SciTech Connect

    Shropshire, D.; Sherick, M.; Biagi, C.

    1995-06-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for managing hazardous waste. The report`s information on treatment, storage, and disposal modules can be integrated to develop total life-cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of cost estimation data is also summarized in this report.

  4. 75 FR 76022 - Agency Information Collection Activities: Cost Submission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... published in the Federal Register (75 FR 57284) on September 20, 2010, allowing for a 60-day comment period... SECURITY Customs and Border Protection Agency Information Collection Activities: Cost Submission AGENCY: U... approval in accordance with the Paperwork Reduction Act: Cost Submission (CBP Form 247). This is a...

  5. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  6. Accurate path integral molecular dynamics simulation of ab-initio water at near-zero added cost

    NASA Astrophysics Data System (ADS)

    Elton, Daniel; Fritz, Michelle; Soler, José; Fernandez-Serra, Marivi

    It is now established that nuclear quantum motion plays an important role in determining water's structure and dynamics. These effects are important to consider when evaluating DFT functionals and attempting to develop better ones for water. The standard way of treating nuclear quantum effects, path integral molecular dynamics (PIMD), multiplies the number of energy/force calculations by the number of beads, which is typically 32. Here we introduce a method whereby PIMD can be incorporated into a DFT molecular dynamics simulation at virtually zero cost. The method is based on the cluster (many body) expansion of the energy. We first subtract the DFT monomer energies, using a custom DFT-based monomer potential energy surface. The evolution of the PIMD beads is then performed using only the more-accurate Partridge-Schwenke monomer energy surface. The DFT calculations are done using the centroid positions. Various bead thermostats can be employed to speed up the sampling of the quantum ensemble. The method bears some resemblance to multiple timestep algorithms and other schemes used to speed up PIMD with classical force fields. We show that our method correctly captures some of key effects of nuclear quantum motion on both the structure and dynamics of water. We acknowledge support from DOE Award No. DE-FG02-09ER16052 (D.E.) and DOE Early Career Award No. DE-SC0003871 (M.V.F.S.).

  7. 76 FR 35218 - Federal Acquisition Regulation; Information Collection; Cost or Pricing Data Requirements and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-16

    ... Regulation; Information Collection; Cost or Pricing Data Requirements and Information Other Than Cost or... requirement concerning cost or pricing data requirements and information other than cost or pricing data... Information Collection 9000- 0013, Cost or Pricing Data Requirements and Information Other Than Cost...

  8. 48 CFR 239.7406 - Cost or pricing data and information other than cost or pricing data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... information other than cost or pricing data. 239.7406 Section 239.7406 Federal Acquisition Regulations System... ACQUISITION OF INFORMATION TECHNOLOGY Telecommunications Services 239.7406 Cost or pricing data and information other than cost or pricing data. (a) Common carriers are not required to submit cost or...

  9. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics

    PubMed Central

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research. PMID:27571061

  10. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics.

    PubMed

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research. PMID:27571061

  11. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics.

    PubMed

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research.

  12. Effect of Health Information Technology Expenditure on Patient Level Cost

    PubMed Central

    Dowd, Bryan

    2013-01-01

    Objectives This study investigate the effect of health information technology (IT) expenditure on individual patient-level cost using California Office of Statewide Health Planning and Development (OSHPD) data obtained from 2000 to 2007. Methods We used a traditional cost function and applied hospital fixed effect and clustered error within hospitals. Results We found that a quadratic function of IT expenditure best fit the data. The quadratic function in IT expenditure predicts a decrease in cost of up to US$1,550 of IT labor per bed, US$27,909 of IT capital per bed, and US$28,695 of all IT expenditure per bed. Moreover, we found that IT expenditure reduced costs more quickly in medical conditions than surgical diseases. Conclusions Interest in health IT is increasing more than ever before. Many studies examined the effect of health IT on hospital level cost. However, there have been few studies to examine the relationship between health IT expenditure and individual patient-level cost. We found that IT expenditure was associated with patient cost. In particular, we found a quadratic relationship between IT expenditure and patient-level cost. In other word, patient-level cost is non-linearly (or a polynomial of second-order degree) related to IT expenditure. PMID:24175120

  13. Thermodynamic Costs of Information Processing in Sensory Adaptation

    PubMed Central

    Sartori, Pablo; Granger, Léo; Lee, Chiu Fan; Horowitz, Jordan M.

    2014-01-01

    Biological sensory systems react to changes in their surroundings. They are characterized by fast response and slow adaptation to varying environmental cues. Insofar as sensory adaptive systems map environmental changes to changes of their internal degrees of freedom, they can be regarded as computational devices manipulating information. Landauer established that information is ultimately physical, and its manipulation subject to the entropic and energetic bounds of thermodynamics. Thus the fundamental costs of biological sensory adaptation can be elucidated by tracking how the information the system has about its environment is altered. These bounds are particularly relevant for small organisms, which unlike everyday computers, operate at very low energies. In this paper, we establish a general framework for the thermodynamics of information processing in sensing. With it, we quantify how during sensory adaptation information about the past is erased, while information about the present is gathered. This process produces entropy larger than the amount of old information erased and has an energetic cost bounded by the amount of new information written to memory. We apply these principles to the E. coli's chemotaxis pathway during binary ligand concentration changes. In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum. Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response. PMID:25503948

  14. 48 CFR 570.110 - Cost or pricing data and information other than cost or pricing data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Cost or pricing data and information other than cost or pricing data. 570.110 Section 570.110 Federal Acquisition Regulations System... PROPERTY General 570.110 Cost or pricing data and information other than cost or pricing data. (a)...

  15. Overwriting information: Correlations, physical costs, and environment models

    NASA Astrophysics Data System (ADS)

    Anderson, Neal G.

    2012-03-01

    In this sequel to our previous study of the entropic and energetic costs of information erasure [N.G. Anderson, Phys. Lett. A 372 (2008) 5552], we consider direct overwriting of classical information encoded in a quantum-mechanical memory system interacting with a heat bath. Lower bounds on physical costs of overwriting - in both “single-shot” and “sequential” overwriting scenarios - are obtained from globally unitary quantum dynamics and entropic inequalities alone, all within a referential approach that grounds information content in correlations between physical system states. A heterogeneous environment model, required for consistent treatment of sequential overwriting, is introduced and used to establish and relate bounds for various cases.

  16. Information Technology: A Tool to Cut Health Care Costs

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.

    1996-01-01

    Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.

  17. Accurate refinement of docked protein complexes using evolutionary information and deep learning.

    PubMed

    Akbal-Delibas, Bahar; Farhoodi, Roshanak; Pomplun, Marc; Haspel, Nurit

    2016-06-01

    One of the major challenges for protein docking methods is to accurately discriminate native-like structures from false positives. Docking methods are often inaccurate and the results have to be refined and re-ranked to obtain native-like complexes and remove outliers. In a previous work, we introduced AccuRefiner, a machine learning based tool for refining protein-protein complexes. Given a docked complex, the refinement tool produces a small set of refined versions of the input complex, with lower root-mean-square-deviation (RMSD) of atomic positions with respect to the native structure. The method employs a unique ranking tool that accurately predicts the RMSD of docked complexes with respect to the native structure. In this work, we use a deep learning network with a similar set of features and five layers. We show that a properly trained deep learning network can accurately predict the RMSD of a docked complex with 1.40 Å error margin on average, by approximating the complex relationship between a wide set of scoring function terms and the RMSD of a docked structure. The network was trained on 35000 unbound docking complexes generated by RosettaDock. We tested our method on 25 different putative docked complexes produced also by RosettaDock for five proteins that were not included in the training data. The results demonstrate that the high accuracy of the ranking tool enables AccuRefiner to consistently choose the refinement candidates with lower RMSD values compared to the coarsely docked input structures. PMID:26846813

  18. Accurate refinement of docked protein complexes using evolutionary information and deep learning.

    PubMed

    Akbal-Delibas, Bahar; Farhoodi, Roshanak; Pomplun, Marc; Haspel, Nurit

    2016-06-01

    One of the major challenges for protein docking methods is to accurately discriminate native-like structures from false positives. Docking methods are often inaccurate and the results have to be refined and re-ranked to obtain native-like complexes and remove outliers. In a previous work, we introduced AccuRefiner, a machine learning based tool for refining protein-protein complexes. Given a docked complex, the refinement tool produces a small set of refined versions of the input complex, with lower root-mean-square-deviation (RMSD) of atomic positions with respect to the native structure. The method employs a unique ranking tool that accurately predicts the RMSD of docked complexes with respect to the native structure. In this work, we use a deep learning network with a similar set of features and five layers. We show that a properly trained deep learning network can accurately predict the RMSD of a docked complex with 1.40 Å error margin on average, by approximating the complex relationship between a wide set of scoring function terms and the RMSD of a docked structure. The network was trained on 35000 unbound docking complexes generated by RosettaDock. We tested our method on 25 different putative docked complexes produced also by RosettaDock for five proteins that were not included in the training data. The results demonstrate that the high accuracy of the ranking tool enables AccuRefiner to consistently choose the refinement candidates with lower RMSD values compared to the coarsely docked input structures.

  19. Improving the delivery of care and reducing healthcare costs with the digitization of information.

    PubMed

    Noffsinger, R; Chin, S

    2000-01-01

    In the coming years, the digitization of information and the Internet will be extremely powerful in reducing healthcare costs while assisting providers in the delivery of care. One example of healthcare inefficiency that can be managed through information digitization is the process of prescription writing. Due to the handwritten and verbal communication surrounding prescription writing, as well as the multiple tiers of authorizations, the prescription drug process causes extensive financial waste as well as medical errors, lost time, and even fatal accidents. Electronic prescription management systems are being designed to address these inefficiencies. By utilizing new electronic prescription systems, physicians not only prescribe more accurately, but also improve formulary compliance thereby reducing pharmacy utilization. These systems expand patient care by presenting proactive alternatives at the point of prescription while reducing costs and providing additional benefits for consumers and healthcare providers. PMID:11066646

  20. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    ERIC Educational Resources Information Center

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  1. 77 FR 63804 - Federal Acquisition Regulation; Information Collection; Indirect Cost Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-17

    ... Regulation; Information Collection; Indirect Cost Rates AGENCY: Department of Defense (DOD), General Services... requirement concerning Indirect Cost Rates. Public comments are particularly invited on: Whether this... Information Collection 9000- 0069, Indirect Cost Rates, by any of the following methods:...

  2. Dynamic disorder and the energetic costs of information transduction

    SciTech Connect

    Thill, Peter

    2014-07-07

    We study a model of dynamic disorder relevant for signal transduction pathways in which enzymatic reaction rates fluctuate over several orders of magnitude. For the simple networks we consider, dynamic disorder drives the system far from equilibrium and imposes an energetic burden for high fidelity signaling capability. We study how the dynamics of the underlying stochastic behavior in the reaction rate process is related to the energetic cost of transmitting information through the network.

  3. 48 CFR 1615.406-2 - Certificate of accurate cost or pricing data for community-rated carriers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION... Community-Rated Carriers This is to certify that, to the best of my knowledge and belief: (1) The cost or... the ____* FEHB Program rates were developed in accordance with the requirements of 48 CFR Chapter...

  4. 48 CFR 1615.406-2 - Certificates of accurate cost or pricing data for community rated carriers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES HEALTH BENEFITS ACQUISITION REGULATION... certify that, to the best of my knowledge and belief: (1) The cost or pricing data submitted (or, if not... were developed in accordance with the requirements of 48 CFR Chapter 16 and the FEHB Program...

  5. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... to release to the public: (1) The Commission staff or a qualified person or entity outside the... will review the information in light of the comments. The degree of review by the Commission and...

  6. An examination of information quality as a moderator of accurate personality judgment.

    PubMed

    Letzring, Tera D; Human, Lauren J

    2014-10-01

    Information quality is an important moderator of the accuracy of personality judgment, and this article describes research focusing on how specific kinds of information are related to accuracy. In this study, 228 participants (159 female, 69 male; mean age = 23.43; 86.4% Caucasian) in unacquainted dyads were assigned to discuss thoughts and feelings, discuss behaviors, or engage in behaviors. Interactions lasted 25-30 min, and participants provided ratings of their partners and themselves following the interaction on the Big Five traits, ego-control, and ego-resiliency. Next, the amount of different types of information made available by each participant was objectively coded. The accuracy criterion, composed of self- and acquaintance ratings, was used to assess distinctive and normative accuracy using the Social Accuracy Model. Participants in the discussion conditions achieved higher distinctive accuracy than participants who engaged in behaviors, but normative accuracy did not differ across conditions. Information about specific behaviors and general behaviors were among the most consistent predictors of higher distinctive accuracy. Normative accuracy was more likely to decrease than increase when higher-quality information was available. Verbal information about behaviors is the most useful for learning about how people are unique.

  7. Polyallelic structural variants can provide accurate, highly informative genetic markers focused on diagnosis and therapeutic targets: Accuracy vs. Precision.

    PubMed

    Roses, A D

    2016-02-01

    Structural variants (SVs) include all insertions, deletions, and rearrangements in the genome, with several common types of nucleotide repeats including single sequence repeats, short tandem repeats, and insertion-deletion length variants. Polyallelic SVs provide highly informative markers for association studies with well-phenotyped cohorts. SVs can influence gene regulation by affecting epigenetics, transcription, splicing, and/or translation. Accurate assays of polyallelic SV loci are required to define the range and allele frequency of variable length alleles. PMID:26517180

  8. Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

    PubMed Central

    Mantel, Bruno; Stoffregen, Thomas A.; Campbell, Alain; Bardy, Benoît G.

    2015-01-01

    Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate. PMID:25856410

  9. MBRidge: an accurate and cost-effective method for profiling DNA methylome at single-base resolution

    PubMed Central

    Cai, Wanshi; Mao, Fengbiao; Teng, Huajing; Cai, Tao; Zhao, Fangqing; Wu, Jinyu; Sun, Zhong Sheng

    2015-01-01

    Organisms and cells, in response to environmental influences or during development, undergo considerable changes in DNA methylation on a genome-wide scale, which are linked to a variety of biological processes. Using MethylC-seq to decipher DNA methylome at single-base resolution is prohibitively costly. In this study, we develop a novel approach, named MBRidge, to detect the methylation levels of repertoire CpGs, by innovatively introducing C-hydroxylmethylated adapters and bisulfate treatment into the MeDIP-seq protocol and employing ridge regression in data analysis. A systematic evaluation of DNA methylome in a human ovarian cell line T29 showed that MBRidge achieved high correlation (R > 0.90) with much less cost (∼10%) in comparison with MethylC-seq. We further applied MBRidge to profiling DNA methylome in T29H, an oncogenic counterpart of T29's. By comparing methylomes of T29H and T29, we identified 131790 differential methylation regions (DMRs), which are mainly enriched in carcinogenesis-related pathways. These are substantially different from 7567 DMRs that were obtained by RRBS and related with cell development or differentiation. The integrated analysis of DMRs in the promoter and expression of DMR-corresponding genes revealed that DNA methylation enforced reverse regulation of gene expression, depending on the distance from the proximal DMR to transcription starting sites in both mRNA and lncRNA. Taken together, our results demonstrate that MBRidge is an efficient and cost-effective method that can be widely applied to profiling DNA methylomes. PMID:26078362

  10. Combining Evolutionary Information and an Iterative Sampling Strategy for Accurate Protein Structure Prediction.

    PubMed

    Braun, Tatjana; Koehler Leman, Julia; Lange, Oliver F

    2015-12-01

    Recent work has shown that the accuracy of ab initio structure prediction can be significantly improved by integrating evolutionary information in form of intra-protein residue-residue contacts. Following this seminal result, much effort is put into the improvement of contact predictions. However, there is also a substantial need to develop structure prediction protocols tailored to the type of restraints gained by contact predictions. Here, we present a structure prediction protocol that combines evolutionary information with the resolution-adapted structural recombination approach of Rosetta, called RASREC. Compared to the classic Rosetta ab initio protocol, RASREC achieves improved sampling, better convergence and higher robustness against incorrect distance restraints, making it the ideal sampling strategy for the stated problem. To demonstrate the accuracy of our protocol, we tested the approach on a diverse set of 28 globular proteins. Our method is able to converge for 26 out of the 28 targets and improves the average TM-score of the entire benchmark set from 0.55 to 0.72 when compared to the top ranked models obtained by the EVFold web server using identical contact predictions. Using a smaller benchmark, we furthermore show that the prediction accuracy of our method is only slightly reduced when the contact prediction accuracy is comparatively low. This observation is of special interest for protein sequences that only have a limited number of homologs.

  11. Honey bees can perform accurately directed waggle dances based solely on information from a homeward trip.

    PubMed

    Edrich, Wolfgang

    2015-10-01

    Honey bees were displaced several 100 m from their hive to an unfamiliar site and provisioned with honey. After feeding, almost two-thirds of the bees flew home to their hive within a 50 min observation time. About half of these returning, bees signalled the direction of the release site in waggle dances thus demonstrating that the dance can be guided entirely by information gathered on a single homeward trip. The likely reason for the bees' enthusiastic dancing on their initial return from this new site was the highly rewarding honeycomb that they were given there. The attractive nature of the site is confirmed by many of these bees revisiting the site and continuing to forage there.

  12. Accurately decoding visual information from fMRI data obtained in a realistic virtual environment

    PubMed Central

    Floren, Andrew; Naylor, Bruce; Miikkulainen, Risto; Ress, David

    2015-01-01

    Three-dimensional interactive virtual environments (VEs) are a powerful tool for brain-imaging based cognitive neuroscience that are presently under-utilized. This paper presents machine-learning based methods for identifying brain states induced by realistic VEs with improved accuracy as well as the capability for mapping their spatial topography on the neocortex. VEs provide the ability to study the brain under conditions closer to the environment in which humans evolved, and thus to probe deeper into the complexities of human cognition. As a test case, we designed a stimulus to reflect a military combat situation in the Middle East, motivated by the potential of using real-time functional magnetic resonance imaging (fMRI) in the treatment of post-traumatic stress disorder. Each subject experienced moving through the virtual town where they encountered 1–6 animated combatants at different locations, while fMRI data was collected. To analyze the data from what is, compared to most studies, more complex and less controlled stimuli, we employed statistical machine learning in the form of Multi-Voxel Pattern Analysis (MVPA) with special attention given to artificial Neural Networks (NN). Extensions to NN that exploit the block structure of the stimulus were developed to improve the accuracy of the classification, achieving performances from 58 to 93% (chance was 16.7%) with six subjects. This demonstrates that MVPA can decode a complex cognitive state, viewing a number of characters, in a dynamic virtual environment. To better understand the source of this information in the brain, a novel form of sensitivity analysis was developed to use NN to quantify the degree to which each voxel contributed to classification. Compared with maps produced by general linear models and the searchlight approach, these sensitivity maps revealed a more diverse pattern of information relevant to the classification of cognitive state. PMID:26106315

  13. Environmental indivisibilities and information costs: fanaticism, agnosticism, and intellectual progress

    SciTech Connect

    Olson, M.

    1982-05-01

    This analysis suggests several distinctive policy recommendations about environmental problems. One is that some of the alarms about ecological catastrophes cannot simply be dismissed, even when some of those who sound the alarms seem almost fanatic. The information needed to be sure one way or another is simply lacking, and may not be attainable at reasonable cost for a long time. We are therefore left with inevitable risk. Ecological systems could also be incomparably more robust than the alarmists claim, so we might also be worrying needlessly. The implication for environmental and ecological research is that we should not exprect that it will produce conclusive information, but should fund a lot of it anyhow. If previous research has produced few compelling results, valid information about these problems is scarce and therefore more valuable. The harvest of research in the areas characterized by indivisibilities is then poor but precious knowledge. If it is important to be able to change behavior quickly, when and if we finally get the information that the ecosystem can't take any more, then it is important that we have the open-mindedness needed to change our views and policies the moment decisive information arrives. Those who shout wolf too often, and those who are sure there are no wolves around, could be our undoing.

  14. The cost of forming more accurate impressions: accuracy-motivated perceivers see the personality of others more distinctively but less normatively than perceivers without an explicit goal.

    PubMed

    Biesanz, Jeremy C; Human, Lauren J

    2010-04-01

    Does the motivation to form accurate impressions actually improve accuracy? The present work extended Kenny's (1991, 1994) weighted-average model (WAM)--a theoretical model of the factors that influence agreement among personality judgments--to examine two components of interpersonal perception: distinctive and normative accuracy. WAM predicts that an accuracy motivation should enhance distinctive accuracy but decrease normative accuracy. In other words, the impressions of a perceiver with an accuracy motivation will correspond more with the target person's unique characteristics and less with the characteristics of the average person. Perceivers randomly assigned to receive the social goal of forming accurate impressions, which was communicated through a single-sentence instruction, achieved higher levels of distinctive self-other agreement but lower levels of normative agreement compared with perceivers not given an explicit impression-formation goal. The results suggest that people motivated to form accurate impressions do indeed become more accurate, but at the cost of seeing others less normatively and, in particular, less positively.

  15. The cost of forming more accurate impressions: accuracy-motivated perceivers see the personality of others more distinctively but less normatively than perceivers without an explicit goal.

    PubMed

    Biesanz, Jeremy C; Human, Lauren J

    2010-04-01

    Does the motivation to form accurate impressions actually improve accuracy? The present work extended Kenny's (1991, 1994) weighted-average model (WAM)--a theoretical model of the factors that influence agreement among personality judgments--to examine two components of interpersonal perception: distinctive and normative accuracy. WAM predicts that an accuracy motivation should enhance distinctive accuracy but decrease normative accuracy. In other words, the impressions of a perceiver with an accuracy motivation will correspond more with the target person's unique characteristics and less with the characteristics of the average person. Perceivers randomly assigned to receive the social goal of forming accurate impressions, which was communicated through a single-sentence instruction, achieved higher levels of distinctive self-other agreement but lower levels of normative agreement compared with perceivers not given an explicit impression-formation goal. The results suggest that people motivated to form accurate impressions do indeed become more accurate, but at the cost of seeing others less normatively and, in particular, less positively. PMID:20424106

  16. 30 CFR 251.13 - Reimbursement for the costs of reproducing data and information and certain processing costs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Reimbursement for the costs of reproducing data and information and certain processing costs. 251.13 Section 251.13 Mineral Resources BUREAU OF OCEAN... GEOPHYSICAL (G&G) EXPLORATIONS OF THE OUTER CONTINENTAL SHELF § 251.13 Reimbursement for the costs...

  17. Accurate prediction of interfacial residues in two-domain proteins using evolutionary information: implications for three-dimensional modeling.

    PubMed

    Bhaskara, Ramachandra M; Padhi, Amrita; Srinivasan, Narayanaswamy

    2014-07-01

    With the preponderance of multidomain proteins in eukaryotic genomes, it is essential to recognize the constituent domains and their functions. Often function involves communications across the domain interfaces, and the knowledge of the interacting sites is essential to our understanding of the structure-function relationship. Using evolutionary information extracted from homologous domains in at least two diverse domain architectures (single and multidomain), we predict the interface residues corresponding to domains from the two-domain proteins. We also use information from the three-dimensional structures of individual domains of two-domain proteins to train naïve Bayes classifier model to predict the interfacial residues. Our predictions are highly accurate (∼85%) and specific (∼95%) to the domain-domain interfaces. This method is specific to multidomain proteins which contain domains in at least more than one protein architectural context. Using predicted residues to constrain domain-domain interaction, rigid-body docking was able to provide us with accurate full-length protein structures with correct orientation of domains. We believe that these results can be of considerable interest toward rational protein and interaction design, apart from providing us with valuable information on the nature of interactions.

  18. The Cambridge Face Tracker: Accurate, Low Cost Measurement of Head Posture Using Computer Vision and Face Recognition Software

    PubMed Central

    Thomas, Peter B. M.; Baltrušaitis, Tadas; Robinson, Peter; Vivian, Anthony J.

    2016-01-01

    Purpose We validate a video-based method of head posture measurement. Methods The Cambridge Face Tracker uses neural networks (constrained local neural fields) to recognize facial features in video. The relative position of these facial features is used to calculate head posture. First, we assess the accuracy of this approach against videos in three research databases where each frame is tagged with a precisely measured head posture. Second, we compare our method to a commercially available mechanical device, the Cervical Range of Motion device: four subjects each adopted 43 distinct head postures that were measured using both methods. Results The Cambridge Face Tracker achieved confident facial recognition in 92% of the approximately 38,000 frames of video from the three databases. The respective mean error in absolute head posture was 3.34°, 3.86°, and 2.81°, with a median error of 1.97°, 2.16°, and 1.96°. The accuracy decreased with more extreme head posture. Comparing The Cambridge Face Tracker to the Cervical Range of Motion Device gave correlation coefficients of 0.99 (P < 0.0001), 0.96 (P < 0.0001), and 0.99 (P < 0.0001) for yaw, pitch, and roll, respectively. Conclusions The Cambridge Face Tracker performs well under real-world conditions and within the range of normally-encountered head posture. It allows useful quantification of head posture in real time or from precaptured video. Its performance is similar to that of a clinically validated mechanical device. It has significant advantages over other approaches in that subjects do not need to wear any apparatus, and it requires only low cost, easy-to-setup consumer electronics. Translational Relevance Noncontact assessment of head posture allows more complete clinical assessment of patients, and could benefit surgical planning in future. PMID:27730008

  19. Accurate molecular dynamics and nuclear quantum effects at low cost by multiple steps in real and imaginary time: Using density functional theory to accelerate wavefunction methods.

    PubMed

    Kapil, V; VandeVondele, J; Ceriotti, M

    2016-02-01

    The development and implementation of increasingly accurate methods for electronic structure calculations mean that, for many atomistic simulation problems, treating light nuclei as classical particles is now one of the most serious approximations. Even though recent developments have significantly reduced the overhead for modeling the quantum nature of the nuclei, the cost is still prohibitive when combined with advanced electronic structure methods. Here we present how multiple time step integrators can be combined with ring-polymer contraction techniques (effectively, multiple time stepping in imaginary time) to reduce virtually to zero the overhead of modelling nuclear quantum effects, while describing inter-atomic forces at high levels of electronic structure theory. This is demonstrated for a combination of MP2 and semi-local DFT applied to the Zundel cation. The approach can be seamlessly combined with other methods to reduce the computational cost of path integral calculations, such as high-order factorizations of the Boltzmann operator or generalized Langevin equation thermostats.

  20. Accurate molecular dynamics and nuclear quantum effects at low cost by multiple steps in real and imaginary time: Using density functional theory to accelerate wavefunction methods

    NASA Astrophysics Data System (ADS)

    Kapil, V.; VandeVondele, J.; Ceriotti, M.

    2016-02-01

    The development and implementation of increasingly accurate methods for electronic structure calculations mean that, for many atomistic simulation problems, treating light nuclei as classical particles is now one of the most serious approximations. Even though recent developments have significantly reduced the overhead for modeling the quantum nature of the nuclei, the cost is still prohibitive when combined with advanced electronic structure methods. Here we present how multiple time step integrators can be combined with ring-polymer contraction techniques (effectively, multiple time stepping in imaginary time) to reduce virtually to zero the overhead of modelling nuclear quantum effects, while describing inter-atomic forces at high levels of electronic structure theory. This is demonstrated for a combination of MP2 and semi-local DFT applied to the Zundel cation. The approach can be seamlessly combined with other methods to reduce the computational cost of path integral calculations, such as high-order factorizations of the Boltzmann operator or generalized Langevin equation thermostats.

  1. 77 FR 20012 - Federal Acquisition Regulation; Information Collection; Corporate Aircraft Costs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... Regulation; Information Collection; Corporate Aircraft Costs AGENCY: Department of Defense (DOD), General... collection requirement concerning corporate aircraft costs. Public comments are particularly invited on..., Corporate Aircraft Costs, by any of the following methods: Regulations.gov :...

  2. Information System for Societal Cost and Benefit Analysis of Vocational and Manpower Programs. Final Report.

    ERIC Educational Resources Information Center

    Arora, Mehar

    The study was directed toward developing a manual for establishing societal benefits and costs of vocational and manpower programs in Wisconsin. After first outlining the background of benefit-cost analysis, problems in establishing cost functions in education are presented along with some important cost concepts and uses of cost information in…

  3. 40 CFR 2.311 - Special rules governing certain information obtained under the Motor Vehicle Information and Cost...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Vehicle Information and Cost Savings Act, as amended, 15 U.S.C. 1901 et seq. (2) Average fuel economy has... section applies, except that information this is fuel economy data is not eligible for confidential... information obtained under the Motor Vehicle Information and Cost Savings Act. 2.311 Section 2.311...

  4. The implications for information system design of how health care costs are determined.

    PubMed

    Ehreth, J

    1996-03-01

    As the costs of health care assume increasing importance in national health policy, information systems will be required to supply better information about how costs are generated and how resources are distributed. Costs, as determined by accounting systems, often are inadequate for policy analysis because they represent resources consumed (expenditures) to produce given outputs but do not measure forgone alternative uses of the resources (opportunity costs). To accommodate cost studies at the program level and the system level, relational information systems must be developed that allow costs to be summed across individuals to determine an organization's costs, across providers to determine an individual patient's costs, and across both to determine system and population costs. Program level studies require that cost variables be grouped into variable costs that are tied to changes in volume of output and fixed costs that are allocated rationally. Data sources for program-level analyses are organizational financial statements, cost center accounting records, Medicare cost reports, American Hospital Association surveys, and the Department of Veterans Affairs (VA) cost distribution files. System-level studies are performed to predict future costs and to compare costs of alternative modes of treatment. System-level analyses aggregate all costs associated with individuals to produce population-based costs. Data sources for system-level analyses include insurance claims;n Medicare files; hospital billing records; and VA inpatient, outpatient, and management databases. Future cost studies will require the assessment of costs from all providers, regardless of organizational membership status, for all individuals in defined populations.

  5. What does an MRI scan cost?

    PubMed

    Young, David W

    2015-11-01

    Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs. PMID:26685437

  6. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  7. 30 CFR 551.13 - Reimbursement for the costs of reproducing data and information and certain processing costs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 2 2012-07-01 2012-07-01 false Reimbursement for the costs of reproducing data and information and certain processing costs. 551.13 Section 551.13 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE GEOLOGICAL AND GEOPHYSCIAL (G&G) EXPLORATIONS...

  8. 30 CFR 551.13 - Reimbursement for the costs of reproducing data and information and certain processing costs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 2 2013-07-01 2013-07-01 false Reimbursement for the costs of reproducing data and information and certain processing costs. 551.13 Section 551.13 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE GEOLOGICAL AND GEOPHYSICAL (G&G) EXPLORATIONS...

  9. 30 CFR 551.13 - Reimbursement for the costs of reproducing data and information and certain processing costs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 2 2014-07-01 2014-07-01 false Reimbursement for the costs of reproducing data and information and certain processing costs. 551.13 Section 551.13 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE GEOLOGICAL AND GEOPHYSICAL (G&G) EXPLORATIONS...

  10. The Social Costs of Ubiquitous Information: Consuming Information on Mobile Phones Is Associated with Lower Trust.

    PubMed

    Kushlev, Kostadin; Proulx, Jason D E

    2016-01-01

    In an age already saturated with information, the ongoing revolution in mobile computing has expanded the realm of immediate information access far beyond our homes and offices. In addition to changing where people can access information, mobile computing has changed what information people access-from finding specific directions to a restaurant to exploring nearby businesses when on the go. Does this ability to instantly gratify our information needs anytime and anywhere have any bearing on how much we trust those around us-from neighbors to strangers? Using data from a large nationally representative survey (World Values Survey: Wave 6), we found that the more people relied on their mobile phones for information, the less they trusted strangers, neighbors and people from other religions and nationalities. In contrast, obtaining information through any other method-including TV, radio, newspapers, and even the Internet more broadly-predicted higher trust in those groups. Mobile information had no bearing on how much people trusted close others, such as their family. Although causality cannot be inferred, these findings provide an intriguing first glimpse into the possible unforeseen costs of convenient information access for the social lubricant of society-our sense of trust in one another. PMID:27606707

  11. The Social Costs of Ubiquitous Information: Consuming Information on Mobile Phones Is Associated with Lower Trust

    PubMed Central

    Proulx, Jason D. E.

    2016-01-01

    In an age already saturated with information, the ongoing revolution in mobile computing has expanded the realm of immediate information access far beyond our homes and offices. In addition to changing where people can access information, mobile computing has changed what information people access—from finding specific directions to a restaurant to exploring nearby businesses when on the go. Does this ability to instantly gratify our information needs anytime and anywhere have any bearing on how much we trust those around us—from neighbors to strangers? Using data from a large nationally representative survey (World Values Survey: Wave 6), we found that the more people relied on their mobile phones for information, the less they trusted strangers, neighbors and people from other religions and nationalities. In contrast, obtaining information through any other method—including TV, radio, newspapers, and even the Internet more broadly—predicted higher trust in those groups. Mobile information had no bearing on how much people trusted close others, such as their family. Although causality cannot be inferred, these findings provide an intriguing first glimpse into the possible unforeseen costs of convenient information access for the social lubricant of society—our sense of trust in one another. PMID:27606707

  12. Cost-Benefit Analysis of Electronic Information: A Case Study.

    ERIC Educational Resources Information Center

    White, Gary W.; Crawford, Gregory A.

    1998-01-01

    Describes a study at Pennsylvania State University Harrisburg in which cost-benefit analysis (CBA) was used to examine the cost effectiveness of an electronic database. Concludes that librarians can use the results of CBA studies to justify budgets and acquisitions and to provide insight into the true costs of providing library services. (PEN)

  13. 48 CFR 215.403-5 - Instructions for submission of cost or pricing data or information other than cost or pricing data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... submission of cost or pricing data or information other than cost or pricing data. 215.403-5 Section 215.403... Instructions for submission of cost or pricing data or information other than cost or pricing data. When the solicitation requires contractor compliance with the Contractor Cost Data Reporting System, follow...

  14. Product line cost estimation: a standard cost approach.

    PubMed

    Cooper, J C; Suver, J D

    1988-04-01

    Product line managers often must make decisions based on inaccurate cost information. A method is needed to determine costs more accurately. By using a standard costing model, product line managers can better estimate the cost of intermediate and end products, and hence better estimate the costs of the product line. PMID:10286385

  15. 19 CFR 10.21 - Updating cost data and other information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Updating cost data and other information. 10.21... Articles Assembled Abroad with United States Components § 10.21 Updating cost data and other information. When a claim for the exemption is predicated on estimated cost data furnished either in advance of...

  16. Waste management facilities cost information: System cost model product description. Revision 2

    SciTech Connect

    Lundeen, A.S.; Hsu, K.M.; Shropshire, D.E.

    1996-02-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for DOE wastes. Transportation costs are provided for truck and rail and include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation`s generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities.

  17. Allele Specific Locked Nucleic Acid Quantitative PCR (ASLNAqPCR): An Accurate and Cost-Effective Assay to Diagnose and Quantify KRAS and BRAF Mutation

    PubMed Central

    Morandi, Luca; de Biase, Dario; Visani, Michela; Cesari, Valentina; De Maglio, Giovanna; Pizzolitto, Stefano; Pession, Annalisa; Tallini, Giovanni

    2012-01-01

    The use of tyrosine kinase inhibitors (TKIs) requires the testing for hot spot mutations of the molecular effectors downstream the membrane-bound tyrosine kinases since their wild type status is expected for response to TKI therapy. We report a novel assay that we have called Allele Specific Locked Nucleic Acid quantitative PCR (ASLNAqPCR). The assay uses LNA-modified allele specific primers and LNA-modified beacon probes to increase sensitivity, specificity and to accurately quantify mutations. We designed primers specific for codon 12/13 KRAS mutations and BRAF V600E, and validated the assay with 300 routine samples from a variety of sources, including cytology specimens. All were analyzed by ASLNAqPCR and Sanger sequencing. Discordant cases were pyrosequenced. ASLNAqPCR correctly identified BRAF and KRAS mutations in all discordant cases and all had a mutated/wild type DNA ratio below the analytical sensitivity of the Sanger method. ASLNAqPCR was 100% specific with greater accuracy, positive and negative predictive values compared with Sanger sequencing. The analytical sensitivity of ASLNAqPCR is 0.1%, allowing quantification of mutated DNA in small neoplastic cell clones. ASLNAqPCR can be performed in any laboratory with real-time PCR equipment, is very cost-effective and can easily be adapted to detect hot spot mutations in other oncogenes. PMID:22558339

  18. Allele specific locked nucleic acid quantitative PCR (ASLNAqPCR): an accurate and cost-effective assay to diagnose and quantify KRAS and BRAF mutation.

    PubMed

    Morandi, Luca; de Biase, Dario; Visani, Michela; Cesari, Valentina; De Maglio, Giovanna; Pizzolitto, Stefano; Pession, Annalisa; Tallini, Giovanni

    2012-01-01

    The use of tyrosine kinase inhibitors (TKIs) requires the testing for hot spot mutations of the molecular effectors downstream the membrane-bound tyrosine kinases since their wild type status is expected for response to TKI therapy. We report a novel assay that we have called Allele Specific Locked Nucleic Acid quantitative PCR (ASLNAqPCR). The assay uses LNA-modified allele specific primers and LNA-modified beacon probes to increase sensitivity, specificity and to accurately quantify mutations. We designed primers specific for codon 12/13 KRAS mutations and BRAF V600E, and validated the assay with 300 routine samples from a variety of sources, including cytology specimens. All were analyzed by ASLNAqPCR and Sanger sequencing. Discordant cases were pyrosequenced. ASLNAqPCR correctly identified BRAF and KRAS mutations in all discordant cases and all had a mutated/wild type DNA ratio below the analytical sensitivity of the Sanger method. ASLNAqPCR was 100% specific with greater accuracy, positive and negative predictive values compared with Sanger sequencing. The analytical sensitivity of ASLNAqPCR is 0.1%, allowing quantification of mutated DNA in small neoplastic cell clones. ASLNAqPCR can be performed in any laboratory with real-time PCR equipment, is very cost-effective and can easily be adapted to detect hot spot mutations in other oncogenes.

  19. The Government and Information: Costs, Choices and Challenges.

    ERIC Educational Resources Information Center

    Challman, Laura E.

    This paper examines the involvement of the federal government in information activities and services, and raises questions about the legitimacy and consistency of this involvement. Three major areas of government policy in the information sector are discussed: research and development, the National Technical Information Service (NTIS), and public…

  20. Better Informing Decision Making with Multiple Outcomes Cost-Effectiveness Analysis under Uncertainty in Cost-Disutility Space

    PubMed Central

    McCaffrey, Nikki; Agar, Meera; Harlum, Janeane; Karnon, Jonathon; Currow, David; Eckermann, Simon

    2015-01-01

    Introduction Comparing multiple, diverse outcomes with cost-effectiveness analysis (CEA) is important, yet challenging in areas like palliative care where domains are unamenable to integration with survival. Generic multi-attribute utility values exclude important domains and non-health outcomes, while partial analyses—where outcomes are considered separately, with their joint relationship under uncertainty ignored—lead to incorrect inference regarding preferred strategies. Objective The objective of this paper is to consider whether such decision making can be better informed with alternative presentation and summary measures, extending methods previously shown to have advantages in multiple strategy comparison. Methods Multiple outcomes CEA of a home-based palliative care model (PEACH) relative to usual care is undertaken in cost disutility (CDU) space and compared with analysis on the cost-effectiveness plane. Summary measures developed for comparing strategies across potential threshold values for multiple outcomes include: expected net loss (ENL) planes quantifying differences in expected net benefit; the ENL contour identifying preferred strategies minimising ENL and their expected value of perfect information; and cost-effectiveness acceptability planes showing probability of strategies minimising ENL. Results Conventional analysis suggests PEACH is cost-effective when the threshold value per additional day at home (1) exceeds $1,068 or dominated by usual care when only the proportion of home deaths is considered. In contrast, neither alternative dominate in CDU space where cost and outcomes are jointly considered, with the optimal strategy depending on threshold values. For example, PEACH minimises ENL when 1=$2,000 and 2=$2,000 (threshold value for dying at home), with a 51.6% chance of PEACH being cost-effective. Conclusion Comparison in CDU space and associated summary measures have distinct advantages to multiple domain comparisons, aiding

  1. Seek and ye shall find: consumer search for objective health care cost and quality information.

    PubMed

    Sick, Brian; Abraham, Jean M

    2011-01-01

    Significant investments have been made in developing and disseminating health care provider cost and quality information on the Internet with the expectation that stronger consumer engagement will lead consumers to seek providers who deliver high-quality, low-cost care. However, prior research shows that the awareness and use of such information is low. This study investigates how the information search process may contribute to explaining this result. The analysis reveals that the Web sites most likely to be found by consumers are owned by private companies and provide information based on anecdotal patient experiences. Web sites less likely to be found have government or community-based ownership, are based on administrative data, and contain a mixture of quality, cost, and patient experience information. Searches for information on hospitals reveal more cost and quality information based on administrative data, whereas searches that focus on clinics or physicians are more likely to produce information based on patient narratives.

  2. 77 FR 4997 - Proposed Information Collection; Comment Request; Alaska Individual Fishing Quota Cost Recovery...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ... National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Alaska Individual Fishing Quota Cost Recovery Program Requirements AGENCY: National Oceanic and Atmospheric... to take this opportunity to comment on proposed and/or continuing information collections,...

  3. Digital Avionics Information System Preliminary Life-Cycle-Cost Analysis. Final Report (November 1974-May 1975).

    ERIC Educational Resources Information Center

    Pruitt, Gary K.; Dieterly, Duncan L.

    The results of a study to evaluate the potential life-cycle costs and cost savings that could be realized by applying the Digital Avionics Information System (DAIS) concept to future avionic systems were presented. The tasks evaluated included selection of program elements for costing, selection of DAIS installation potential, definition of a…

  4. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.

  5. Microfiche for Technical Information Dissemination: A Cost-Benefit Analysis.

    ERIC Educational Resources Information Center

    Teplitz, Arthur

    The paper introduces the concept of Book Fiche, a microfiche form which provides up to 390 pages on a single 4 X 6 inch film sheet, and its application to the reproduction and distribution of technical documents. A cost comparison is made between conventional printing and distribution techniques, and microfiche dissemination utilizing…

  6. Integrated Undergraduate Management Education: An Informal Benefit/Cost Analysis

    ERIC Educational Resources Information Center

    Casey, William L., Jr.

    2005-01-01

    This paper seeks to contribute to the literature of management education by evaluating assessment data on Babson College's integrated undergraduate management core program (IMC). Transitions from functionally isolated curricula to more integrated alternatives involve both benefits and costs, accruing to faculty, students and sponsoring…

  7. 48 CFR 215.403-3 - Requiring information other than cost or pricing data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACTING BY NEGOTIATION Contract Pricing 215.403-3 Requiring information other than cost or pricing data... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Requiring information other than cost or pricing data. 215.403-3 Section 215.403-3 Federal Acquisition Regulations...

  8. Cost-Effectiveness of Management Training in the Informal Sector. Discussion Paper No. 101.

    ERIC Educational Resources Information Center

    Nubler, Irmgard

    A research project in the Ivory Coast, Kenya, and Tanzania evaluated the cost effectiveness of management training seminars for women entrepreneurs in the informal sector. Women, a large and growing part of entrepreneurs, had less access to needed resources, skills, and information than men. Reasons for failure to study the cost effectiveness and…

  9. 75 FR 10455 - Information Collection; Timber Purchaser Cost and Sales Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ...; ] DEPARTMENT OF AGRICULTURE Forest Service Information Collection; Timber Purchaser Cost and Sales Data AGENCY... organizations on the extension of a currently approved information collection, Timber Purchaser Cost and Sales... CONTACT: Richard Fitzgerald, Timber Staff, Forest Management at 202-205-1753. Individuals who use TDD...

  10. Trabectedin in the treatment of metastatic soft tissue sarcoma: cost-effectiveness, cost-utility and value of information

    PubMed Central

    Soini, E. J. O.; García San Andrés, B.; Joensuu, T.

    2011-01-01

    Background: To assess the cost-effectiveness of trabectedin compared with end-stage treatment (EST) after failure with anthracycline and/or ifosfamide in metastatic soft tissue sarcoma (mSTS). Design: Analysis was carried out using a probabilistic Markov model with trabectedin → EST and EST arms, three health states (stable disease, progressive disease and death) and a lifetime perspective (3% annual discount rate). Finnish resources (drugs, mSTS, adverse events and travelling) and costs (year 2008) were used. Efficacy was based on an indirect comparison of the STS-201 and European Organisation for Research and Treatment of Cancer trials. QLQ-C30 scale scores were mapped to 15D, Short Form 6D and EuroQol 5D utilities. The outcome measures were the cost-effectiveness acceptability frontier, incremental cost per life year gained (LYG) and quality-adjusted life year (QALY) gained and the expected value of perfect information (EVPI). Results: Trabectedin → EST was associated with 14.0 (95% confidence interval 9.1–19.2) months longer survival, €36 778 higher costs (€32 816 using hospital price for trabectedin) and €31 590 (€28 192) incremental cost per LYG with an EVPI of €3008 (€3188) compared with EST. With a threshold of €50 000 per LYG, trabectedin → EST had 98.5% (98.2%) probability of being cost-effective. The incremental cost per QALY gained with trabectedin → EST was €42 633–47 735 (€37 992–42 819) compared with EST. The results were relatively insensitive to changes. Conclusion: Trabectedin is a potentially cost-effective treatment of mSTS patients. PMID:20627875

  11. Cost Accounting for Decision Makers.

    ERIC Educational Resources Information Center

    Kaneklides, Ann L.

    1985-01-01

    Underscores the importance of informed decision making through accurate anticipation of cost incurrence in light of changing economic and environmental conditions. Explains the concepts of cost accounting, full allocation of costs, the selection of an allocation base, the allocation of indirect costs, depreciation, and implications for community…

  12. Cost-volume-profit and net present value analysis of health information systems.

    PubMed

    McLean, R A

    1998-08-01

    The adoption of any information system should be justified by an economic analysis demonstrating that its projected benefits outweigh its projected costs. Analysis differ, however, on which methods to employ for such a justification. Accountants prefer cost-volume-profit analysis, and economists prefer net present value analysis. The article explains the strengths and weaknesses of each method and shows how they can be used together so that well-informed investments in information systems can be made.

  13. A Basis for Time and Cost Evaluation of Information Systems.

    ERIC Educational Resources Information Center

    Korfhage, R. R.; DeLutis, T. G.

    A general model for information storage and retrieval (IS&R) systems is proposed. The system is selected from the set of all available IS&R components. These components define the system's users and data sources; the hardware, software, and personnel performing the actual storage and retrieval activities; and the funder who acts as a filter in the…

  14. Low-cost information distribution - New directions for technology developments

    NASA Technical Reports Server (NTRS)

    Catoe, C. E.

    1978-01-01

    The use of space satellites for data storage and retrieval is discussed with respect to short-term (1978-1985) and long-term (1985-2000) developments. The present structure, where access to satellite-transmitted data is controlled largely by the Federal Government for its own use, will be gradually replaced by a continually expanding user-community with a broadening scope of needs. Technological improvements in satellite-communication and data-processing will drive down the cost of both transmitting and receiving hardware, making such hardware available to more and more people. By the closing years of the century, personal, direct satellite communication should be available to every American household, providing video, printed, and archivable data over a wide range of subjects, from bank statements to medical records.

  15. A Trial of Nursing Cost Accounting using Nursing Practice Data on a Hospital Information System.

    PubMed

    Miyahira, Akiko; Tada, Kazuko; Ishima, Masatoshi; Nagao, Hidenori; Miyamoto, Tadashi; Nakagawa, Yoshiaki; Takemura, Tadamasa

    2015-01-01

    Hospital administration is very important and many hospitals carry out activity-based costing under comprehensive medicine. However, nursing cost is unclear, because nursing practice is expanding both quantitatively and qualitatively and it is difficult to grasp all nursing practices, and nursing cost is calculated in many cases comprehensively. On the other hand, a nursing information system (NIS) is implemented in many hospitals in Japan and we are beginning to get nursing practical data. In this paper, we propose a nursing cost accounting model and we simulate a cost by nursing contribution using NIS data. PMID:26262246

  16. A Trial of Nursing Cost Accounting using Nursing Practice Data on a Hospital Information System.

    PubMed

    Miyahira, Akiko; Tada, Kazuko; Ishima, Masatoshi; Nagao, Hidenori; Miyamoto, Tadashi; Nakagawa, Yoshiaki; Takemura, Tadamasa

    2015-01-01

    Hospital administration is very important and many hospitals carry out activity-based costing under comprehensive medicine. However, nursing cost is unclear, because nursing practice is expanding both quantitatively and qualitatively and it is difficult to grasp all nursing practices, and nursing cost is calculated in many cases comprehensively. On the other hand, a nursing information system (NIS) is implemented in many hospitals in Japan and we are beginning to get nursing practical data. In this paper, we propose a nursing cost accounting model and we simulate a cost by nursing contribution using NIS data.

  17. Migrating from Informal to Formal Consortium — COSTLI Issues

    NASA Astrophysics Data System (ADS)

    Birdie, C.; Patil, Y. M.

    2010-10-01

    There are many models of library consortia which have come into existence due to various reasons and compulsions. FORSA (Forum for Resource Sharing in Astronomy) is an informal consortium born from the links between academic institutions specializing in astronomy in India. FORSA is a cooperative venture initiated by library professionals. Though this consortium was formed mainly for inter-lending activities and bibliographic access, it has matured over the years to adopt the consortium approach on cooperative acquisitions, due to increased requirements.

  18. Judgements about the Value and Cost of Human Factors Information in Design.

    ERIC Educational Resources Information Center

    Burns, Catherine M.; Vicente, Kim J.

    1996-01-01

    Describes an empirical evaluation that investigated the criteria by which designers of human-machine systems evaluate design information. Professional designers of nuclear power plant control rooms rated hypothetical information search questions in terms of relevance, importance, cost, and effort based on Rouse's model of information search…

  19. Waste Management Facilities cost information for mixed low-level waste. Revision 1

    SciTech Connect

    Shropshire, D.; Sherick, M.; Biadgi, C.

    1995-06-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for managing mixed low-level waste. The report`s information on treatment, storage, and disposal modules can be integrated to develop total life-cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of cost estimation data is also summarized in this report.

  20. Comparison of Informal Care Time and Costs in Different Age-Related Dementias: A Review

    PubMed Central

    Costa, Nadège; Ferlicoq, Laura; Derumeaux-Burel, Hélène; Rapp, Thomas; Garnault, Valérie; Gillette-Guyonnet, Sophie; Andrieu, Sandrine; Vellas, Bruno; Lamure, Michel; Grand, Alain; Molinier, Laurent

    2013-01-01

    Objectives. Age-related dementia is a progressive degenerative brain syndrome whose prevalence increases with age. Dementias cause a substantial burden on society and on families who provide informal care. This study aims to review the relevant papers to compare informal care time and costs in different dementias. Methods. A bibliographic search was performed on an international medical literature database (MEDLINE). All studies which assessed the social economic burden of different dementias were selected. Informal care time and costs were analyzed in three care settings by disease stages. Results. 21 studies met our criteria. Mean informal care time was 55.73 h per week for Alzheimer disease and 15.8 h per week for Parkinson disease (P = 0.0076), and the associated mean annual informal costs were $17,492 versus $3,284, respectively (P = 0.0393). Conclusion. There is a lack of data about informal care time and costs among other dementias than AD or PD. Globally, AD is the most costly in terms of informal care costs than PD, $17,492 versus $3,284, respectively. PMID:23509789

  1. 78 FR 37883 - Information Collection Activities: Report of Fuel Cost, Consumption, and Surcharge Revenue

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-24

    ... Surface Transportation Board Information Collection Activities: Report of Fuel Cost, Consumption, and..., Consumption, and Surcharge Revenue. Comments are requested concerning: (1) The accuracy of the Board's burden... Board's request for OMB approval. Description of Collection Title: Report of Fuel Cost, Consumption,...

  2. The neglected topic: presentation of cost information in patient decision AIDS.

    PubMed

    Blumenthal-Barby, J S; Robinson, Emily; Cantor, Scott B; Naik, Aanand D; Russell, Heidi Voelker; Volk, Robert J

    2015-05-01

    Costs are an important component of patients' decision making, but a comparatively underemphasized aspect of formal shared decision making. We hypothesized that decision aids also avoid discussion of costs, despite their being tools designed to facilitate shared decision making about patient-centered outcomes. We sought to define the frequency of cost-related information and identify the common modes of presenting cost and cost-related information in the 290 decision aids catalogued in the Ottawa Hospital Research Institute's Decision Aid Library Inventory (DALI) system. We found that 56% (n = 161) of the decision aids mentioned cost in some way, but only 13% (n = 37) gave a specific price or range of prices. We identified 9 different ways in which cost was mentioned. The most common approach was as a "pro" of one of the treatment options (e.g., "you avoid the cost of medication"). Of the 37 decision aids that gave specific prices or ranges of prices for treatment options, only 2 were about surgery decisions despite the fact that surgery decision aids were the most common. Our findings suggest that presentation of cost information in decision aids is highly variable. Evidence-based guidelines should be developed by the International Patient Decision Aid Standards (IPDAS) Collaboration. PMID:25583552

  3. The neglected topic: presentation of cost information in patient decision AIDS.

    PubMed

    Blumenthal-Barby, J S; Robinson, Emily; Cantor, Scott B; Naik, Aanand D; Russell, Heidi Voelker; Volk, Robert J

    2015-05-01

    Costs are an important component of patients' decision making, but a comparatively underemphasized aspect of formal shared decision making. We hypothesized that decision aids also avoid discussion of costs, despite their being tools designed to facilitate shared decision making about patient-centered outcomes. We sought to define the frequency of cost-related information and identify the common modes of presenting cost and cost-related information in the 290 decision aids catalogued in the Ottawa Hospital Research Institute's Decision Aid Library Inventory (DALI) system. We found that 56% (n = 161) of the decision aids mentioned cost in some way, but only 13% (n = 37) gave a specific price or range of prices. We identified 9 different ways in which cost was mentioned. The most common approach was as a "pro" of one of the treatment options (e.g., "you avoid the cost of medication"). Of the 37 decision aids that gave specific prices or ranges of prices for treatment options, only 2 were about surgery decisions despite the fact that surgery decision aids were the most common. Our findings suggest that presentation of cost information in decision aids is highly variable. Evidence-based guidelines should be developed by the International Patient Decision Aid Standards (IPDAS) Collaboration.

  4. Guide to documenting and managing cost and performance information for remediation projects. Revised version

    SciTech Connect

    1998-10-01

    This document summarizes the recommended procedures for reporting costs and performance of remediation projects. Chapter 2 focuses on costs, while Chapter 3 focuses on performance. Both chapters include examples on how to use the recommended formats. Chapter 4 identifies factors that affect cost or performance, and Chapter 5 presents information about specific reporting formats. A discussion about the Roundtable web site strategy is provided in Chapter 6. Appendices provide additional information related to that presented in Chapters 2 through 6. Appendix A discusses the effects of matrix characteristics and operating parameters on cost or performance, while measurement procedures for those parameters are shown in Appendix B. A recommended format for preparing case study abstracts is provided in Appendix C, and a generic format for full case studies is shown in Appendix D. Appendices E and F list the active members of the Ad Hoc Work Group on Cost and Performance, and the members of the Federal Remediation Technologies Roundtable, respectively.

  5. Relationship between patient dependence and direct medical-, social-, indirect-, and informal-care costs in Spain

    PubMed Central

    Darbà, Josep; Kaskens, Lisette

    2015-01-01

    Objective The objectives of this analysis were to examine how patients’ dependence on others relates to costs of care and explore the incremental effects of patient dependence measured by the Dependence Scale on costs for patients with Alzheimer’s disease (AD) in Spain. Methods The Co-Dependence in Alzheimer’s Disease study is an 18 multicenter, cross-sectional, observational study among patients with AD according to the clinical dementia rating score and their caregivers in Spain. This study also gathered data on resource utilization for medical care, social care, caregiver productivity losses, and informal caregiver time reported in the Resource Utilization in Dementia Lite instrument and a complementary questionnaire. The data of 343 patients and their caregivers were collected through the completion of a clinical report form during one visit/assessment at an outpatient center or hospital, where all instruments were administered. The data collected (in addition to clinical measures) also included sociodemographic data concerning the patients and their caregivers. Cost analysis was based on resource use for medical care, social care, caregiver productivity losses, and informal caregiver time reported in the Resource Utilization in Dementia Lite instrument and a complementary questionnaire. Resource unit costs were applied to value direct medical-, social-, and indirect-care costs. A replacement cost method was used to value informal care. Patient dependence on others was measured using the Dependence Scale, and the Cumulative Index Rating Scale was administered to the patient to assess multi-morbidity. Multivariate regression analysis was used to model the effects of dependence and other sociodemographic and clinical variables on cost of care. Results The mean (standard deviation) costs per patient over 6 months for direct medical-, social-, indirect-, and informal-care costs were estimated at €1,028.10 (€1,655.00), €843.80 (€2,684.80), €464.20 (

  6. The role of cognitive switching in head-up displays. [to determine pilot ability to accurately extract information from either of two sources

    NASA Technical Reports Server (NTRS)

    Fischer, E.

    1979-01-01

    The pilot's ability to accurately extract information from either one or both of two superimposed sources of information was determined. Static, aerial, color 35 mm slides of external runway environments and slides of corresponding static head-up display (HUD) symbology were used as the sources. A three channel tachistoscope was utilized to show either the HUD alone, the scene alone, or the two slides superimposed. Cognitive performance of the pilots was assessed by determining the percentage of correct answers given to two HUD related questions, two scene related questions, or one HUD and one scene related question.

  7. Financial and Cost Management for Libraries and Information Services. Second Edition.

    ERIC Educational Resources Information Center

    Roberts, Stephen A.

    This book highlights the importance of good financial and cost management in libraries and information centers to make best use of available resources, and illustrates how to maintain user services. The book applies current research and theory of financial management techniques to situations faced in libraries and information services. Eight…

  8. The Relationship between Return on Profitability and Costs of Outsourcing Information Technology Technical Support

    ERIC Educational Resources Information Center

    Odion, Segun

    2011-01-01

    The purpose of this quantitative correlational research study was to examine the relationship between costs of operation and total return on profitability of outsourcing information technology technical support in a two-year period of outsourcing operations. United States of America list of Fortune 1000 companies' chief information officers…

  9. 77 FR 75163 - Federal Acquisition Regulation; Information Collection; Contract Funding-Limitation of Costs/Funds

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-19

    ... Regulation; Information Collection; Contract Funding--Limitation of Costs/Funds AGENCIES: Department of... (NASA). ACTION: Notice of request for public comments regarding an extension of an existing OMB...: Submit comments identified by Information Collection 9000- 0074, Contract Funding--Limitation of...

  10. Multiplying probe for accurate power measurements on an RF driven atmospheric pressure plasma jet applied to the COST reference microplasma jet

    NASA Astrophysics Data System (ADS)

    Beijer, P. A. C.; Sobota, A.; van Veldhuizen, E. M.; Kroesen, G. M. W.

    2016-03-01

    In this paper a new multiplying probe for measuring the power dissipated in a miniature capacitively coupled, RF driven, atmospheric pressure plasma jet (μAPPJ—COST Reference Microplasma Jet—COST RMJ) is presented. The approach aims for substantially higher accuracy than provided by traditionally applied methods using bi-directional power meters or commercially available voltage and current probes in conjunction with digitizing oscilloscopes. The probe is placed on a miniature PCB and designed to minimize losses, influence of unknown elements, crosstalk and variations in temperature. The probe is designed to measure powers of the order of magnitude of 0.1-10 W. It is estimated that it measures power with less than 2% deviation from the real value in the tested power range. The design was applied to measure power dissipated in COST-RMJ running in helium with a small addition of oxygen.

  11. Cost reduction and minimization of land based on an accurate determination of fault current distribution in shield wires and grounding systems

    SciTech Connect

    Daily, W.K. ); Dawalibi, F. )

    1993-01-01

    Careful analysis of Fault Current Distribution in neutral metallic paths, Power System Protection requirements and Ground Potential Rise (GPR) evaluations were carried out at FPL's Lauderdale Power Plant and associated switchyard. These studies resulted in substantial cost savings and land utilization minimization for the power system expansions at Lauderdale Plant by confirming that the in-situ expansion and reconfiguration aimed at constructing two electrically independent substations sharing the same site and grounding system is a sound economical alternative to the construction of a new substation and associated significant site preparation and construction costs. This paper describes the methodology used to conduct this study.

  12. Estimating costs of traffic crashes and crime: tools for informed decision making.

    PubMed

    Streff, F M; Molnar, L J; Cohen, M A; Miller, T R; Rossman, S B

    1992-01-01

    Traffic crashes and crime both impose significant economic and social burdens through injury and loss of life, as well as property damage and loss. Efforts to reduce crashes and crime often result in competing demands on limited public resources. Comparable and up-to-date cost data on crashes and crime contribute to informed decisions about allocation of these resources in important ways. As a first step, cost data provide information about the magnitude of the problems of crashes and crime by allowing us to estimate associated dollar losses to society. More importantly, cost data on crashes and crime are essential to evaluating costs and benefits of various policy alternatives that compete for resources. This paper presents the first comparable comprehensive cost estimates for crashes and crime and applies them to crash and crime incidence data for Michigan to generate dollar losses for the state. An example illustrates how cost estimates can be used to evaluate costs and benefits of crash-reduction and crime-reduction policies in making resource allocation decisions. Traffic crash and selected index crime incidence data from the calendar year 1988 were obtained from the Michigan State Police. Costs for crashes and index crimes were generated and applied to incidence data to estimate dollar losses from crashes and index crimes for the state of Michigan. In 1988, index crimes in Michigan resulted in $0.8 billion in monetary costs and $2.4 billion in total monetary and nonmonetary quality-of-life costs (using the willingness-to-pay approach). Traffic crashes in Michigan resulted in $2.3 billion in monetary costs and $7.1 billion in total monetary and nonmonetary quality-of-life costs, nearly three times the costs of index crimes. Based on dollar losses to the state, the magnitude of the problem of traffic crashes clearly exceeded that of index crimes in Michigan in 1988. From a policy perspective, summing the total dollar losses from crashes or crime is of less

  13. Assignment of Calibration Information to Deeper Phylogenetic Nodes is More Effective in Obtaining Precise and Accurate Divergence Time Estimates.

    PubMed

    Mello, Beatriz; Schrago, Carlos G

    2014-01-01

    Divergence time estimation has become an essential tool for understanding macroevolutionary events. Molecular dating aims to obtain reliable inferences, which, within a statistical framework, means jointly increasing the accuracy and precision of estimates. Bayesian dating methods exhibit the propriety of a linear relationship between uncertainty and estimated divergence dates. This relationship occurs even if the number of sites approaches infinity and places a limit on the maximum precision of node ages. However, how the placement of calibration information may affect the precision of divergence time estimates remains an open question. In this study, relying on simulated and empirical data, we investigated how the location of calibration within a phylogeny affects the accuracy and precision of time estimates. We found that calibration priors set at median and deep phylogenetic nodes were associated with higher precision values compared to analyses involving calibration at the shallowest node. The results were independent of the tree symmetry. An empirical mammalian dataset produced results that were consistent with those generated by the simulated sequences. Assigning time information to the deeper nodes of a tree is crucial to guarantee the accuracy and precision of divergence times. This finding highlights the importance of the appropriate choice of outgroups in molecular dating. PMID:24855333

  14. Interim report: Waste management facilities cost information for mixed low-level waste

    SciTech Connect

    Feizollahi, F.; Shropshire, D.

    1994-03-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for treating alpha and nonalpha mixed low-level radioactive waste. This report contains information on twenty-seven treatment, storage, and disposal modules that can be integrated to develop total life cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of estimating data is also summarized in this report.

  15. Short-Term Medical Costs of a VHA Health Information Exchange: A CHEERS-Compliant Article

    PubMed Central

    French, Dustin D.; Dixon, Brian E.; Perkins, Susan M.; Myers, Laura J.; Weiner, Michael; Zillich, Allan J.; Haggstrom, David A.

    2016-01-01

    Abstract The Virtual Lifetime Electronic Record (VLER) Health program provides the Veterans Health Administration (VHA) a framework whereby VHA providers can access the veterans’ electronic health record information to coordinate healthcare across multiple sites of care. As an early adopter of VLER, the Indianapolis VHA and Regenstrief Institute implemented a regional demonstration program involving bi-directional health information exchange (HIE) between VHA and non-VHA providers. The aim of the study is to determine whether implementation of VLER HIE reduces 1 year VHA medical costs. A cohort evaluation with a concurrent control group compared VHA healthcare costs using propensity score adjustment. A CHEERs compliant checklist was used to conduct the cost evaluation. Patients were enrolled in the VLER program onsite at the Indianapolis VHA in outpatient clinics or through the release-of-information office. VHA cost data (in 2014 dollars) were obtained for both enrolled and nonenrolled (control) patients for 1 year prior to, and 1 year after, the index date of patient enrollment. There were 6104 patients enrolled in VLER and 45,700 patients in the control group. The annual adjusted total cost difference per patient was associated with a higher cost for VLER enrollees $1152 (95% CI: $807–1433) (P < 0.01) (in 2014 dollars) than VLER nonenrollees. Short-term evaluation of this demonstration project did not show immediate reductions in healthcare cost as might be expected if HIE decreased redundant medical tests and treatments. Cost reductions from shared health information may be realized with longer time horizons. PMID:26765453

  16. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  17. Short-Term Medical Costs of a VHA Health Information Exchange: A CHEERS-Compliant Article.

    PubMed

    French, Dustin D; Dixon, Brian E; Perkins, Susan M; Myers, Laura J; Weiner, Michael; Zillich, Allan J; Haggstrom, David A

    2016-01-01

    The Virtual Lifetime Electronic Record (VLER) Health program provides the Veterans Health Administration (VHA) a framework whereby VHA providers can access the veterans' electronic health record information to coordinate healthcare across multiple sites of care. As an early adopter of VLER, the Indianapolis VHA and Regenstrief Institute implemented a regional demonstration program involving bi-directional health information exchange (HIE) between VHA and non-VHA providers.The aim of the study is to determine whether implementation of VLER HIE reduces 1 year VHA medical costs.A cohort evaluation with a concurrent control group compared VHA healthcare costs using propensity score adjustment. A CHEERs compliant checklist was used to conduct the cost evaluation.Patients were enrolled in the VLER program onsite at the Indianapolis VHA in outpatient clinics or through the release-of-information office.VHA cost data (in 2014 dollars) were obtained for both enrolled and nonenrolled (control) patients for 1 year prior to, and 1 year after, the index date of patient enrollment.There were 6104 patients enrolled in VLER and 45,700 patients in the control group. The annual adjusted total cost difference per patient was associated with a higher cost for VLER enrollees $1152 (95% CI: $807-1433) (P < 0.01) (in 2014 dollars) than VLER nonenrollees.Short-term evaluation of this demonstration project did not show immediate reductions in healthcare cost as might be expected if HIE decreased redundant medical tests and treatments. Cost reductions from shared health information may be realized with longer time horizons. PMID:26765453

  18. The Lunar Laser Ranging Experiment: Accurate ranges have given a large improvement in the lunar orbit and new selenophysical information.

    PubMed

    Bender, P L; Currie, D G; Poultney, S K; Alley, C O; Dicke, R H; Wilkinson, D T; Eckhardt, D H; Faller, J E; Kaula, W M; Mulholland, J D; Plotkin, H H; Silverberg, E C; Williams, J G

    1973-10-19

    The lunar ranging measurements now being made at the McDonald Observatory have an accuracy of 1 nsec in round-trip travel time. This corresponds to 15 cm in the one-way distance. The use of lasers with pulse-lengths of less than 1 nsec is expected to give an accuracy of 2 to 3 cm in the next few years. A new station is under construction in Hawaii, and additional stations in other countries are either in operation or under development. It is hoped that these stations will form the basis for a worldwide network to determine polar motion and earth rotation on a regular basis, and will assist in providing information about movement of the tectonic plates making up the earth's surface. Several mobile lunar ranging stations with telescopes having diameters of 1.0 m or less could, in the future, greatly extend the information obtainable about motions within and between the tectonic plates. The data obtained so far by the McDonald Observatory have been used to generate a new lunar ephemeris based on direct numerical integration of the equations of motion for the moon and planets. With this ephemeris, the range to the three Apollo retro-reflectors can be fit to an accuracy of 5 m by adjusting the differences in moments of inertia of the moon about its principal axes, the selenocentric coordinates of the reflectors, and the McDonald longitude. The accuracy of fitting the results is limited currently by errors of the order of an arc second in the angular orientation of the moon, as derived from the best available theory of how the moon rotates in response to the torques acting on it. Both a new calculation of the moon's orientation as a function of time based on direct numerical integration of the torque equations and a new analytic theory of the moon's orientation are expected to be available soon, and to improve considerably the accuracy of fitting the data. The accuracy already achieved routinely in lunar laser ranging represents a hundredfold improvement over any

  19. The Lunar Laser Ranging Experiment: Accurate ranges have given a large improvement in the lunar orbit and new selenophysical information.

    PubMed

    Bender, P L; Currie, D G; Poultney, S K; Alley, C O; Dicke, R H; Wilkinson, D T; Eckhardt, D H; Faller, J E; Kaula, W M; Mulholland, J D; Plotkin, H H; Silverberg, E C; Williams, J G

    1973-10-19

    The lunar ranging measurements now being made at the McDonald Observatory have an accuracy of 1 nsec in round-trip travel time. This corresponds to 15 cm in the one-way distance. The use of lasers with pulse-lengths of less than 1 nsec is expected to give an accuracy of 2 to 3 cm in the next few years. A new station is under construction in Hawaii, and additional stations in other countries are either in operation or under development. It is hoped that these stations will form the basis for a worldwide network to determine polar motion and earth rotation on a regular basis, and will assist in providing information about movement of the tectonic plates making up the earth's surface. Several mobile lunar ranging stations with telescopes having diameters of 1.0 m or less could, in the future, greatly extend the information obtainable about motions within and between the tectonic plates. The data obtained so far by the McDonald Observatory have been used to generate a new lunar ephemeris based on direct numerical integration of the equations of motion for the moon and planets. With this ephemeris, the range to the three Apollo retro-reflectors can be fit to an accuracy of 5 m by adjusting the differences in moments of inertia of the moon about its principal axes, the selenocentric coordinates of the reflectors, and the McDonald longitude. The accuracy of fitting the results is limited currently by errors of the order of an arc second in the angular orientation of the moon, as derived from the best available theory of how the moon rotates in response to the torques acting on it. Both a new calculation of the moon's orientation as a function of time based on direct numerical integration of the torque equations and a new analytic theory of the moon's orientation are expected to be available soon, and to improve considerably the accuracy of fitting the data. The accuracy already achieved routinely in lunar laser ranging represents a hundredfold improvement over any

  20. Subjective sense of memory strength and the objective amount of information accurately remembered are related to distinct neural correlates at encoding.

    PubMed

    Qin, Shaozheng; van Marle, Hein J F; Hermans, Erno J; Fernández, Guillén

    2011-06-15

    Although commonly used, the term memory strength is not well defined in humans. Besides durability, it has been conceptualized by retrieval characteristics, such as subjective confidence associated with retrieval, or objectively, by the amount of information accurately retrieved. Behaviorally, these measures are not necessarily correlated, indicating that distinct neural processes may underlie them. Thus, we aimed at disentangling neural activity at encoding associated with either a subsequent subjective sense of memory strength or with a subsequent objective amount of information remembered. Using functional magnetic resonance imaging (fMRI), participants were scanned while incidentally encoding a series of photographs of complex scenes. The next day, they underwent two memory tests, quantifying memory strength either subjectively (confidence on remembering the gist of a scene) or objectively (the number of details accurately remembered within a scene). Correlations between these measurements were mutually partialed out in subsequent memory analyses of fMRI data. Results revealed that activation in left ventral lateral prefrontal cortex and temporoparietal junction predicted subsequent confidence ratings. In contrast, parahippocampal and hippocampal activity predicted the number of details remembered. Our findings suggest that memory strength may reflect a functionally heterogeneous set of (at least two) phenomena. One phenomenon appears related to prefrontal and temporoparietal top-down modulations, resulting in the subjective sense of memory strength that is potentially based on gist memory. The other phenomenon is likely related to medial-temporal binding processes, determining the amount of information accurately encoded into memory. Thus, our study dissociated two distinct phenomena that are usually described as memory strength.

  1. Exploring the costs and benefits of social information use: an appraisal of current experimental evidence

    PubMed Central

    Rieucau, Guillaume; Giraldeau, Luc-Alain

    2011-01-01

    Research on social learning has focused traditionally on whether animals possess the cognitive ability to learn novel motor patterns from tutors. More recently, social learning has included the use of others as sources of inadvertent social information. This type of social learning seems more taxonomically widespread and its use can more readily be approached as an economic decision. Social sampling information, however, can be tricky to use and calls for a more lucid appraisal of its costs. In this four-part review, we address these costs. Firstly, we address the possibility that only a fraction of group members are actually providing social information at any one time. Secondly, we review experimental research which shows that animals are circumspect about social information use. Thirdly, we consider the cases where social information can lead to incorrect decisions and finally, we review studies investigating the effect of social information quality. We address the possibility that using social information or not is not a binary decision and present results of a study showing that nutmeg mannikins combine both sources of information, a condition that can lead to the establishment of informational cascades. We discuss the importance of empirically investigating the economics of social information use. PMID:21357217

  2. Compact and cost-effective temperature-insensitive bio-sensor based on long-period fiber gratings for accurate detection of E. coli bacteria in water.

    PubMed

    Dandapat, Krishnendu; Tripathi, Saurabh Mani; Chinifooroshan, Yasser; Bock, Wojtek J; Mikulic, Predrag

    2016-09-15

    We propose and demonstrate a novel temperature-insensitive bio-sensor for accurate and quantitative detection of Escherichia coli (E. coli) bacteria in water. Surface sensitivity is maximized by operating the long-period fiber grating (LPFG) closest to its turnaround wavelength, and the temperature insensitivity is achieved by selectively exciting a pair of cladding modes with opposite dispersion characteristics. Our sensor shows a nominal temperature sensitivity of ∼1.25  pm/°C, which can be further reduced by properly adjusting the LPFG lengths, while maintaining a high refractive index sensitivity of 1929 nm/RIU. The overall length of the sensor is ∼3.6  cm, making it ideally suitable for bio-sensing applications. As an example, we also show the sensor's capability for reliable, quantitative detection of E. coli bacteria in water over a temperature fluctuation of room temperature to 40°C. PMID:27628356

  3. 20 CFR 663.540 - What kind of performance and cost information is required for determinations of subsequent...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... program costs (such as tuition and fees) for WIA participants in the program. (b) Governors may require... 20 Employees' Benefits 4 2012-04-01 2012-04-01 false What kind of performance and cost information... performance and cost information is required for determinations of subsequent eligibility? (a)...

  4. 20 CFR 663.540 - What kind of performance and cost information is required for determinations of subsequent...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... program costs (such as tuition and fees) for WIA participants in the program. (b) Governors may require... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What kind of performance and cost information... and cost information is required for determinations of subsequent eligibility? (a) Eligible...

  5. 20 CFR 663.540 - What kind of performance and cost information is required for determinations of subsequent...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... program costs (such as tuition and fees) for WIA participants in the program. (b) Governors may require... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false What kind of performance and cost information... performance and cost information is required for determinations of subsequent eligibility? (a)...

  6. A study of the relative effectiveness and cost of computerized information retrieval in the interactive mode

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Furniss, M. A.; Potter, T. R.

    1974-01-01

    Results of a number of experiments to illuminate the relative effectiveness and costs of computerized information retrieval in the interactive mode are reported. It was found that for equal time spent in preparing the search strategy, the batch and interactive modes gave approximately equal recall and relevance. The interactive mode however encourages the searcher to devote more time to the task and therefore usually yields improved output. Engineering costs as a result are higher in this mode. Estimates of associated hardware costs also indicate that operation in this mode is more expensive. Skilled RECON users like the rapid feedback and additional features offered by this mode if they are not constrained by considerations of cost.

  7. 78 FR 70282 - Proposed Information Collection; Comment Request; Cost-Earnings Surveys of Hawaii and American...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Cost-Earnings Surveys of Hawaii and American Samoa Small Boat-Based Fisheries AGENCY: National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION: Notice. SUMMARY: The Department of Commerce, as part...

  8. Digital Avionics Information System (DAIS): Impact of DAIS Concept on Life Cycle Cost. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; And Others

    Designed to identify and quantify the potential impacts of the Digital Avionics Information System (DAIS) on weapon system personnel requirements and life cycle cost (LCC), this study postulated a typical close-air-support (CAS) mission avionics suite to serve as a basis for comparing present day and DAIS configuration specifications. The purpose…

  9. An Accurate and Fault-Tolerant Target Positioning System for Buildings Using Laser Rangefinders and Low-Cost MEMS-Based MARG Sensors.

    PubMed

    Zhao, Lin; Guan, Dongxue; Landry, René; Cheng, Jianhua; Sydorenko, Kostyantyn

    2015-01-01

    Target positioning systems based on MEMS gyros and laser rangefinders (LRs) have extensive prospects due to their advantages of low cost, small size and easy realization. The target positioning accuracy is mainly determined by the LR's attitude derived by the gyros. However, the attitude error is large due to the inherent noises from isolated MEMS gyros. In this paper, both accelerometer/magnetometer and LR attitude aiding systems are introduced to aid MEMS gyros. A no-reset Federated Kalman Filter (FKF) is employed, which consists of two local Kalman Filters (KF) and a Master Filter (MF). The local KFs are designed by using the Direction Cosine Matrix (DCM)-based dynamic equations and the measurements from the two aiding systems. The KFs can estimate the attitude simultaneously to limit the attitude errors resulting from the gyros. Then, the MF fuses the redundant attitude estimates to yield globally optimal estimates. Simulation and experimental results demonstrate that the FKF-based system can improve the target positioning accuracy effectively and allow for good fault-tolerant capability.

  10. An Accurate and Fault-Tolerant Target Positioning System for Buildings Using Laser Rangefinders and Low-Cost MEMS-Based MARG Sensors

    PubMed Central

    Zhao, Lin; Guan, Dongxue; Landry, René Jr.; Cheng, Jianhua; Sydorenko, Kostyantyn

    2015-01-01

    Target positioning systems based on MEMS gyros and laser rangefinders (LRs) have extensive prospects due to their advantages of low cost, small size and easy realization. The target positioning accuracy is mainly determined by the LR’s attitude derived by the gyros. However, the attitude error is large due to the inherent noises from isolated MEMS gyros. In this paper, both accelerometer/magnetometer and LR attitude aiding systems are introduced to aid MEMS gyros. A no-reset Federated Kalman Filter (FKF) is employed, which consists of two local Kalman Filters (KF) and a Master Filter (MF). The local KFs are designed by using the Direction Cosine Matrix (DCM)-based dynamic equations and the measurements from the two aiding systems. The KFs can estimate the attitude simultaneously to limit the attitude errors resulting from the gyros. Then, the MF fuses the redundant attitude estimates to yield globally optimal estimates. Simulation and experimental results demonstrate that the FKF-based system can improve the target positioning accuracy effectively and allow for good fault-tolerant capability. PMID:26512672

  11. Metabolic cost of neuronal information in an empirical stimulus-response model.

    PubMed

    Kostal, Lubomir; Lansky, Petr; McDonnell, Mark D

    2013-06-01

    The limits on maximum information that can be transferred by single neurons may help us to understand how sensory and other information is being processed in the brain. According to the efficient-coding hypothesis (Barlow, Sensory Comunication, MIT press, Cambridge, 1961), neurons are adapted to the statistical properties of the signals to which they are exposed. In this paper we employ methods of information theory to calculate, both exactly (numerically) and approximately, the ultimate limits on reliable information transmission for an empirical neuronal model. We couple information transfer with the metabolic cost of neuronal activity and determine the optimal information-to-metabolic cost ratios. We find that the optimal input distribution is discrete with only six points of support, both with and without a metabolic constraint. However, we also find that many different input distributions achieve mutual information close to capacity, which implies that the precise structure of the capacity-achieving input is of lesser importance than the value of capacity.

  12. Using a small/low cost computer in an information center

    NASA Technical Reports Server (NTRS)

    Wilde, D. U.

    1972-01-01

    Small/low cost computers are available with I/O capacities that make them suitable for SDI and retrospective searching on any of the many commercially available data bases. A small two-tape computer system is assumed, and an analysis of its run-time equations leads to a three-step search procedure. Run times and costs are shown as a function of file size, number of search terms, and input transmission rates. Actual examples verify that it is economically feasible for an information center to consider its own small, dedicated computer system.

  13. The Spatial Dynamics of Predators and the Benefits and Costs of Sharing Information

    PubMed Central

    2016-01-01

    Predators of all kinds, be they lions hunting in the Serengeti or fishermen searching for their catch, display various collective strategies. A common strategy is to share information about the location of prey. However, depending on the spatial characteristics and mobility of predators and prey, information sharing can either improve or hinder individual success. Here, our goal is to investigate the interacting effects of space and information sharing on predation efficiency, represented by the expected rate at which prey are found and consumed. We derive a feeding functional response that accounts for both spatio-temporal heterogeneity and communication, and validate this mathematical analysis with a computational agent-based model. This agent-based model has an explicit yet minimal representation of space, as well as information sharing about the location of prey. The analytical model simplifies predator behavior into a few discrete states and one essential trade-off, between the individual benefit of acquiring information and the cost of creating spatial and temporal correlation between predators. Despite the absence of an explicit spatial dimension in these equations, they quantitatively predict the predator consumption rates measured in the agent-based simulations across the explored parameter space. Together, the mathematical analysis and agent-based simulations identify the conditions for when there is a benefit to sharing information, and also when there is a cost. PMID:27764098

  14. Argon Cluster Sputtering Source for ToF-SIMS Depth Profiling of Insulating Materials: High Sputter Rate and Accurate Interfacial Information.

    PubMed

    Wang, Zhaoying; Liu, Bingwen; Zhao, Evan W; Jin, Ke; Du, Yingge; Neeway, James J; Ryan, Joseph V; Hu, Dehong; Zhang, Kelvin H L; Hong, Mina; Le Guernic, Solenne; Thevuthasan, Suntharampilai; Wang, Fuyi; Zhu, Zihua

    2015-08-01

    The use of an argon cluster ion sputtering source has been demonstrated to perform superiorly relative to traditional oxygen and cesium ion sputtering sources for ToF-SIMS depth profiling of insulating materials. The superior performance has been attributed to effective alleviation of surface charging. A simulated nuclear waste glass (SON68) and layered hole-perovskite oxide thin films were selected as model systems because of their fundamental and practical significance. Our results show that high sputter rates and accurate interfacial information can be achieved simultaneously for argon cluster sputtering, whereas this is not the case for cesium and oxygen sputtering. Therefore, the implementation of an argon cluster sputtering source can significantly improve the analysis efficiency of insulating materials and, thus, can expand its applications to the study of glass corrosion, perovskite oxide thin film characterization, and many other systems of interest.

  15. DRG production costs: one more dimension in management information for South Australia hospitals.

    PubMed

    Hindle, D; Scutery, J

    1988-01-01

    This paper summarises and discusses a report of a recent study undertaken by the South Australian Health Commission (SAHC) aimed at the measurement of the production costs of diagnosis related groups. Data were generated on an experimental basis for the seven major public hospitals in the State. The methodology was judged to be feasible, and data outputs of great potential value. It has therefore been decided that the costing process should become a routine component of the hospital management information system. Work is proceeding to extend the process to other sites. It is anticipated that production costs data of these types will make significant contributions to clinical program management, facilities planning, and hospital budgeting decisions in the future.

  16. Waste management facilities cost information for transportation of radioactive and hazardous materials

    SciTech Connect

    Feizollahi, F.; Shropshire, D.; Burton, D.

    1995-06-01

    This report contains cost information on the U.S. Department of Energy (DOE) Complex waste streams that will be addressed by DOE in the programmatic environmental impact statement (PEIS) project. It describes the results of the task commissioned by DOE to develop cost information for transportation of radioactive and hazardous waste. It contains transportation costs for most types of DOE waste streams: low-level waste (LLW), mixed low-level waste (MLLW), alpha LLW and alpha MLLW, Greater-Than-Class C (GTCC) LLW and DOE equivalent waste, transuranic (TRU) waste, spent nuclear fuel (SNF), and hazardous waste. Unit rates for transportation of contact-handled (<200 mrem/hr contact dose) and remote-handled (>200 mrem/hr contact dose) radioactive waste are estimated. Land transportation of radioactive and hazardous waste is subject to regulations promulgated by DOE, the U.S. Department of Transportation (DOT), the U.S. Nuclear Regulatory Commission (NRC), and state and local agencies. The cost estimates in this report assume compliance with applicable regulations.

  17. Interruptions improve choice performance in gray jays: prolonged information processing versus minimization of costly errors.

    PubMed

    Waite, Thomas A

    2002-12-01

    Under the assumption that selection favors minimization of costly errors, erroneous choice may be common when its fitness cost is low. According to an adaptive-choice model, this cost depends on the rate at which an animal encounters the choice: the higher this rate, the smaller the cost of choosing a less valuable option. Errors should thus be more common when interruptions to foraging are shorter. A previous experiment supported this prediction: gray jays, Perisoreus canadensis, were more error prone when subjected to shorter delays to access to food rewards. This pattern, though, is also predicted by an attentional-constraints model. Because the subjects were able to inspect the rewards during delays, their improved performance when subjected to longer delays could have been a byproduct of the experimentally prolonged opportunity for information processing. To evaluate this possibility, a follow-up experiment manipulated both delay to access and whether rewards could be inspected during delays. Depriving jays of the opportunity to inspect rewards (using opaque lids) induced only a small, nonsignificant increase in error rate. This effect was independent of length of delay and so the jays' improved performance when subjected to longer delays was not simply a byproduct of prolonged information processing. More definitively, even when the jays were prevented from inspecting rewards during delays, their performance improved when subjected to longer delays. The findings are thus consistent with the adaptive-choice model.

  18. Estimating costs of programme services and products using information provided in standard financial statements.

    PubMed Central

    Ellwein, L. B.; Thulasiraj, R. D.; Boulter, A. R.; Dhittal, S. P.

    1998-01-01

    The financial viability of programme services and product offerings requires that revenue exceeds expenses. Revenue includes payments for services and products as well as donor cash and in-kind contributions. Expenses reflect consumption of purchased or contributed time and materials and utilization (depreciation) of physical plant facilities and equipment. Standard financial reports contain this revenue and expense information, complemented when necessary by valuation and accounting of in-kind contributions. Since financial statements are prepared using consistent and accepted accounting practices, year-to-year and organization-to-organization comparisons can be made. The use of such financial information is illustrated in this article by determining the unit cost of cataract surgery in two hospitals in Nepal. The proportion of unit cost attributed to personnel, medical supplies, administrative materials, and depreciation varied significantly by institution. These variations are accounted for by examining differences in operational structure and capacity utilization. PMID:9868836

  19. Cost comparison of Transcatheter and Operative Pulmonary Valve Replacement (from the Pediatric Health Information Systems Database).

    PubMed

    O'Byrne, Michael L; Gillespie, Matthew J; Shinohara, Russell T; Dori, Yoav; Rome, Jonathan J; Glatz, Andrew C

    2016-01-01

    Outcomes for transcatheter pulmonary valve replacement (TC-PVR) and operative pulmonary valve replacement (S-PVR) are excellent. Thus, their respective cost is a relevant clinical outcome. We performed a retrospective cohort study of children and adults who underwent PVR at age ≥ 8 years from January 1, 2011, to December 31, 2013, at 35 centers contributing data to the Pediatric Health Information Systems database to address this question. A propensity score-adjusted multivariable analysis was performed to adjust for known confounders. Secondary analyses of department-level charges, risk of re-admission, and associated costs were performed. A total of 2,108 PVR procedures were performed in 2,096 subjects (14% transcatheter and 86% operative). The observed cost of S-PVR and TC-PVR was not significantly different (2013US $50,030 vs 2013US $51,297; p = 0.85). In multivariate analysis, total costs of S-PVR and TC-PVR were not significantly different (p = 0.52). Length of stay was shorter after TC-PVR (p <0.0001). Clinical and supply charges were greater for TC-PVR (p <0.0001), whereas laboratory, pharmacy, and other charges (all p <0.0001) were greater for S-PVR. Risks of both 7- and 30-day readmission were not significantly different. In conclusion, short-term costs of TC-PVR and S-PVR are not significantly different after adjustment. PMID:26552510

  20. Energy information systems (EIS): Technology costs, benefit, and best practice uses

    SciTech Connect

    Granderson, Jessica; Lin, Guanjing; Piette, Mary Ann

    2013-11-26

    Energy information systems are the web-based software, data acquisition hardware, and communication systems used to store, analyze, and display building energy data. They often include analysis methods such as baselining, benchmarking, load profiling, and energy anomaly detection. This report documents a large-scale assessment of energy information system (EIS) uses, costs, and energy benefits, based on a series of focused case study investigations that are synthesized into generalizable findings. The overall objective is to provide organizational decision makers with the information they need to make informed choices as to whether or not to invest in an EIS--a promising technology that can enable up to 20 percent site energy savings, quick payback, and persistent low-energy performance when implemented as part of best-practice energy management programs.

  1. Know Thy Neighbor: Costly Information Can Hurt Cooperation in Dynamic Networks

    PubMed Central

    Antonioni, Alberto; Cacault, Maria Paula; Lalive, Rafael; Tomassini, Marco

    2014-01-01

    People need to rely on cooperation with other individuals in many aspects of everyday life, such as teamwork and economic exchange in anonymous markets. We study whether and how the ability to make or break links in social networks fosters cooperate, paying particular attention to whether information on an individual's actions is freely available to potential partners. Studying the role of information is relevant as information on other people's actions is often not available for free: a recruiting firm may need to call a job candidate's references, a bank may need to find out about the credit history of a new client, etc. We find that people cooperate almost fully when information on their actions is freely available to their potential partners. Cooperation is less likely, however, if people have to pay about half of what they gain from cooperating with a cooperator. Cooperation declines even further if people have to pay a cost that is almost equivalent to the gain from cooperating with a cooperator. Thus, costly information on potential neighbors' actions can undermine the incentive to cooperate in fluid networks. PMID:25356905

  2. Informing disinvestment through cost-effectiveness modelling: is lack of data a surmountable barrier?

    PubMed

    Karnon, Jonathan; Carlton, Jill; Czoski-Murray, Carolyn; Smith, Kevin

    2009-01-01

    The mandatory nature of recommendations made by the National Institute for Health and Clinical Excellence (NICE) in the UK has highlighted inherent difficulties in the process of disinvestment in existing technologies to fund NICE-approved technologies. A lack of evidence on candidate technologies means that the process of disinvestment is subject to greater uncertainty than the investment process, and inefficiencies may occur as a result of the inverse evidence law. This article describes a potential disinvestment scenario and the options for the decision maker, including the conduct of value of information analyses. To illustrate the scenario, an economic evaluation of a disinvestment candidate (screening for amblyopia and strabismus) is presented. Only very limited data were available. The reference case analysis found that screening is not cost effective at currently accepted values of a QALY. However, a small utility decrement due to unilateral vision loss reduced the incremental cost per QALY gained, with screening expected to be extremely cost effective. The discussion highlights the specific options to be considered by decision makers in light of the model-based evaluation. It is shown that the evaluation provides useful information to guide the disinvestment decision, providing a range of focused options with respect to the decision and the decision-making process. A combination of explicit model-based evaluation, and pragmatic and generalizable approaches to interpreting uncertainty in the decision-making process is proposed, which should enable informed decisions around the disinvestment of technologies with weak evidence bases.

  3. 39 CFR 3050.30 - Information needed to estimate the cost of the universal service obligation. [Reserved

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Information needed to estimate the cost of the universal service obligation. 3050.30 Section 3050.30 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL PERIODIC REPORTING § 3050.30 Information needed to estimate the cost of the universal service obligation....

  4. IrisPlex: a sensitive DNA tool for accurate prediction of blue and brown eye colour in the absence of ancestry information.

    PubMed

    Walsh, Susan; Liu, Fan; Ballantyne, Kaye N; van Oven, Mannis; Lao, Oscar; Kayser, Manfred

    2011-06-01

    A new era of 'DNA intelligence' is arriving in forensic biology, due to the impending ability to predict externally visible characteristics (EVCs) from biological material such as those found at crime scenes. EVC prediction from forensic samples, or from body parts, is expected to help concentrate police investigations towards finding unknown individuals, at times when conventional DNA profiling fails to provide informative leads. Here we present a robust and sensitive tool, termed IrisPlex, for the accurate prediction of blue and brown eye colour from DNA in future forensic applications. We used the six currently most eye colour-informative single nucleotide polymorphisms (SNPs) that previously revealed prevalence-adjusted prediction accuracies of over 90% for blue and brown eye colour in 6168 Dutch Europeans. The single multiplex assay, based on SNaPshot chemistry and capillary electrophoresis, both widely used in forensic laboratories, displays high levels of genotyping sensitivity with complete profiles generated from as little as 31pg of DNA, approximately six human diploid cell equivalents. We also present a prediction model to correctly classify an individual's eye colour, via probability estimation solely based on DNA data, and illustrate the accuracy of the developed prediction test on 40 individuals from various geographic origins. Moreover, we obtained insights into the worldwide allele distribution of these six SNPs using the HGDP-CEPH samples of 51 populations. Eye colour prediction analyses from HGDP-CEPH samples provide evidence that the test and model presented here perform reliably without prior ancestry information, although future worldwide genotype and phenotype data shall confirm this notion. As our IrisPlex eye colour prediction test is capable of immediate implementation in forensic casework, it represents one of the first steps forward in the creation of a fully individualised EVC prediction system for future use in forensic DNA intelligence.

  5. Information from Pharmaceutical Companies and the Quality, Quantity, and Cost of Physicians' Prescribing: A Systematic Review

    PubMed Central

    Spurling, Geoffrey K.; Mansfield, Peter R.; Montgomery, Brett D.; Lexchin, Joel; Doust, Jenny; Othman, Noordin; Vitry, Agnes I.

    2010-01-01

    Background Pharmaceutical companies spent $57.5 billion on pharmaceutical promotion in the United States in 2004. The industry claims that promotion provides scientific and educational information to physicians. While some evidence indicates that promotion may adversely influence prescribing, physicians hold a wide range of views about pharmaceutical promotion. The objective of this review is to examine the relationship between exposure to information from pharmaceutical companies and the quality, quantity, and cost of physicians' prescribing. Methods and Findings We searched for studies of physicians with prescribing rights who were exposed to information from pharmaceutical companies (promotional or otherwise). Exposures included pharmaceutical sales representative visits, journal advertisements, attendance at pharmaceutical sponsored meetings, mailed information, prescribing software, and participation in sponsored clinical trials. The outcomes measured were quality, quantity, and cost of physicians' prescribing. We searched Medline (1966 to February 2008), International Pharmaceutical Abstracts (1970 to February 2008), Embase (1997 to February 2008), Current Contents (2001 to 2008), and Central (The Cochrane Library Issue 3, 2007) using the search terms developed with an expert librarian. Additionally, we reviewed reference lists and contacted experts and pharmaceutical companies for information. Randomized and observational studies evaluating information from pharmaceutical companies and measures of physicians' prescribing were independently appraised for methodological quality by two authors. Studies were excluded where insufficient study information precluded appraisal. The full text of 255 articles was retrieved from electronic databases (7,185 studies) and other sources (138 studies). Articles were then excluded because they did not fulfil inclusion criteria (179) or quality appraisal criteria (18), leaving 58 included studies with 87 distinct analyses

  6. The Effects of Health Information Technology on the Costs and Quality of Medical Care

    PubMed Central

    Agha, Leila

    2015-01-01

    Information technology has been linked to productivity growth in a wide variety of sectors, and health information technology (HIT) is a leading example of an innovation with the potential to transform industry-wide productivity. This paper analyzes the impact of health information technology (HIT) on the quality and intensity of medical care. Using Medicare claims data from 1998-2005, I estimate the effects of early investment in HIT by exploiting variation in hospitals’ adoption statuses over time, analyzing 2.5 million inpatient admissions across 3900 hospitals. HIT is associated with a 1.3 percent increase in billed charges (p-value: 5.6%), and there is no evidence of cost savings even five years after adoption. Additionally, HIT adoption appears to have little impact on the quality of care, measured by patient mortality, adverse drug events, and readmission rates. PMID:24463141

  7. Allocating health care: cost-utility analysis, informed democratic decision making, or the veil of ignorance?

    PubMed

    Goold, S D

    1996-01-01

    Assuming that rationing health care is unavoidable, and that it requires moral reasoning, how should we allocate limited health care resources? This question is difficult because our pluralistic, liberal society has no consensus on a conception of distributive justice. In this article I focus on an alternative: Who shall decide how to ration health care, and how shall this be done to respect autonomy, pluralism, liberalism, and fairness? I explore three processes for making rationing decisions: cost-utility analysis, informed democratic decision making, and applications of the veil of ignorance. I evaluate these processes as examples of procedural justice, assuming that there is no outcome considered the most just. I use consent as a criterion to judge competing processes so that rationing decisions are, to some extent, self-imposed. I also examine the processes' feasibility in our current health care system. Cost-utility analysis does not meet criteria for actual or presumed consent, even if costs and health-related utility could be measured perfectly. Existing structures of government cannot creditably assimilate the information required for sound rationing decisions, and grassroots efforts are not representative. Applications of the veil of ignorance are more useful for identifying principles relevant to health care rationing than for making concrete rationing decisions. I outline a process of decision making, specifically for health care, that relies on substantive, selected representation, respects pluralism, liberalism, and deliberative democracy, and could be implemented at the community or organizational level.

  8. Effective cost modeling for service line planning.

    PubMed

    Scott, Michael; Stephen, Robert

    2010-05-01

    Healthcare executives have struggled to have accurate, timely information about cost and resources to model and monitor service line performance. Process-based cost modeling has been used successfully in other industries, but is relatively new in health care. Understanding costs and resources at process and patient levels can make the difference between a service line having a positive or negative margin. PMID:20446426

  9. A Cost-Effectiveness Tool for Informing Policies on Zika Virus Control

    PubMed Central

    Tamagnan, Jules A.; Medlock, Jan; Ndeffo-Mbah, Martial L.; Fish, Durland; Ávila-Agüero, María L.; Marín, Rodrigo; Ko, Albert I.; Galvani, Alison P.

    2016-01-01

    in real-time within our user-friendly tool to provide updated estimates on cost-effectiveness of interventions and inform policy decisions in country-specific settings. PMID:27205899

  10. A better approach to cost estimation.

    PubMed

    Richmond, Russ

    2013-03-01

    Using ratios of costs to charges (RCCs) to estimate costs can cause hospitals to significantly over- or under-invest in service lines. A focus on improving cost estimation in cost centers where physicians have significant control over operating expenses, such as drugs or implants, can strengthen decision making and strategic planning. Connecting patient file information to purchasing data can lead to more accurate reflections of actual costs and help hospitals gain better visibility across service lines.

  11. Magnetic dipolar coupling and collective effects for binary information codification in cost-effective logic devices

    NASA Astrophysics Data System (ADS)

    Chiolerio, Alessandro; Allia, Paolo; Graziano, Mariagrazia

    2012-09-01

    Physical limitations foreshadow the eventual end to traditional Complementary Metal Oxide Semiconductor (CMOS) scaling. Therefore, interest has turned to various materials and technologies aimed to succeed to traditional CMOS. Magnetic Quantum dot Cellular Automata (MQCA) are one of these technologies. Working MQCA arrays require very complex techniques and an excellent control on the geometry of the nanomagnets and on the quality of the magnetic thin film, thus limiting the possibility for MQCA of representing a definite solution to cost-effective, high density and low power consumption device demand. Counter-intuitively, moving towards bigger sizes and lighter technologies it is still possible to develop multi-state logic devices, as we demonstrated, whose main advantage is cost-effectiveness. Applications may be seen in low cost logic devices where integration and computational power are not the main issue, eventually using flexible substrates and taking advantage of the intrinsic mechanical toughness of systems where long range interactions do not need wirings. We realized cobalt micrometric MQCA arrays by means of Electron Beam Lithography, exploiting cost-effective processes such as lift-off and RF sputtering that usually are avoided due to their low control on array geometry and film roughness. Information relative to the magnetic configuration of MQCA elements including their eventual magnetic interactions was obtained from Magnetic Force Microscope (MFM) images, enhanced by means of a numerical procedure and presented in differential maps. We report the existence of bi-stable magnetic patterns, as detected by MFM while sampling the z-component of magnetic induction field, arising from dipolar inter-element magnetostatic coupling, able to store and propagate binary information. This is achieved despite the array quality and element magnetic state, which are low and multi-domain, respectively. We discuss in detail shape, inter-element spacing and dot profile

  12. Effective information channels for reducing costs of environmentally- friendly technologies: evidence from residential PV markets

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Robinson, Scott A.

    2013-03-01

    Realizing the environmental benefits of solar photovoltaics (PV) will require reducing costs associated with perception, informational gaps and technological uncertainties. To identify opportunities to decrease costs associated with residential PV adoption, in this letter we use multivariate regression models to analyze a unique, household-level dataset of PV adopters in Texas (USA) to systematically quantify the effect of different information channels on aspiring PV adopters’ decision-making. We find that the length of the decision period depends on the business model, such as whether the system was bought or leased, and on special opportunities to learn, such as the influence of other PV owners in the neighborhood. This influence accrues passively through merely witnessing PV systems in the neighborhood, increasing confidence and motivation, as well as actively through peer-to-peer communications. Using these insights we propose a new framework to provide public information on PV that could drastically reduce barriers to PV adoption, thereby accelerating its market penetration and environmental benefits. This framework could also serve as a model for other distributed generation technologies.

  13. Case studies of energy information systems and related technology: Operational practices, costs, and benefits

    SciTech Connect

    Motegi, Naoya; Piette, Mary Ann; Kinney, Satkartar; Dewey, Jim

    2003-09-02

    Energy Information Systems (EIS), which can monitor and analyze building energy consumption and related data throughout the Internet, have been increasing in use over the last decade. Though EIS developers describe the capabilities, costs, and benefits of EIS, many of these descriptions are idealized and often insufficient for potential users to evaluate cost, benefit and operational usefulness. LBNL has conducted a series of case studies of existing EIS and related technology installations. This study explored the following questions: (1) How is the EIS used in day-to-day operation? (2) What are the costs and benefits of an EIS? (3) Where do the energy savings come from? This paper reviews the process of these technologies from installation through energy management practice. The study is based on interviews with operators and energy managers who use EIS. Analysis of energy data trended by EIS and utility bills was also conducted to measure the benefit. This paper explores common uses and findings to identify energy savings attributable to EIS, and discusses non-energy benefits as well. This paper also addresses technologies related to EIS that have been demonstrated and evaluated by LBNL.

  14. Costs and effects of a nursing information system in three Dutch hospitals.

    PubMed

    van Gennip, E M; Klaassen-Leil, C C; Stokman, R; van Valkenburg, R K

    1995-01-01

    VISION is an integrated nursing information system developed in the Netherlands. In mid-1992, a technology assessment of this system was started: the VISTA project. Its aim is to assess the costs and effects of VISION in three different types of hospitals: a University Hospital, a General Hospital, and a Psychiatric Hospital. The study was financially supported by the Dutch Ministry of Welfare, Health, and Culture. Each hospital selected an experimental ward at which VISION parts were installed (VISION was not yet complete during the study). Also, control wards were selected at which no VISION parts were installed. A series of two measurements were carried out at the experimental and control wards. The first experiences show that VISION can be used in various applications, in different environments. Although the second measurements were done just a few weeks after the installation of VISION parts, first effects could already be demonstrated. Nurses were enthusiastic and the quality of coordination of care increased. Time savings have not yet been demonstrated, but are expected after a longer, more extensive use of the system. Extrapolation of costs for hospital-wide implementation revealed that the costs of VISION are low compared to similar systems in the US. After reviewing the first results, each hospital decided to continue the implementation of VISION parts. More measurements are planned. PMID:8591462

  15. Risk information in support of cost estimates for the Baseline Environmental Management Report (BEMR). Section 1

    SciTech Connect

    Gelston, G.M.; Jarvis, M.F.; Warren, B.R.; Von Berg, R.

    1995-06-01

    The Pacific Northwest Laboratory (PNL)(1) effort on the overall Baseline Environmental Management Report (BEMR) project consists of four installation-specific work components performed in succession. These components include (1) development of source terms, 92) collection of data and preparation of environmental settings reports, (3) calculation of unit risk factors, and (4) utilization of the unit risk factors in Automated Remedial Action Methodology (ARAM) for computation of target concentrations and cost estimates. This report documents work completed for the Nevada Test Site, Nevada, for components 2 and 3. The product of this phase of the BEMR project is the development of unit factors (i.e., unit transport factors, unit exposure factors, and unit risk factors). Thousands of these unit factors are gene rated and fill approximately one megabyte of computer information per installation. The final unit risk factors (URF) are transmitted electronically to BEMR-Cost task personnel as input to a computer program (ARAM). Abstracted files and exhibits of the URF information are included in this report. These visual formats are intended to provide a sample of the final task deliverable (the URF files) which can be easily read without a computer.

  16. 47 CFR 64.903 - Cost allocation manuals.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... mid-sized incumbent local exchange carriers is required to file a cost allocation manual describing how it separates regulated from nonregulated costs. The manual shall contain the following information... shall ensure that the information contained in its cost allocation manual is accurate. Carriers...

  17. Digital Avionics Information System (DAIS): Impact of DAIS Concept on Life Cycle Cost--Supplement. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; And Others

    This supplement to a technical report providing the results of a preliminary investigation of the potential impact of the Digital Avionics Information System (DAIS) concept on system support personnel requirements and life cycle cost (LCC) includes: (1) additional details of the cost comparison of a hypothetical application of a conceptual…

  18. The Perceived Effect of Hidden Costs on the Operational Management of Information Technology Outsourcing: A Qualitative Study

    ERIC Educational Resources Information Center

    Swift, Ian

    2011-01-01

    Information technology (IT) outsourcing is a business trend aimed at reducing costs and enabling companies to concentrate on their core competencies. This qualitative multiple case design research study explored the effects of hidden costs on the operational management of IT outsourcing. The study involved analyzing IT outsourcing agreements as…

  19. A review of non-contact, low-cost physiological information measurement based on photoplethysmographic imaging.

    PubMed

    Liu, He; Wang, Yadong; Wang, Lei

    2012-01-01

    In recent decades, there has been increasing interest in low-cost, non-contact and pervasive methods for measuring physiological information, such as heart rate (HR), respiratory rate, heart rate variability (HRV) and oxyhemoglobin saturation. The conventional methods including wet adhesive Ag/AgCl electrodes for HR and HRV, the capnograph device for respiratory status and pulse oximetry for oxyhemoglobin saturation provide excellent signals but are expensive, troublesome and inconvenient. A method to monitor physiological information based on photoplethysmographic imaging offers a new means for health monitoring. Blood volume can be indirectly assessed in terms of blood velocity, blood flow rate and blood pressure, which, in turn, can reflect changes in physiological parameters. Changes in blood volume can be determined from the spectra of light reflected from or transmitted through body tissues. Images of an area of the skin surface are consecutively captured with the color camera of a computer or smartphone and, by processing and analyzing the light signals, physiological information such as HR, respiratory rate, HRV and oxyhemoglobin saturation can be acquired. In this paper, we review the latest developments in using photoplethysmographic imaging for non-contact health monitoring and discuss the challenges and future directions for this field. PMID:23366332

  20. Layer-switching cost and optimality in information spreading on multiplex networks

    NASA Astrophysics Data System (ADS)

    Min, Byungjoon; Gwak, Sang-Hwan; Lee, Nanoom; Goh, K.-I.

    2016-02-01

    We study a model of information spreading on multiplex networks, in which agents interact through multiple interaction channels (layers), say online vs. offline communication layers, subject to layer-switching cost for transmissions across different interaction layers. The model is characterized by the layer-wise path-dependent transmissibility over a contact, that is dynamically determined dependently on both incoming and outgoing transmission layers. We formulate an analytical framework to deal with such path-dependent transmissibility and demonstrate the nontrivial interplay between the multiplexity and spreading dynamics, including optimality. It is shown that the epidemic threshold and prevalence respond to the layer-switching cost non-monotonically and that the optimal conditions can change in abrupt non-analytic ways, depending also on the densities of network layers and the type of seed infections. Our results elucidate the essential role of multiplexity that its explicit consideration should be crucial for realistic modeling and prediction of spreading phenomena on multiplex social networks in an era of ever-diversifying social interaction layers.

  1. Layer-switching cost and optimality in information spreading on multiplex networks

    PubMed Central

    Min, Byungjoon; Gwak, Sang-Hwan; Lee, Nanoom; Goh, K. -I.

    2016-01-01

    We study a model of information spreading on multiplex networks, in which agents interact through multiple interaction channels (layers), say online vs. offline communication layers, subject to layer-switching cost for transmissions across different interaction layers. The model is characterized by the layer-wise path-dependent transmissibility over a contact, that is dynamically determined dependently on both incoming and outgoing transmission layers. We formulate an analytical framework to deal with such path-dependent transmissibility and demonstrate the nontrivial interplay between the multiplexity and spreading dynamics, including optimality. It is shown that the epidemic threshold and prevalence respond to the layer-switching cost non-monotonically and that the optimal conditions can change in abrupt non-analytic ways, depending also on the densities of network layers and the type of seed infections. Our results elucidate the essential role of multiplexity that its explicit consideration should be crucial for realistic modeling and prediction of spreading phenomena on multiplex social networks in an era of ever-diversifying social interaction layers. PMID:26887527

  2. Groundwater Exploration Using Remote Sensing And A Low-Cost Geographical Information System

    NASA Astrophysics Data System (ADS)

    Teeuw, R. M.

    1995-03-01

    Now that personal computers (pc's) have become more powerful, potable, and affordable, geoscientists can make full use of developments in computer-aided mapping, particularly Geographical Information Systems (GIS). The IDRISI GIS was used to 1) carry out image processing on satellite images; 2) assess the reliability of the interpreted lineaments; 3) create maps showing individual lineament lengths, areal extent of interconnected lineaments, and targets for groundwater boreholes; and 4) incorporate socio-economic factors, by creating maps that show the proximity of villages to sites considered favourable for boreholes. The exact location of each site for drilling was decided on the basis of geophysical surveys over the areas that had been targeted by the remote sensing and GIS analysis. Most of the remote sensing and GIS work was carried out in Ghana in two weeks, during which the `ground truth' of lineament maps was checked. The total cost of the hardware and software used in this project (16-colour laptop pc, portable colour printer, and IDRISI) was slightly less than US 2,600. The relatively low cost and ease of use of this system make it a technology that is readily transferable to developing countries.

  3. Cost and results of information systems for health and poverty indicators in the United Republic of Tanzania.

    PubMed Central

    Rommelmann, Vanessa; Setel, Philip W.; Hemed, Yusuf; Angeles, Gustavo; Mponezya, Hamisi; Whiting, David; Boerma, Ties

    2005-01-01

    OBJECTIVE: To examine the costs of complementary information generation activities in a resource-constrained setting and compare the costs and outputs of information subsystems that generate the statistics on poverty, health and survival required for monitoring, evaluation and reporting on health programmes in the United Republic of Tanzania. METHODS: Nine systems used by four government agencies or ministries were assessed. Costs were calculated from budgets and expenditure data made available by information system managers. System coverage, quality assurance and information production were reviewed using questionnaires and interviews. Information production was characterized in terms of 38 key sociodemographic indicators required for national programme monitoring. FINDINGS: In 2002-03 approximately US$ 0.53 was spent per Tanzanian citizen on the nine information subsystems that generated information on 37 of the 38 selected indicators. The census and reporting system for routine health service statistics had the largest participating populations and highest total costs. Nationally representative household surveys and demographic surveillance systems (which are not based on nationally representative samples) produced more than half the indicators and used the most rigorous quality assurance. Five systems produced fewer than 13 indicators and had comparatively high costs per participant. CONCLUSION: Policy-makers and programme planners should be aware of the many trade-offs with respect to system costs, coverage, production, representativeness and quality control when making investment choices for monitoring and evaluation. In future, formal cost-effectiveness studies of complementary information systems would help guide investments in the monitoring, evaluation and planning needed to demonstrate the impact of poverty-reduction and health programmes. PMID:16184275

  4. College Attendance Costs Vary and Result from Higher Tuition, Room, and Board. OPPAGA Information Brief.

    ERIC Educational Resources Information Center

    Florida State Legislature, Tallahassee. Office of Program Policy Analysis and Government Accountability.

    College students must pay for educational expenses and living costs. Educational expenses account for 18 to 25% of the cost of college attendance. The cost of attending college, as calculated by each institution, is used to determine a student's financial need and eligibility for financial aid. The cost of attendance varies among institutions,…

  5. Cost model for biobanks.

    PubMed

    Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando

    2013-10-01

    Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections.

  6. Cost model for biobanks.

    PubMed

    Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando

    2013-10-01

    Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections. PMID:24835258

  7. Food Assistance: Financial Information on WIC Nutrition Services and Administrative Costs. United States General Accounting Office Report to Congressional Committees.

    ERIC Educational Resources Information Center

    Robertson, Robert E.

    The Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) is a federally funded nutrition assistance program administered by the Department of Agriculture's (USDA) Food and Nutrition Service (FNS). Responding to Congressional requests for information regarding program costs, this report provides information on: (1) funding…

  8. The Telecommunications Stranglehold on Europe's Information Use: Practical Constraints for SMEs and Economic Assessment Based on Cost Models.

    ERIC Educational Resources Information Center

    Delcroix, Jean-Claude

    There is a general feeling that European telecommunications are delaying the introduction of new information services. This paper responds to some of the questions concerning online information. The views result from research work at DECADE (Belgium) on the requirements of smaller organizations on the one hand and on telecommunications costs on…

  9. Towards a personalized environmental health information service using low-cost sensors and crowdsourcing

    NASA Astrophysics Data System (ADS)

    Castell, Nuria; Liu, Hai-Ying; Schneider, Philipp; Cole-Hunter, Tom; Lahoz, William; Bartonova, Alena

    2015-04-01

    Most European cities exceed the air quality guidelines established by the WHO to protect human health. As such, citizens are exposed to potentially harmful pollutant levels. Some cities have services (e.g., web pages, mobile apps, etc.) which provide timely air quality information to the public. However, air quality data at individual level is currently scarce or non-existent. Making this information directly useful to individuals poses a challenge. For instance, if a user is informed that the air quality is "poor", what does that mean for him/her, and how can this information be acted upon? Despite individuals having a unique relationship with their environment, the information on the state of atmospheric components and related hazards is currently mostly generic, and seldom personally relevant. This undermines citizens' interest in their environment, and consequently limits their ability to recognize and change both their contribution and their exposure to air pollution. In Oslo, two EU founded projects, CITI-SENSE (Engelken-Jorge et al., 2014) and Citi-Sense-MOB (Castell et al., 2014), are trying to establish a dialogue with citizens by providing them with the possibility of getting personalized air quality information on their smartphones. The users are able to check the air quality in their immediate surroundings and track their individual exposure while moving through the urban environment (Castell et al., 2014). In this way, they may be able to reduce their exposure such as by changing transport modes or routes, for example by selecting less polluted streets to walk or cycle through. Using a smartphone application, citizens are engaged in collecting and sharing environmental data generated by low-cost air quality sensors, and in reporting their individual perception (turning citizens into sensors themselves). The highly spatially resolved data on air quality and perception is geo-located. This allows for simultaneous visualization of both kinds of the sensor

  10. Investigating the cost-effectiveness of health information technologies: a systematic review protocol

    PubMed Central

    Sheikh, Aziz; Nurmatov, Ulugbek B; Cresswell, Kathrin; Bates, David

    2013-01-01

    Introduction There is a need to develop new, more cost-effective models of healthcare and in this vein there is a considerable international interest in exploiting the potential offered by major developments in health information technologies (HITs). Very substantial investments are, as a result, now being made globally, but these still probably only represent a fraction of the investments needed if healthcare is to make the transition from the paper to the digital era. Investing greater resources is, however, inherently challenging and unpopular at a time of financial austerity and this is furthermore complicated by the thus far variable evidence of health benefits and demonstrable short-term to medium-term returns associated with investments in HITs. Objectives Building on our related systematic overviews investigating the impact of HITs, we now seek to estimate the cost-effectiveness of HITs and as a secondary aim to identify potentially transferable lessons in relation to how to realise returns on investments in these technologies. Methods We will conduct a systematic review to identify the empirical evidence base surrounding the return on investments from implementing HITs. Two reviewers will independently search major international databases for published, unpublished and on-going experimental and quasi-experimental studies of interest published during the period 1990–2013. These searches of bibliographic databases will be supplemented by contacting an international panel of experts. There will be no restriction on the language of publication of studies. Studies will be critically appraised using the Critical Appraisal Skills Programme (CASP) Economic Evaluations checklist. In view of the anticipated heterogeneity in intervention investigated, study design and health system contexts, we will undertake a descriptive, narrative and interpretative synthesis of data. Ethics and dissemination Ethical approval is not required. Results These will be presented in

  11. Waste Management Facilities Cost Information report for Greater-Than-Class C and DOE equivalent special case waste

    SciTech Connect

    Feizollahi, F.; Shropshire, D.

    1993-07-01

    This Waste Management Facility Cost Information (WMFCI) report for Greater-Than-Class C low-level waste (GTCC LLW) and DOE equivalent special case waste contains preconceptual designs and planning level life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities needed for management of GTCC LLW and DOE equivalent waste. The report contains information on 16 facilities (referred to as cost modules). These facilities are treatment facility front-end and back-end support functions (administration support, and receiving, preparation, and shipping cost modules); seven treatment concepts (incineration, metal melting, shredding/compaction, solidification, vitrification, metal sizing and decontamination, and wet/air oxidation cost modules); two storage concepts (enclosed vault and silo); disposal facility front-end functions (disposal receiving and inspection cost module); and four disposal concepts (shallow-land, engineered shallow-land, intermediate depth, and deep geological cost modules). Data in this report allow the user to develop PLCC estimates for various waste management options. A procedure to guide the U.S. Department of Energy (DOE) and its contractor personnel in the use of estimating data is also included in this report.

  12. Draft Submission; Social Cost of Energy Generation

    SciTech Connect

    1990-01-05

    This report is intended to provide a general understanding of the social costs associated with electric power generation. Based on a thorough review of recent literature on the subject, the report describes how these social costs can be most fully and accurately evaluated, and discusses important considerations in applying this information within the competitive bidding process. [DJE 2005

  13. Development of Cost Benefit Methodology for Scientific and Technical Information Communication and Application to Information Analysis Centers. Final Report.

    ERIC Educational Resources Information Center

    Mason, Robert M.; And Others

    This document presents a research effort intended to improve the economic information available for formulating politics and making decisions related to Information Analysis Centers (IAC's) and IAC services. The project used a system of IAC information activities to analyze the functional aspects of IAC services, calculate the present value of net…

  14. Student Awareness of Costs and Benefits of Educational Decisions: Effects of an Information Campaign. CEE DP 139

    ERIC Educational Resources Information Center

    McGuigan, Martin; McNally, Sandra; Wyness, Gill

    2012-01-01

    The economic benefits of staying on in education have been well established. But do students know this? One of the reasons why students might drop out of education too soon is because they are not well informed about the costs and benefits of staying on in education at an appropriate time of their educational career. Indeed, the fact that…

  15. Cost-Effectiveness Analysis. Instructor Guide. Working for Clean Water: An Information Program for Advisory Groups.

    ERIC Educational Resources Information Center

    Buskirk, E. Drannon, Jr.

    Presented is the instructor's manual for a one-hour presentation on cost-effectiveness analysis. Topics covered are the scope of cost-effectiveness analysis, basic assessment procedures, and the role of citizens in the analysis of alternatives. A supplementary audiovisual program is available. These materials are part of the Working for Clean…

  16. 78 FR 4425 - Notice of Proposed Information Collection: Comment Request; Multifamily Contractor's Mortgagor's...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    ... data to keep Field Office cost data banks and cost estimates current and accurate. HUD-92205A is used... estimates current and accurate. HUD-92205A is used to certify the actual costs of acquisition or refinancing... Mortgagor's Cost Breakdowns and Certifications AGENCY: Office of the Chief Information Officer, HUD....

  17. 32 CFR Appendix D to Part 286 - DD Form 2086-1, “Record of Freedom of Information (FOI) Processing Cost for Technical Data”

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., “Record of Freedom of Information (FOI) Processing Cost for Technical Data” ER25NO98.002 ER25NO98.003 ... Information (FOI) Processing Cost for Technical Dataâ D Appendix D to Part 286 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) FREEDOM OF INFORMATION ACT PROGRAM...

  18. Student Information Systems Demystified: The Increasing Demand for Accurate, Timely Data Means Schools and Districts Are Relying Heavily on SIS Technologies

    ERIC Educational Resources Information Center

    McIntire, Todd

    2004-01-01

    Student information systems, one of the first applications of computer technology in education, are undergoing a significant transition yet again. The first major shift in SIS technologies occurred about 15 years ago when they evolved from mainframe programs to client-server solutions. Now, vendors across the board are offering centralized…

  19. 78 FR 77432 - Proposed Information Collection; Comment Request; Bait and Tackle Store Cost-Earnings Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-23

    ... and quantify their operational costs and sales revenues in addition to describing their clientele. As... mail or internet-based surveys, but telephone and personal interviews may be employed to supplement...

  20. 78 FR 40696 - Proposed Information Collection; Comment Request; Alaska Crab Cost Recovery

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ... Crab Cost Recovery AGENCY: National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION... and Aleutian Islands (BSAI) Crab includes the Crab Rationalization (CR) Program, a limited access system that allocates BSAI Crab resources among harvesters, processors, and coastal communities....

  1. 14 CFR 151.24 - Procedures: Application; information on estimated project costs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) is not the actual cost or the amount of an award in eminent domain proceedings, the sponsor must so... application, notifies the sponsor that it may, within a stated time, ask in writing for reconsideration of...

  2. Impact of marketing, information system, modularity, and low-cost solution on the implementation of CIM in SMEs

    NASA Astrophysics Data System (ADS)

    Marri, Hussain B.; McGaughey, Ronald; Gunasekaran, Angappa

    2000-10-01

    Globalization can have a dramatic impact on manufacturing sector due to the fact that the majority of establishments in this industry are small to medium manufacturing companies. The role of Small and Medium Enterprises (SMEs) in the national economy has been emphasized all over the world, considering their contribution to the total manufacturing output and employment opportunities. The lack of marketing forces to regulate the operation of SMEs has been a fundamental cause of low efficiency for a long time. Computer Integrated Manufacturing (CIM) is emerging as one of the most promising opportunities for shrinking the time delays in information transfer and reducing manufacturing costs. CIM is the architecture for integrating the engineering, marketing and manufacturing functions through information system technologies. SMEs in general have not made full use of new technologies although their investments in CIM technology tended to be wider in scale and scope. Most of the SMEs only focus on the short-term benefit, but overlook a long- term and fundamental development on applications of new technologies. With the help of suitable information systems, modularity and low cost solutions, SMEs can compete in the global market. Considering the importance of marketing, information system, modularity and low cost solutions in the implementation of CIM in SMEs, a model has been developed and studied with the help of an empirical study conducted with British SMEs to facilitate the adoption of CIM. Finally, a summary of findings and recommendations are presented.

  3. Prenatal nutrition services: a cost analysis.

    PubMed

    Splett, P L; Caldwell, H M; Holey, E S; Alton, I R

    1987-02-01

    The scarcity of information about program costs in relation to quality care prompted a cost analysis of prenatal nutrition services in two urban settings. This study examined prenatal nutrition services in terms of total costs, per client costs, per visit costs, and cost per successful outcome. Standard cost-accounting principles were used. Outcome measures, based on written quality assurance criteria, were audited using standard procedures. In the studied programs, nutrition services were delivered for a per client cost of $72 in a health department setting and $121 in a hospital-based prenatal care program. Further analysis illustrates that total and per client costs can be misleading and that costs related to successful outcomes are much higher. The three levels of cost analysis reported provide baseline data for quantifying the costs of providing prenatal nutrition services to healthy pregnant women. Cost information from these cost analysis procedures can be used to guide adjustments in service delivery to assure successful outcomes of nutrition care. Accurate cost and outcome data are necessary prerequisites to cost-effectiveness and cost-benefit studies.

  4. Economic valuation of informal care: lessons from the application of the opportunity costs and proxy good methods.

    PubMed

    van den Berg, Bernard; Brouwer, Werner; van Exel, Job; Koopmanschap, Marc; van den Bos, Geertrudis A M; Rutten, Frans

    2006-02-01

    This paper reports the results of the application of the opportunity costs and proxy good methods to determine a monetary value of informal care. We developed a survey in which we asked informal caregivers in The Netherlands to indicate the different types of time forgone (paid work, unpaid work and leisure) in order to be able to provide care. Moreover, we asked informal caregivers how much time they spent on a list of 16 informal care tasks during the week before the interview. Data were obtained from surveys in two different populations: informal caregivers and their care recipients with stroke and with rheumatoid arthritis (RA). A total of 218 care recipients with stroke and their primary informal caregivers completed a survey as well as 147 caregivers and their care recipients with RA. The measurement of care according to both methods is more problematic compared to the valuation. This is especially the case for the opportunity costs method and for the housework part in the proxy good method. More precise guidelines are necessary for the consistent application of both methods in order to ensure comparability of results and of economic evaluations of health care.

  5. An Examination of the Explicit Costs of Sensitive Information Security Breaches

    ERIC Educational Resources Information Center

    Toe, Cleophas Adeodat

    2013-01-01

    Data security breaches are categorized as loss of information that is entrusted in an organization by its customers, partners, shareholders, and stakeholders. Data breaches are significant risk factors for companies that store, process, and transmit sensitive personal information. Sensitive information is defined as confidential or proprietary…

  6. Cost-effectiveness of compression technologies for evidence-informed leg ulcer care: results from the Canadian Bandaging Trial

    PubMed Central

    2012-01-01

    Background Venous leg ulcers, affecting approximately 1% of the population, are costly to manage due to poor healing and high recurrence rates. We evaluated an evidence-informed leg ulcer care protocol with two frequently used high compression systems: ‘four-layer bandage’ (4LB) and ‘short-stretch bandage’ (SSB). Methods We conducted a cost-effectiveness analysis using individual patient data from the Canadian Bandaging Trial, a publicly funded, pragmatic, randomized trial evaluating high compression therapy with 4LB (n = 215) and SSB (n = 209) for community care of venous leg ulcers. We estimated costs (in 2009–2010 Canadian dollars) from the societal perspective and used a time horizon corresponding to each trial participant’s first year. Results Relative to SSB, 4LB was associated with an average 15 ulcer-free days gained, although the 95% confidence interval [−32, 21 days] crossed zero, indicating no treatment difference; an average health benefit of 0.009 QALYs gained [−0.019, 0.037] and overall, an average cost increase of $420 [$235, $739] (due to twice as many 4LB bandages used); or equivalently, a cost of $46,667 per QALY gained. If decision makers are willing to pay from $50,000 to $100,000 per QALY, the probability of 4LB being more cost effective increased from 51% to 63%. Conclusions Our findings differ from the emerging clinical and economic evidence that supports high compression therapy with 4LB, and therefore suggest another perspective on high compression practice, namely when delivered by trained registered nurses using an evidence-informed protocol, both 4LB and SSB systems offer comparable effectiveness and value for money. Trial registration ClinicalTrials.gov Identifier: NCT00202267 PMID:23031428

  7. Investing in International Information Exchange Activities to Improve the Safety, Cost Effectiveness and Schedule of Cleanup - 13281

    SciTech Connect

    Seed, Ian; James, Paula; Mathieson, John; Judd, Laurie; Elmetti-Ramirez, Rosa; Han, Ana

    2013-07-01

    With decreasing budgets and increasing pressure on completing cleanup missions as quickly, safely and cost-effectively as possible, there is significant benefit to be gained from collaboration and joint efforts between organizations facing similar issues. With this in mind, the US Department of Energy (DOE) and the UK Nuclear Decommissioning Authority (NDA) have formally agreed to share information on lessons learned on the development and application of new technologies and approaches to improve the safety, cost effectiveness and schedule of the cleanup legacy wastes. To facilitate information exchange a range of tools and methodologies were established. These included tacit knowledge exchange through facilitated meetings, conference calls and Site visits as well as explicit knowledge exchange through document sharing and newsletters. A DOE web-based portal has been established to capture these exchanges and add to them via discussion boards. The information exchange is operating at the Government-to-Government strategic level as well as at the Site Contractor level to address both technical and managerial topic areas. This effort has resulted in opening a dialogue and building working relationships. In some areas joint programs of work have been initiated thus saving resource and enabling the parties to leverage off one another activities. The potential benefits of high quality information exchange are significant, ranging from cost avoidance through identification of an approach to a problem that has been proven elsewhere to cost sharing and joint development of a new technology to address a common problem. The benefits in outcomes significantly outweigh the costs of the process. The applicability of the tools and methods along with the lessons learned regarding some key issues is of use to any organization that wants to improve value for money. In the waste management marketplace, there are a multitude of challenges being addressed by multiple organizations and

  8. Examining the Potential of Information Technologies to Improve Cost Control in Community Colleges

    ERIC Educational Resources Information Center

    Sudhakar, Samuel

    2013-01-01

    The challenges facing publicly funded community colleges have never been greater. Declining state and federal support and decreasing property tax revenues have placed a tremendous pressure on tuition rates. Declining revenues combined with the lack of adequate cost control, has caused in-state tuition and fees at public 2-year colleges to increase…

  9. Least-cost groundwater remediation design using uncertain hydrogeological information. 1998 annual progress report

    SciTech Connect

    Pinder, G.F.

    1998-06-01

    'The objective of the project is to formulate, test, and evaluate a new approach to the least-cost design of groundwater contamination containment and decontamination systems. The proposed methodology employs robust optimization, the outer-approximation method of non-linear programming, and groundwater flow and transport modeling to find the most cost-effective pump-and-treat design possible given the physical parameters describing the groundwater reservoir are known with uncertainty. The result is a methodology that will provide the least-cost groundwater remediation design possible given a specified set of design objectives and physical and sociological constraints. As of the end of the first year of this 3-year project the author has developed and tested the concept of robust optimization within the framework of least-cost groundwater-contamination-containment design. The outer-approximation method has been employed in this context for the relatively simple linear-constraint case associated with the containment problem. In an effort to enhance the efficiency and applicability of this methodology, a new strategy for selecting the various realizations arising out of the Monte-Carlo underpinnings of the robust-optimization technique has been developed and tested. Based upon observations arising out of this work a yet more promising approach has been discovered. The theoretical foundation for this most recent approach has been, and continues to be, the primary focus of the research.'

  10. Shaker K(+)-channels are predicted to reduce the metabolic cost of neural information in Drosophila photoreceptors.

    PubMed

    Niven, J E; Vähäsöyrinki, M; Juusola, M

    2003-08-01

    Shaker K(+)-channels are one of several voltage-activated K(+)-channels expressed in Drosophila photoreceptors. We have shown recently that Shaker channels act as selective amplifiers, attenuating some signals while boosting others. Loss of these channels reduces the photoreceptor information capacity (bits s(-1)) and induces compensatory changes in photoreceptors enabling them to minimize the impact of this loss upon coding natural-like stimuli. Energy as well as coding is also an important consideration in understanding the role of ion channels in neural processing. Here, we use a simple circuit model that incorporates the major ion channels, pumps and exchangers of the photoreceptors to derive experimentally based estimates of the metabolic cost of neural information in wild-type (WT) and Shaker mutant photoreceptors. We show that in WT photoreceptors, which contain Shaker K(+)-channels, each bit of information costs approximately half the number of ATP molecules than each bit in Shaker photoreceptors, in which lack of the Shaker K(+)-channels is compensated by increased leak conductance. Additionally, using a Hodgkin-Huxley-type model coupled to the circuit model we show that the amount of leak present in both WT and Shaker photoreceptors is optimized to both maximize the available voltage range and minimize the metabolic cost.

  11. ABC estimation of unit costs for emergency department services.

    PubMed

    Holmes, R L; Schroeder, R E

    1996-04-01

    Rapid evolution of the health care industry forces managers to make cost-effective decisions. Typical hospital cost accounting systems do not provide emergency department managers with the information needed, but emergency department settings are so complex and dynamic as to make the more accurate activity-based costing (ABC) system prohibitively expensive. Through judicious use of the available traditional cost accounting information and simple computer spreadsheets. managers may approximate the decision-guiding information that would result from the much more costly and time-consuming implementation of ABC. PMID:10156656

  12. Allocating Information Costs in a Negotiated Information Order: Interorganizational Constraints on Decision Making in Norwegian Oil Insurance.

    ERIC Educational Resources Information Center

    Heimer, Carol A.

    1985-01-01

    This paper analyzes two types of decisions for insuring mobile oil rigs and fixed installations in the Norwegian North Sea: (1) decisions about information for ratemaking and underwriting, and (2) decisions about the conditions of insurance. Appended are 46 references. (MLF)

  13. 40 CFR 2.311 - Special rules governing certain information obtained under the Motor Vehicle Information and Cost...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL PUBLIC INFORMATION Confidentiality of Business... calculation of fuel economy for any model type and average fuel economy of a manufacturer under section 503(d..., 15 U.S.C. 2001(9). (6) Model type has the meaning given it in section 501(11) of the Act, 15...

  14. Member satisfaction information as competitive intelligence: a new tool for increasing market share and reducing costs.

    PubMed

    Cooperman, T

    1995-01-01

    MCOs have begun to realize the impact that consumer satisfaction has on enrollment and pricing. Taking a lesson from auto manufacturers, MCOs are now realizing the additional advantages of obtaining consumer satisfaction information about their competitors. Knowing competitors' members intentions to stay or leave their plans, pin-pointing competitors' strengths and weaknesses, and identifying unmet consumer needs, allow MCOs to more successfully develop tactics and strategies for sales, marketing, and planning. This article describes the use of member satisfaction information as competitive intelligence, what to look for in this information, and sources for obtaining reliable information.

  15. Member satisfaction information as competitive intelligence: a new tool for increasing market share and reducing costs.

    PubMed

    Cooperman, T

    1995-01-01

    MCOs have begun to realize the impact that consumer satisfaction has on enrollment and pricing. Taking a lesson from auto manufacturers, MCOs are now realizing the additional advantages of obtaining consumer satisfaction information about their competitors. Knowing competitors' members intentions to stay or leave their plans, pin-pointing competitors' strengths and weaknesses, and identifying unmet consumer needs, allow MCOs to more successfully develop tactics and strategies for sales, marketing, and planning. This article describes the use of member satisfaction information as competitive intelligence, what to look for in this information, and sources for obtaining reliable information. PMID:10151593

  16. 76 FR 22410 - Notice of Proposed Information Collection: Comment Request; Mortgagor's Certificate of Actual Cost

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-21

    ... Federal Information Relay Service, (1-800-877-8339). FOR FURTHER INFORMATION CONTACT: Joyce Allen... of 1995 (44 U.S.C. chapter 35, as amended). This Notice is soliciting comments from members of the... approved collection. Authority: The Paperwork Reduction Act of 1995, 44 U.S.C., Chapter 35, as...

  17. Smart Aquifer Characterisation validated using Information Theory and Cost benefit analysis

    NASA Astrophysics Data System (ADS)

    Moore, Catherine

    2016-04-01

    The field data acquisition required to characterise aquifer systems are time consuming and expensive. Decisions regarding field testing, the type of field measurements to make and the spatial and temporal resolution of measurements have significant cost repercussions and impact the accuracy of various predictive simulations. The Smart Aquifer Characterisation (SAC) research programme (New Zealand (NZ)) addresses this issue by assembling and validating a suite of innovative methods for characterising groundwater systems at the large, regional and national scales. The primary outcome is a suite of cost effective tools and procedures provided to resource managers to advance the understanding and management of groundwater systems and thereby assist decision makers and communities in the management of their groundwater resources, including the setting of land use limits that protect fresh water flows and quality and the ecosystems dependent on that fresh water. The programme has focused novel investigation approaches including the use of geophysics, satellite remote sensing, temperature sensing and age dating. The SMART (Save Money And Reduce Time) aspect of the programme emphasises techniques that use these passive cost effective data sources to characterise groundwater systems at both the aquifer and the national scale by: • Determination of aquifer hydraulic properties • Determination of aquifer dimensions • Quantification of fluxes between ground waters and surface water • Groundwater age dating These methods allow either a lower cost method for estimating these properties and fluxes, or a greater spatial and temporal coverage for the same cost. To demonstrate the cost effectiveness of the methods a 'data worth' analysis is undertaken. The data worth method involves quantification of the utility of observation data in terms of how much it reduces the uncertainty of model parameters and decision focussed predictions which depend on these parameters. Such

  18. Smart Aquifer Characterisation validated using Information Theory and Cost benefit analysis

    NASA Astrophysics Data System (ADS)

    Moore, Catherine

    2016-04-01

    The field data acquisition required to characterise aquifer systems are time consuming and expensive. Decisions regarding field testing, the type of field measurements to make and the spatial and temporal resolution of measurements have significant cost repercussions and impact the accuracy of various predictive simulations. The Smart Aquifer Characterisation (SAC) research programme (New Zealand (NZ)) addresses this issue by assembling and validating a suite of innovative methods for characterising groundwater systems at the large, regional and national scales. The primary outcome is a suite of cost effective tools and procedures provided to resource managers to advance the understanding and management of groundwater systems and thereby assist decision makers and communities in the management of their groundwater resources, including the setting of land use limits that protect fresh water flows and quality and the ecosystems dependent on that fresh water. The programme has focused novel investigation approaches including the use of geophysics, satellite remote sensing, temperature sensing and age dating. The SMART (Save Money And Reduce Time) aspect of the programme emphasises techniques that use these passive cost effective data sources to characterise groundwater systems at both the aquifer and the national scale by: • Determination of aquifer hydraulic properties • Determination of aquifer dimensions • Quantification of fluxes between ground waters and surface water • Groundwater age dating These methods allow either a lower cost method for estimating these properties and fluxes, or a greater spatial and temporal coverage for the same cost. To demonstrate the cost effectiveness of the methods a 'data worth' analysis is undertaken. The data worth method involves quantification of the utility of observation data in terms of how much it reduces the uncertainty of model parameters and decision focussed predictions which depend on these parameters. Such

  19. New frac analysis uses real-time information to lower costs, raise

    SciTech Connect

    Aud, W.W.; Middlebrook, M.L.

    1995-08-01

    Evolving hydraulic fracturing technology during the past 10 years has led to reduced costs, improved production and increased cash flow. This report describes several critical aspects in hydraulic fracturing including mechanisms, approaches in treatment quality control procedures and stress profile development, and finally, it is shown how the mechanisms or processes can be addressed and improved through advanced, real-time hydraulic fracture treatment evaluation and execution.

  20. The information revolution reaches pharmaceuticals: balancing innovation incentives, cost, and access in the post-genomics era.

    PubMed

    Rai, A K

    2001-01-01

    Recent developments in genomics--the science that lies at the intersection of information technology and biotechnology--have ushered in a new era of pharmaceutical innovation. Professor Rai advances a theory of pharmaceutical development and allocation that takes account of these recent developments from the perspective of both patent law and health law--that is, from both the production side and the consumption side. She argues that genomics has the potential to make reforms that increase access to prescription drugs not only more necessary as a matter of equity but also more feasible as a matter of innovation policy. On the production end, so long as patent rights in upstream genomics research do not create transaction cost bottlenecks, genomics should, in the not-too-distant future, yield some reduction in drug research and development costs. If these costs reductions are realized, it may be possible to scale back certain features of the pharmaceutical patent regime that cause patent protection for pharmaceuticals to be significantly stronger than patent protection for other innovation. On the consumption side, genomics should make drug therapy even more important in treating illness. This reality, coupled with empirical data revealing that cost and access problems are particularly severe for those individuals who are not able to secure favorable price discrimination through insurance, militates in favor of government subsidies for such insurance. As contrasted with patent buyouts, the approach favored by many patent scholars, subsidies would take account of, and indeed capitalize on, the institutional realities of health care consumption. These subsidies should, however, be linked to insurance regulation that works to channel innovation in a cost-effective direction by requiring coverage of drugs that provide significant benefit relative to their cost.

  1. Laboratory computing--process and information management supporting high-quality, cost-effective healthcare.

    PubMed

    Buffone, G J; Moreau, D R

    1995-09-01

    One currently observes many healthcare institutions rushing to reengineer and install information systems with the expectation of achieving enhanced efficiency, competitiveness, and, it is hoped, higher patient satisfaction resulting from timely, high-quality care. Unfortunately, information system concepts, design, and implementation have not yet addressed the complexity of representing and managing clinical processes. As a result, much of the synergy one might expect to derive from understanding and designing clinical processes to gain efficiency and quality while maintaining humanness is not readily achievable by implementing traditional information systems. In this presentation, with laboratory services as an example, we describe a conceptually different information systems model, which we believe would aid care-givers in their efforts to deliver compassionate, quality care while addressing the highly competitive nature of market-driven healthcare. PMID:7656450

  2. Impact of information cost and switching of trading strategies in an artificial stock market

    NASA Astrophysics Data System (ADS)

    Liu, Yi-Fang; Zhang, Wei; Xu, Chao; Vitting Andersen, Jørgen; Xu, Hai-Chuan

    2014-08-01

    This paper studies the switching of trading strategies and its effect on the market volatility in a continuous double auction market. We describe the behavior when some uninformed agents, who we call switchers, decide whether or not to pay for information before they trade. By paying for the information they behave as informed traders. First we verify that our model is able to reproduce some of the stylized facts in real financial markets. Next we consider the relationship between switching and the market volatility under different structures of investors. We find that there exists a positive relationship between the market volatility and the percentage of switchers. We therefore conclude that the switchers are a destabilizing factor in the market. However, for a given fixed percentage of switchers, the proportion of switchers that decide to buy information at a given moment of time is negatively related to the current market volatility. In other words, if more agents pay for information to know the fundamental value at some time, the market volatility will be lower. This is because the market price is closer to the fundamental value due to information diffusion between switchers.

  3. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  4. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  5. U.S. Geological Survey Streamgage Operation and Maintenance Cost Evaluation...from the National Streamflow Information Program

    USGS Publications Warehouse

    Norris, J. Michael

    2010-01-01

    To help meet the goal of providing earth-science information to the Nation, the U.S. Geological Survey (USGS) operates and maintains the largest streamgage network in the world, with over 7,600 active streamgages in 2010. This network is operated in cooperation with over 850 Federal, tribal, State, and local funding partners. The streamflow information provided by the USGS is used for the protection of life and property; for the assessment, allocation, and management of water resources; for the design of roads, bridges, dams, and water works; for the delineation of flood plains; for the assessment and evaluation of habitat; for understanding the effects of land-use, water-use, and climate changes; for evaluation of water quality; and for recreational safety and enjoyment. USGS streamgages are managed and operated to rigorous national standards, allowing analyses of data from streamgages in different areas and spanning long time periods, some with more than 100 years of data. About 90 percent of USGS streamgages provide streamflow information real-time on the web. Physical measurements of streamflow are made at streamgages multiple times a year, depending on flow conditions, to ensure the highest level of accuracy possible. In addition, multiple reviews and quality assurance checks are performed before the data is finalized. In 2006, the USGS reviewed all activities, operations, equipment, support, and costs associated with operating and maintaining a streamgage program (Norris and others, 2008). A summary of the percentages of costs associated with activities required to operate a streamgage on an annual basis are presented in figure 1. This information represents what it costs to fund a 'typical' USGS streamgage and how those funds are utilized. It should be noted that some USGS streamgages have higher percentages for some categories than do others depending on location and conditions. Forty-one percent of the funding for the typical USGS streamgage is for labor

  6. Prognostic testing in coronary artery disease: An analysis of the relationship between increments in cost and information

    SciTech Connect

    Pollock, B.H.

    1988-01-01

    Tests analyzed include stress electrocardiography (ECG), thallium myocardial perfusion scintigraphy, and technetium wall motion scintigraphy. The incremental value of each test was evaluated using staged survival regression and was measured as the area under the receiver operating characteristic (ROC) curve. This approach is preferable to one based on sensitivity and specificity derived from heterogeneous populations, or from approaches that report the most powerful predictor obtained from stepwise regression. Cost-effectiveness for each test was assessed as the increment of ROC area divided by the marginal cost. Three populations were studied. In the thallium population, a significant increment in ROC area was added at each stage of testing; more prognostic information was added by ECG than by thallium. In the technetium population, ECG added a significant increment of ROC area, but technetium did not. In the population receiving both nuclear, more incremental information was added by thallium than by technetium. Thallium was found to be more cost-effective than technetium; thus, it is preferred for assessing prognosis in patients with suspected disease.

  7. Cost goals

    NASA Technical Reports Server (NTRS)

    Hoag, J.

    1981-01-01

    Cost goal activities for the point focusing parabolic dish program are reported. Cost goals involve three tasks: (1) determination of the value of the dish systems to potential users; (2) the cost targets of the dish system are set out; (3) the value side and cost side are integrated to provide information concerning the potential size of the market for parabolic dishes. The latter two activities are emphasized.

  8. Optimal Mandates and The Welfare Cost of Asymmetric Information: Evidence from The U.K. Annuity Market*

    PubMed Central

    Einav, Liran; Finkelstein, Amy; Schrimpf, Paul

    2009-01-01

    Much of the extensive empirical literature on insurance markets has focused on whether adverse selection can be detected. Once detected, however, there has been little attempt to quantify its welfare cost, or to assess whether and what potential government interventions may reduce these costs. To do so, we develop a model of annuity contract choice and estimate it using data from the U.K. annuity market. The model allows for private information about mortality risk as well as heterogeneity in preferences over different contract options. We focus on the choice of length of guarantee among individuals who are required to buy annuities. The results suggest that asymmetric information along the guarantee margin reduces welfare relative to a first best symmetric information benchmark by about £127 million per year, or about 2 percent of annuitized wealth. We also find that by requiring that individuals choose the longest guarantee period allowed, mandates could achieve the first-best allocation. However, we estimate that other mandated guarantee lengths would have detrimental effects on welfare. Since determining the optimal mandate is empirically difficult, our findings suggest that achieving welfare gains through mandatory social insurance may be harder in practice than simple theory may suggest. PMID:20592943

  9. Information-Seeking in Family Day Care: Access, Quality and Personal Cost

    ERIC Educational Resources Information Center

    Corr, L.; Davis, E.; Cook, K.; Mackinnon, A.; Sims, M.; Herrman, H.

    2014-01-01

    Family day-care (FDC) educators work autonomously to provide care and education for children of mixed ages, backgrounds and abilities. To meet the demands and opportunities of their work and regulatory requirements, educators need access to context-relevant and high quality information. No previous research has examined how and where these workers…

  10. 77 FR 69441 - Federal Acquisition Regulation; Information Collection; Cost Accounting Standards Administration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-19

    ... Federal eRulemaking portal by searching the OMB control number. Select the link ``Submit a Comment'' that... provided. FOR FURTHER INFORMATION CONTACT: Mr. Edward Chambers, Procurement Analyst, Office of Acquisition... estimated annual reporting burden is increased from that published in the Federal Register at 75 FR 3236,...

  11. Information Technology Cost Center Employee Perception of Their Contribution Value in a For Profit Organizational Culture

    ERIC Educational Resources Information Center

    Gilstrap, Donald E.

    2010-01-01

    A literature review revealed a lack of academic research related to cultural dynamics within organizations that influence information technology investments. The goal of this single descriptive case study of a for profit international company was to examine one area of cultural influence on investments. The aim was to gain an understanding of…

  12. A Costing Model for Project-Based Information and Communication Technology Systems

    ERIC Educational Resources Information Center

    Stewart, Brian; Hrenewich, Dave

    2009-01-01

    A major difficulty facing IT departments is ensuring that the projects and activities to which information and communications technologies (ICT) resources are committed represent an effective, economic, and efficient use of those resources. This complex problem has no single answer. To determine effective use requires, at the least, a…

  13. Which University? A Study of the Influence of Cost and Information Factors on Scottish Undergraduate Choice

    ERIC Educational Resources Information Center

    Briggs, Senga; Wilson, Alex

    2007-01-01

    At a time when higher education institutions (HEIs) around the globe face declining student numbers and decreasing funding grants, it becomes imperative for those involved in the recruitment process to understand the factors utilized by students in the search process. This paper explores the influence of two such factors: Information Supplied by…

  14. Codes, Costs, and Critiques: The Organization of Information in "Library Quarterly", 1931-2004

    ERIC Educational Resources Information Center

    Olson, Hope A.

    2006-01-01

    This article reports the results of a quantitative and thematic content analysis of the organization of information literature in the "Library Quarterly" ("LQ") between its inception in 1931 and 2004. The majority of articles in this category were published in the first half of "LQ's" run. Prominent themes have included cataloging codes and the…

  15. 76 FR 26705 - Proposed Information Collection; Comment Request; Commercial Fishing Vessel Cost and Earnings...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-09

    ... Environmental Policy Act, Executive Order 12866 and the Regulatory Flexibility Act. The Social Sciences Branch (SSB) of the NMFS, Northeast Fisheries Science Center (NEFSC) is responsible for estimating the economic and social impacts of fishery management actions. Lack of information on vessel operating...

  16. A cost comparison of two malaria control methods in Kyunggi Province, Republic of Korea, using remote sensing and geographic information systems.

    PubMed

    Claborn, David M; Masuoka, Penny M; Klein, Terry A; Hooper, Tomoko; Lee, Arthur; Andre, Richard G

    2002-06-01

    A cost-comparison of two methods for the control of malaria in the Republic of Korea was performed. The cost of larviciding with methoprene granules was estimated at $93.48/hectare. The annual cost of providing chemoprophylaxis was estimated at $37.53/person. Remote sensing and geographic information systems were used to obtain estimates of the size of vector larval habitats around two U.S. Army camps, allowing an estimate of the cost of larviciding around each of the camps. This estimate was compared to the cost of providing chloroquine and primaquine chemoprophylaxis for the camp populations. Costs on each of the camps differed by the size of the larval habitats and the size of the at-risk population. These tools allow extrapolation of larval surveillance data to a regional scale while simultaneously providing site-specific cost analysis, thus reducing the cost and labor associated with vector surveillance over large areas.

  17. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  18. Developing Information on Energy Savings and Associated Costs and Benefits of Energy Efficient Emerging Technologies Applicable in California

    SciTech Connect

    Xu, Tengfang; Slaa, Jan Willem; Sathaye, Jayant

    2010-12-15

    Implementation and adoption of efficient end-use technologies have proven to be one of the key measures for reducing greenhouse gas (GHG) emissions throughout the industries. In many cases, implementing energy efficiency measures is among one of the most cost effective investments that the industry could make in improving efficiency and productivity while reducing carbon dioxide (CO2) emissions. Over the years, there have been incentives to use resources and energy in a cleaner and more efficient way to create industries that are sustainable and more productive. With the working of energy programs and policies on GHG inventory and regulation, understanding and managing the costs associated with mitigation measures for GHG reductions is very important for the industry and policy makers around the world and in California. Successful implementation of applicable emerging technologies not only may help advance productivities, improve environmental impacts, or enhance industrial competitiveness, but also can play a significant role in climate-mitigation efforts by saving energy and reducing the associated GHG emissions. Developing new information on costs and savings benefits of energy efficient emerging technologies applicable in California market is important for policy makers as well as the industries. Therefore, provision of timely evaluation and estimation of the costs and energy savings potential of emerging technologies applicable to California is the focus of this report. The overall goal of the project is to identify and select a set of emerging and under-utilized energy-efficient technologies and practices as they are important to reduce energy consumption in industry while maintaining economic growth. Specifically, this report contains the results from performing Task 3 Technology Characterization for California Industries for the project titled Research Opportunities in Emerging and Under-Utilized Energy-Efficient Industrial Technologies, sponsored by

  19. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  20. Using architecture information and real-time resource state to reduce power consumption and communication costs in parallel applications.

    SciTech Connect

    Brandt, James M.; Devine, Karen Dragon; Gentile, Ann C.; Leung, Vitus Joseph; Olivier, Stephen Lecler; Pedretti, Kevin; Rajamanickam, Sivasankaran; Bunde, David P.; Deveci, Mehmet; Catalyurek, Umit V.

    2014-09-01

    As computer systems grow in both size and complexity, the need for applications and run-time systems to adjust to their dynamic environment also grows. The goal of the RAAMP LDRD was to combine static architecture information and real-time system state with algorithms to conserve power, reduce communication costs, and avoid network contention. We devel- oped new data collection and aggregation tools to extract static hardware information (e.g., node/core hierarchy, network routing) as well as real-time performance data (e.g., CPU uti- lization, power consumption, memory bandwidth saturation, percentage of used bandwidth, number of network stalls). We created application interfaces that allowed this data to be used easily by algorithms. Finally, we demonstrated the benefit of integrating system and application information for two use cases. The first used real-time power consumption and memory bandwidth saturation data to throttle concurrency to save power without increasing application execution time. The second used static or real-time network traffic information to reduce or avoid network congestion by remapping MPI tasks to allocated processors. Results from our work are summarized in this report; more details are available in our publications [2, 6, 14, 16, 22, 29, 38, 44, 51, 54].

  1. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System Reliability, Maintainability, and Cost Model (RMCM)--Description. Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; And Others

    The Reliability, Maintainability, and Cost Model (RMCM) described in this report is an interactive mathematical model with a built-in sensitivity analysis capability. It is a major component of the Life Cycle Cost Impact Model (LCCIM), which was developed as part of the DAIS advanced development program to be used to assess the potential impacts…

  2. GME: at what cost?

    PubMed

    Young, David W

    2003-11-01

    Current computing methods impede determining the real cost of graduate medical education. However, a more accurate estimate could be obtained if policy makers would allow for the application of basic cost-accounting principles, including consideration of department-level costs, unbundling of joint costs, and other factors.

  3. Using Cost-Effectiveness Tests to Design CHP Incentive Programs

    SciTech Connect

    Tidball, Rick

    2014-11-01

    This paper examines the structure of cost-effectiveness tests to illustrate how they can accurately reflect the costs and benefits of CHP systems. This paper begins with a general background discussion on cost-effectiveness analysis of DER and then describes how cost-effectiveness tests can be applied to CHP. Cost-effectiveness results are then calculated and analyzed for CHP projects in five states: Arkansas, Colorado, Iowa, Maryland, and North Carolina. Based on the results obtained for these five states, this paper offers four considerations to inform regulators in the application of cost-effectiveness tests in developing CHP programs.

  4. A New Activity-Based Financial Cost Management Method

    NASA Astrophysics Data System (ADS)

    Qingge, Zhang

    The standard activity-based financial cost management model is a new model of financial cost management, which is on the basis of the standard cost system and the activity-based cost and integrates the advantages of the two. It is a new model of financial cost management with more accurate and more adequate cost information by taking the R&D expenses as the accounting starting point and after-sale service expenses as the terminal point and covering the whole producing and operating process and the whole activities chain and value chain aiming at serving the internal management and decision.

  5. Agent Based Modelling of Communication Costs: Why Information Can Be Free

    NASA Astrophysics Data System (ADS)

    Čače, Ivana; Bryson, Joanna J.

    What purposes, other than facilitating the sharing of information, can language have served? First, it may not have evolved to serve any purpose at all. It is possible that language is just a side effect of the large human brain — a spandrel or exaptation — that only became useful later. If language is adaptive, this does not necessarily mean that it is adaptive for the purpose of communication. For example Dennett (1996) and Chomsky (1980) have stressed the utility of language in thinking. Also, there are different ways to view communication. The purpose of language according to Dunbar (1993), is to replace grooming as a social bonding process and in this way to ensure the stability of large social groups.

  6. The P600 reflects cost of new information in discourse memory.

    PubMed

    Burkhardt, Petra

    2007-11-19

    Discourse processing depends on long-term semantic memory and discourse memory (i.e. the organization and maintenance of a mental model). Using event-related potentials, the information-processing functions of the activities manifested by a centroparietal negativity (N400) and a posterior positivity (P600) were investigated during the processing of inferences. On the basis of findings from semantic priming, it was predicted that the standard N400 signature would modulate differences in the degree of inferential processing (semantic memory). Instead of N400 modulations, however, the two conditions that depended on a more demanding drawing of inferences elicited a P600, indicating that updating the mental model encumbers discourse memory capacity. The results highlight the functional significance of this positivity for discourse updating, and support the view that the P600 reflects distinct language processes.

  7. Accurate and Accidental Empathy.

    ERIC Educational Resources Information Center

    Chandler, Michael

    The author offers two controversial criticisms of what are rapidly becoming standard assessment procedures for the measurement of empathic skill. First, he asserts that assessment procedures which attend exclusively to the accuracy with which subjects are able to characterize other people's feelings provide little or no useful information about…

  8. Hospital cost accounting and the new imperative.

    PubMed

    Sabin, P

    1987-05-01

    Government regulatory structures, prospective payment mechanisms, a more competitive environment, and attempts to link cost accounting principles to planning, budgeting, and fiscal control all have served as catalysts for hospitals to increase their reliance and emphasis on cost accounting. Current hospital accounting systems are relatively inexpensive to develop and maintain, and they fulfill the financial reporting requirements mandated by Medicare and other third-party payers. These systems, however, do not provide information on what specific service units cost, and managers must have this information to make optimal trade-offs between quality, availability, and cost of medical services. Most health care organizations have a predetermined charge for each type of service, but the charge may not accurately portray the cost of providing the service. Knowing true costs will enable managers to select the most cost-effective method of treating a patient; know the financial implications of adding tests or procedures; relate costs to established norms of care; establish ranges of acceptable costs in various diagnostic groups; negotiate more successfully with rate review organizations and health maintenance organizations; and vigorously market and advertise the services that most contribute to the organization's overall financial health. The goal of microcosting is to determine the full cost of providing specific service units. The microcosting process comprises three components: data collection, cost modeling, and cost analysis. Microcosting is used to determine full costs for 20 percent of the hospital's procedures that are responsible for generating 80 percent of the hospital's gross revenue. Full costs are established by adding labor costs, materials costs, equipment depreciation costs, departmental overhead costs, and corporate overhead costs.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Accurate Optical Reference Catalogs

    NASA Astrophysics Data System (ADS)

    Zacharias, N.

    2006-08-01

    Current and near future all-sky astrometric catalogs on the ICRF are reviewed with the emphasis on reference star data at optical wavelengths for user applications. The standard error of a Hipparcos Catalogue star position is now about 15 mas per coordinate. For the Tycho-2 data it is typically 20 to 100 mas, depending on magnitude. The USNO CCD Astrograph Catalog (UCAC) observing program was completed in 2004 and reductions toward the final UCAC3 release are in progress. This all-sky reference catalogue will have positional errors of 15 to 70 mas for stars in the 10 to 16 mag range, with a high degree of completeness. Proper motions for the about 60 million UCAC stars will be derived by combining UCAC astrometry with available early epoch data, including yet unpublished scans of the complete set of AGK2, Hamburg Zone astrograph and USNO Black Birch programs. Accurate positional and proper motion data are combined in the Naval Observatory Merged Astrometric Dataset (NOMAD) which includes Hipparcos, Tycho-2, UCAC2, USNO-B1, NPM+SPM plate scan data for astrometry, and is supplemented by multi-band optical photometry as well as 2MASS near infrared photometry. The Milli-Arcsecond Pathfinder Survey (MAPS) mission is currently being planned at USNO. This is a micro-satellite to obtain 1 mas positions, parallaxes, and 1 mas/yr proper motions for all bright stars down to about 15th magnitude. This program will be supplemented by a ground-based program to reach 18th magnitude on the 5 mas level.

  10. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces.

    PubMed

    Abbott, W W; Faisal, A A

    2012-08-01

    Eye movements are highly correlated with motor intentions and are often retained by patients with serious motor deficiencies. Despite this, eye tracking is not widely used as control interface for movement in impaired patients due to poor signal interpretation and lack of control flexibility. We propose that tracking the gaze position in 3D rather than 2D provides a considerably richer signal for human machine interfaces by allowing direct interaction with the environment rather than via computer displays. We demonstrate here that by using mass-produced video-game hardware, it is possible to produce an ultra-low-cost binocular eye-tracker with comparable performance to commercial systems, yet 800 times cheaper. Our head-mounted system has 30 USD material costs and operates at over 120 Hz sampling rate with a 0.5-1 degree of visual angle resolution. We perform 2D and 3D gaze estimation, controlling a real-time volumetric cursor essential for driving complex user interfaces. Our approach yields an information throughput of 43 bits s(-1), more than ten times that of invasive and semi-invasive brain-machine interfaces (BMIs) that are vastly more expensive. Unlike many BMIs our system yields effective real-time closed loop control of devices (10 ms latency), after just ten minutes of training, which we demonstrate through a novel BMI benchmark--the control of the video arcade game 'Pong'.

  11. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces

    NASA Astrophysics Data System (ADS)

    Abbott, W. W.; Faisal, A. A.

    2012-08-01

    Eye movements are highly correlated with motor intentions and are often retained by patients with serious motor deficiencies. Despite this, eye tracking is not widely used as control interface for movement in impaired patients due to poor signal interpretation and lack of control flexibility. We propose that tracking the gaze position in 3D rather than 2D provides a considerably richer signal for human machine interfaces by allowing direct interaction with the environment rather than via computer displays. We demonstrate here that by using mass-produced video-game hardware, it is possible to produce an ultra-low-cost binocular eye-tracker with comparable performance to commercial systems, yet 800 times cheaper. Our head-mounted system has 30 USD material costs and operates at over 120 Hz sampling rate with a 0.5-1 degree of visual angle resolution. We perform 2D and 3D gaze estimation, controlling a real-time volumetric cursor essential for driving complex user interfaces. Our approach yields an information throughput of 43 bits s-1, more than ten times that of invasive and semi-invasive brain-machine interfaces (BMIs) that are vastly more expensive. Unlike many BMIs our system yields effective real-time closed loop control of devices (10 ms latency), after just ten minutes of training, which we demonstrate through a novel BMI benchmark—the control of the video arcade game ‘Pong’.

  12. Daily variation in natural disaster casualties: information flows, safety, and opportunity costs in tornado versus hurricane strikes.

    PubMed

    Zahran, Sammy; Tavani, Daniele; Weiler, Stephan

    2013-07-01

    Casualties from natural disasters may depend on the day of the week they strike. With data from the Spatial Hazard Events and Losses Database for the United States (SHELDUS), daily variation in hurricane and tornado casualties from 5,043 tornado and 2,455 hurricane time/place events is analyzed. Hurricane forecasts provide at-risk populations with considerable lead time. Such lead time allows strategic behavior in choosing protective measures under hurricane threat; opportunity costs in terms of lost income are higher during weekdays than during weekends. On the other hand, the lead time provided by tornadoes is near zero; hence tornados generate no opportunity costs. Tornado casualties are related to risk information flows, which are higher during workdays than during leisure periods, and are related to sheltering-in-place opportunities, which are better in permanent buildings like businesses and schools. Consistent with theoretical expectations, random effects negative binomial regression results indicate that tornado events occurring on the workdays of Monday through Thursday are significantly less lethal than tornados that occur on weekends. In direct contrast, and also consistent with theory, the expected count of hurricane casualties increases significantly with weekday occurrences. The policy implications of observed daily variation in tornado and hurricane events are considered. PMID:23126406

  13. Daily variation in natural disaster casualties: information flows, safety, and opportunity costs in tornado versus hurricane strikes.

    PubMed

    Zahran, Sammy; Tavani, Daniele; Weiler, Stephan

    2013-07-01

    Casualties from natural disasters may depend on the day of the week they strike. With data from the Spatial Hazard Events and Losses Database for the United States (SHELDUS), daily variation in hurricane and tornado casualties from 5,043 tornado and 2,455 hurricane time/place events is analyzed. Hurricane forecasts provide at-risk populations with considerable lead time. Such lead time allows strategic behavior in choosing protective measures under hurricane threat; opportunity costs in terms of lost income are higher during weekdays than during weekends. On the other hand, the lead time provided by tornadoes is near zero; hence tornados generate no opportunity costs. Tornado casualties are related to risk information flows, which are higher during workdays than during leisure periods, and are related to sheltering-in-place opportunities, which are better in permanent buildings like businesses and schools. Consistent with theoretical expectations, random effects negative binomial regression results indicate that tornado events occurring on the workdays of Monday through Thursday are significantly less lethal than tornados that occur on weekends. In direct contrast, and also consistent with theory, the expected count of hurricane casualties increases significantly with weekday occurrences. The policy implications of observed daily variation in tornado and hurricane events are considered.

  14. The Outdoor Dust Information Node (ODIN) - development and performance assessment of a low cost ambient dust sensor

    NASA Astrophysics Data System (ADS)

    Olivares, G.; Edwards, S.

    2015-07-01

    The large gradients in air quality expected in urban areas present a significant challenge to standard measurement technologies. Small, low-cost devices have been developing rapidly in recent years and have the potential to improve the spatial coverage of traditional air quality measurements. Here we present the first version of the Outdoor Dust Information Node (ODIN) as well as the results of the first real-world measurements. The lab tests indicate that the Sharp dust sensor used in the ODIN presents a stable baseline response only slightly affected by ambient temperature. The field tests indicate that ODIN data can be used to estimate hourly and daily PM2.5 concentrations after appropriate temperature and baseline corrections are applied. The ODIN seems suitable for campaign deployments complementing more traditional measurements.

  15. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses

    PubMed Central

    Soares, Marta O.; Palmer, Stephen; Ades, Anthony E.; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M.

    2015-01-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. PMID:25712447

  16. Application of the Activity-Based Costing Method for Unit-Cost Calculation in a Hospital

    PubMed Central

    Javid, Mahdi; Hadian, Mohammad; Ghaderi, Hossein; Ghaffari, Shahram; Salehi, Masoud

    2016-01-01

    Background: Choosing an appropriate accounting system for hospital has always been a challenge for hospital managers. Traditional cost system (TCS) causes cost distortions in hospital. Activity-based costing (ABC) method is a new and more effective cost system. Objective: This study aimed to compare ABC with TCS method in calculating the unit cost of medical services and to assess its applicability in Kashani Hospital, Shahrekord City, Iran. Methods: This cross-sectional study was performed on accounting data of Kashani Hospital in 2013. Data on accounting reports of 2012 and other relevant sources at the end of 2012 were included. To apply ABC method, the hospital was divided into several cost centers and five cost categories were defined: wage, equipment, space, material, and overhead costs. Then activity centers were defined. ABC method was performed into two phases. First, the total costs of cost centers were assigned to activities by using related cost factors. Then the costs of activities were divided to cost objects by using cost drivers. After determining the cost of objects, the cost price of medical services was calculated and compared with those obtained from TCS. Results: The Kashani Hospital had 81 physicians, 306 nurses, and 328 beds with the mean occupancy rate of 67.4% during 2012. Unit cost of medical services, cost price of occupancy bed per day, and cost per outpatient service were calculated. The total unit costs by ABC and TCS were respectively 187.95 and 137.70 USD, showing 50.34 USD more unit cost by ABC method. ABC method represented more accurate information on the major cost components. Conclusion: By utilizing ABC, hospital managers have a valuable accounting system that provides a true insight into the organizational costs of their department. PMID:26234974

  17. 32 CFR Appendix D to Part 286 - DD Form 2086-1, “Record of Freedom of Information (FOI) Processing Cost for Technical Data”

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 2 2011-07-01 2011-07-01 false DD Form 2086-1, âRecord of Freedom of Information (FOI) Processing Cost for Technical Dataâ D Appendix D to Part 286 National Defense Department of... FREEDOM OF INFORMATION ACT PROGRAM REGULATION Pt. 286, App. D Appendix D to Part 286—DD Form...

  18. Do Low-Cost Private School Leavers in the Informal Settlement Have a Good Chance of Admission to a Government Secondary school? A Study from Kibera in Kenya

    ERIC Educational Resources Information Center

    Ohba, Asayo

    2013-01-01

    There are growing numbers of low-cost private schools in urban informal settlements in developing countries. It has been argued that these institutions may constitute alternatives for government schools, as they are able to meet the educational needs of children in urban informal settlements. This study explores the question of whether low-cost…

  19. Federal Information Resources for Professional Counselors. A Sourcebook of Free and Low-Cost Resources To Support and Enrich Your Work as a Professional Counselor.

    ERIC Educational Resources Information Center

    Lum, Christie

    This publication is a sourcebook of free and low-cost resources to support the work of professional counselors. The information includes: (1) synthesis of current research, statistics, and research reports; (2) background material about current and emerging policy issues; (3) information about model programs and policies; (4) materials and…

  20. Business models for cost effective use of health information technologies: lessons learned in the CHCS II project.

    PubMed

    Riley, David L

    2003-01-01

    The Department of Defense (DoD) has embarked on an initiative to create an electronic medical record for all of its eligible beneficiaries. The Clinical Information Technology Program Office (CITPO) is the joint-service program office established to centrally manage this multi-year project. The Composite Health Care System II (CHCS II) is the name of the system under development. Given the historical failure rate of large-scale government information system projects, CITPO has employed an incremental acquisition approach and striven to use industry best practices to the greatest degree possible within the constraints of federal acquisition law. Based on lessons learned during the concept exploration phase of this project, CITPO, in partnership with Integic Corporation, the prime integration contractor, has reengineered its software acquisition process to include industry best practices. The result of this reengineering process has resulted in a reduction of the total projected life cycle costs for CHCS II from the original estimate of $7.6 billion over a 14-year period to between $3.9 and $4.3 billion. PMID:15455852

  1. The Cost-Effectiveness of Two Forms of Case Management Compared to a Control Group for Persons with Dementia and Their Informal Caregivers from a Societal Perspective

    PubMed Central

    Eekhout, Iris; Joling, Karlijn J.; van Mierlo, Lisa D.; Meiland, Franka J. M.; van Hout, Hein P. J.; de Rooij, Sophia E.

    2016-01-01

    Objectives The objective of this article was to compare the costs and cost-effectiveness of the two most prominent types of case management in the Netherlands (intensive case management and linkage models) against no access to case management (control group) for people with already diagnosed dementia and their informal caregivers. Methods The economic evaluation was conducted from a societal perspective embedded within a two year prospective, observational, controlled, cohort study with 521 informal caregivers and community-dwelling persons with dementia. Case management provided within one care organization (intensive case management model, ICMM), case management where care was provided by different care organizations within one region (Linkage model, LM), and a group with no access to case management (control) were compared. The economic evaluation related incremental costs to incremental effects regarding neuropsychiatric symptoms (NPI), psychological health of the informal caregiver (GHQ-12), and quality adjusted life years (QALY) of the person with dementia and informal caregiver. Results Inverse-propensity-score-weighted models showed no significant differences in clinical or total cost outcomes between the three groups. Informal care costs were significantly lower in the ICMM group compared to both other groups. Day center costs were significantly lower in the ICMM group compared to the control group. For all outcomes, the probability that the ICMM was cost-effective in comparison with LM and the control group was larger than 0.97 at a threshold ratio of 0 €/incremental unit of effect. Conclusion This study provides preliminary evidence that the ICMM is cost-effective compared to the control group and the LM. However, the findings should be interpreted with caution since this study was not a randomized controlled trial. PMID:27655234

  2. Principles and methods of managerial cost-accounting systems.

    PubMed

    Suver, J D; Cooper, J C

    1988-01-01

    An introduction to cost-accounting systems for pharmacy managers is provided; terms are defined and examples of specific applications are given. Cost-accounting systems determine, record, and report the resources consumed in providing services. An effective cost-accounting system must provide the information needed for both internal and external reports. In accounting terms, cost is the value given up to secure an asset. In determining how volumes of activity affect costs, fixed costs and variable costs are calculated; applications include pricing strategies, cost determinations, and break-even analysis. Also discussed are the concepts of direct and indirect costs, opportunity costs, and incremental and sunk costs. For most pharmacy department services, process costing, an accounting of intermediate outputs and homogeneous units, is used; in determining the full cost of providing a product or service (e.g., patient stay), job-order costing is used. Development of work-performance standards is necessary for monitoring productivity and determining product costs. In allocating pharmacy department costs, a ratio of costs to charges can be used; this method is convenient, but microcosting (specific identification of the costs of products) is more accurate. Pharmacy managers can use cost-accounting systems to evaluate the pharmacy's strategies, policies, and services and to improve budgets and reports.

  3. Principles and methods of managerial cost-accounting systems.

    PubMed

    Suver, J D; Cooper, J C

    1988-01-01

    An introduction to cost-accounting systems for pharmacy managers is provided; terms are defined and examples of specific applications are given. Cost-accounting systems determine, record, and report the resources consumed in providing services. An effective cost-accounting system must provide the information needed for both internal and external reports. In accounting terms, cost is the value given up to secure an asset. In determining how volumes of activity affect costs, fixed costs and variable costs are calculated; applications include pricing strategies, cost determinations, and break-even analysis. Also discussed are the concepts of direct and indirect costs, opportunity costs, and incremental and sunk costs. For most pharmacy department services, process costing, an accounting of intermediate outputs and homogeneous units, is used; in determining the full cost of providing a product or service (e.g., patient stay), job-order costing is used. Development of work-performance standards is necessary for monitoring productivity and determining product costs. In allocating pharmacy department costs, a ratio of costs to charges can be used; this method is convenient, but microcosting (specific identification of the costs of products) is more accurate. Pharmacy managers can use cost-accounting systems to evaluate the pharmacy's strategies, policies, and services and to improve budgets and reports. PMID:3348229

  4. The cost-effectiveness of the MobileMums intervention to increase physical activity among mothers with young children: a Markov model informed by a randomised controlled trial

    PubMed Central

    Burn, Edward; Barnett, Adrian G; Fjeldsoe, Brianna S; Graves, Nicholas

    2015-01-01

    Objectives To determine the cost-effectiveness of the MobileMums intervention. MobileMums is a 12-week programme which assists mothers with young children to be more physically active, primarily through the use of personalised SMS text-messages. Design A cost-effectiveness analysis using a Markov model to estimate and compare the costs and consequences of MobileMums and usual care. Setting This study considers the cost-effectiveness of MobileMums in Queensland, Australia. Participants A hypothetical cohort of over 36 000 women with a child under 1 year old is considered. These women are expected to be eligible and willing to participate in the intervention in Queensland, Australia. Data sources The model was informed by the effectiveness results from a 9-month two-arm community-based randomised controlled trial undertaken in 2011 and registered retrospectively with the Australian Clinical Trials Registry (ACTRN12611000481976). Baseline characteristics for the model cohort, treatment effects and resource utilisation were all informed by this trial. Main outcome measures The incremental cost per quality-adjusted life year (QALY) of MobileMums compared with usual care. Results The intervention is estimated to lead to an increase of 131 QALYs for an additional cost to the health system of 1.1 million Australian dollars (AUD). The expected incremental cost-effectiveness ratio for MobileMums is 8608 AUD per QALY gained. MobileMums has a 98% probability of being cost-effective at a cost-effectiveness threshold of 64 000 AUD. Varying modelling assumptions has little effect on this result. Conclusions At a cost-effectiveness threshold of 64 000 AUD, MobileMums would likely be a cost-effective use of healthcare resources in Queensland, Australia. Trial registration number Australian Clinical Trials Registry; ACTRN12611000481976. PMID:25926145

  5. Accurate whole human genome sequencing using reversible terminator chemistry.

    PubMed

    Bentley, David R; Balasubramanian, Shankar; Swerdlow, Harold P; Smith, Geoffrey P; Milton, John; Brown, Clive G; Hall, Kevin P; Evers, Dirk J; Barnes, Colin L; Bignell, Helen R; Boutell, Jonathan M; Bryant, Jason; Carter, Richard J; Keira Cheetham, R; Cox, Anthony J; Ellis, Darren J; Flatbush, Michael R; Gormley, Niall A; Humphray, Sean J; Irving, Leslie J; Karbelashvili, Mirian S; Kirk, Scott M; Li, Heng; Liu, Xiaohai; Maisinger, Klaus S; Murray, Lisa J; Obradovic, Bojan; Ost, Tobias; Parkinson, Michael L; Pratt, Mark R; Rasolonjatovo, Isabelle M J; Reed, Mark T; Rigatti, Roberto; Rodighiero, Chiara; Ross, Mark T; Sabot, Andrea; Sankar, Subramanian V; Scally, Aylwyn; Schroth, Gary P; Smith, Mark E; Smith, Vincent P; Spiridou, Anastassia; Torrance, Peta E; Tzonev, Svilen S; Vermaas, Eric H; Walter, Klaudia; Wu, Xiaolin; Zhang, Lu; Alam, Mohammed D; Anastasi, Carole; Aniebo, Ify C; Bailey, David M D; Bancarz, Iain R; Banerjee, Saibal; Barbour, Selena G; Baybayan, Primo A; Benoit, Vincent A; Benson, Kevin F; Bevis, Claire; Black, Phillip J; Boodhun, Asha; Brennan, Joe S; Bridgham, John A; Brown, Rob C; Brown, Andrew A; Buermann, Dale H; Bundu, Abass A; Burrows, James C; Carter, Nigel P; Castillo, Nestor; Chiara E Catenazzi, Maria; Chang, Simon; Neil Cooley, R; Crake, Natasha R; Dada, Olubunmi O; Diakoumakos, Konstantinos D; Dominguez-Fernandez, Belen; Earnshaw, David J; Egbujor, Ugonna C; Elmore, David W; Etchin, Sergey S; Ewan, Mark R; Fedurco, Milan; Fraser, Louise J; Fuentes Fajardo, Karin V; Scott Furey, W; George, David; Gietzen, Kimberley J; Goddard, Colin P; Golda, George S; Granieri, Philip A; Green, David E; Gustafson, David L; Hansen, Nancy F; Harnish, Kevin; Haudenschild, Christian D; Heyer, Narinder I; Hims, Matthew M; Ho, Johnny T; Horgan, Adrian M; Hoschler, Katya; Hurwitz, Steve; Ivanov, Denis V; Johnson, Maria Q; James, Terena; Huw Jones, T A; Kang, Gyoung-Dong; Kerelska, Tzvetana H; Kersey, Alan D; Khrebtukova, Irina; Kindwall, Alex P; Kingsbury, Zoya; Kokko-Gonzales, Paula I; Kumar, Anil; Laurent, Marc A; Lawley, Cynthia T; Lee, Sarah E; Lee, Xavier; Liao, Arnold K; Loch, Jennifer A; Lok, Mitch; Luo, Shujun; Mammen, Radhika M; Martin, John W; McCauley, Patrick G; McNitt, Paul; Mehta, Parul; Moon, Keith W; Mullens, Joe W; Newington, Taksina; Ning, Zemin; Ling Ng, Bee; Novo, Sonia M; O'Neill, Michael J; Osborne, Mark A; Osnowski, Andrew; Ostadan, Omead; Paraschos, Lambros L; Pickering, Lea; Pike, Andrew C; Pike, Alger C; Chris Pinkard, D; Pliskin, Daniel P; Podhasky, Joe; Quijano, Victor J; Raczy, Come; Rae, Vicki H; Rawlings, Stephen R; Chiva Rodriguez, Ana; Roe, Phyllida M; Rogers, John; Rogert Bacigalupo, Maria C; Romanov, Nikolai; Romieu, Anthony; Roth, Rithy K; Rourke, Natalie J; Ruediger, Silke T; Rusman, Eli; Sanches-Kuiper, Raquel M; Schenker, Martin R; Seoane, Josefina M; Shaw, Richard J; Shiver, Mitch K; Short, Steven W; Sizto, Ning L; Sluis, Johannes P; Smith, Melanie A; Ernest Sohna Sohna, Jean; Spence, Eric J; Stevens, Kim; Sutton, Neil; Szajkowski, Lukasz; Tregidgo, Carolyn L; Turcatti, Gerardo; Vandevondele, Stephanie; Verhovsky, Yuli; Virk, Selene M; Wakelin, Suzanne; Walcott, Gregory C; Wang, Jingwen; Worsley, Graham J; Yan, Juying; Yau, Ling; Zuerlein, Mike; Rogers, Jane; Mullikin, James C; Hurles, Matthew E; McCooke, Nick J; West, John S; Oaks, Frank L; Lundberg, Peter L; Klenerman, David; Durbin, Richard; Smith, Anthony J

    2008-11-01

    DNA sequence information underpins genetic research, enabling discoveries of important biological or medical benefit. Sequencing projects have traditionally used long (400-800 base pair) reads, but the existence of reference sequences for the human and many other genomes makes it possible to develop new, fast approaches to re-sequencing, whereby shorter reads are compared to a reference to identify intraspecies genetic variation. Here we report an approach that generates several billion bases of accurate nucleotide sequence per experiment at low cost. Single molecules of DNA are attached to a flat surface, amplified in situ and used as templates for synthetic sequencing with fluorescent reversible terminator deoxyribonucleotides. Images of the surface are analysed to generate high-quality sequence. We demonstrate application of this approach to human genome sequencing on flow-sorted X chromosomes and then scale the approach to determine the genome sequence of a male Yoruba from Ibadan, Nigeria. We build an accurate consensus sequence from >30x average depth of paired 35-base reads. We characterize four million single-nucleotide polymorphisms and four hundred thousand structural variants, many of which were previously unknown. Our approach is effective for accurate, rapid and economical whole-genome re-sequencing and many other biomedical applications.

  6. An Analysis of Rocket Propulsion Testing Costs

    NASA Technical Reports Server (NTRS)

    Ramirez, Carmen; Rahman, Shamim

    2010-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is commonly characterized as one of two types: production testing for certification and acceptance of engine hardware, and developmental testing for prototype evaluation or research and development (R&D) purposes. For programmatic reasons there is a continuing need to assess and evaluate the test costs for the various types of test campaigns that involve liquid rocket propellant test articles. Presently, in fact, there is a critical need to provide guidance on what represents a best value for testing and provide some key economic insights for decision-makers within NASA and the test customers outside the Agency. Hence, selected rocket propulsion test databases and references have been evaluated and analyzed with the intent to discover correlations of technical information and test costs that could help produce more reliable and accurate cost projections in the future. The process of searching, collecting, and validating propulsion test cost information presented some unique obstacles which then led to a set of recommendations for improvement in order to facilitate future cost information gathering and analysis. In summary, this historical account and evaluation of rocket propulsion test cost information will enhance understanding of the various kinds of project cost information; identify certain trends of interest to the aerospace testing community.

  7. Low-Cost Sensors Deliver Nanometer-Accurate Measurements

    NASA Technical Reports Server (NTRS)

    2015-01-01

    As part of a unique partnership program, Kennedy Space Center collaborated with a nearby business school to allow MBA students to examine and analyze the market potential for a selection of NASA-patented technologies. Following the semester, a group of students decided to form Winter Park, Florida-based Juntura Group Inc. to license and sell a technology they had worked with: a sensor capable of detecting position changes as small as 10 nanometers-approximately the thickness of a cell wall.

  8. Toxic Substances: Information on Costs and Financial Aid to Schools To Control Asbestos. Fact Sheet for the Honorable John J. La Falce, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Resources, Community, and Economic Development Div.

    Information on the costs of and financial aid available to schools for asbestos abatement is provided in this report. Data are based on interviews with officials from 15 school districts in 5 states--Illinois, New Jersey, New York, Ohio, and Pennsylvania. Section 1 provides background on the use of asbestos in buildings, health problems, federal…

  9. EPA (Environmental Protection Agency) evaluation of the HYDRO-VAC device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-08-01

    This document announces the conclusions of the EPA evaluation of the HYDRO-VAC device under section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the HYDRO-VAC device was conducted upon the application of the manufacturer. The product is claimed to improved fuel economy and performance for both gasoline and diesel fueled vehicles.

  10. EPA evaluation of the SYNERGY-1 fuel additive under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1981-06-01

    This document announces the conclusions of the EPA evaluation of the 'SYNERGY-1' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. This additive is intended to improve fuel economy and exhaust emission levels of two and four cycle gasoline fueled engines.

  11. Exploring Cost Exchange at Colleges and Universities. A Report on the 1973 Field Test of NCHEMS' Preliminary Information Exchange and Reporting Procedures at 70 Institutions.

    ERIC Educational Resources Information Center

    Collard, William; Huff, Robert

    The Preliminary Information Exchange Procedures (IEP) cost study project was undertaken as a joint venture by a group of colleges and universities and the National Center for Higher Education Management Systems (NCHEMS). The project was initiated and sponsored by NCHEMS to accomplish six objectives that would benefit both the institutions and…

  12. Federal Information Resources for Professional Counselors: A Sourcebook of Free and Low-Cost Resources To Support and Enrich Your Work as a Professional Counselor.

    ERIC Educational Resources Information Center

    Lum, Christie

    The Federal government supports many information clearinghouses and research institutions that produce free and low-cost publications and materials that can support and enrich the work of a professional counselor. This sourcebook is designed to help tap into and take greater advantage of these resources. The sourcebook provides syntheses of…

  13. Federal Information Resources for Professional Counselors. A Sourcebook of Free and Low-Cost Resources To Support and Enrich Your Work as a Professional Counselor.

    ERIC Educational Resources Information Center

    Lum, Christie

    The federal government supports many information clearinghouses and research institutions that produce free and low-cost publications and materials that can support and enrich the work of a professional counselor. This sourcebook is designed to help tap into and take greater advantage of these resources. The sourcebook provides syntheses of…

  14. Federal Information Resources for Professional Counselors: A Sourcebook of Free and Low-Cost Resources To Support and Enrich Your Work as a Professional Counselor, 2002.

    ERIC Educational Resources Information Center

    Lum, Christie

    The federal government supports many information clearinghouses and research institutions that produce free and low-cost publications and materials that can support and enrich the work of a professional counselor. This sourcebook is designed to help tap into and take greater advantage of these resources. The sourcebook provides syntheses of…

  15. How Do Students Meet the Cost of Attending a State University? Information Brief. Volume 4, Issue 2

    ERIC Educational Resources Information Center

    Florida Board of Governors, State University System, 2007

    2007-01-01

    Students and their families must cover, on average, 83% of the roughly $16,000 cost of attendance for a full-time, in-state undergraduate at a state university in Florida. On average, 75% of the cost of attendance in Florida's public universities is from expenses other than tuition and fees and books. The largest expense is room and board,…

  16. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System (LCCIM)--A Managerial Overview. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; Baran, H. Anthony

    This report gives a managerial overview of the Life Cycle Cost Impact Modeling System (LCCIM), which was designed to provide the Air Force with an in-house capability of assessing the life cycle cost impact of weapon system design alternatives. LCCIM consists of computer programs and the analyses which the user must perform to generate input data.…

  17. Accurate modeling of parallel scientific computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Townsend, James C.

    1988-01-01

    Scientific codes are usually parallelized by partitioning a grid among processors. To achieve top performance it is necessary to partition the grid so as to balance workload and minimize communication/synchronization costs. This problem is particularly acute when the grid is irregular, changes over the course of the computation, and is not known until load time. Critical mapping and remapping decisions rest on the ability to accurately predict performance, given a description of a grid and its partition. This paper discusses one approach to this problem, and illustrates its use on a one-dimensional fluids code. The models constructed are shown to be accurate, and are used to find optimal remapping schedules.

  18. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  19. The Psychology of Cost Estimating

    NASA Technical Reports Server (NTRS)

    Price, Andy

    2016-01-01

    Cost estimation for large (and even not so large) government programs is a challenge. The number and magnitude of cost overruns associated with large Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) programs highlight the difficulties in developing and promulgating accurate cost estimates. These overruns can be the result of inadequate technology readiness or requirements definition, the whims of politicians or government bureaucrats, or even as failures of the cost estimating profession itself. However, there may be another reason for cost overruns that is right in front of us, but only recently have we begun to grasp it: the fact that cost estimators and their customers are human. The last 70+ years of research into human psychology and behavioral economics have yielded amazing findings into how we humans process and use information to make judgments and decisions. What these scientists have uncovered is surprising: humans are often irrational and illogical beings, making decisions based on factors such as emotion and perception, rather than facts and data. These built-in biases to our thinking directly affect how we develop our cost estimates and how those cost estimates are used. We cost estimators can use this knowledge of biases to improve our cost estimates and also to improve how we communicate and work with our customers. By understanding how our customers think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. By using psychology to our advantage, we can more effectively help the decision maker and our organizations make fact-based decisions.

  20. EPA (Environmental Protection Agency) evaluation of the Cyclone-Z device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-01-01

    This document announces the conclusions of the EPA evaluation of the Cyclone-Z device under the provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the Cyclone-Z device was conducted upon receiving an application from the marketer. The device is claimed to improve fuel economy and driveability and to reduce exhaust emissions. EPA fully considered all of the information submitted by the applicant. The evaluation of the Cyclone-Z device was based on that information, EPA's engineering judgement, and its experience with other air bleed devices.

  1. Ramjet cost estimating handbook

    NASA Technical Reports Server (NTRS)

    Emmons, H. T.; Norwood, D. L.; Rasmusen, J. E.; Reynolds, H. E.

    1978-01-01

    Research conducted under Air Force Contract F33615-76-C-2043 to generate cost data and to establish a cost methodology that accurately predicts the production costs of ramjet engines is presented. The cost handbook contains a description of over one hundred and twenty-five different components which are defined as baseline components. The cost estimator selects from the handbook the appropriate components to fit his ramjet assembly, computes the cost from cost computation data sheets in the handbook, and totals all of the appropriate cost elements to arrive at the total engine cost. The methodology described in the cost handbook addresses many different ramjet types from simple podded arrangements of the liquid fuel ramjet to the more complex integral rocket/ramjet configurations including solid fuel ramjets and solid ducted rockets. It is applicable to a range of sizes from 6 in diameter to 18 in diameter and to production quantities up to 5000 engines.

  2. COSTS OF URBAN STORMWATER CONTROL

    EPA Science Inventory

    This report presents information on the cost of stormwater pollution control facilities in urban areas, including collection, control, and treatment systems. Information on prior cost studies of control technologies and cost estimating models used in these studies was collected,...

  3. COSTS OF URBAN STORMWATER CONTROL

    EPA Science Inventory

    This paper presents information on the cost of stormwater pollution control facilities in urban areas, including collection, control, and treatment systems. Information on prior cost studies of control technologies and cost estimating models used in these studies was collected, r...

  4. Accurate mask model for advanced nodes

    NASA Astrophysics Data System (ADS)

    Zine El Abidine, Nacer; Sundermann, Frank; Yesilada, Emek; Ndiaye, El Hadji Omar; Mishra, Kushlendra; Paninjath, Sankaranarayanan; Bork, Ingo; Buck, Peter; Toublan, Olivier; Schanen, Isabelle

    2014-07-01

    Standard OPC models consist of a physical optical model and an empirical resist model. The resist model compensates the optical model imprecision on top of modeling resist development. The optical model imprecision may result from mask topography effects and real mask information including mask ebeam writing and mask process contributions. For advanced technology nodes, significant progress has been made to model mask topography to improve optical model accuracy. However, mask information is difficult to decorrelate from standard OPC model. Our goal is to establish an accurate mask model through a dedicated calibration exercise. In this paper, we present a flow to calibrate an accurate mask enabling its implementation. The study covers the different effects that should be embedded in the mask model as well as the experiment required to model them.

  5. PPD-QALY-an index for cost-effectiveness in orthopedics: providing essential information to both physicians and health care policy makers for appropriate allocation of medical resources.

    PubMed

    Dougherty, Christopher P; Howard, Timothy

    2013-09-01

    Because of the increasing health care costs and the need for proper allocation of resources, it is important to ensure the best use of health benefits for sick and injured people of the population. An index or indicator is needed to help us quantify what is being spent so that comparisons with other options can be implemented. Cost-effective analysis seems to be well suited to provide this essential information to health care policy makers and those charged with distributing disability funds so that the proper allocation of resources can be achieved. There is currently no such index to show whether the benefits paid out are the most cost-effective. By comparing the quality-adjusted life year (QALY) of a treatment method to the disability an individual would experience, on the basis of lost wages as measure of disability, we provide decision makers more information for the basis of cost allocation in health care. To accomplish this, we describe a new term, the PPD-QALY (permanent partial disability-quality of life year). This term was developed to establish an index to which musculoskeletal care can be compared, to evaluate the cost-effectiveness of a treatment on the basis of the monetary value of the disability. This term serves to standardize the monetary value of an injury. Cost-effective analysis in arthroscopic surgery may prove to be a valuable asset in this role and to provide decision makers the information needed to determine the societal benefit from new arthroscopic procedures as they are developed and implemented. PMID:23924750

  6. PPD-QALY-an index for cost-effectiveness in orthopedics: providing essential information to both physicians and health care policy makers for appropriate allocation of medical resources.

    PubMed

    Dougherty, Christopher P; Howard, Timothy

    2013-09-01

    Because of the increasing health care costs and the need for proper allocation of resources, it is important to ensure the best use of health benefits for sick and injured people of the population. An index or indicator is needed to help us quantify what is being spent so that comparisons with other options can be implemented. Cost-effective analysis seems to be well suited to provide this essential information to health care policy makers and those charged with distributing disability funds so that the proper allocation of resources can be achieved. There is currently no such index to show whether the benefits paid out are the most cost-effective. By comparing the quality-adjusted life year (QALY) of a treatment method to the disability an individual would experience, on the basis of lost wages as measure of disability, we provide decision makers more information for the basis of cost allocation in health care. To accomplish this, we describe a new term, the PPD-QALY (permanent partial disability-quality of life year). This term was developed to establish an index to which musculoskeletal care can be compared, to evaluate the cost-effectiveness of a treatment on the basis of the monetary value of the disability. This term serves to standardize the monetary value of an injury. Cost-effective analysis in arthroscopic surgery may prove to be a valuable asset in this role and to provide decision makers the information needed to determine the societal benefit from new arthroscopic procedures as they are developed and implemented.

  7. Accurate eye center location through invariant isocentric patterns.

    PubMed

    Valenti, Roberto; Gevers, Theo

    2012-09-01

    Locating the center of the eyes allows for valuable information to be captured and used in a wide range of applications. Accurate eye center location can be determined using commercial eye-gaze trackers, but additional constraints and expensive hardware make these existing solutions unattractive and impossible to use on standard (i.e., visible wavelength), low-resolution images of eyes. Systems based solely on appearance are proposed in the literature, but their accuracy does not allow us to accurately locate and distinguish eye centers movements in these low-resolution settings. Our aim is to bridge this gap by locating the center of the eye within the area of the pupil on low-resolution images taken from a webcam or a similar device. The proposed method makes use of isophote properties to gain invariance to linear lighting changes (contrast and brightness), to achieve in-plane rotational invariance, and to keep low-computational costs. To further gain scale invariance, the approach is applied to a scale space pyramid. In this paper, we extensively test our approach for its robustness to changes in illumination, head pose, scale, occlusion, and eye rotation. We demonstrate that our system can achieve a significant improvement in accuracy over state-of-the-art techniques for eye center location in standard low-resolution imagery. PMID:22813958

  8. Cost and quality of fuels for electric plants 1993

    SciTech Connect

    Not Available

    1994-07-01

    The Cost and Quality of Fuels for Electric Utility Plants (C&Q) presents an annual summary of statistics at the national, Census division, State, electric utility, and plant levels regarding the quantity, quality, and cost of fossil fuels used to produce electricity. The purpose of this publication is to provide energy decision-makers with accurate and timely information that may be used in forming various perspectives on issues regarding electric power.

  9. Cost and quality of fuels for electric utility plants, 1994

    SciTech Connect

    1995-07-14

    This document presents an annual summary of statistics at the national, Census division, State, electric utility, and plant levels regarding the quantity, quality, and cost of fossil fuels used to produce electricity. Purpose of this publication is to provide energy decision-makers with accurate, timely information that may be used in forming various perspectives on issues regarding electric power.

  10. Cost and quality of fuels for electric utility plants, 1992

    SciTech Connect

    Not Available

    1993-08-02

    This publication presents an annual summary of statistics at the national, Census division, State, electric utility, and plant levels regarding the quantity, quality, and cost of fossil fuels used to produce electricity. The purpose of this publication is to provide energy decision-makers with accurate and timely information that may be used in forming various perspectives on issues regarding electric power.

  11. Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, Not Just More Evidence on Cost-Effectiveness

    PubMed Central

    Baltussen, Rob; Jansen, Maarten P.; Mikkelsen, Evelinn; Tromp, Noor; Hontelez, Jan; Bijlmakers, Leon; Van der Wilt, Gert Jan

    2016-01-01

    Priority setting of health interventions is generally considered as a valuable approach to support low- and middle-income countries (LMICs) in their strive for universal health coverage (UHC). However, present initiatives on priority setting are mainly geared towards the development of more cost-effectiveness information, and this evidence does not sufficiently support countries to make optimal choices. The reason is that priority setting is in reality a value-laden political process in which multiple criteria beyond cost-effectiveness are important, and stakeholders often justifiably disagree about the relative importance of these criteria. Here, we propose the use of ‘evidence-informed deliberative processes’ as an approach that does explicitly recognise priority setting as a political process and an intrinsically complex task. In these processes, deliberation between stakeholders is crucial to identify, reflect and learn about the meaning and importance of values, informed by evidence on these values. Such processes then result in the use of a broader range of explicit criteria that can be seen as the product of both international learning (‘core’ criteria, which include eg, cost-effectiveness, priority to the worse off, and financial protection) and learning among local stakeholders (‘contextual’ criteria). We believe that, with these evidence-informed deliberative processes in place, priority setting can provide a more meaningful contribution to achieving UHC. PMID:27801355

  12. Automatic classification and accurate size measurement of blank mask defects

    NASA Astrophysics Data System (ADS)

    Bhamidipati, Samir; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter

    2015-07-01

    A blank mask and its preparation stages, such as cleaning or resist coating, play an important role in the eventual yield obtained by using it. Blank mask defects' impact analysis directly depends on the amount of available information such as the number of defects observed, their accurate locations and sizes. Mask usability qualification at the start of the preparation process, is crudely based on number of defects. Similarly, defect information such as size is sought to estimate eventual defect printability on the wafer. Tracking of defect characteristics, specifically size and shape, across multiple stages, can further be indicative of process related information such as cleaning or coating process efficiencies. At the first level, inspection machines address the requirement of defect characterization by detecting and reporting relevant defect information. The analysis of this information though is still largely a manual process. With advancing technology nodes and reducing half-pitch sizes, a large number of defects are observed; and the detailed knowledge associated, make manual defect review process an arduous task, in addition to adding sensitivity to human errors. Cases where defect information reported by inspection machine is not sufficient, mask shops rely on other tools. Use of CDSEM tools is one such option. However, these additional steps translate into increased costs. Calibre NxDAT based MDPAutoClassify tool provides an automated software alternative to the manual defect review process. Working on defect images generated by inspection machines, the tool extracts and reports additional information such as defect location, useful for defect avoidance[4][5]; defect size, useful in estimating defect printability; and, defect nature e.g. particle, scratch, resist void, etc., useful for process monitoring. The tool makes use of smart and elaborate post-processing algorithms to achieve this. Their elaborateness is a consequence of the variety and

  13. Development of standardized air-blown coal gasifier/gas turbine concepts for future electric power systems. Volume 5, Appendix D: Cost support information: Final report

    SciTech Connect

    Sadowski, R.S.; Brown, M.J.; Harriz, J.T.; Ostrowski, E.

    1991-01-01

    The cost estimate provided for the DOE sponsored study of Air Blown Coal Gasification was developed from vendor quotes obtained directly for the equipment needed in the 50 MW, 100 MW, and 200 MW sized plants and from quotes from other jobs that have been referenced to apply to the particular cycle. Quotes were generally obtained for the 100 MW cycle and a scale up/down factor was used to generate the cost estimates for the 200 MW and 50 MW cycles, respectively. Information from GTPro (property of Thermoflow, Inc.) was used to estimate the cost of the 200 MW and 50 MW gas turbine, HRSG, and steam turbines. To available the use of GTPro`s estimated values for this equipment, a comparison was made between the quotes obtained for the 100 MW cycle (ABB GT 11N combustion turbine and a HSRG) against the estimated values by GTPro.

  14. Cost Analysis of Selected Patient Categories within a Dermatology Department Using an ABC Approach

    PubMed Central

    Papadaki, Šárka; Popesko, Boris

    2016-01-01

    Background: Present trends in hospital management are facilitating the utilization of more accurate costing methods, which potentially results in superior cost-related information and improved managerial decision-making. However, the Activity-Based Costing method (ABC), which was designed for cost allocation purposes in the 1980s, is not widely used by healthcare organizations. This study analyzes costs related to selected categories of patients, those suffering from psoriasis, varicose ulcers, eczema and other conditions, within a dermatology department at a Czech regional hospital. Methods: The study was conducted in a hospital department where both inpatient and outpatient care are offered. Firstly, the diseases treated at the department were identified. Further costs were determined for each activity using ABC. The study utilized data from managerial and financial accounting, as well as data obtained through interviews with departmental staff. Using a defined cost-allocation procedure makes it possible to determine the cost of an individual patient with a given disease more accurately than via traditional costing procedures. Results: The cost analysis focused on the differences between the costs related to individual patients within the selected diagnoses, variations between inpatient and outpatient treatments and the costs of activities performed by the dermatology department. Furthermore, comparing the costs identified through this approach and the revenue stemming from the health insurance system is an option. Conclusions: Activity-Based Costing is more accurate and relevant than the traditional costing method. The outputs of ABC provide an abundance of additional information for managers. The benefits of this research lie in its practically-tested outputs, resulting from calculating the costs of hospitalization, which could prove invaluable to persons involved in hospital management and decision-making. The study also defines the managerial implications of

  15. Accurate Stellar Parameters for Exoplanet Host Stars

    NASA Astrophysics Data System (ADS)

    Brewer, John Michael; Fischer, Debra; Basu, Sarbani; Valenti, Jeff A.

    2015-01-01

    A large impedement to our understanding of planet formation is obtaining a clear picture of planet radii and densities. Although determining precise ratios between planet and stellar host are relatively easy, determining accurate stellar parameters is still a difficult and costly undertaking. High resolution spectral analysis has traditionally yielded precise values for some stellar parameters but stars in common between catalogs from different authors or analyzed using different techniques often show offsets far in excess of their uncertainties. Most analyses now use some external constraint, when available, to break observed degeneracies between surface gravity, effective temperature, and metallicity which can otherwise lead to correlated errors in results. However, these external constraints are impossible to obtain for all stars and can require more costly observations than the initial high resolution spectra. We demonstrate that these discrepencies can be mitigated by use of a larger line list that has carefully tuned atomic line data. We use an iterative modeling technique that does not require external constraints. We compare the surface gravity obtained with our spectral synthesis modeling to asteroseismically determined values for 42 Kepler stars. Our analysis agrees well with only a 0.048 dex offset and an rms scatter of 0.05 dex. Such accurate stellar gravities can reduce the primary source of uncertainty in radii by almost an order of magnitude over unconstrained spectral analysis.

  16. Human African trypanosomiasis prevention, treatment and control costs: a systematic review.

    PubMed

    Keating, Joseph; Yukich, Joshua O; Sutherland, C Simone; Woods, Geordie; Tediosi, Fabrizio

    2015-10-01

    The control and eventual elimination of human African trypanosomiasis (HAT) requires the expansion of current control and surveillance activities. A systematic review of the published literature on the costs of HAT prevention, treatment, and control, in addition to the economic burden, was conducted. All studies that contained primary or secondary data on costs of prevention, treatment and control were considered, resulting in the inclusion of 42 papers. The geographically focal nature of the disease and a lack of standardization in the cost data limit the usefulness of the available information for making generalizations across diverse settings. More recent information on the costs of treatment and control interventions for HAT is needed to provide accurate information for analyses and planning. The cost information contained herein can be used to inform rational decision making in control and elimination programs, and to assess potential synergies with existing vector-borne disease control programs, but programs would benefit significantly from new cost data collection.

  17. EPA evaluation of the Malpassi Filter King device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-01-01

    This report announces the conclusions of the EPA evaluation of the 'Malpassi Filter King' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the 'Malpassi Filter King' device was conducted upon receiving an application from the marketer. The device is a gasoline pressure regulator. The 'Malpassi Filter King' device is claimed to save gasoline by improving the fuel economy of carburetor-equipped, automotive engines.

  18. An Investigation of the Perceptions of Low-Income Students of Color Concerning College Costs and Financial Aid Information

    ERIC Educational Resources Information Center

    Waters, Jennifer A.

    2009-01-01

    As college enrollments continue to increase, the disparity between middle-income white students and low-income students of color enrolling in private higher educational institutions continues to widen. Previous research has identified barriers such as access and equity in education, the high cost of education, and limited knowledge regarding…

  19. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models. PMID:27111139

  20. Accurate phase-shift velocimetry in rock

    NASA Astrophysics Data System (ADS)

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R.; Holmes, William M.

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  1. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  2. Processing of Perceptual Information Is More Robust than Processing of Conceptual Information in Preschool-Age Children: Evidence from Costs of Switching

    ERIC Educational Resources Information Center

    Fisher, Anna V.

    2011-01-01

    Is processing of conceptual information as robust as processing of perceptual information early in development? Existing empirical evidence is insufficient to answer this question. To examine this issue, 3- to 5-year-old children were presented with a flexible categorization task, in which target items (e.g., an open red umbrella) shared category…

  3. Rabbit System. Low cost, high reliability front end electronics featuring 16 bit dynamic range. [Redundant Analog Bus Based Information Transfer

    SciTech Connect

    Drake, G.; Droege, T.F.; Nelson, C.A. Jr.; Turner, K.J.; Ohska, T.K.

    1985-10-01

    A new crate-based front end system has been built which features low cost, compact packaging, command capability, 16 bit dynamic range digitization, and a high degree of redundancy. The crate can contain a variety of instrumentation modules, and is designed to be situated close to the detector. The system is suitable for readout of a large number of channels via parallel multiprocessor data acquisition.

  4. SUPPORT Tools for evidence-informed health Policymaking (STP) 12: Finding and using research evidence about resource use and costs.

    PubMed

    Oxman, Andrew D; Fretheim, Atle; Lavis, John N; Lewin, Simon

    2009-01-01

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. In this article, we address considerations about resource use and costs. The consequences of a policy or programme option for resource use differ from other impacts (both in terms of benefits and harms) in several ways. However, considerations of the consequences of options for resource use are similar to considerations related to other impacts in that policymakers and their staff need to identify important impacts on resource use, acquire and appraise the best available evidence regarding those impacts, and ensure that appropriate monetary values have been applied. We suggest four questions that can be considered when assessing resource use and the cost consequences of an option. These are: 1. What are the most important impacts on resource use? 2. What evidence is there for important impacts on resource use? 3. How confident is it possible to be in the evidence for impacts on resource use? 4. Have the impacts on resource use been valued appropriately in terms of their true costs? PMID:20018102

  5. SUPPORT Tools for evidence-informed health Policymaking (STP) 12: Finding and using research evidence about resource use and costs

    PubMed Central

    2009-01-01

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. In this article, we address considerations about resource use and costs. The consequences of a policy or programme option for resource use differ from other impacts (both in terms of benefits and harms) in several ways. However, considerations of the consequences of options for resource use are similar to considerations related to other impacts in that policymakers and their staff need to identify important impacts on resource use, acquire and appraise the best available evidence regarding those impacts, and ensure that appropriate monetary values have been applied. We suggest four questions that can be considered when assessing resource use and the cost consequences of an option. These are: 1. What are the most important impacts on resource use? 2. What evidence is there for important impacts on resource use? 3. How confident is it possible to be in the evidence for impacts on resource use? 4. Have the impacts on resource use been valued appropriately in terms of their true costs? PMID:20018102

  6. The Impact of Densification by Means of Informal Shacks in the Backyards of Low-Cost Houses on the Environment and Service Delivery in Cape Town, South Africa

    PubMed Central

    Govender, Thashlin; Barnes, Jo M.; Pieper, Clarissa H.

    2011-01-01

    This paper investigates the state-sponsored low cost housing provided to previously disadvantaged communities in the City of Cape Town. The strain imposed on municipal services by informal densification of unofficial backyard shacks was found to create unintended public health risks. Four subsidized low-cost housing communities were selected within the City of Cape Town in this cross-sectional survey. Data was obtained from 1080 persons with a response rate of 100%. Illegal electrical connections to backyard shacks that are made of flimsy materials posed increased fire risks. A high proportion of main house owners did not pay for water but sold water to backyard dwellers. The design of state-subsidised houses and the unplanned housing in the backyard added enormous pressure on the existing municipal infrastructure and the environment. Municipal water and sewerage systems and solid waste disposal cannot cope with the increased population density and poor sanitation behaviour of the inhabitants of these settlements. The low-cost housing program in South Africa requires improved management and prudent policies to cope with the densification of state-funded low-cost housing settlements. PMID:21695092

  7. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  8. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material. PMID:11366835

  9. The impact of activity based cost accounting on health care capital investment decisions.

    PubMed

    Greene, J K; Metwalli, A

    2001-01-01

    For the future survival of the rural hospitals in the U.S., there is a need to make sound financial decisions. The Activity Based Cost Accounting (ABC) provides more accurate and detailed cost information to make an informed capital investment decision taking into consideration all the costs and revenue reimbursement from third party payors. The paper analyzes, evaluates and compares two scenarios of acquiring capital equipment and attempts to show the importance of utilizing the ABC method in making a sound financial decision as compared to the traditional cost method. PMID:11794757

  10. Social cost impact assessment of pipeline infrastructure projects

    SciTech Connect

    Matthews, John C.; Allouche, Erez N.; Sterling, Raymond L.

    2015-01-15

    A key advantage of trenchless construction methods compared with traditional open-cut methods is their ability to install or rehabilitate underground utility systems with limited disruption to the surrounding built and natural environments. The equivalent monetary values of these disruptions are commonly called social costs. Social costs are often ignored by engineers or project managers during project planning and design phases, partially because they cannot be calculated using standard estimating methods. In recent years some approaches for estimating social costs were presented. Nevertheless, the cost data needed for validation of these estimating methods is lacking. Development of such social cost databases can be accomplished by compiling relevant information reported in various case histories. This paper identifies eight most important social cost categories, presents mathematical methods for calculating them, and summarizes the social cost impacts for two pipeline construction projects. The case histories are analyzed in order to identify trends for the various social cost categories. The effectiveness of the methods used to estimate these values is also discussed. These findings are valuable for pipeline infrastructure engineers making renewal technology selection decisions by providing a more accurate process for the assessment of social costs and impacts. - Highlights: • Identified the eight most important social cost factors for pipeline construction • Presented mathematical methods for calculating those social cost factors • Summarized social cost impacts for two pipeline construction projects • Analyzed those projects to identify trends for the social cost factors.

  11. Activity-Based Costing: A Cost Management Tool.

    ERIC Educational Resources Information Center

    Turk, Frederick J.

    1993-01-01

    In college and university administration, overhead costs are often charged to programs indiscriminately, whereas the support activities that underlie those costs remain unanalyzed. It is time for institutions to decrease ineffective use of resources. Activity-based management attributes costs more accurately and can improve efficiency. (MSE)

  12. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  13. Financing and budgetary impact of landslide losses for highways and urban infrastructures in NW Germany - an economic analysis using landslide database information and cost survey data

    NASA Astrophysics Data System (ADS)

    Maurischat, Philipp; Klose, Martin

    2014-05-01

    Recent studies show that landslides cause even in low mountain areas of Central and Western Europe millions of dollars in annual losses (Klose et al., 2012; Vranken et al., 2013). The objective of this study has therefore been to model landslide disaster financing and to assess budgetary impacts of landslide losses for highways and urban infrastructures in the Lower Saxon Uplands, NW Germany. The present contribution includes two case studies on the financial burden of landslides for public budgets using the examples of the Lower Saxony Department of Transportation and the city of Hann. Münden. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. Using a toolset for landslide cost modeling based on landslide databases (Klose et al., 2013), the direct costs of more than 30 landslide damage events to highways in a local case study area were determined. The annual average landslide maintenance, repair, and mitigation costs for highways in this case study area are estimated at 0.76 million between 1980 and 2010. Alternatively, a cost survey based on expert interviews has been conducted to collect landslide loss data for urban infrastructures. This cost survey for the city of Hann. Münden shows annual landslide losses of up to 3.4 million during the previous 10 years. Further expert interviews at city and highway agency level were focused on identifying procedure, resources, and limits of financing landslide damage costs. The information on landslide disaster financing and cost survey data on annual maintenance and construction budgets for highways, city sewer lines, and urban roads were used to evaluate the fiscal significance of estimated landslide losses. The results of this economic impact assessment prove variable financial burdens of analyzed public budgets. Thus, in costly years with landslide losses of more than 7 million, the Lower Saxony Department of Transportation is required to shift up to 19% of its

  14. Activity-Based Costing in the After Press Services Industry

    NASA Astrophysics Data System (ADS)

    Shevasuthisilp, Suntichai; Punsathitwong, Kosum

    2009-10-01

    This research was conducted to apply activity-based costing (ABC) in an after press service company in Chiang Mai province, Thailand. The company produces all of its products by one-stop service (such as coating, stitching, binding, die cutting, and gluing). All products are made to order, and have different sizes and patterns. A strategy of low price is used to compete in the marketplace. After cost analysis, the study found that the company has high overhead (36.5% of total cost). The company's problem is its use of traditional cost accounting, which has low accuracy in assigning overhead costs. If management uses this information when pricing customer orders, losses may occur because real production costs may be higher than the selling price. Therefore, the application of ABC in cost analysis can help executives receive accurate cost information; establish a sound pricing strategy; and improve the manufacturing process by determining work activities which have excessively high production costs. According to this research, 6 out of 56 items had a production cost higher than the selling price, leading to losses of 123,923 baht per year. Methods used to solve this problem were: reducing production costs; establishing suitable prices; and creating a sales promotion with lower prices for customers whose orders include processes involving unused capacity. These actions will increase overall sales of the company, and allow more efficient use of its machinery.

  15. Accurate documentation and wound measurement.

    PubMed

    Hampton, Sylvie

    This article, part 4 in a series on wound management, addresses the sometimes routine yet crucial task of documentation. Clear and accurate records of a wound enable its progress to be determined so the appropriate treatment can be applied. Thorough records mean any practitioner picking up a patient's notes will know when the wound was last checked, how it looked and what dressing and/or treatment was applied, ensuring continuity of care. Documenting every assessment also has legal implications, demonstrating due consideration and care of the patient and the rationale for any treatment carried out. Part 5 in the series discusses wound dressing characteristics and selection.

  16. Accurate calculation of the absolute free energy of binding for drug molecules† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c5sc02678d Click here for additional data file.

    PubMed Central

    Aldeghi, Matteo; Heifetz, Alexander; Bodkin, Michael J.; Knapp, Stefan

    2016-01-01

    Accurate prediction of binding affinities has been a central goal of computational chemistry for decades, yet remains elusive. Despite good progress, the required accuracy for use in a drug-discovery context has not been consistently achieved for drug-like molecules. Here, we perform absolute free energy calculations based on a thermodynamic cycle for a set of diverse inhibitors binding to bromodomain-containing protein 4 (BRD4) and demonstrate that a mean absolute error of 0.6 kcal mol–1 can be achieved. We also show a similar level of accuracy (1.0 kcal mol–1) can be achieved in pseudo prospective approach. Bromodomains are epigenetic mark readers that recognize acetylation motifs and regulate gene transcription, and are currently being investigated as therapeutic targets for cancer and inflammation. The unprecedented accuracy offers the exciting prospect that the binding free energy of drug-like compounds can be predicted for pharmacologically relevant targets. PMID:26798447

  17. Financial managers' costing expertise is needed in clinical trials.

    PubMed

    West, D A; Balas, E A; West, T D

    2000-01-01

    In addition to providing comparable and verifiable evidence regarding outcomes, clinical trials could also serve as sources of accurate and replicable financial information. Trial reports that identify expenses associated with effective diagnostic and therapeutic interventions enable cost controls. Standardized cost calculations could help clinicians and administrators identify more efficient health care technologies. Unfortunately, relatively few published trials include economic analyses and when they do, data are incomplete. Based on analyses of 97 clinical trial reports, this article proposes a standard costing format. Health care financial managers have the costing expertise necessary to implement and interpret standardized cost calculations for clinical trials. With the active involvement of financial managers, a standard costing format for clinical trials can be achieved. PMID:10961828

  18. Cost estimation model for advanced planetary programs, fourth edition

    NASA Technical Reports Server (NTRS)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  19. The cost of breast cancer recurrences.

    PubMed Central

    Hurley, S. F.; Huggins, R. M.; Snyder, R. D.; Bishop, J. F.

    1992-01-01

    Information about the costs of recurrent breast cancer is potentially important for targeting cost containment strategies and analysing the cost-effectiveness of breast cancer control programmes. We estimated these costs by abstracting health service and consumable usage data from the medical histories of 128 patients, and valuing each of the resources used. Resource usage and costs were summarised by regarding the recurrence as a series of episodes which were categorised into five anatomical site-based groups according to the following hierarchy: visceral, central nervous system (CNS), bone, local and other. Hospital visits and investigations comprised 78% of total costs for all episodes combined, and there were significant differences between the site-based groups in the frequency of hospital visits and most investigations. Total costs were most accurately described by separate linear regression models for each group, with the natural logarithm of the cost of the episode as the dependent variable, and predictor variables including the duration of the episode, duration squared, duration cubed and a variable indicating whether the episode was fatal. Visceral and CNS episodes were associated with higher costs than the other groups and were more likely to be shorter and fatal. A fatal recurrence of duration 15.7 months (the median for our sample) was predicted to cost $10,575 (Aus + 1988; or 4,877 pounds). Reduction of the substantial costs of recurrent breast cancer is likely to be a sizable economic benefit of adjuvant systemic therapy and mammographic screening. We did not identify any major opportunities for cost containment during the management of recurrences. PMID:1558803

  20. EPA (Environmental Protection Agency) evaluation of the P. S. C. U. 01 device under section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-08-01

    This document announces the conclusions of the EPA evaluation of the 'P.S.C.U. 01' device under the provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the P.S.C.U. 01 was conducted upon the application of Dutch Pacific, Incorporated. The device is comprised of several mechanical and electrical components and is intended to generate steam and deliver it to the combustion chamber via an inline catalyst. The device is claimed to improve fuel economy and to reduce exhaust emissions. The P.S.C.U. 01 is classified by EPA as a Vapor bleed device.

  1. EPA (Environmental Protection Agency) evaluation of the gyroscopic wheel cover device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-06-01

    This report announces the conclusions of the Environmental Protection Agency (EPA) evaluation of the Gyroscopic Wheel Cover under the provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the Gyroscopic Wheel Cover device was conducted upon the application of Simmer Wheels, Incorporated. The device is a mechanical assembly which replaces each of the standard wheel covers on a vehicle. The device is claimed to improve fuel economy, handling and braking characteristics, and the life of the brakes and tires.

  2. Cost of dengue outbreaks: literature review and country case studies

    PubMed Central

    2013-01-01

    different cost components (vector control; surveillance; information, education and communication; direct medical and indirect costs), as percentage of total costs, differed across the respective countries. Resources used for dengue disease control and treatment were country specific. Conclusions The evidence so far collected further confirms the methodological challenges in this field: 1) to define technically dengue outbreaks (what do we measure?) and 2) to measure accurately the costs in prospective field studies (how do we measure?). Currently, consensus on the technical definition of an outbreak is sought through the International Research Consortium on Dengue Risk Assessment, Management and Surveillance (IDAMS). Best practice guidelines should be further developed, also to improve the quality and comparability of cost study findings. Modelling the costs of dengue outbreaks and validating these models through field studies should guide further research. PMID:24195519

  3. Urban School Desegregation Costs.

    ERIC Educational Resources Information Center

    Colton, David L.

    The findings of an exploratory study of urban school desegregation costs are reported in this paper. The study examined five cities faced with desegregating their schools: Cleveland, Columbus, Buffalo, Dayton, and Milwaukee. The main body of the report presents descriptive information about desegregation costs. Cost variations among cities are…

  4. COST OF MTBE REMEDIATION

    EPA Science Inventory

    Widespread contamination of methyl tert-butyl ether (MTBE) in ground water has raised concerns about the increased cost of remediation of MTBE releases compared to BTEX-only sites. To evaluate these cost, cost information for 311 sites was furnished by U.S. EPA Office of Undergr...

  5. SPLASH: Accurate OH maser positions

    NASA Astrophysics Data System (ADS)

    Walsh, Andrew; Gomez, Jose F.; Jones, Paul; Cunningham, Maria; Green, James; Dawson, Joanne; Ellingsen, Simon; Breen, Shari; Imai, Hiroshi; Lowe, Vicki; Jones, Courtney

    2013-10-01

    The hydroxyl (OH) 18 cm lines are powerful and versatile probes of diffuse molecular gas, that may trace a largely unstudied component of the Galactic ISM. SPLASH (the Southern Parkes Large Area Survey in Hydroxyl) is a large, unbiased and fully-sampled survey of OH emission, absorption and masers in the Galactic Plane that will achieve sensitivities an order of magnitude better than previous work. In this proposal, we request ATCA time to follow up OH maser candidates. This will give us accurate (~10") positions of the masers, which can be compared to other maser positions from HOPS, MMB and MALT-45 and will provide full polarisation measurements towards a sample of OH masers that have not been observed in MAGMO.

  6. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  7. Accurate thickness measurement of graphene

    NASA Astrophysics Data System (ADS)

    Shearer, Cameron J.; Slattery, Ashley D.; Stapleton, Andrew J.; Shapter, Joseph G.; Gibson, Christopher T.

    2016-03-01

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  8. Accurate thickness measurement of graphene.

    PubMed

    Shearer, Cameron J; Slattery, Ashley D; Stapleton, Andrew J; Shapter, Joseph G; Gibson, Christopher T

    2016-03-29

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  9. [Costing nuclear medicine diagnostic procedures].

    PubMed

    Markou, Pavlos

    2005-01-01

    To the Editor: Referring to a recent special report about the cost analysis of twenty-nine nuclear medicine procedures, I would like to clarify some basic aspects for determining costs of nuclear medicine procedure with various costing methodologies. Activity Based Costing (ABC) method, is a new approach in imaging services costing that can provide the most accurate cost data, but is difficult to perform in nuclear medicine diagnostic procedures. That is because ABC requires determining and analyzing all direct and indirect costs of each procedure, according all its activities. Traditional costing methods, like those for estimating incomes and expenses per procedure or fixed and variable costs per procedure, which are widely used in break-even point analysis and the method of ratio-of-costs-to-charges per procedure may be easily performed in nuclear medicine departments, to evaluate the variability and differences between costs and reimbursement - charges. PMID:15886748

  10. Cost Recovery Through Depreciation.

    ERIC Educational Resources Information Center

    Forrester, Robert T.; Wesolowski, Leonard V.

    1983-01-01

    The approach of adopting depreciation rather than use allowance in order to recover more accurately the cost of college buildings and equipment used on federal projects is considered. It is suggested that depreciation will offer most colleges and universities a higher annual recovery rate, and an opportunity for better facilities planning. For…

  11. Price-transparency and cost accounting: challenges for health care organizations in the consumer-driven era.

    PubMed

    Hilsenrath, Peter; Eakin, Cynthia; Fischer, Katrina

    2015-01-01

    Health care reform is directed toward improving access and quality while containing costs. An essential part of this is improvement of pricing models to more accurately reflect the costs of providing care. Transparent prices that reflect costs are necessary to signal information to consumers and producers. This information is central in a consumer-driven marketplace. The rapid increase in high deductible insurance and other forms of cost sharing incentivizes the search for price information. The organizational ability to measure costs across a cycle of care is an integral component of creating value, and will play a greater role as reimbursements transition to episode-based care, value-based purchasing, and accountable care organization models. This article discusses use of activity-based costing (ABC) to better measure the cost of health care. It describes examples of ABC in health care organizations and discusses impediments to adoption in the United States including cultural and institutional barriers. PMID:25862425

  12. Price-transparency and cost accounting: challenges for health care organizations in the consumer-driven era.

    PubMed

    Hilsenrath, Peter; Eakin, Cynthia; Fischer, Katrina

    2015-01-01

    Health care reform is directed toward improving access and quality while containing costs. An essential part of this is improvement of pricing models to more accurately reflect the costs of providing care. Transparent prices that reflect costs are necessary to signal information to consumers and producers. This information is central in a consumer-driven marketplace. The rapid increase in high deductible insurance and other forms of cost sharing incentivizes the search for price information. The organizational ability to measure costs across a cycle of care is an integral component of creating value, and will play a greater role as reimbursements transition to episode-based care, value-based purchasing, and accountable care organization models. This article discusses use of activity-based costing (ABC) to better measure the cost of health care. It describes examples of ABC in health care organizations and discusses impediments to adoption in the United States including cultural and institutional barriers.

  13. Non-targeted analysis of electronics waste by comprehensive two-dimensional gas chromatography combined with high-resolution mass spectrometry: Using accurate mass information and mass defect analysis to explore the data.

    PubMed

    Ubukata, Masaaki; Jobst, Karl J; Reiner, Eric J; Reichenbach, Stephen E; Tao, Qingping; Hang, Jiliang; Wu, Zhanpin; Dane, A John; Cody, Robert B

    2015-05-22

    Comprehensive two-dimensional gas chromatography (GC×GC) and high-resolution mass spectrometry (HRMS) offer the best possible separation of their respective techniques. Recent commercialization of combined GC×GC-HRMS systems offers new possibilities for the analysis of complex mixtures. However, such experiments yield enormous data sets that require new informatics tools to facilitate the interpretation of the rich information content. This study reports on the analysis of dust obtained from an electronics recycling facility by using GC×GC in combination with a new high-resolution time-of-flight (TOF) mass spectrometer. New software tools for (non-traditional) Kendrick mass defect analysis were developed in this research and greatly aided in the identification of compounds containing chlorine and bromine, elements that feature in most persistent organic pollutants (POPs). In essence, the mass defect plot serves as a visual aid from which halogenated compounds are recognizable on the basis of their mass defect and isotope patterns. Mass chromatograms were generated based on specific ions identified in the plots as well as region of the plot predominantly occupied by halogenated contaminants. Tentative identification was aided by database searches, complementary electron-capture negative ionization experiments and elemental composition determinations from the exact mass data. These included known and emerging flame retardants, such as polybrominated diphenyl ethers (PBDEs), hexabromobenzene, tetrabromo bisphenol A and tris (1-chloro-2-propyl) phosphate (TCPP), as well as other legacy contaminants such as polychlorinated biphenyls (PCBs) and polychlorinated terphenyls (PCTs).

  14. Environmental Protection Agency (EPA) evaluation of the Super-Mag Fuel Extender under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Ashby, H.A.

    1982-01-01

    This document announces the conclusions of the EPA evaluation of the 'Super-Mag Fuel Extender' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. On December 10, 1980, the EPA received a written request from the Metropolitan Denver District Attorney's Office of Consumer Fraud and Economic Crime to test at least one 'cow magnet' type of fuel economy device. Following a survey of devices being marketed, the Metropolitan Denver District Attorney's Office selected the 'Super-Mag' device as typical of its category and on April 13, 1981 provided EPA with units for testing. The EPA evaluation of the device using three vehicles showed neither fuel economy nor exhaust emissions were affected by the installation of the 'Super-Mag' device. In addition, any differences between baseline test results and results from tests with the device installed were within the range of normal test variability.

  15. Accurate, meshless methods for magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.; Raives, Matthias J.

    2016-01-01

    Recently, we explored new meshless finite-volume Lagrangian methods for hydrodynamics: the `meshless finite mass' (MFM) and `meshless finite volume' (MFV) methods; these capture advantages of both smoothed particle hydrodynamics (SPH) and adaptive mesh refinement (AMR) schemes. We extend these to include ideal magnetohydrodynamics (MHD). The MHD equations are second-order consistent and conservative. We augment these with a divergence-cleaning scheme, which maintains nabla \\cdot B≈ 0. We implement these in the code GIZMO, together with state-of-the-art SPH MHD. We consider a large test suite, and show that on all problems the new methods are competitive with AMR using constrained transport (CT) to ensure nabla \\cdot B=0. They correctly capture the growth/structure of the magnetorotational instability, MHD turbulence, and launching of magnetic jets, in some cases converging more rapidly than state-of-the-art AMR. Compared to SPH, the MFM/MFV methods exhibit convergence at fixed neighbour number, sharp shock-capturing, and dramatically reduced noise, divergence errors, and diffusion. Still, `modern' SPH can handle most test problems, at the cost of larger kernels and `by hand' adjustment of artificial diffusion. Compared to non-moving meshes, the new methods exhibit enhanced `grid noise' but reduced advection errors and diffusion, easily include self-gravity, and feature velocity-independent errors and superior angular momentum conservation. They converge more slowly on some problems (smooth, slow-moving flows), but more rapidly on others (involving advection/rotation). In all cases, we show divergence control beyond the Powell 8-wave approach is necessary, or all methods can converge to unphysical answers even at high resolution.

  16. Costing the satellite power system

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.

    1978-01-01

    The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.

  17. Physician Awareness of Drug Cost: A Systematic Review

    PubMed Central

    Allan, G. Michael; Lexchin, Joel; Wiebe, Natasha

    2007-01-01

    Background Pharmaceutical costs are the fastest-growing health-care expense in most developed countries. Higher drug costs have been shown to negatively impact patient outcomes. Studies suggest that doctors have a poor understanding of pharmaceutical costs, but the data are variable and there is no consistent pattern in awareness. We designed this systematic review to investigate doctors' knowledge of the relative and absolute costs of medications and to determine the factors that influence awareness. Methods and Findings Our search strategy included The Cochrane Library, EconoLit, EMBASE, and MEDLINE as well as reference lists and contact with authors who had published two or more articles on the topic or who had published within 10 y of the commencement of our review. Studies were included if: either doctors, trainees (interns or residents), or medical students were surveyed; there were more than ten survey respondents; cost of pharmaceuticals was estimated; results were expressed quantitatively; there was a clear description of how authors defined “accurate estimates”; and there was a description of how the true cost was determined. Two authors reviewed each article for eligibility and extracted data independently. Cost accuracy outcomes were summarized, but data were not combined in meta-analysis because of extensive heterogeneity. Qualitative data related to physicians and drug costs were also extracted. The final analysis included 24 articles. Cost accuracy was low; 31% of estimates were within 20% or 25% of the true cost, and fewer than 50% were accurate by any definition of cost accuracy. Methodological weaknesses were common, and studies of low methodological quality showed better cost awareness. The most important factor influencing the pattern and accuracy of estimation was the true cost of therapy. High-cost drugs were estimated more accurately than inexpensive ones (74% versus 31%, Chi-square p < 0.001). Doctors consistently overestimated the cost

  18. A pharmacist-led information technology intervention for medication errors (PINCER): a multicentre, cluster randomised, controlled trial and cost-effectiveness analysis

    PubMed Central

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Cresswell, Kathrin; Eden, Martin; Elliott, Rachel A; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Prescott, Robin J; Swanwick, Glen; Franklin, Matthew; Putman, Koen; Boyd, Matthew; Sheikh, Aziz

    2012-01-01

    Summary Background Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-effectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months' follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0

  19. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  20. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  1. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  2. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  3. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  4. Accurate On-Line Intervention Practices for Efficient Improvement of Reading Skills in Africa

    ERIC Educational Resources Information Center

    Marshall, Minda B.

    2016-01-01

    Lifelong learning is the only way to sustain proficient learning in a rapidly changing world. Knowledge and information are exploding across the globe. We need accurate ways to facilitate the process of drawing external factual information into an internal perceptive advantage from which to interpret and argue new information. Accurate and…

  5. The importance of accurate atmospheric modeling

    NASA Astrophysics Data System (ADS)

    Payne, Dylan; Schroeder, John; Liang, Pang

    2014-11-01

    This paper will focus on the effect of atmospheric conditions on EO sensor performance using computer models. We have shown the importance of accurately modeling atmospheric effects for predicting the performance of an EO sensor. A simple example will demonstrated how real conditions for several sites in China will significantly impact on image correction, hyperspectral imaging, and remote sensing. The current state-of-the-art model for computing atmospheric transmission and radiance is, MODTRAN® 5, developed by the US Air Force Research Laboratory and Spectral Science, Inc. Research by the US Air Force, Navy and Army resulted in the public release of LOWTRAN 2 in the early 1970's. Subsequent releases of LOWTRAN and MODTRAN® have continued until the present. Please verify that (1) all pages are present, (2) all figures are correct, (3) all fonts and special characters are correct, and (4) all text and figures fit within the red margin lines shown on this review document. Complete formatting information is available at http://SPIE.org/manuscripts Return to the Manage Active Submissions page at http://spie.org/submissions/tasks.aspx and approve or disapprove this submission. Your manuscript will not be published without this approval. Please contact author_help@spie.org with any questions or concerns. The paper will demonstrate the importance of using validated models and local measured meteorological, atmospheric and aerosol conditions to accurately simulate the atmospheric transmission and radiance. Frequently default conditions are used which can produce errors of as much as 75% in these values. This can have significant impact on remote sensing applications.

  6. Accurate Mass Measurements in Proteomics

    SciTech Connect

    Liu, Tao; Belov, Mikhail E.; Jaitly, Navdeep; Qian, Weijun; Smith, Richard D.

    2007-08-01

    To understand different aspects of life at the molecular level, one would think that ideally all components of specific processes should be individually isolated and studied in details. Reductionist approaches, i.e., studying one biological event at a one-gene or one-protein-at-a-time basis, indeed have made significant contributions to our understanding of many basic facts of biology. However, these individual “building blocks” can not be visualized as a comprehensive “model” of the life of cells, tissues, and organisms, without using more integrative approaches.1,2 For example, the emerging field of “systems biology” aims to quantify all of the components of a biological system to assess their interactions and to integrate diverse types of information obtainable from this system into models that could explain and predict behaviors.3-6 Recent breakthroughs in genomics, proteomics, and bioinformatics are making this daunting task a reality.7-14 Proteomics, the systematic study of the entire complement of proteins expressed by an organism, tissue, or cell under a specific set of conditions at a specific time (i.e., the proteome), has become an essential enabling component of systems biology. While the genome of an organism may be considered static over short timescales, the expression of that genome as the actual gene products (i.e., mRNAs and proteins) is a dynamic event that is constantly changing due to the influence of environmental and physiological conditions. Exclusive monitoring of the transcriptomes can be carried out using high-throughput cDNA microarray analysis,15-17 however the measured mRNA levels do not necessarily correlate strongly with the corresponding abundances of proteins,18-20 The actual amount of functional proteins can be altered significantly and become independent of mRNA levels as a result of post-translational modifications (PTMs),21 alternative splicing,22,23 and protein turnover.24,25 Moreover, the functions of expressed

  7. Costs and cost-effectiveness of periviable care.

    PubMed

    Caughey, Aaron B; Burchfield, David J

    2014-02-01

    With increasing concerns regarding rapidly expanding healthcare costs, cost-effectiveness analysis allows assessment of whether marginal gains from new technology are worth the increased costs. Particular methodologic issues related to cost and cost-effectiveness analysis in the area of neonatal and periviable care include how costs are estimated, such as the use of charges and whether long-term costs are included; the challenges of measuring utilities; and whether to use a maternal, neonatal, or dual perspective in such analyses. A number of studies over the past three decades have examined the costs and the cost-effectiveness of neonatal and periviable care. Broadly, while neonatal care is costly, it is also cost effective as it produces both life-years and quality-adjusted life-years (QALYs). However, as the gestational age of the neonate decreases, the costs increase and the cost-effectiveness threshold is harder to achieve. In the periviable range of gestational age (22-24 weeks of gestation), whether the care is cost effective is questionable and is dependent on the perspective. Understanding the methodology and salient issues of cost-effectiveness analysis is critical for researchers, editors, and clinicians to accurately interpret results of the growing body of cost-effectiveness studies related to the care of periviable pregnancies and neonates.

  8. Wind Integration Cost and Cost-Causation: Preprint

    SciTech Connect

    Milligan, M.; Kirby, B.; Holttinen, H.; Kiviluoma, J.; Estanqueiro, A.; Martin-Martinez, S.; Gomez-Lazaro, E.; Peneda, I.; Smith, C.

    2013-10-01

    The question of wind integration cost has received much attention in the past several years. The methodological challenges to calculating integration costs are discussed in this paper. There are other sources of integration cost unrelated to wind energy. A performance-based approach would be technology neutral, and would provide price signals for all technology types. However, it is difficult to correctly formulate such an approach. Determining what is and is not an integration cost is challenging. Another problem is the allocation of system costs to one source. Because of significant nonlinearities, this can prove to be impossible to determine in an accurate and objective way.

  9. Educational Costs.

    ERIC Educational Resources Information Center

    Arnold, Robert

    Problems in educational cost accounting and a new cost accounting approach are described in this paper. The limitations of the individualized cost (student units) approach and the comparative cost approach (in the form of fund-function-object) are illustrated. A new strategy, an activity-based system of accounting, is advocated. Borrowed from…

  10. Leasing strategies reduce the cost of financing healthcare equipment.

    PubMed

    Bayless, M E; Diltz, J D

    1985-10-01

    Prospective payment has increased the importance of controlling capital costs. One area where this may be possible is lease financing. Reasons commonly cited in favor of leasing may be of questionable validity, but, under an easily identified set of circumstances, lease financing can be cost effective. Recent developments in finance make it possible to not only evaluate the financial attractiveness of a given lease, but also to accurately predict bounds within which the terms of the lease must fall. Hospital administrators armed with this information should be able to negotiate more favorable lease terms under given tax and economic environments.

  11. Cost of Computer Searching

    ERIC Educational Resources Information Center

    Chenery, Peter J.

    1973-01-01

    The program described has the primary objective of making Federally generated technology and research information available to public and private agencies. Cost analysis, data banks, and search strategies are explained. (Author/DH)

  12. Measuring Fisher information accurately in correlated neural populations.

    PubMed

    Kanitscheider, Ingmar; Coen-Cagli, Ruben; Kohn, Adam; Pouget, Alexandre

    2015-06-01

    Neural responses are known to be variable. In order to understand how this neural variability constrains behavioral performance, we need to be able to measure the reliability with which a sensory stimulus is encoded in a given population. However, such measures are challenging for two reasons: First, they must take into account noise correlations which can have a large influence on reliability. Second, they need to be as efficient as possible, since the number of trials available in a set of neural recording is usually limited by experimental constraints. Traditionally, cross-validated decoding has been used as a reliability measure, but it only provides a lower bound on reliability and underestimates reliability substantially in small datasets. We show that, if the number of trials per condition is larger than the number of neurons, there is an alternative, direct estimate of reliability which consistently leads to smaller errors and is much faster to compute. The superior performance of the direct estimator is evident both for simulated data and for neuronal population recordings from macaque primary visual cortex. Furthermore we propose generalizations of the direct estimator which measure changes in stimulus encoding across conditions and the impact of correlations on encoding and decoding, typically denoted by Ishuffle and Idiag respectively.

  13. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  14. How flatbed scanners upset accurate film dosimetry

    NASA Astrophysics Data System (ADS)

    van Battum, L. J.; Huizenga, H.; Verdaasdonk, R. M.; Heukelom, S.

    2016-01-01

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner’s transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner’s optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  15. How flatbed scanners upset accurate film dosimetry.

    PubMed

    van Battum, L J; Huizenga, H; Verdaasdonk, R M; Heukelom, S

    2016-01-21

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner's transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner's optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  16. Accurately measuring MPI broadcasts in a computational grid

    SciTech Connect

    Karonis N T; de Supinski, B R

    1999-05-06

    An MPI library's implementation of broadcast communication can significantly affect the performance of applications built with that library. In order to choose between similar implementations or to evaluate available libraries, accurate measurements of broadcast performance are required. As we demonstrate, existing methods for measuring broadcast performance are either inaccurate or inadequate. Fortunately, we have designed an accurate method for measuring broadcast performance, even in a challenging grid environment. Measuring broadcast performance is not easy. Simply sending one broadcast after another allows them to proceed through the network concurrently, thus resulting in inaccurate per broadcast timings. Existing methods either fail to eliminate this pipelining effect or eliminate it by introducing overheads that are as difficult to measure as the performance of the broadcast itself. This problem becomes even more challenging in grid environments. Latencies a long different links can vary significantly. Thus, an algorithm's performance is difficult to predict from it's communication pattern. Even when accurate pre-diction is possible, the pattern is often unknown. Our method introduces a measurable overhead to eliminate the pipelining effect, regardless of variations in link latencies. choose between different available implementations. Also, accurate and complete measurements could guide use of a given implementation to improve application performance. These choices will become even more important as grid-enabled MPI libraries [6, 7] become more common since bad choices are likely to cost significantly more in grid environments. In short, the distributed processing community needs accurate, succinct and complete measurements of collective communications performance. Since successive collective communications can often proceed concurrently, accurately measuring them is difficult. Some benchmarks use knowledge of the communication algorithm to predict the

  17. An Analysis of Rocket Propulsion Testing Costs

    NASA Technical Reports Server (NTRS)

    Ramirez-Pagan, Carmen P.; Rahman, Shamim A.

    2009-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is generally performed within two arenas: (1) Production testing for certification and acceptance, and (2) Developmental testing for prototype or experimental purposes. The customer base consists of NASA programs, DOD programs, and commercial programs. Resources in place to perform on-site testing include both civil servants and contractor personnel, hardware and software including data acquisition and control, and 6 test stands with a total of 14 test positions/cells. For several business reasons there is the need to augment understanding of the test costs for all the various types of test campaigns. Historical propulsion test data was evaluated and analyzed in many different ways with the intent to find any correlation or statistics that could help produce more reliable and accurate cost estimates and projections. The analytical efforts included timeline trends, statistical curve fitting, average cost per test, cost per test second, test cost timeline, and test cost envelopes. Further, the analytical effort includes examining the test cost from the perspective of thrust level and test article characteristics. Some of the analytical approaches did not produce evidence strong enough for further analysis. Some other analytical approaches yield promising results and are candidates for further development and focused study. Information was organized for into its elements: a Project Profile, Test Cost Timeline, and Cost Envelope. The Project Profile is a snap shot of the project life cycle on a timeline fashion, which includes various statistical analyses. The Test Cost Timeline shows the cumulative average test cost, for each project, at each month where there was test activity. The Test Cost Envelope shows a range of cost for a given number of test(s). The supporting information upon which this study was performed came from diverse sources and thus it was necessary to

  18. Rapidly falling costs of battery packs for electric vehicles

    NASA Astrophysics Data System (ADS)

    Nykvist, Björn; Nilsson, Måns

    2015-04-01

    To properly evaluate the prospects for commercially competitive battery electric vehicles (BEV) one must have accurate information on current and predicted cost of battery packs. The literature reveals that costs are coming down, but with large uncertainties on past, current and future costs of the dominating Li-ion technology. This paper presents an original systematic review, analysing over 80 different estimates reported 2007-2014 to systematically trace the costs of Li-ion battery packs for BEV manufacturers. We show that industry-wide cost estimates declined by approximately 14% annually between 2007 and 2014, from above US$1,000 per kWh to around US$410 per kWh, and that the cost of battery packs used by market-leading BEV manufacturers are even lower, at US$300 per kWh, and has declined by 8% annually. Learning rate, the cost reduction following a cumulative doubling of production, is found to be between 6 and 9%, in line with earlier studies on vehicle battery technology. We reveal that the costs of Li-ion battery packs continue to decline and that the costs among market leaders are much lower than previously reported. This has significant implications for the assumptions used when modelling future energy and transport systems and permits an optimistic outlook for BEVs contributing to low-carbon transport.

  19. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  20. Accurate SHAPE-directed RNA structure determination

    PubMed Central

    Deigan, Katherine E.; Li, Tian W.; Mathews, David H.; Weeks, Kevin M.

    2009-01-01

    Almost all RNAs can fold to form extensive base-paired secondary structures. Many of these structures then modulate numerous fundamental elements of gene expression. Deducing these structure–function relationships requires that it be possible to predict RNA secondary structures accurately. However, RNA secondary structure prediction for large RNAs, such that a single predicted structure for a single sequence reliably represents the correct structure, has remained an unsolved problem. Here, we demonstrate that quantitative, nucleotide-resolution information from a SHAPE experiment can be interpreted as a pseudo-free energy change term and used to determine RNA secondary structure with high accuracy. Free energy minimization, by using SHAPE pseudo-free energies, in conjunction with nearest neighbor parameters, predicts the secondary structure of deproteinized Escherichia coli 16S rRNA (>1,300 nt) and a set of smaller RNAs (75–155 nt) with accuracies of up to 96–100%, which are comparable to the best accuracies achievable by comparative sequence analysis. PMID:19109441

  1. Costing Hospital Surgery Services: The Method Matters

    PubMed Central

    Mercier, Gregoire; Naro, Gerald

    2014-01-01

    Background Accurate hospital costs are required for policy-makers, hospital managers and clinicians to improve efficiency and transparency. However, different methods are used to allocate direct costs, and their agreement is poorly understood. The aim of this study was to assess the agreement between bottom-up and top-down unit costs of a large sample of surgical operations in a French tertiary centre. Methods Two thousand one hundred and thirty consecutive procedures performed between January and October 2010 were analysed. Top-down costs were based on pre-determined weights, while bottom-up costs were calculated through an activity-based costing (ABC) model. The agreement was assessed using correlation coefficients and the Bland and Altman method. Variables associated with the difference between methods were identified with bivariate and multivariate linear regressions. Results The correlation coefficient amounted to 0.73 (95%CI: 0.72; 0.76). The overall agreement between methods was poor. In a multivariate analysis, the cost difference was independently associated with age (Beta = −2.4; p = 0.02), ASA score (Beta = 76.3; p<0.001), RCI (Beta = 5.5; p<0.001), staffing level (Beta = 437.0; p<0.001) and intervention duration (Beta = −10.5; p<0.001). Conclusions The ability of the current method to provide relevant information to managers, clinicians and payers is questionable. As in other European countries, a shift towards time-driven activity-based costing should be advocated. PMID:24817167

  2. Second EPA evaluation of the Platinum Gasaver Device under Section 511 of the Motor Vehicle Information and Cost Savings Act (updated). Technical report

    SciTech Connect

    Not Available

    1991-07-01

    The report announces the conclusions of the EPA evaluation of the Platinum Gasaver Device under the provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The second evaluation of the Platinum Gasaver device was conducted upon the request of the Federal Trade Commission. The unit is a vapor bleed device. It functions by bleeding a mixture of air and 'platinum concentrate' through a 'T' connection that is installed in the Positive Crankcase Ventilation (PCV) line. It is claimed to reduce emissions, improve fuel economy, raise the octane of gasoline, and extend engine line. Three typical vehicles were tested at the EPA's Motor Vehicle Emission Laboratory. The basic test sequence included 2,000 miles of mileage accumulation, replicate Federal Test Procedures (FTP) and replicate Highway Fuel Economy Tests (HFET). The test sequence was conducted both without and with the Platinum Gasaver installed. The overall conclusion from these tests is that the Platinum Gasaver device did not significantly change vehicle emissions or fuel economy for either the FTP or the HFET.

  3. Low-Cost Spectral Sensor Development Description.

    SciTech Connect

    Armijo, Kenneth Miguel; Yellowhair, Julius

    2014-11-01

    Solar spectral data for all parts of the US is limited due in part to the high cost of commercial spectrometers. Solar spectral information is necessary for accurate photovoltaic (PV) performance forecasting, especially for large utility-scale PV installations. A low-cost solar spectral sensor would address the obstacles and needs. In this report, a novel low-cost, discrete- band sensor device, comprised of five narrow-band sensors, is described. The hardware is comprised of commercial-off-the-shelf components to keep the cost low. Data processing algorithms were developed and are being refined for robustness. PV module short-circuit current ( I sc ) prediction methods were developed based on interaction-terms regression methodology and spectrum reconstruction methodology for computing I sc . The results suggest the computed spectrum using the reconstruction method agreed well with the measured spectrum from the wide-band spectrometer (RMS error of 38.2 W/m 2 -nm). Further analysis of computed I sc found a close correspondence of 0.05 A RMS error. The goal is for ubiquitous adoption of the low-cost spectral sensor in solar PV and other applications such as weather forecasting.

  4. Tracking Costs

    ERIC Educational Resources Information Center

    Erickson, Paul W.

    2010-01-01

    Even though there's been a slight reprieve in energy costs, the reality is that the cost of non-renewable energy is increasing, and state education budgets are shrinking. One way to keep energy and operations costs from overshadowing education budgets is to develop a 10-year energy audit plan to eliminate waste. First, facility managers should…

  5. Cost-effectiveness in orthopedics: providing essential information to both physicians and health care policy makers for appropriate allocation of medical resources.

    PubMed

    Dougherty, Christopher P; Howard, Timothy

    2013-09-01

    Cost-effective analysis has become an important tool in helping determine what procedures are both cost-effective and appropriate in today's cost control health care. The quality-adjusted life-year (QALY) is a standard measure for health-related quality-of-life in medical cost-effectiveness research. It can be used to compare different interventions to determine the cost-effectiveness of each procedure. Use of QALY to compare health care interventions has become the new gold standard. The key words arthroscopy, cost-effectiveness analysis, QALY, shoulder, hip, knee, ankle, elbow, wrist, and pubic symphysis were searched utilizing PubMed and an internet search engine. Cost/QALY ratios were determined and compared with other surgical procedures using techniques other than arthroscopy. Cost/QALYs were found for the shoulder, hip, knee, and elbow. The QALY for the shoulder was $13,092, for a simple knee was $5783, for a hip $21,700, and for an elbow $2031. General costs were found for the ankle, wrist, and pubic symphysis, that could be used to estimate QALYs without the complex formal calculation. On the basis of our findings, arthroscopy is an extremely cost-effective allocation of health care resources.

  6. Cost accounting of radiological examinations. Cost analysis of radiological examinations of intermediate referral hospitals and general practice.

    PubMed

    Lääperi, A L

    1996-01-01

    nonmonetary variables was developed. In it the radiologist, radiographer and examination-specific equipment costs were allocated to the examinations applying estimated cost equivalents. Some minor cost items were replaced by a general cost factor (GCF). The program is suitable for internal cost accounting of radiological departments as well as regional planning. If more accurate cost information is required, cost assignment employing the actual consumption of the resources and applying the principles of activity-based cost accounting is recommended. As an application of the cost accounting formula the average costs of the radiological examinations were calculated. In conventional radiography the average proportion of the cost factors in the total material was: personnel costs 43%, equipment costs 26%, material costs 7%, real estate costs 11%, administration and overheads 14%. The average total costs including radiologist costs in the hospitals were (FIM): conventional roentgen examinations 188, contrast medium examinations 695, ultrasound 296, mammography 315, roentgen examinations with mobile equipment 1578. The average total costs without radiologist costs in the public health centres were (FIM): conventional roentgen examinations 107, contrast medium examinations 988, ultrasound 203, mammography 557. The average currency rate of exchange in 1991 was USD 1 = FIM 4.046. The following formula is proposed for calculating the cost of a radiological examination (or a group of examinations) performed with a certain piece of equipment during a period of time (e.g. 1 year): a2/ sigma ax*ax+ b2/ sigma bx*bx+ d1/d5*dx+ e1 + [(c1+ c2) + d4 + (e2 - e3) + f5 + g1+ g2+ i]/n.

  7. Cost accounting of radiological examinations. Cost analysis of radiological examinations of intermediate referral hospitals and general practice.

    PubMed

    Lääperi, A L

    1996-01-01

    nonmonetary variables was developed. In it the radiologist, radiographer and examination-specific equipment costs were allocated to the examinations applying estimated cost equivalents. Some minor cost items were replaced by a general cost factor (GCF). The program is suitable for internal cost accounting of radiological departments as well as regional planning. If more accurate cost information is required, cost assignment employing the actual consumption of the resources and applying the principles of activity-based cost accounting is recommended. As an application of the cost accounting formula the average costs of the radiological examinations were calculated. In conventional radiography the average proportion of the cost factors in the total material was: personnel costs 43%, equipment costs 26%, material costs 7%, real estate costs 11%, administration and overheads 14%. The average total costs including radiologist costs in the hospitals were (FIM): conventional roentgen examinations 188, contrast medium examinations 695, ultrasound 296, mammography 315, roentgen examinations with mobile equipment 1578. The average total costs without radiologist costs in the public health centres were (FIM): conventional roentgen examinations 107, contrast medium examinations 988, ultrasound 203, mammography 557. The average currency rate of exchange in 1991 was USD 1 = FIM 4.046. The following formula is proposed for calculating the cost of a radiological examination (or a group of examinations) performed with a certain piece of equipment during a period of time (e.g. 1 year): a2/ sigma ax*ax+ b2/ sigma bx*bx+ d1/d5*dx+ e1 + [(c1+ c2) + d4 + (e2 - e3) + f5 + g1+ g2+ i]/n. PMID:8804226

  8. Determination of chest x-ray cost using activity based costing approach at Penang General Hospital, Malaysia

    PubMed Central

    Atif, Muhammad; Sulaiman, Syed Azhar Syed; Shafie, Asrul Akmal; Saleem, Fahad; Ahmad, Nafees

    2012-01-01

    Background Activity based costing (ABC) is an approach to get insight of true costs and to solve accounting problems. It provides more accurate information on product cost than conventional accounting system. The purpose of this study was to identify detailed resource consumption for chest x-ray procedure. Methods Human resource cost was calculated by multiplying the mean time spent by employees doing specific activity to their per-minute salaries. The costs of consumables and clinical equipments were obtained from the procurement section of the Radiology Department. The cost of the building was calculated by multiplying the area of space used by the chest X-ray facility with the unit cost of public building department. Moreover, straight-line deprecation with a discount rate of 3% was assumed for calculation of equivalent annual costs for building and machines. Cost of electricity was calculated by multiplying number of kilo watts used by electrical appliance in the year 2010 with electricity tariff for Malaysian commercial consumers (MYR 0.31 per kWh). Results Five activities were identified which were required to develop one chest X-ray film. Human resource, capital, consumable and electricity cost was MYR 1.48, MYR 1.98, MYR 2.15 and MYR 0.04, respectively. Total cost of single chest X-ray was MYR 5.65 (USD 1.75). Conclusion By applying ABC approach, we can have more detailed and precise estimate of cost for specific activity or service. Choice of repeating a chest X-ray can be based on our findings, when cost is a limiting factor. PMID:22891098

  9. Cost analysis and the practicing radiologist/manager: an introduction to managerial accounting.

    PubMed

    Forman, H P; Yin, D

    1996-06-01

    Cost analysis is inherently one of the most tedious tasks falling on the shoulders of any manager. In today's world, whether in a service business such as radiology or medicine or in a product line such as car manufacturing, accurate cost analysis is critical to all aspects of management: marketing, competitive strategy, quality control, human resource management, accounting (financial), and operations management, to name but a few. This is a topic that we will explore with the intention of giving the radiologist/manager the understanding and the basic skills to use cost analysis efficiently, making sure that major financial decisions are being made with adequate cost information, and showing that cost accounting is really managerial accounting in that it pays little attention to the bottom line of financial statements but places much more emphasis on equipping managers with the information to determine budgets, prices, salaries, and incentives and influences capital budgeting decisions through an understanding of product profitability rather than firm profitability.

  10. DR-TAMAS: Diffeomorphic Registration for Tensor Accurate Alignment of Anatomical Structures.

    PubMed

    Irfanoglu, M Okan; Nayak, Amritha; Jenkins, Jeffrey; Hutchinson, Elizabeth B; Sadeghi, Neda; Thomas, Cibu P; Pierpaoli, Carlo

    2016-05-15

    In this work, we propose DR-TAMAS (Diffeomorphic Registration for Tensor Accurate alignMent of Anatomical Structures), a novel framework for intersubject registration of Diffusion Tensor Imaging (DTI) data sets. This framework is optimized for brain data and its main goal is to achieve an accurate alignment of all brain structures, including white matter (WM), gray matter (GM), and spaces containing cerebrospinal fluid (CSF). Currently most DTI-based spatial normalization algorithms emphasize alignment of anisotropic structures. While some diffusion-derived metrics, such as diffusion anisotropy and tensor eigenvector orientation, are highly informative for proper alignment of WM, other tensor metrics such as the trace or mean diffusivity (MD) are fundamental for a proper alignment of GM and CSF boundaries. Moreover, it is desirable to include information from structural MRI data, e.g., T1-weighted or T2-weighted images, which are usually available together with the diffusion data. The fundamental property of DR-TAMAS is to achieve global anatomical accuracy by incorporating in its cost function the most informative metrics locally. Another important feature of DR-TAMAS is a symmetric time-varying velocity-based transformation model, which enables it to account for potentially large anatomical variability in healthy subjects and patients. The performance of DR-TAMAS is evaluated with several data sets and compared with other widely-used diffeomorphic image registration techniques employing both full tensor information and/or DTI-derived scalar maps. Our results show that the proposed method has excellent overall performance in the entire brain, while being equivalent to the best existing methods in WM.

  11. Spacecraft platform cost estimating relationships

    NASA Technical Reports Server (NTRS)

    Gruhl, W. M.

    1972-01-01

    The three main cost areas of unmanned satellite development are discussed. The areas are identified as: (1) the spacecraft platform (SCP), (2) the payload or experiments, and (3) the postlaunch ground equipment and operations. The SCP normally accounts for over half of the total project cost and accurate estimates of SCP costs are required early in project planning as a basis for determining total project budget requirements. The development of single formula SCP cost estimating relationships (CER) from readily available data by statistical linear regression analysis is described. The advantages of single formula CER are presented.

  12. Cost Differential Analysis: Providing Data for Added Cost Funding

    ERIC Educational Resources Information Center

    Nystrom, Dennis C.; Hennessy, James V.

    1975-01-01

    A 1972-73 statewide study conducted in Illinois to develop a cost accounting system which facilitates cost differential ratios for secondary vocational education courses indicated that vocational programs are approximately twice as expensive as nonvocational. Specific cost elements identified in the study provided essential information regarding…

  13. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  14. Selecting accurate statements from the cognitive interview using confidence ratings.

    PubMed

    Roberts, Wayne T; Higham, Philip A

    2002-03-01

    Participants viewed a videotape of a simulated murder, and their recall (and confidence) was tested 1 week later with the cognitive interview. Results indicated that (a) the subset of statements assigned high confidence was more accurate than the full set of statements; (b) the accuracy benefit was limited to information that forensic experts considered relevant to an investigation, whereas peripheral information showed the opposite pattern; (c) the confidence-accuracy relationship was higher for relevant than for peripheral information; (d) the focused-retrieval phase was associated with a greater proportion of peripheral and a lesser proportion of relevant information than the other phases; and (e) only about 50% of the relevant information was elicited, and most of this was elicited in Phase 1.

  15. Variation in the costs of delivering routine immunization services in Peru.

    PubMed Central

    Walker, D.; Mosqueira, N. R.; Penny, M. E.; Lanata, C. F.; Clark, A. D.; Sanderson, C. F. B.; Fox-Rushby, J. A.

    2004-01-01

    OBJECTIVE: Estimates of vaccination costs usually provide only point estimates at national level with no information on cost variation. In practice, however, such information is necessary for programme managers. This paper presents information on the variations in costs of delivering routine immunization services in three diverse districts of Peru: Ayacucho (a mountainous area), San Martin (a jungle area) and Lima (a coastal area). METHODS: We consider the impact of variability on predictions of cost and reflect on the likely impact on expected cost-effectiveness ratios, policy decisions and future research practice. All costs are in 2002 prices in US dollars and include the costs of providing vaccination services incurred by 19 government health facilities during the January-December 2002 financial year. Vaccine wastage rates have been estimated using stock records. FINDINGS: The cost per fully vaccinated child ranged from 16.63-24.52 U.S. Dollars in Ayacucho, 21.79-36.69 U.S. Dollars in San Martin and 9.58-20.31 U.S. Dollars in Lima. The volume of vaccines administered and wastage rates are determinants of the variation in costs of delivering routine immunization services. CONCLUSION: This study shows there is considerable variation in the costs of providing vaccines across geographical regions and different types of facilities. Information on how costs vary can be used as a basis from which to generalize to other settings and provide more accurate estimates for decision-makers who do not have disaggregated data on local costs. Future studies should include sufficiently large sample sizes and ensure that regions are carefully selected in order to maximize the interpretation of cost variation. PMID:15628205

  16. Avoidable waste management costs

    SciTech Connect

    Hsu, K.; Burns, M.; Priebe, S.; Robinson, P.

    1995-01-01

    This report describes the activity based costing method used to acquire variable (volume dependent or avoidable) waste management cost data for routine operations at Department of Energy (DOE) facilities. Waste volumes from environmental restoration, facility stabilization activities, and legacy waste were specifically excluded from this effort. A core team consisting of Idaho National Engineering Laboratory, Los Alamos National Laboratory, Rocky Flats Environmental Technology Site, and Oak Ridge Reservation developed and piloted the methodology, which can be used to determine avoidable waste management costs. The method developed to gather information was based on activity based costing, which is a common industrial engineering technique. Sites submitted separate flow diagrams that showed the progression of work from activity to activity for each waste type or treatability group. Each activity on a flow diagram was described in a narrative, which detailed the scope of the activity. Labor and material costs based on a unit quantity of waste being processed were then summed to generate a total cost for that flow diagram. Cross-complex values were calculated by determining a weighted average for each waste type or treatability group based on the volume generated. This study will provide DOE and contractors with a better understanding of waste management processes and their associated costs. Other potential benefits include providing cost data for sites to perform consistent cost/benefit analysis of waste minimization and pollution prevention (WMIN/PP) options identified during pollution prevention opportunity assessments and providing a means for prioritizing and allocating limited resources for WMIN/PP.

  17. Cost accounting for blood bank laboratories.

    PubMed

    Tessel, J A

    1989-01-01

    To meet the challenges of present-day blood banking, laboratory managers and supervisors must acquire and use skills in financial management. One such skill is cost analysis. Cost analyses vary from simple to complex and are used to determine the basic elements contributing to a test cost. Cost analysis can be used to identify costs, justify updating laboratory test prices, monitor general supply and reagent costs, help in the decision to lease or buy an instrument, modify existing test procedures to cut costs, determine staffing needs, and assure accurate reimbursement for laboratory services.

  18. Troubleshooting Costs

    NASA Astrophysics Data System (ADS)

    Kornacki, Jeffrey L.

    Seventy-six million cases of foodborne disease occur each year in the United States alone. Medical and lost productivity costs of the most common pathogens are estimated to be 5.6-9.4 billion. Product recalls, whether from foodborne illness or spoilage, result in added costs to manufacturers in a variety of ways. These may include expenses associated with lawsuits from real or allegedly stricken individuals and lawsuits from shorted customers. Other costs include those associated with efforts involved in finding the source of the contamination and eliminating it and include time when lines are shut down and therefore non-productive, additional non-routine testing, consultant fees, time and personnel required to overhaul the entire food safety system, lost market share to competitors, and the cost associated with redesign of the factory and redesign or acquisition of more hygienic equipment. The cost associated with an effective quality assurance plan is well worth the effort to prevent the situations described.

  19. The fiscal impact of informal caregiving to home care recipients in Canada: how the intensity of care influences costs and benefits to government.

    PubMed

    Jacobs, Josephine C; Lilly, Meredith B; Ng, Carita; Coyte, Peter C

    2013-03-01

    The objective of this study was to estimate the annual costs and consequences of unpaid caregiving by Canadians from a government perspective. We estimated these costs both at the individual and population levels for caregivers aged 45 and older. We conducted a cost-benefit analysis where we considered the costs of unpaid caregiving to be potential losses in income tax revenues and changes in social assistance payments and the potential benefit of reduced paid care expenditures. Our costing methods were based on multivariate analyses using the 2007 General Social Survey, a cross-sectional survey of 23,404 individuals. We determined the differential probability of employment, wages, and hours worked by caregivers of varying intensity versus non-caregivers. We also used multivariate analysis to determine how receiving different intensities of unpaid care impacted both the probability of receiving paid care and the weekly hours of paid care received. At the lowest intensities of caregiving, there was a net benefit to government from caregiving, at both the individual and population levels. At the population level, the net benefit to government was estimated to be $4.4 billion for caregivers providing less than five hours of weekly care. At the highest intensity of caregiving, there was a net cost to government of $641 million. Our overall findings were robust to a number of changes applied in our sensitivity analysis. We found that the factor with the greatest impact on cost was the probability of labour force participation. As the biggest cost driver appears to be the higher likelihood of intense caregivers dropping out of the labour force, government policies that enable intense caregivers to balance caregiving with employment may help to mitigate these losses.

  20. The fiscal impact of informal caregiving to home care recipients in Canada: how the intensity of care influences costs and benefits to government.

    PubMed

    Jacobs, Josephine C; Lilly, Meredith B; Ng, Carita; Coyte, Peter C

    2013-03-01

    The objective of this study was to estimate the annual costs and consequences of unpaid caregiving by Canadians from a government perspective. We estimated these costs both at the individual and population levels for caregivers aged 45 and older. We conducted a cost-benefit analysis where we considered the costs of unpaid caregiving to be potential losses in income tax revenues and changes in social assistance payments and the potential benefit of reduced paid care expenditures. Our costing methods were based on multivariate analyses using the 2007 General Social Survey, a cross-sectional survey of 23,404 individuals. We determined the differential probability of employment, wages, and hours worked by caregivers of varying intensity versus non-caregivers. We also used multivariate analysis to determine how receiving different intensities of unpaid care impacted both the probability of receiving paid care and the weekly hours of paid care received. At the lowest intensities of caregiving, there was a net benefit to government from caregiving, at both the individual and population levels. At the population level, the net benefit to government was estimated to be $4.4 billion for caregivers providing less than five hours of weekly care. At the highest intensity of caregiving, there was a net cost to government of $641 million. Our overall findings were robust to a number of changes applied in our sensitivity analysis. We found that the factor with the greatest impact on cost was the probability of labour force participation. As the biggest cost driver appears to be the higher likelihood of intense caregivers dropping out of the labour force, government policies that enable intense caregivers to balance caregiving with employment may help to mitigate these losses. PMID:23347496

  1. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture.

    PubMed

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-09-22

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain.

  2. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture

    PubMed Central

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-01-01

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain. PMID:26402681

  3. The Yale Cost Model and cost centres: servant or master?

    PubMed

    Rigby, E

    1993-01-01

    Cost accounting describes that aspect of accounting which collects, allocates and controls the cost of producing a service. Costing information is primarily reported to management to enable control of costs and to ensure the financial viability of units, departments and divisions. As costing studies continue to produce estimates of Diagnosis Related Group (DRG) costs in New South Wales hospitals, as well as in other states, costs for different hospitals are being externally compared, using a tool which is usually related to internal management and reporting. Comparability of costs is assumed even though accounting systems differ. This paper examines the cost centre structures at five major teaching hospitals in Sydney. It describes the similarities and differences in how the cost centres were constituted, and then details the line items of expenditure that are charged to each cost centre. The results of a comparative study of a medical specialty are included as evidence of different costing methodologies in the hospitals. The picture that emerged from the study is that the hospitals are constituting their cost centres to meet their internal management needs, that is, to know the cost of running a ward or nursing unit, a medical specialty, department and so on. The rationale for the particular cost centre construction was that cost centre managers could manage and control costs and assign responsibility. There are variations in procedures for assigning costs to cost centres, and the question is asked 'Do these variations in procedures make a material difference to our ability to compare costs per Diagnosis Related Group at the various hospitals?' It is contended that the accounting information, which is produced as a result of different practices, is primarily for internal management, not external comparison. It would be better for hospitals to compare their estimated costs per Diagnosis Related Group to an internal standard cost rather than the costs from other

  4. Closing the mental health treatment gap in South Africa: a review of costs and cost-effectiveness

    PubMed Central

    Jack, Helen; Wagner, Ryan G.; Petersen, Inge; Thom, Rita; Newton, Charles R.; Stein, Alan; Kahn, Kathleen; Tollman, Stephen; Hofman, Karen J.

    2014-01-01

    Background Nearly one in three South Africans will suffer from a mental disorder in his or her lifetime, a higher prevalence than many low- and middle-income countries. Understanding the economic costs and consequences of prevention and packages of care is essential, particularly as South Africa considers scaling-up mental health services and works towards universal health coverage. Economic evaluations can inform how priorities are set in system or spending changes. Objective To identify and review research from South Africa and sub-Saharan Africa on the direct and indirect costs of mental, neurological, and substance use (MNS) disorders and the cost-effectiveness of treatment interventions. Design Narrative overview methodology. Results and conclusions Reviewed studies indicate that integrating mental health care into existing health systems may be the most effective and cost-efficient approach to increase access to mental health services in South Africa. Integration would also direct treatment, prevention, and screening to people with HIV and other chronic health conditions who are at high risk for mental disorders. We identify four major knowledge gaps: 1) accurate and thorough assessment of the health burdens of MNS disorders, 2) design and assessment of interventions that integrate mental health screening and treatment into existing health systems, 3) information on the use and costs of traditional medicines, and 4) cost-effectiveness evaluation of a range of specific interventions or packages of interventions that are tailored to the national context. PMID:24848654

  5. Information logistics: A production-line approach to information services

    NASA Technical Reports Server (NTRS)

    Adams, Dennis; Lee, Chee-Seng

    1991-01-01

    Logistics can be defined as the process of strategically managing the acquisition, movement, and storage of materials, parts, and finished inventory (and the related information flow) through the organization and its marketing channels in a cost effective manner. It is concerned with delivering the right product to the right customer in the right place at the right time. The logistics function is composed of inventory management, facilities management, communications unitization, transportation, materials management, and production scheduling. The relationship between logistics and information systems is clear. Systems such as Electronic Data Interchange (EDI), Point of Sale (POS) systems, and Just in Time (JIT) inventory management systems are important elements in the management of product development and delivery. With improved access to market demand figures, logisticians can decrease inventory sizes and better service customer demand. However, without accurate, timely information, little, if any, of this would be feasible in today's global markets. Information systems specialists can learn from logisticians. In a manner similar to logistics management, information logistics is concerned with the delivery of the right data, to the ring customer, at the right time. As such, information systems are integral components of the information logistics system charged with providing customers with accurate, timely, cost-effective, and useful information. Information logistics is a management style and is composed of elements similar to those associated with the traditional logistics activity: inventory management (data resource management), facilities management (distributed, centralized and decentralized information systems), communications (participative design and joint application development methodologies), unitization (input/output system design, i.e., packaging or formatting of the information), transportations (voice, data, image, and video communication systems

  6. Assessing the Cost of Global Biodiversity and Conservation Knowledge.

    PubMed

    Juffe-Bignoli, Diego; Brooks, Thomas M; Butchart, Stuart H M; Jenkins, Richard B; Boe, Kaia; Hoffmann, Michael; Angulo, Ariadne; Bachman, Steve; Böhm, Monika; Brummitt, Neil; Carpenter, Kent E; Comer, Pat J; Cox, Neil; Cuttelod, Annabelle; Darwall, William R T; Di Marco, Moreno; Fishpool, Lincoln D C; Goettsch, Bárbara; Heath, Melanie; Hilton-Taylor, Craig; Hutton, Jon; Johnson, Tim; Joolia, Ackbar; Keith, David A; Langhammer, Penny F; Luedtke, Jennifer; Nic Lughadha, Eimear; Lutz, Maiko; May, Ian; Miller, Rebecca M; Oliveira-Miranda, María A; Parr, Mike; Pollock, Caroline M; Ralph, Gina; Rodríguez, Jon Paul; Rondinini, Carlo; Smart, Jane; Stuart, Simon; Symes, Andy; Tordoff, Andrew W; Woodley, Stephen; Young, Bruce; Kingston, Naomi

    2016-01-01

    Knowledge products comprise assessments of authoritative information supported by standards, governance, quality control, data, tools, and capacity building mechanisms. Considerable resources are dedicated to developing and maintaining knowledge products for biodiversity conservation, and they are widely used to inform policy and advise decision makers and practitioners. However, the financial cost of delivering this information is largely undocumented. We evaluated the costs and funding sources for developing and maintaining four global biodiversity and conservation knowledge products: The IUCN Red List of Threatened Species, the IUCN Red List of Ecosystems, Protected Planet, and the World Database of Key Biodiversity Areas. These are secondary data sets, built on primary data collected by extensive networks of expert contributors worldwide. We estimate that US$160 million (range: US$116-204 million), plus 293 person-years of volunteer time (range: 278-308 person-years) valued at US$ 14 million (range US$12-16 million), were invested in these four knowledge products between 1979 and 2013. More than half of this financing was provided through philanthropy, and nearly three-quarters was spent on personnel costs. The estimated annual cost of maintaining data and platforms for three of these knowledge products (excluding the IUCN Red List of Ecosystems for which annual costs were not possible to estimate for 2013) is US$6.5 million in total (range: US$6.2-6.7 million). We estimated that an additional US$114 million will be needed to reach pre-defined baselines of data coverage for all the four knowledge products, and that once achieved, annual maintenance costs will be approximately US$12 million. These costs are much lower than those to maintain many other, similarly important, global knowledge products. Ensuring that biodiversity and conservation knowledge products are sufficiently up to date, comprehensive and accurate is fundamental to inform decision-making for

  7. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities

    PubMed Central

    Helb, Danica A.; Tetteh, Kevin K. A.; Felgner, Philip L.; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R.; Beeson, James G.; Tappero, Jordan; Smith, David L.; Crompton, Peter D.; Rosenthal, Philip J.; Dorsey, Grant; Drakeley, Christopher J.; Greenhouse, Bryan

    2015-01-01

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual’s recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86–0.93), whereas responses to six antigens accurately estimated an individual’s malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs. PMID:26216993

  8. Modified chemiluminescent NO analyzer accurately measures NOX

    NASA Technical Reports Server (NTRS)

    Summers, R. L.

    1978-01-01

    Installation of molybdenum nitric oxide (NO)-to-higher oxides of nitrogen (NOx) converter in chemiluminescent gas analyzer and use of air purge allow accurate measurements of NOx in exhaust gases containing as much as thirty percent carbon monoxide (CO). Measurements using conventional analyzer are highly inaccurate for NOx if as little as five percent CO is present. In modified analyzer, molybdenum has high tolerance to CO, and air purge substantially quenches NOx destruction. In test, modified chemiluminescent analyzer accurately measured NO and NOx concentrations for over 4 months with no denegration in performance.

  9. PRIMAL: Fast and Accurate Pedigree-based Imputation from Sequence Data in a Founder Population

    PubMed Central

    Livne, Oren E.; Han, Lide; Alkorta-Aranburu, Gorka; Wentworth-Sheilds, William; Abney, Mark; Ober, Carole; Nicolae, Dan L.

    2015-01-01

    Founder populations and large pedigrees offer many well-known advantages for genetic mapping studies, including cost-efficient study designs. Here, we describe PRIMAL (PedigRee IMputation ALgorithm), a fast and accurate pedigree-based phasing and imputation algorithm for founder populations. PRIMAL incorporates both existing and original ideas, such as a novel indexing strategy of Identity-By-Descent (IBD) segments based on clique graphs. We were able to impute the genomes of 1,317 South Dakota Hutterites, who had genome-wide genotypes for ~300,000 common single nucleotide variants (SNVs), from 98 whole genome sequences. Using a combination of pedigree-based and LD-based imputation, we were able to assign 87% of genotypes with >99% accuracy over the full range of allele frequencies. Using the IBD cliques we were also able to infer the parental origin of 83% of alleles, and genotypes of deceased recent ancestors for whom no genotype information was available. This imputed data set will enable us to better study the relative contribution of rare and common variants on human phenotypes, as well as parental origin effect of disease risk alleles in >1,000 individuals at minimal cost. PMID:25735005

  10. Comprehensive cost analysis of sentinel node biopsy in solid head and neck tumors using a time-driven activity-based costing approach.

    PubMed

    Crott, Ralph; Lawson, Georges; Nollevaux, Marie-Cécile; Castiaux, Annick; Krug, Bruno

    2016-09-01

    Head and neck cancer (HNC) is predominantly a locoregional disease. Sentinel lymph node (SLN) biopsy offers a minimally invasive means of accurately staging the neck. Value in healthcare is determined by both outcomes and the costs associated with achieving them. Time-driven activity-based costing (TDABC) may offer more precise estimates of the true cost. Process maps were developed for nuclear medicine, operating room and pathology care phases. TDABC estimates the costs by combining information about the process with the unit cost of each resource used. Resource utilization is based on observation of care and staff interviews. Unit costs are calculated as a capacity cost rate, measured as a Euros/min (2014), for each resource consumed. Multiplying together the unit costs and resource quantities and summing across all resources used will produce the average cost for each phase of care. Three time equations with six different scenarios were modeled based on the type of camera, the number of SLN and the type of staining used. Total times for different SLN scenarios vary between 284 and 307 min, respectively, with a total cost between 2794 and 3541€. The unit costs vary between 788€/h for the intraoperative evaluation with a gamma-probe and 889€/h for a preoperative imaging with a SPECT/CT. The unit costs for the lymphadenectomy and the pathological examination are, respectively, 560 and 713€/h. A 10 % increase of time per individual activity generates only 1 % change in the total cost. TDABC evaluates the cost of SLN in HNC. The total costs across all phases which varied between 2761 and 3744€ per standard case. PMID:27170361

  11. Comprehensive cost analysis of sentinel node biopsy in solid head and neck tumors using a time-driven activity-based costing approach.

    PubMed

    Crott, Ralph; Lawson, Georges; Nollevaux, Marie-Cécile; Castiaux, Annick; Krug, Bruno

    2016-09-01

    Head and neck cancer (HNC) is predominantly a locoregional disease. Sentinel lymph node (SLN) biopsy offers a minimally invasive means of accurately staging the neck. Value in healthcare is determined by both outcomes and the costs associated with achieving them. Time-driven activity-based costing (TDABC) may offer more precise estimates of the true cost. Process maps were developed for nuclear medicine, operating room and pathology care phases. TDABC estimates the costs by combining information about the process with the unit cost of each resource used. Resource utilization is based on observation of care and staff interviews. Unit costs are calculated as a capacity cost rate, measured as a Euros/min (2014), for each resource consumed. Multiplying together the unit costs and resource quantities and summing across all resources used will produce the average cost for each phase of care. Three time equations with six different scenarios were modeled based on the type of camera, the number of SLN and the type of staining used. Total times for different SLN scenarios vary between 284 and 307 min, respectively, with a total cost between 2794 and 3541€. The unit costs vary between 788€/h for the intraoperative evaluation with a gamma-probe and 889€/h for a preoperative imaging with a SPECT/CT. The unit costs for the lymphadenectomy and the pathological examination are, respectively, 560 and 713€/h. A 10 % increase of time per individual activity generates only 1 % change in the total cost. TDABC evaluates the cost of SLN in HNC. The total costs across all phases which varied between 2761 and 3744€ per standard case.

  12. 48 CFR 52.215-10 - Price Reduction for Defective Certified Cost or Pricing Data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... complete, accurate, and current as certified in its Certificate of Current Cost or Pricing Data; (2) A... complete, accurate, and current as certified in the Contractor's Certificate of Current Cost or Pricing... been modified even if accurate, complete, and current certified cost or pricing data had been...

  13. Can Appraisers Rate Work Performance Accurately?

    ERIC Educational Resources Information Center

    Hedge, Jerry W.; Laue, Frances J.

    The ability of individuals to make accurate judgments about others is examined and literature on this subject is reviewed. A wide variety of situational factors affects the appraisal of performance. It is generally accepted that the purpose of the appraisal influences the accuracy of the appraiser. The instrumentation, or tools, available to the…

  14. Practical Schemes for Accurate Forces in Quantum Monte Carlo.

    PubMed

    Moroni, S; Saccani, S; Filippi, C

    2014-11-11

    While the computation of interatomic forces has become a well-established practice within variational Monte Carlo (VMC), the use of the more accurate Fixed-Node Diffusion Monte Carlo (DMC) method is still largely limited to the computation of total energies on structures obtained at a lower level of theory. Algorithms to compute exact DMC forces have been proposed in the past, and one such scheme is also put forward in this work, but remain rather impractical due to their high computational cost. As a practical route to DMC forces, we therefore revisit here an approximate method, originally developed in the context of correlated sampling and named here the Variational Drift-Diffusion (VD) approach. We thoroughly investigate its accuracy by checking the consistency between the approximate VD force and the derivative of the DMC potential energy surface for the SiH and C2 molecules and employ a wide range of wave functions optimized in VMC to assess its robustness against the choice of trial function. We find that, for all but the poorest wave function, the discrepancy between force and energy is very small over all interatomic distances, affecting the equilibrium bond length obtained with the VD forces by less than 0.004 au. Furthermore, when the VMC forces are approximate due to the use of a partially optimized wave function, the DMC forces have smaller errors and always lead to an equilibrium distance in better agreement with the experimental value. We also show that the cost of computing the VD forces is only slightly larger than the cost of calculating the DMC energy. Therefore, the VD approximation represents a robust and efficient approach to compute accurate DMC forces, superior to the VMC counterparts.

  15. A systematic review of the unit costs of allied health and community services used by older people in Australia

    PubMed Central

    2013-01-01

    Background An economic evaluation of interventions for older people requires accurate assessment of costing and consideration of both acute and long-term services. Accurate information on the unit cost of allied health and community services is not readily available in Australia however. This systematic review therefore aims to synthesise information available in the literature on the unit costs of allied health and community services that may be utilised by an older person living in Australia. Method A comprehensive search of Medline, Embase, CINAHL, Google Scholar and Google was undertaken. Specialised economic databases were also reviewed. In addition Australian Government Department websites were inspected. The search identified the cost of specified allied health services including: physiotherapy, occupational therapy, dietetics, podiatry, counselling and home nursing. The range of community services included: personal care, meals on wheels, transport costs and domestic services. Where the information was not available, direct contact with service providers was made. Results The number of eligible studies included in the qualitative synthesis was fourty-nine. Calculated hourly rates for Australian allied health services were adjusted to be in equivalent currency and were as follows as follows: physiotherapy $157.75, occupational therapy $150.77, dietetics $163.11, psychological services $165.77, community nursing $105.76 and podiatry $129.72. Conclusions Utilisation of the Medicare Benefits Scheduled fee as a broad indicator of the costs of services, may lead to underestimation of the real costs of services and therefore to inaccuracies in economic evaluation. PMID:23421756

  16. Predicting patients with high risk of becoming high-cost healthcare users in Ontario (Canada).

    PubMed

    Chechulin, Yuriy; Nazerian, Amir; Rais, Saad; Malikov, Kamil

    2014-02-01

    Literature and original analysis of healthcare costs have shown that a small proportion of patients consume the majority of healthcare resources. A proactive approach is to target interventions towards those patients who are at risk of becoming high-cost users (HCUs). This approach requires identifying high-risk patients accurately before substantial avoidable costs have been incurred and health status has deteriorated further. We developed a predictive model to identify patients at risk of becoming HCUs in Ontario. HCUs were defined as the top 5% of patients incurring the highest costs. Information was collected on various demographic and utilization characteristics. The modelling technique used was logistic regression. If the top 5% of patients at risk of becoming HCUs are followed, the sensitivity is 42.2% and specificity is 97%. Alternatives for implementation of the model include collaboration between different levels of healthcare services for personalized healthcare interventions and interventions addressing needs of patient cohorts with high-cost conditions. PMID:24726075

  17. Light Field Imaging Based Accurate Image Specular Highlight Removal.

    PubMed

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into "unsaturated" and "saturated" category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  18. A catalog of isolated galaxy pairs with accurate radial velocities

    NASA Astrophysics Data System (ADS)

    Chamaraux, P.; Nottale, L.

    2016-07-01

    The present paper is devoted to the construction of a catalog of isolated galaxy pairs from the Uppsala Galaxy Catalog (UGC), using accurate radial velocities. The UGC lists 12 921 galaxies to δ > -2°30' and is complete to an apparent diameter of 1'. The criteria used to define the isolated galaxy pairs are based on velocity, interdistance, reciprocity and isolation information. A peculiar investigation has allowed to gather very accurate radial velocities for pair members, from high quality HI and optical measurements (median uncertainty on velocity differences 10 kms-1). Our final catalog contains 1005 galaxy pairs with ρ > 2.5, of which 509 have ρ > 5 (50% of the pairs, i.e. 8%of the UGC galaxies) and 273 are highly isolated with ρ > 10 (27% of the pairs, i.e. 4% of the UGC galaxies). Some global properties of the pair catalog are given.

  19. Methods for accurate homology modeling by global optimization.

    PubMed

    Joo, Keehyoung; Lee, Jinwoo; Lee, Jooyoung

    2012-01-01

    High accuracy protein modeling from its sequence information is an important step toward revealing the sequence-structure-function relationship of proteins and nowadays it becomes increasingly more useful for practical purposes such as in drug discovery and in protein design. We have developed a protocol for protein structure prediction that can generate highly accurate protein models in terms of backbone structure, side-chain orientation, hydrogen bonding, and binding sites of ligands. To obtain accurate protein models, we have combined a powerful global optimization method with traditional homology modeling procedures such as multiple sequence alignment, chain building, and side-chain remodeling. We have built a series of specific score functions for these steps, and optimized them by utilizing conformational space annealing, which is one of the most successful combinatorial optimization algorithms currently available.

  20. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    SciTech Connect

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  1. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  2. Costs and benefits

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two models of cost benefit analysis are illustrated and the application of these models to assessing the economic scope of space applications programs was discussed. Four major areas cited as improvable through space derived information - food supply and distribution, energy sources, mineral reserves, and communication and navigation were - discussed. Specific illustrations are given for agriculture and maritime traffic.

  3. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  4. Accurate object tracking system by integrating texture and depth cues

    NASA Astrophysics Data System (ADS)

    Chen, Ju-Chin; Lin, Yu-Hang

    2016-03-01

    A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.

  5. Efficient and Accurate Indoor Localization Using Landmark Graphs

    NASA Astrophysics Data System (ADS)

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  6. The impact of disease stage on direct medical costs of HIV management: a review of the international literature.

    PubMed

    Levy, Adrian; Johnston, Karissa; Annemans, Lieven; Tramarin, Andrea; Montaner, Julio

    2010-01-01

    The global prevalence of HIV infection continues to grow, as a result of increasing incidence in some countries and improved survival where highly active antiretroviral therapy (HAART) is available. Growing healthcare expenditure and shifts in the types of medical resources used have created a greater need for accurate information on the costs of treatment. The objectives of this review were to compare published estimates of direct medical costs for treating HIV and to determine the impact of disease stage on such costs, based on CD4 cell count and plasma viral load. A literature review was conducted to identify studies meeting prespecified criteria for information content, including an original estimate of the direct medical costs of treating an HIV-infected individual, stratified based on markers of disease progression. Three unpublished cost-of-care studies were also included, which were applied in the economic analyses published in this supplement. A two-step procedure was used to convert costs into a common price year (2004) using country-specific health expenditure inflators and, to account for differences in currency, using health-specific purchasing power parities to express all cost estimates in US dollars. In all nine studies meeting the eligibility criteria, infected individuals were followed longitudinally and a 'bottom-up' approach was used to estimate costs. The same patterns were observed in all studies: the lowest CD4 categories had the highest cost; there was a sharp decrease in costs as CD4 cell counts rose towards 100 cells/mm³; and there was a more gradual decline in costs as CD4 cell counts rose above 100 cells/mm³. In the single study reporting cost according to viral load, it was shown that higher plasma viral load level (> 100,000 HIV-RNA copies/mL) was associated with higher costs of care. The results demonstrate that the cost of treating HIV disease increases with disease progression, particularly at CD4 cell counts below 100 cells

  7. Application of activity-based costing (ABC) for a Peruvian NGO healthcare provider.

    PubMed

    Waters, H; Abdallah, H; Santillán, D

    2001-01-01

    This article describes the application of activity-based costing (ABC) to calculate the unit costs of the services for a health care provider in Peru. While traditional costing allocates overhead and indirect costs in proportion to production volume or to direct costs, ABC assigns costs through activities within an organization. ABC uses personnel interviews to determine principal activities and the distribution of individual's time among these activities. Indirect costs are linked to services through time allocation and other tracing methods, and the result is a more accurate estimate of unit costs. The study concludes that applying ABC in a developing country setting is feasible, yielding results that are directly applicable to pricing and management. ABC determines costs for individual clinics, departments and services according to the activities that originate these costs, showing where an organization spends its money. With this information, it is possible to identify services that are generating extra revenue and those operating at a loss, and to calculate cross subsidies across services. ABC also highlights areas in the health care process where efficiency improvements are possible. Conclusions about the ultimate impact of the methodology are not drawn here, since the study was not repeated and changes in utilization patterns and the addition of new clinics affected applicability of the results. A potential constraint to implementing ABC is the availability and organization of cost information. Applying ABC efficiently requires information to be readily available, by cost category and department, since the greatest benefits of ABC come from frequent, systematic application of the methodology in order to monitor efficiency and provide feedback for management. The article concludes with a discussion of the potential applications of ABC in the health sector in developing countries. PMID:11326572

  8. Application of activity-based costing (ABC) for a Peruvian NGO healthcare provider.

    PubMed

    Waters, H; Abdallah, H; Santillán, D

    2001-01-01

    This article describes the application of activity-based costing (ABC) to calculate the unit costs of the services for a health care provider in Peru. While traditional costing allocates overhead and indirect costs in proportion to production volume or to direct costs, ABC assigns costs through activities within an organization. ABC uses personnel interviews to determine principal activities and the distribution of individual's time among these activities. Indirect costs are linked to services through time allocation and other tracing methods, and the result is a more accurate estimate of unit costs. The study concludes that applying ABC in a developing country setting is feasible, yielding results that are directly applicable to pricing and management. ABC determines costs for individual clinics, departments and services according to the activities that originate these costs, showing where an organization spends its money. With this information, it is possible to identify services that are generating extra revenue and those operating at a loss, and to calculate cross subsidies across services. ABC also highlights areas in the health care process where efficiency improvements are possible. Conclusions about the ultimate impact of the methodology are not drawn here, since the study was not repeated and changes in utilization patterns and the addition of new clinics affected applicability of the results. A potential constraint to implementing ABC is the availability and organization of cost information. Applying ABC efficiently requires information to be readily available, by cost category and department, since the greatest benefits of ABC come from frequent, systematic application of the methodology in order to monitor efficiency and provide feedback for management. The article concludes with a discussion of the potential applications of ABC in the health sector in developing countries.

  9. The Business Case for Payer Support of a Community-Based Health Information Exchange: A Humana Pilot Evaluating Its Effectiveness in Cost Control for Plan Members Seeking Emergency Department Care

    PubMed Central

    Tzeel, Albert; Lawnicki, Victor; Pemble, Kim R.

    2011-01-01

    Background As emergency department utilization continues to increase, health plans must limit their cost exposure, which may be driven by duplicate testing and a lack of medical history at the point of care. Based on previous studies, health information exchanges (HIEs) can potentially provide health plans with the ability to address this need. Objective To assess the effectiveness of a community-based HIE in controlling plan costs arising from emergency department care for a health plan's members. Albert Tzeel Methods The study design was observational, with an eligible population (N = 1482) of fully insured plan members who sought emergency department care on at least 2 occasions during the study period, from December 2008 through March 2010. Cost and utilization data, obtained from member claims, were matched to a list of persons utilizing the emergency department where HIE querying could have occurred. Eligible members underwent propensity score matching to create a test group (N = 326) in which the HIE database was queried in all emergency department visits, and a control group (N = 325) in which the HIE database was not queried in any emergency department visit. Results Post–propensity matching analysis showed that the test group achieved an average savings of $29 per emergency department visit compared with the control group. Decreased utilization of imaging procedures and diagnostic tests drove this cost-savings. Conclusions When clinicians utilize HIE in the care of patients who present to the emergency department, the costs borne by a health plan providing coverage for these patients decrease. Although many factors can play a role in this finding, it is likely that HIEs obviate unnecessary service utilization through provision of historical medical information regarding specific patients at the point of care. PMID:25126351

  10. Feedback about more accurate versus less accurate trials: differential effects on self-confidence and activation.

    PubMed

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-06-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected byfeedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On day 1, participants performed a golf putting task under one of two conditions: one group received feedback on the most accurate trials, whereas another group received feedback on the least accurate trials. On day 2, participants completed an anxiety questionnaire and performed a retention test. Shin conductance level, as a measure of arousal, was determined. The results indicated that feedback about more accurate trials resulted in more effective learning as well as increased self-confidence. Also, activation was a predictor of performance. PMID:22808705

  11. Two highly accurate methods for pitch calibration

    NASA Astrophysics Data System (ADS)

    Kniel, K.; Härtig, F.; Osawa, S.; Sato, O.

    2009-11-01

    Among profiles, helix and tooth thickness pitch is one of the most important parameters of an involute gear measurement evaluation. In principle, coordinate measuring machines (CMM) and CNC-controlled gear measuring machines as a variant of a CMM are suited for these kinds of gear measurements. Now the Japan National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) and the German national metrology institute the Physikalisch-Technische Bundesanstalt (PTB) have each developed independently highly accurate pitch calibration methods applicable to CMM or gear measuring machines. Both calibration methods are based on the so-called closure technique which allows the separation of the systematic errors of the measurement device and the errors of the gear. For the verification of both calibration methods, NMIJ/AIST and PTB performed measurements on a specially designed pitch artifact. The comparison of the results shows that both methods can be used for highly accurate calibrations of pitch standards.

  12. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  13. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  14. Preparation and accurate measurement of pure ozone.

    PubMed

    Janssen, Christof; Simone, Daniela; Guinet, Mickaël

    2011-03-01

    Preparation of high purity ozone as well as precise and accurate measurement of its pressure are metrological requirements that are difficult to meet due to ozone decomposition occurring in pressure sensors. The most stable and precise transducer heads are heated and, therefore, prone to accelerated ozone decomposition, limiting measurement accuracy and compromising purity. Here, we describe a vacuum system and a method for ozone production, suitable to accurately determine the pressure of pure ozone by avoiding the problem of decomposition. We use an inert gas in a particularly designed buffer volume and can thus achieve high measurement accuracy and negligible degradation of ozone with purities of 99.8% or better. The high degree of purity is ensured by comprehensive compositional analyses of ozone samples. The method may also be applied to other reactive gases. PMID:21456766

  15. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  16. Line gas sampling system ensures accurate analysis

    SciTech Connect

    Not Available

    1992-06-01

    Tremendous changes in the natural gas business have resulted in new approaches to the way natural gas is measured. Electronic flow measurement has altered the business forever, with developments in instrumentation and a new sensitivity to the importance of proper natural gas sampling techniques. This paper reports that YZ Industries Inc., Snyder, Texas, combined its 40 years of sampling experience with the latest in microprocessor-based technology to develop the KynaPak 2000 series, the first on-line natural gas sampling system that is both compact and extremely accurate. This means the composition of the sampled gas must be representative of the whole and related to flow. If so, relative measurement and sampling techniques are married, gas volumes are accurately accounted for and adjustments to composition can be made.

  17. Solar power satellite cost estimate

    NASA Technical Reports Server (NTRS)

    Harron, R. J.; Wadle, R. C.

    1981-01-01

    The solar power configuration costed is the 5 GW silicon solar cell reference system. The subsystems identified by work breakdown structure elements to the lowest level for which cost information was generated. This breakdown divides into five sections: the satellite, construction, transportation, the ground receiving station and maintenance. For each work breakdown structure element, a definition, design description and cost estimate were included. An effort was made to include for each element a reference that more thoroughly describes the element and the method of costing used. All costs are in 1977 dollars.

  18. Selected Tether Applications Cost Model

    NASA Technical Reports Server (NTRS)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  19. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    PubMed

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset.

  20. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-10-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  1. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-04-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  2. Accurate Molecular Polarizabilities Based on Continuum Electrostatics

    PubMed Central

    Truchon, Jean-François; Nicholls, Anthony; Iftimie, Radu I.; Roux, Benoît; Bayly, Christopher I.

    2013-01-01

    A novel approach for representing the intramolecular polarizability as a continuum dielectric is introduced to account for molecular electronic polarization. It is shown, using a finite-difference solution to the Poisson equation, that the Electronic Polarization from Internal Continuum (EPIC) model yields accurate gas-phase molecular polarizability tensors for a test set of 98 challenging molecules composed of heteroaromatics, alkanes and diatomics. The electronic polarization originates from a high intramolecular dielectric that produces polarizabilities consistent with B3LYP/aug-cc-pVTZ and experimental values when surrounded by vacuum dielectric. In contrast to other approaches to model electronic polarization, this simple model avoids the polarizability catastrophe and accurately calculates molecular anisotropy with the use of very few fitted parameters and without resorting to auxiliary sites or anisotropic atomic centers. On average, the unsigned error in the average polarizability and anisotropy compared to B3LYP are 2% and 5%, respectively. The correlation between the polarizability components from B3LYP and this approach lead to a R2 of 0.990 and a slope of 0.999. Even the F2 anisotropy, shown to be a difficult case for existing polarizability models, can be reproduced within 2% error. In addition to providing new parameters for a rapid method directly applicable to the calculation of polarizabilities, this work extends the widely used Poisson equation to areas where accurate molecular polarizabilities matter. PMID:23646034

  3. Cost Control

    ERIC Educational Resources Information Center

    Foreman, Phillip

    2009-01-01

    Education administrators involved in construction initiatives unanimously agree that when it comes to change orders, less is more. Change orders have a negative rippling effect of driving up building costs and producing expensive project delays that often interfere with school operations and schedules. Some change orders are initiated by schools…

  4. Medicaid: Determining Cost-Effectiveness of Home and Community-Based Services. Report to the Administrator, Health Care Financing Administration, Department of Health and Human Services.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    To examine alternatives to nursing home care, states have been testing home and community-based services under the Medicaid program. Information on the operations of the state projects will be vital to designing cost-effective alternative services in the future. The General Accounting Office (GAO) reviewed state reports to see if accurate,…

  5. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  6. An Iterative, Low-Cost Strategy to Building Information Systems Allows a Small Jurisdiction Local Health Department to Increase Efficiencies and Expand Services

    PubMed Central

    Shah, Gulzar H.

    2016-01-01

    Objective and Methods: The objective of this case study was to describe the process and outcomes of a small local health department's (LHD's) strategy to build and use information systems. The case study is based on a review of documents and semi-structured interviews with key informants in the Pomperaug District Health Department. Interviews were recorded, transcribed, coded, and analyzed. Results and Conclusions: The case study here suggests that small LHDs can use a low-resource, incremental strategy to build information systems for improving departmental effectiveness and efficiency. Specifically, we suggest that the elements for this department's success were simple information systems, clear vision, consistent leadership, and the involvement, training, and support of staff. PMID:27684628

  7. Lifetime costs of cerebral palsy.

    PubMed

    Kruse, Marie; Michelsen, Susan Ishøy; Flachs, Esben Meulengracht; Brønnum-Hansen, Henrik; Madsen, Mette; Uldall, Peter

    2009-08-01

    This study quantified the lifetime costs of cerebral palsy (CP) in a register-based setting. It was the first study outside the US to assess the lifetime costs of CP. The lifetime costs attributable to CP were divided into three categories: health care costs, productivity costs, and social costs. The population analysed was retrieved from the Danish Cerebral Palsy Register, which covers the eastern part of the country and has registered about half of the Danish population of individuals with CP since 1950. For this study we analysed 2367 individuals with CP, who were born in 1930 to 2000 and were alive in 2000. The prevalence of CP in eastern Denmark was approximately 1.7 per 1000. Information on productivity and the use of health care was retrieved from registers. The lifetime cost of CP was about 860,000 euro for men and about 800,000 euro for women. The largest component was social care costs, particularly during childhood. A sensitivity analysis found that alterations in social care costs had a small effect, whereas lowering the discount rate from 5 to 3 per cent markedly increased total lifetime costs. Discounting decreases the value of costs in the future compared with the present. The high social care costs and productivity costs associated with CP point to a potential gain from labour market interventions that benefit individuals with CP. PMID:19416329

  8. Costing Children's Speech, Language and Communication Interventions

    ERIC Educational Resources Information Center

    Beecham, Jennifer; Law, James; Zeng, Biao; Lindsay, Geoff

    2012-01-01

    Background: There are few economic evaluations of speech and language interventions. Such work requires underpinning by an accurate estimate of the costs of the intervention. This study seeks to address some of the complexities of this task by applying existing approaches of cost estimation to interventions described in published effectiveness…

  9. CLOMP: Accurately Characterizing OpenMP Application Overheads

    SciTech Connect

    Bronevetsky, G; Gyllenhaal, J; de Supinski, B

    2008-02-11

    Despite its ease of use, OpenMP has failed to gain widespread use on large scale systems, largely due to its failure to deliver sufficient performance. Our experience indicates that the cost of initiating OpenMP regions is simply too high for the desired OpenMP usage scenario of many applications. In this paper, we introduce CLOMP, a new benchmark to characterize this aspect of OpenMP implementations accurately. CLOMP complements the existing EPCC benchmark suite to provide simple, easy to understand measurements of OpenMP overheads in the context of application usage scenarios. Our results for several OpenMP implementations demonstrate that CLOMP identifies the amount of work required to compensate for the overheads observed with EPCC. Further, we show that CLOMP also captures limitations for OpenMP parallelization on NUMA systems.

  10. Accurate multiplex gene synthesis from programmable DNA microchips

    NASA Astrophysics Data System (ADS)

    Tian, Jingdong; Gong, Hui; Sheng, Nijing; Zhou, Xiaochuan; Gulari, Erdogan; Gao, Xiaolian; Church, George

    2004-12-01

    Testing the many hypotheses from genomics and systems biology experiments demands accurate and cost-effective gene and genome synthesis. Here we describe a microchip-based technology for multiplex gene synthesis. Pools of thousands of `construction' oligonucleotides and tagged complementary `selection' oligonucleotides are synthesized on photo-programmable microfluidic chips, released, amplified and selected by hybridization to reduce synthesis errors ninefold. A one-step polymerase assembly multiplexing reaction assembles these into multiple genes. This technology enabled us to synthesize all 21 genes that encode the proteins of the Escherichia coli 30S ribosomal subunit, and to optimize their translation efficiency in vitro through alteration of codon bias. This is a significant step towards the synthesis of ribosomes in vitro and should have utility for synthetic biology in general.

  11. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  12. Can blind persons accurately assess body size from the voice?

    PubMed

    Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-04-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. PMID:27095264

  13. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  14. Manufacturing cost analysis of integrated photonic packages

    NASA Astrophysics Data System (ADS)

    Stirk, Charles W.; Liu, Qin; Ball, Matthew V.

    1999-04-01

    This paper analyzes the manufacturing cost of photonic system using software that combines several methods for accurate cost accounting. Activity based costing assigns al capital equipment, material and labor costs directly to the product rather than to overheads. Cost of ownership models determine the cost of using machines under different financial and utilization scenarios. Libraries of standard machines, process steps, and process sequences facilitate rapid model building and modification. Using libraries for semiconductor and photonics fabrication, along with packaging and optomechanical assembly, we construct cost models for 2D VCSEL array communication modules. The result of the analysis is that the model cost is driven mainly by the epitaxial material cost, and laser yield limits VCSEL arrays to small scale integration.

  15. Standard cost elements for technology programs

    NASA Technical Reports Server (NTRS)

    Christensen, Carisa B.; Wagenfuehrer, Carl

    1992-01-01

    The suitable structure for an effective and accurate cost estimate for general purposes is discussed in the context of a NASA technology program. Cost elements are defined for research, management, and facility-construction portions of technology programs. Attention is given to the mechanisms for insuring the viability of spending programs, and the need for program managers is established for effecting timely fund disbursement. Formal, structures, and intuitive techniques are discussed for cost-estimate development, and cost-estimate defensibility can be improved with increased documentation. NASA policies for cash management are examined to demonstrate the importance of the ability to obligate funds and the ability to cost contracted funds. The NASA approach to consistent cost justification is set forth with a list of standard cost-element definitions. The cost elements reflect the three primary concerns of cost estimates: the identification of major assumptions, the specification of secondary analytic assumptions, and the status of program factors.

  16. Estimating archiving costs for engineering records

    SciTech Connect

    Stutz, R.A.; Lamartine, B.C.

    1997-02-01

    Information technology has completely changed the concept of record keeping for engineering projects -- the advent of digital records was a momentous discovery, as significant as the invention of the printing press. Digital records allowed huge amounts of information to be stored in a very small space and to be examined quickly. However, digital documents are much more vulnerable to the passage of time than printed documents because the media on which they are stored are easily affected by physical phenomena, such as magnetic fields, oxidation, material decay, and by various environmental factors that may erase the information. Even more important, digital information becomes obsolete because, even if future generations may be able to read it, they may not necessarily be able to interpret it. Engineering projects of all sizes are becoming more dependent on digital records. These records are created on computers used in design, estimating, construction management, and construction. The necessity for the accurate and accessible storage of these documents, generated by computer software systems, is increasing for a number of reasons including legal and environment issues. This paper will discuss media life considerations and life cycle costs associated with several methods of storing engineering records.

  17. 50 CFR 85.41 - Allowable costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... applicable Federal cost principles in 43 CFR 12.60(b). Purchase of informational signs, program signs, and... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Allowable costs. 85.41 Section 85.41... Use/Acceptance of Funds § 85.41 Allowable costs. (a) Allowable grant costs are limited to those...

  18. Activity-based costing and its application in a Turkish university hospital.

    PubMed

    Yereli, Ayşe Necef

    2009-03-01

    Resource management in hospitals is of increasing importance in today's global economy. Traditional accounting systems have become inadequate for managing hospital resources and accurately determining service costs. Conversely, the activity-based costing approach to hospital accounting is an effective cost management model that determines costs and evaluates financial performance across departments. Obtaining costs that are more accurate can enable hospitals to analyze and interpret costing decisions and make more accurate budgeting decisions. Traditional and activity-based costing approaches were compared using a cost analysis of gall bladder surgeries in the general surgery department of one university hospital in Manisa, Turkey. PMID:19269382

  19. Activity-based costing and its application in a Turkish university hospital.

    PubMed

    Yereli, Ayşe Necef

    2009-03-01

    Resource management in hospitals is of increasing importance in today's global economy. Traditional accounting systems have become inadequate for managing hospital resources and accurately determining service costs. Conversely, the activity-based costing approach to hospital accounting is an effective cost management model that determines costs and evaluates financial performance across departments. Obtaining costs that are more accurate can enable hospitals to analyze and interpret costing decisions and make more accurate budgeting decisions. Traditional and activity-based costing approaches were compared using a cost analysis of gall bladder surgeries in the general surgery department of one university hospital in Manisa, Turkey.

  20. ACCURATE CHEMICAL MASTER EQUATION SOLUTION USING MULTI-FINITE BUFFERS

    PubMed Central

    Cao, Youfang; Terebus, Anna; Liang, Jie

    2016-01-01

    The discrete chemical master equation (dCME) provides a fundamental framework for studying stochasticity in mesoscopic networks. Because of the multi-scale nature of many networks where reaction rates have large disparity, directly solving dCMEs is intractable due to the exploding size of the state space. It is important to truncate the state space effectively with quantified errors, so accurate solutions can be computed. It is also important to know if all major probabilistic peaks have been computed. Here we introduce the Accurate CME (ACME) algorithm for obtaining direct solutions to dCMEs. With multi-finite buffers for reducing the state space by O(n!), exact steady-state and time-evolving network probability landscapes can be computed. We further describe a theoretical framework of aggregating microstates into a smaller number of macrostates by decomposing a network into independent aggregated birth and death processes, and give an a priori method for rapidly determining steady-state truncation errors. The maximal sizes of the finite buffers for a given error tolerance can also be pre-computed without costly trial solutions of dCMEs. We show exactly computed probability landscapes of three multi-scale networks, namely, a 6-node toggle switch, 11-node phage-lambda epigenetic circuit, and 16-node MAPK cascade network, the latter two with no known solutions. We also show how probabilities of rare events can be computed from first-passage times, another class of unsolved problems challenging for simulation-based techniques due to large separations in time scales. Overall, the ACME method enables accurate and efficient solutions of the dCME for a large class of networks. PMID:27761104

  1. Accurately Mapping M31's Microlensing Population

    NASA Astrophysics Data System (ADS)

    Crotts, Arlin

    2004-07-01

    We propose to augment an existing microlensing survey of M31 with source identifications provided by a modest amount of ACS {and WFPC2 parallel} observations to yield an accurate measurement of the masses responsible for microlensing in M31, and presumably much of its dark matter. The main benefit of these data is the determination of the physical {or "einstein"} timescale of each microlensing event, rather than an effective {"FWHM"} timescale, allowing masses to be determined more than twice as accurately as without HST data. The einstein timescale is the ratio of the lensing cross-sectional radius and relative velocities. Velocities are known from kinematics, and the cross-section is directly proportional to the {unknown} lensing mass. We cannot easily measure these quantities without knowing the amplification, hence the baseline magnitude, which requires the resolution of HST to find the source star. This makes a crucial difference because M31 lens m ass determinations can be more accurate than those towards the Magellanic Clouds through our Galaxy's halo {for the same number of microlensing events} due to the better constrained geometry in the M31 microlensing situation. Furthermore, our larger survey, just completed, should yield at least 100 M31 microlensing events, more than any Magellanic survey. A small amount of ACS+WFPC2 imaging will deliver the potential of this large database {about 350 nights}. For the whole survey {and a delta-function mass distribution} the mass error should approach only about 15%, or about 6% error in slope for a power-law distribution. These results will better allow us to pinpoint the lens halo fraction, and the shape of the halo lens spatial distribution, and allow generalization/comparison of the nature of halo dark matter in spiral galaxies. In addition, we will be able to establish the baseline magnitude for about 50, 000 variable stars, as well as measure an unprecedentedly deta iled color-magnitude diagram and luminosity

  2. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  3. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2016-07-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  4. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  5. Accurate density functional thermochemistry for larger molecules.

    SciTech Connect

    Raghavachari, K.; Stefanov, B. B.; Curtiss, L. A.; Lucent Tech.

    1997-06-20

    Density functional methods are combined with isodesmic bond separation reaction energies to yield accurate thermochemistry for larger molecules. Seven different density functionals are assessed for the evaluation of heats of formation, Delta H 0 (298 K), for a test set of 40 molecules composed of H, C, O and N. The use of bond separation energies results in a dramatic improvement in the accuracy of all the density functionals. The B3-LYP functional has the smallest mean absolute deviation from experiment (1.5 kcal mol/f).

  6. Universality: Accurate Checks in Dyson's Hierarchical Model

    NASA Astrophysics Data System (ADS)

    Godina, J. J.; Meurice, Y.; Oktay, M. B.

    2003-06-01

    In this talk we present high-accuracy calculations of the susceptibility near βc for Dyson's hierarchical model in D = 3. Using linear fitting, we estimate the leading (γ) and subleading (Δ) exponents. Independent estimates are obtained by calculating the first two eigenvalues of the linearized renormalization group transformation. We found γ = 1.29914073 ± 10 -8 and, Δ = 0.4259469 ± 10-7 independently of the choice of local integration measure (Ising or Landau-Ginzburg). After a suitable rescaling, the approximate fixed points for a large class of local measure coincide accurately with a fixed point constructed by Koch and Wittwer.

  7. CARETS: A prototype regional environmental information system. Volume 6: Cost, accuracy and consistency comparisons of land use maps made from high-altitude aircraft photography and ERTS imagery

    NASA Technical Reports Server (NTRS)

    Alexander, R. H. (Principal Investigator); Fitzpatrick, K. A.

    1975-01-01

    The author has identified the following significant results. Level 2 land use maps produced at three scales (1:24,000, 1:100,000, and 1:250,000) from high altitude photography were compared with each other and with point data obtained in the field. The same procedures were employed to determine the accuracy of the Level 1 land use maps produced at 1:250,000 from high altitude photography and color composite ERTS imagery. Accuracy of the Level 2 maps was 84.9 percent at 1:24,000, 77.4 percent at 1:100,000 and 73.0 percent at 1:250,000. Accuracy of the Level 1 1:250,000 maps was 76.5 percent for aerial photographs and 69.5 percent for ERTS imagery. The cost of Level 2 land use mapping at 1:24,000 was found to be high ($11.93 per sq km). The cost of mapping at 1:100,000 ($1.75) was about two times as expensive as mapping at 1:250,000 ($.88), and the accuracy increased by only 4.4 percent.

  8. Program Tracks Cost Of Travel

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III

    1993-01-01

    Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.

  9. Gauging Technology Costs and Benefits

    ERIC Educational Resources Information Center

    Kaestner, Rich

    2007-01-01

    Regardless of the role technology plays in a school district, district personnel should know the costs associated with technology, understand the consequences of technology purchases, and be able to measure the benefits of technology, so they can make more informed decisions. However, determining costs and benefits of current technology or…

  10. Treatment Program Operations and Costs

    PubMed Central

    Broome, Kirk M.; Knight, Danica K.; Joe, George W.; Flynn, Patrick M.

    2011-01-01

    This study investigates how average costs for an episode of care in outpatient drug-free (ODF) treatment relate to clinical intensity (length of stay and weekly counseling hours) and program structure (e.g., size, staffing), controlling for prices paid and selected clientele measures. Based on cost assessments from a naturalistic sample of 67 programs located across the US (using the Treatment Cost Analysis Tool), robust regression techniques showed that programs having 10% longer treatment stays had episode costs 7% higher; those having 10% more weekly counseling hours per client had 4% higher episode costs. Other important factors included wages, amount of counselors’ time conducting sessions, and serving more clients referred from the criminal justice system. The study provides valuable information on treatment program features that relate to costs. Most importantly, cost differences associated with longer stays or more intensive counseling protocols appear modest, and may be justified by improved client outcomes. PMID:22154033

  11. Analysis of Costs and Performance

    ERIC Educational Resources Information Center

    Duchesne, Roderick M.

    1973-01-01

    This article outlines a library management information system concerned with total library costs and performance. The system is essentially an adaptation of well-proven industrial and commercial management accounting techniques to the library context. (24 references) (Author)

  12. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  13. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  14. Accurate determination of characteristic relative permeability curves

    NASA Astrophysics Data System (ADS)

    Krause, Michael H.; Benson, Sally M.

    2015-09-01

    A recently developed technique to accurately characterize sub-core scale heterogeneity is applied to investigate the factors responsible for flowrate-dependent effective relative permeability curves measured on core samples in the laboratory. The dependency of laboratory measured relative permeability on flowrate has long been both supported and challenged by a number of investigators. Studies have shown that this apparent flowrate dependency is a result of both sub-core scale heterogeneity and outlet boundary effects. However this has only been demonstrated numerically for highly simplified models of porous media. In this paper, flowrate dependency of effective relative permeability is demonstrated using two rock cores, a Berea Sandstone and a heterogeneous sandstone from the Otway Basin Pilot Project in Australia. Numerical simulations of steady-state coreflooding experiments are conducted at a number of injection rates using a single set of input characteristic relative permeability curves. Effective relative permeability is then calculated from the simulation data using standard interpretation methods for calculating relative permeability from steady-state tests. Results show that simplified approaches may be used to determine flowrate-independent characteristic relative permeability provided flow rate is sufficiently high, and the core heterogeneity is relatively low. It is also shown that characteristic relative permeability can be determined at any typical flowrate, and even for geologically complex models, when using accurate three-dimensional models.

  15. How Accurately can we Calculate Thermal Systems?

    SciTech Connect

    Cullen, D; Blomquist, R N; Dean, C; Heinrichs, D; Kalugin, M A; Lee, M; Lee, Y; MacFarlan, R; Nagaya, Y; Trkov, A

    2004-04-20

    I would like to determine how accurately a variety of neutron transport code packages (code and cross section libraries) can calculate simple integral parameters, such as K{sub eff}, for systems that are sensitive to thermal neutron scattering. Since we will only consider theoretical systems, we cannot really determine absolute accuracy compared to any real system. Therefore rather than accuracy, it would be more precise to say that I would like to determine the spread in answers that we obtain from a variety of code packages. This spread should serve as an excellent indicator of how accurately we can really model and calculate such systems today. Hopefully, eventually this will lead to improvements in both our codes and the thermal scattering models that they use in the future. In order to accomplish this I propose a number of extremely simple systems that involve thermal neutron scattering that can be easily modeled and calculated by a variety of neutron transport codes. These are theoretical systems designed to emphasize the effects of thermal scattering, since that is what we are interested in studying. I have attempted to keep these systems very simple, and yet at the same time they include most, if not all, of the important thermal scattering effects encountered in a large, water-moderated, uranium fueled thermal system, i.e., our typical thermal reactors.

  16. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    ERIC Educational Resources Information Center

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  17. Providing Low-Cost Information Technology Access to Rural Communities in Developing Countries: What Works? What Pays? OECD Development Centre Working Paper No. 229 (Formerly Webdoc No. 17)

    ERIC Educational Resources Information Center

    Caspary, Georg; O'Connor, David

    2003-01-01

    Rural areas of the developing world are the last frontier of the information technology revolution. Telephone and internet penetration there remains a small fraction of what it is in the developed world. Limited means of electronic communication with the outside world are just one source of isolation of rural communities and economies from the…

  18. Informing mental health policies and services in the EMR: cost-effective deployment of human resources to deliver integrated community-based care.

    PubMed

    Ivbijaro, G; Patel, V; Chisholm, D; Goldberg, D; Khoja, T A M; Edwards, T M; Enum, Y; Kolkiewic, L A

    2015-09-28

    For EMR countries to deliver the expectations of the Global Mental Health Action Plan 2013-2020 & the ongoing move towards universal health coverage, all health & social care providers need to innovate and transform their services to provide evidence-based health care that is accessible, cost-effective & with the best patient outcomes. For the primary and community workforce, this includes general medical practitioners, practice & community nurses, community social workers, housing officers, lay health workers, nongovernmental organizations & civil society, including community spiritual leaders/healers. This paper brings together the current best evidence to support transformation & discusses key approaches to achieve this, including skill mix and/or task shifting and integrated care. The important factors that need to be in place to support skill mix/task shifting and good integrated care are outlined with reference to EMR countries.

  19. Informing mental health policies and services in the EMR: cost-effective deployment of human resources to deliver integrated community-based care.

    PubMed

    Ivbijaro, G; Patel, V; Chisholm, D; Goldberg, D; Khoja, T A M; Edwards, T M; Enum, Y; Kolkiewic, L A

    2015-07-01

    For EMR countries to deliver the expectations of the Global Mental Health Action Plan 2013-2020 & the ongoing move towards universal health coverage, all health & social care providers need to innovate and transform their services to provide evidence-based health care that is accessible, cost-effective & with the best patient outcomes. For the primary and community workforce, this includes general medical practitioners, practice & community nurses, community social workers, housing officers, lay health workers, nongovernmental organizations & civil society, including community spiritual leaders/healers. This paper brings together the current best evidence to support transformation & discusses key approaches to achieve this, including skill mix and/or task shifting and integrated care. The important factors that need to be in place to support skill mix/task shifting and good integrated care are outlined with reference to EMR countries. PMID:26442888

  20. Subvoxel accurate graph search using non-Euclidean graph space.

    PubMed

    Abràmoff, Michael D; Wu, Xiaodong; Lee, Kyungmoo; Tang, Li

    2014-01-01

    Graph search is attractive for the quantitative analysis of volumetric medical images, and especially for layered tissues, because it allows globally optimal solutions in low-order polynomial time. However, because nodes of graphs typically encode evenly distributed voxels of the volume with arcs connecting orthogonally sampled voxels in Euclidean space, segmentation cannot achieve greater precision than a single unit, i.e. the distance between two adjoining nodes, and partial volume effects are ignored. We generalize the graph to non-Euclidean space by allowing non-equidistant spacing between nodes, so that subvoxel accurate segmentation is achievable. Because the number of nodes and edges in the graph remains the same, running time and memory use are similar, while all the advantages of graph search, including global optimality and computational efficiency, are retained. A deformation field calculated from the volume data adaptively changes regional node density so that node density varies with the inverse of the expected cost. We validated our approach using optical coherence tomography (OCT) images of the retina and 3-D MR of the arterial wall, and achieved statistically significant increased accuracy. Our approach allows improved accuracy in volume data acquired with the same hardware, and also, preserved accuracy with lower resolution, more cost-effective, image acquisition equipment. The method is not limited to any specific imaging modality and readily extensible to higher dimensions.

  1. Parallel kinetic Monte Carlo simulation framework incorporating accurate models of adsorbate lateral interactions

    NASA Astrophysics Data System (ADS)

    Nielsen, Jens; d'Avezac, Mayeul; Hetherington, James; Stamatakis, Michail

    2013-12-01

    Ab initio kinetic Monte Carlo (KMC) simulations have been successfully applied for over two decades to elucidate the underlying physico-chemical phenomena on the surfaces of heterogeneous catalysts. These simulations necessitate detailed knowledge of the kinetics of elementary reactions constituting the reaction mechanism, and the energetics of the species participating in the chemistry. The information about the energetics is encoded in the formation energies of gas and surface-bound species, and the lateral interactions between adsorbates on the catalytic surface, which can be modeled at different levels of detail. The majority of previous works accounted for only pairwise-additive first nearest-neighbor interactions. More recently, cluster-expansion Hamiltonians incorporating long-range interactions and many-body terms have been used for detailed estimations of catalytic rate [C. Wu, D. J. Schmidt, C. Wolverton, and W. F. Schneider, J. Catal. 286, 88 (2012)]. In view of the increasing interest in accurate predictions of catalytic performance, there is a need for general-purpose KMC approaches incorporating detailed cluster expansion models for the adlayer energetics. We have addressed this need by building on the previously introduced graph-theoretical KMC framework, and we have developed Zacros, a FORTRAN2003 KMC package for simulating catalytic chemistries. To tackle the high computational cost in the presence of long-range interactions we introduce parallelization with OpenMP. We further benchmark our framework by simulating a KMC analogue of the NO oxidation system established by Schneider and co-workers [J. Catal. 286, 88 (2012)]. We show that taking into account only first nearest-neighbor interactions may lead to large errors in the prediction of the catalytic rate, whereas for accurate estimates thereof, one needs to include long-range terms in the cluster expansion.

  2. 7 CFR 2903.4 - Indirect costs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AGRICULTURE BIODIESEL FUEL EDUCATION PROGRAM General Information § 2903.4 Indirect costs. (a) For the Biodiesel Fuel Education Program, applicants should use the current indirect cost rate negotiated with...

  3. EPA (Environmental Protection Agency) evaluation of the POWERFuel Extender System under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-08-01

    The evaluation of the POWERFuel Extender System was conducted upon the application of the manufacturer. The device is claimed to improve fuel economy and driveability and to reduce exhaust emissions and required engine maintenance. The device is classified by EPA as a vapor-air bleed device. EPA fully considered all of the information submitted by the applicant. The evaluation of the POWERFuel Extender System was based on that information and on EPA's experience with other similar devices. Although, in theory, the introduction of alcohol and water could have a favorable effect on an engine's cleaniness, power and maintenance requirements and could even allow some vehicles to use lower octane fuel, data were not submitted to substantiate that the POWERFuel Extender System could cause these benefits.

  4. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  5. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  6. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  7. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  8. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  9. Accurate metacognition for visual sensory memory representations.

    PubMed

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Barrett, Adam B; Seth, Anil K; Fahrenfort, Johannes J; Lamme, Victor A F

    2014-04-01

    The capacity to attend to multiple objects in the visual field is limited. However, introspectively, people feel that they see the whole visual world at once. Some scholars suggest that this introspective feeling is based on short-lived sensory memory representations, whereas others argue that the feeling of seeing more than can be attended to is illusory. Here, we investigated this phenomenon by combining objective memory performance with subjective confidence ratings during a change-detection task. This allowed us to compute a measure of metacognition--the degree of knowledge that subjects have about the correctness of their decisions--for different stages of memory. We show that subjects store more objects in sensory memory than they can attend to but, at the same time, have similar metacognition for sensory memory and working memory representations. This suggests that these subjective impressions are not an illusion but accurate reflections of the richness of visual perception.

  10. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2003-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  11. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2002-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  12. Accurate metacognition for visual sensory memory representations.

    PubMed

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Barrett, Adam B; Seth, Anil K; Fahrenfort, Johannes J; Lamme, Victor A F

    2014-04-01

    The capacity to attend to multiple objects in the visual field is limited. However, introspectively, people feel that they see the whole visual world at once. Some scholars suggest that this introspective feeling is based on short-lived sensory memory representations, whereas others argue that the feeling of seeing more than can be attended to is illusory. Here, we investigated this phenomenon by combining objective memory performance with subjective confidence ratings during a change-detection task. This allowed us to compute a measure of metacognition--the degree of knowledge that subjects have about the correctness of their decisions--for different stages of memory. We show that subjects store more objects in sensory memory than they can attend to but, at the same time, have similar metacognition for sensory memory and working memory representations. This suggests that these subjective impressions are not an illusion but accurate reflections of the richness of visual perception. PMID:24549293

  13. Accurate Telescope Mount Positioning with MEMS Accelerometers

    NASA Astrophysics Data System (ADS)

    Mészáros, L.; Jaskó, A.; Pál, A.; Csépány, G.

    2014-08-01

    This paper describes the advantages and challenges of applying microelectromechanical accelerometer systems (MEMS accelerometers) in order to attain precise, accurate, and stateless positioning of telescope mounts. This provides a completely independent method from other forms of electronic, optical, mechanical or magnetic feedback or real-time astrometry. Our goal is to reach the subarcminute range which is considerably smaller than the field-of-view of conventional imaging telescope systems. Here we present how this subarcminute accuracy can be achieved with very cheap MEMS sensors and we also detail how our procedures can be extended in order to attain even finer measurements. In addition, our paper discusses how can a complete system design be implemented in order to be a part of a telescope control system.

  14. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  15. Accurate Weather Forecasting for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Maddalena, Ronald J.

    2010-01-01

    The NRAO Green Bank Telescope routinely observes at wavelengths from 3 mm to 1 m. As with all mm-wave telescopes, observing conditions depend upon the variable atmospheric water content. The site provides over 100 days/yr when opacities are low enough for good observing at 3 mm, but winds on the open-air structure reduce the time suitable for 3-mm observing where pointing is critical. Thus, to maximum productivity the observing wavelength needs to match weather conditions. For 6 years the telescope has used a dynamic scheduling system (recently upgraded; www.gb.nrao.edu/DSS) that requires accurate multi-day forecasts for winds and opacities. Since opacity forecasts are not provided by the National Weather Services (NWS), I have developed an automated system that takes available forecasts, derives forecasted opacities, and deploys the results on the web in user-friendly graphical overviews (www.gb.nrao.edu/ rmaddale/Weather). The system relies on the "North American Mesoscale" models, which are updated by the NWS every 6 hrs, have a 12 km horizontal resolution, 1 hr temporal resolution, run to 84 hrs, and have 60 vertical layers that extend to 20 km. Each forecast consists of a time series of ground conditions, cloud coverage, etc, and, most importantly, temperature, pressure, humidity as a function of height. I use the Liebe's MWP model (Radio Science, 20, 1069, 1985) to determine the absorption in each layer for each hour for 30 observing wavelengths. Radiative transfer provides, for each hour and wavelength, the total opacity and the radio brightness of the atmosphere, which contributes substantially at some wavelengths to Tsys and the observational noise. Comparisons of measured and forecasted Tsys at 22.2 and 44 GHz imply that the forecasted opacities are good to about 0.01 Nepers, which is sufficient for forecasting and accurate calibration. Reliability is high out to 2 days and degrades slowly for longer-range forecasts.

  16. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  17. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  18. Accurate low-cost methods for performance evaluation of cache memory systems

    NASA Technical Reports Server (NTRS)

    Laha, Subhasis; Patel, Janak H.; Iyer, Ravishankar K.

    1988-01-01

    Methods of simulation based on statistical techniques are proposed to decrease the need for large trace measurements and for predicting true program behavior. Sampling techniques are applied while the address trace is collected from a workload. This drastically reduces the space and time needed to collect the trace. Simulation techniques are developed to use the sampled data not only to predict the mean miss rate of the cache, but also to provide an empirical estimate of its actual distribution. Finally, a concept of primed cache is introduced to simulate large caches by the sampling-based method.

  19. Video distribution system cost model

    NASA Technical Reports Server (NTRS)

    Gershkoff, I.; Haspert, J. K.; Morgenstern, B.

    1980-01-01

    A cost model that can be used to systematically identify the costs of procuring and operating satellite linked communications systems is described. The user defines a network configuration by specifying the location of each participating site, the interconnection requirements, and the transmission paths available for the uplink (studio to satellite), downlink (satellite to audience), and voice talkback (between audience and studio) segments of the network. The model uses this information to calculate the least expensive signal distribution path for each participating site. Cost estimates are broken downy by capital, installation, lease, operations and maintenance. The design of the model permits flexibility in specifying network and cost structure.

  20. Costing blood products and services.

    PubMed

    Wallace, E L

    1991-05-01

    At present, blood centers and transfusion services have limited alternatives for offsetting the ever-rising costs of health care inputs. In the face of current revenue constraints, cost reduction or cost containment through efficiency improvements or service reduction is the principal available means. Such methods ought to be pursued vigorously by blood bankers with the aid of well-designed costing and other physical measurements systems. Experience indicates, however, that blood bankers, in their attempts to reduce or contain costs, are likely to place undue reliance on cost accounting systems as the means of capturing sought-for benefits. Management must learn enough about methods of costing to judge directly the uses and limitations of the information produced. Such understanding begins with recognition that all costs and cost comparisons should be specific to the purpose for which they are developed. No costing procedure is capable of producing measures generally applicable to all management decisions. A measure relevant to a planning decision is unlikely to be appropriate for performance evaluation. Useful comparisons among sets of organizations of costs, or of measures of physical inputs and outputs, require assurance that the methods of measurement employed are the same and that the sets of organizations from which the measures are drawn are reasonably comparable.