Science.gov

Sample records for accurate cost information

  1. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  2. You Can Accurately Predict Land Acquisition Costs.

    ERIC Educational Resources Information Center

    Garrigan, Richard

    1967-01-01

    Land acquisition costs were tested for predictability based upon the 1962 assessed valuations of privately held land acquired for campus expansion by the University of Wisconsin from 1963-1965. By correlating the land acquisition costs of 108 properties acquired during the 3 year period with--(1) the assessed value of the land, (2) the assessed…

  3. Managing Information On Costs

    NASA Technical Reports Server (NTRS)

    Taulbee, Zoe A.

    1990-01-01

    Cost Management Model, CMM, software tool for planning, tracking, and reporting costs and information related to costs. Capable of estimating costs, comparing estimated to actual costs, performing "what-if" analyses on estimates of costs, and providing mechanism to maintain data on costs in format oriented to management. Number of supportive cost methods built in: escalation rates, production-learning curves, activity/event schedules, unit production schedules, set of spread distributions, tables of rates and factors defined by user, and full arithmetic capability. Import/export capability possible with 20/20 Spreadsheet available on Data General equipment. Program requires AOS/VS operating system available on Data General MV series computers. Written mainly in FORTRAN 77 but uses SGU (Screen Generation Utility).

  4. Preparing Rapid, Accurate Construction Cost Estimates with a Personal Computer.

    ERIC Educational Resources Information Center

    Gerstel, Sanford M.

    1986-01-01

    An inexpensive and rapid method for preparing accurate cost estimates of construction projects in a university setting, using a personal computer, purchased software, and one estimator, is described. The case against defined estimates, the rapid estimating system, and adjusting standard unit costs are discussed. (MLW)

  5. How utilities can achieve more accurate decommissioning cost estimates

    SciTech Connect

    Knight, R.

    1999-07-01

    The number of commercial nuclear power plants that are undergoing decommissioning coupled with the economic pressure of deregulation has increased the focus on adequate funding for decommissioning. The introduction of spent-fuel storage and disposal of low-level radioactive waste into the cost analysis places even greater concern as to the accuracy of the fund calculation basis. The size and adequacy of the decommissioning fund have also played a major part in the negotiations for transfer of plant ownership. For all of these reasons, it is important that the operating plant owner reduce the margin of error in the preparation of decommissioning cost estimates. To data, all of these estimates have been prepared via the building block method. That is, numerous individual calculations defining the planning, engineering, removal, and disposal of plant systems and structures are performed. These activity costs are supplemented by the period-dependent costs reflecting the administration, control, licensing, and permitting of the program. This method will continue to be used in the foreseeable future until adequate performance data are available. The accuracy of the activity cost calculation is directly related to the accuracy of the inventory of plant system component, piping and equipment, and plant structural composition. Typically, it is left up to the cost-estimating contractor to develop this plant inventory. The data are generated by searching and analyzing property asset records, plant databases, piping and instrumentation drawings, piping system isometric drawings, and component assembly drawings. However, experience has shown that these sources may not be up to date, discrepancies may exist, there may be missing data, and the level of detail may not be sufficient. Again, typically, the time constraints associated with the development of the cost estimate preclude perfect resolution of the inventory questions. Another problem area in achieving accurate cost

  6. Managing Costs and Medical Information

    Cancer.gov

    People with cancer may face major financial challenges and need help dealing with the high costs of care. Cancer treatment can be very expensive, even when you have insurance. Learn ways to manage medical information, paperwork, bills, and other records.

  7. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  8. Counting the cost of not costing HIV health facilities accurately: pay now, or pay more later.

    PubMed

    Beck, Eduard J; Avila, Carlos; Gerbase, Sofia; Harling, Guy; De Lay, Paul

    2012-10-01

    The HIV pandemic continues to be one of our greatest contemporary public health threats. Policy makers in many middle- and low-income countries are in the process of scaling up HIV prevention, treatment and care services in the context of a reduction in international HIV funding due to the global economic downturn. In order to scale up services that are sustainable in the long term, policy makers and implementers need to have access to robust and contemporary strategic information, including financial information on expenditure and cost, in order to be able to plan, implement, monitor and evaluate HIV services. A major problem in middle- and low-income countries continues to be a lack of basic information on the use of services, their cost, outcome and impact, while those few costing studies that have been performed were often not done in a standardized fashion. Some researchers handle this by transposing information from one country to another, developing mathematical or statistical models that rest on assumptions or information that may not be applicable, or using top-down costing methods that only provide global financial costs rather than using bottom-up ingredients-based costing. While these methods provide answers in the short term, countries should develop systematic data collection systems to store, transfer and produce robust and contemporary strategic financial information for stakeholders at local, sub-national and national levels. National aggregated information should act as the main source of financial data for international donors, agencies or other organizations involved with the global HIV response. This paper describes the financial information required by policy makers and other stakeholders to enable them to make evidence-informed decisions and reviews the quantity and quality of the financial information available, as indicated by cost studies published between 1981 and 2008. Among the lessons learned from reviewing these studies, a need was

  9. 77 FR 11191 - Insurance Cost Information Regulation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-24

    ... National Highway Traffic Safety Administration Insurance Cost Information Regulation AGENCY: National... insurance cost information booklet that all car dealers must make available to prospective purchasers... differences in passenger vehicle collision loss experience that could affect auto insurance costs....

  10. Ultra-accurate collaborative information filtering via directed user similarity

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  11. Cost Information and Formula Funding: New Approaches.

    ERIC Educational Resources Information Center

    Allen, Richard H., Ed.; Topping, James R., Ed.

    A report based on a conference on the impact of cost information on statewide budgeting and planning is presented. The conference was organized around case-study reports on the use of cost information in higher education budgeting in Florida, Indiana, Washington, and Wisconsin. Rather than publishing convention proceedings, the case studies were…

  12. Unifying cost and information in information-theoretic competitive learning.

    PubMed

    Kamimura, Ryotaro

    2005-01-01

    In this paper, we introduce costs into the framework of information maximization and try to maximize the ratio of information to its associated cost. We have shown that competitive learning is realized by maximizing mutual information between input patterns and competitive units. One shortcoming of the method is that maximizing information does not necessarily produce representations faithful to input patterns. Information maximizing primarily focuses on some parts of input patterns that are used to distinguish between patterns. Therefore, we introduce the cost, which represents average distance between input patterns and connection weights. By minimizing the cost, final connection weights reflect input patterns well. We applied the method to a political data analysis, a voting attitude problem and a Wisconsin cancer problem. Experimental results confirmed that, when the cost was introduced, representations faithful to input patterns were obtained. In addition, improved generalization performance was obtained within a relatively short learning time.

  13. A new accurate pill recognition system using imprint information

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Kamata, Sei-ichiro

    2013-12-01

    Great achievements in modern medicine benefit human beings. Also, it has brought about an explosive growth of pharmaceuticals that current in the market. In daily life, pharmaceuticals sometimes confuse people when they are found unlabeled. In this paper, we propose an automatic pill recognition technique to solve this problem. It functions mainly based on the imprint feature of the pills, which is extracted by proposed MSWT (modified stroke width transform) and described by WSC (weighted shape context). Experiments show that our proposed pill recognition method can reach an accurate rate up to 92.03% within top 5 ranks when trying to classify more than 10 thousand query pill images into around 2000 categories.

  14. Building accurate geometric models from abundant range imaging information

    SciTech Connect

    Diegert, C.; Sackos, J.; Nellums, R.

    1997-05-01

    The authors define two simple metrics for accuracy of models built from range imaging information. They apply the metric to a model built from a recent range image taken at the Laser Radar Development and Evaluation Facility (LDERF), Eglin AFB, using a Scannerless Range Imager (SRI) from Sandia National Laboratories. They also present graphical displays of the residual information produced as a byproduct of this measurement, and discuss mechanisms that these data suggest for further improvement in the performance of this already impressive SRI.

  15. A Low Cost Course Information Syndication System

    ERIC Educational Resources Information Center

    Ajayi, A. O.; Olajubu, E. A.; Bello, S. A.; Soriyan, H. A.; Obamuyide, A. V.

    2011-01-01

    This study presents a cost effective, reliable, and convenient mobile web-based system to facilitate the dissemination of course information to students, to support interaction that goes beyond the classroom. The system employed the Really Simple Syndication (RSS) technology and was developed using Rapid Application Development (RAD) methodology.…

  16. The Good, the Strong, and the Accurate: Preschoolers' Evaluations of Informant Attributes

    ERIC Educational Resources Information Center

    Fusaro, Maria; Corriveau, Kathleen H.; Harris, Paul L.

    2011-01-01

    Much recent evidence shows that preschoolers are sensitive to the accuracy of an informant. Faced with two informants, one of whom names familiar objects accurately and the other inaccurately, preschoolers subsequently prefer to learn the names and functions of unfamiliar objects from the more accurate informant. This study examined the inference…

  17. Waste Management Facilities Cost Information Report

    SciTech Connect

    Feizollahi, F.; Shropshire, D.

    1992-10-01

    The Waste Management Facility Cost Information (WMFCI) Report, commissioned by the US Department of Energy (DOE), develops planning life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities. This report contains PLCC estimates versus capacity for 26 different facility cost modules. A procedure to guide DOE and its contractor personnel in the use of estimating data is also provided. Estimates in the report apply to five distinctive waste streams: low-level waste, low-level mixed waste, alpha contaminated low-level waste, alpha contaminated low-level mixed waste, and transuranic waste. The report addresses five different treatment types: incineration, metal/melting and recovery, shredder/compaction, solidification, and vitrification. Data in this report allows the user to develop PLCC estimates for various waste management options.

  18. Chromatography paper as a low-cost medium for accurate spectrophotometric assessment of blood hemoglobin concentration.

    PubMed

    Bond, Meaghan; Elguea, Carlos; Yan, Jasper S; Pawlowski, Michal; Williams, Jessica; Wahed, Amer; Oden, Maria; Tkaczyk, Tomasz S; Richards-Kortum, Rebecca

    2013-06-21

    Anemia affects a quarter of the world's population, and a lack of appropriate diagnostic tools often prevents treatment in low-resource settings. Though the HemoCue 201+ is an appropriate device for diagnosing anemia in low-resource settings, the high cost of disposables ($0.99 per test in Malawi) limits its availability. We investigated using spectrophotometric measurement of blood spotted on chromatography paper as a low-cost (<$0.01 per test) alternative to HemoCue cuvettes. For this evaluation, donor blood was diluted with plasma to simulate anemia, a micropipette spotted blood on paper, and a bench-top spectrophotometer validated the approach before the development of a low-cost reader. We optimized impregnating paper with chemicals to lyse red blood cells, paper type, drying time, wavelengths measured, and sensitivity to variations in volume of blood, and we validated our approach using patient samples. Lysing the blood cells with sodium deoxycholate dried in Whatman Chr4 chromatography paper gave repeatable results, and the absorbance difference between 528 nm and 656 nm was stable over time in measurements taken up to 10 min after sample preparation. The method was insensitive to the amount of blood spotted on the paper over the range of 5 μL to 25 μL. We created a low-cost, handheld reader to measure the transmission of paper cuvettes at these optimal wavelengths. Training and validating our method with patient samples on both the spectrometer and the handheld reader showed that both devices are accurate to within 2 g dL(-1) of the HemoCue device for 98% and 95% of samples, respectively.

  19. Measuring nonlinear oscillations using a very accurate and low-cost linear optical position transducer

    NASA Astrophysics Data System (ADS)

    Donoso, Guillermo; Ladera, Celso L.

    2016-09-01

    An accurate linear optical displacement transducer of about 0.2 mm resolution over a range of ∼40 mm is presented. This device consists of a stack of thin cellulose acetate strips, each strip longitudinally slid ∼0.5 mm over the precedent one so that one end of the stack becomes a stepped wedge of constant step. A narrowed light beam from a white LED orthogonally incident crosses the wedge at a known point, the transmitted intensity being detected with a phototransistor whose emitter is connected to a diode. We present the interesting analytical proof that the voltage across the diode is linearly dependent upon the ordinate of the point where the light beam falls on the wedge, as well as the experimental validation of such a theoretical proof. Applications to nonlinear oscillations are then presented—including the interesting case of a body moving under dry friction, and the more advanced case of an oscillator in a quartic energy potential—whose time-varying positions were accurately measured with our transducer. Our sensing device can resolve the dynamics of an object attached to it with great accuracy and precision at a cost considerably less than that of a linear neutral density wedge. The technique used to assemble the wedge of acetate strips is described.

  20. A Low-Cost, Accurate, and High-Precision Fluid Dispensing System for Microscale Application.

    PubMed

    Das, Champak; Wang, Guochun; Nguyen, Chien

    2017-04-01

    We present here the development of a low-cost, accurate, and precise fluid dispensing system. It can be used with peristaltic or any other pump to improve the flow characteristics. The dispensing system has a range of 1 to 100 µL with accuracy of ~99.5% and standard deviation at ~150 nL over the entire range. The system developed does not depend on the accuracy or precision of the driving pump; therefore, any positive displacement pump can be used to get similar accuracy and precision, which gives an opportunity to reduce the cost of the system. The dispensing system does not require periodic calibration and can also be miniaturized for microfluidic application. Although primarily designed for aqueous liquid, it can be extended for different nonconductive liquids as well with modifications. The unit is further used for near real-time measurement of lactate from microdialysate. The individual components can easily be made disposable or sterilized for use in biomedical applications.

  1. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  2. Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information

    NASA Technical Reports Server (NTRS)

    Butts, Glenn

    2007-01-01

    Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.

  3. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization.

    PubMed

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-10-27

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 µs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online.

  4. A Low-Cost Modular Platform for Heterogeneous Data Acquisition with Accurate Interchannel Synchronization

    PubMed Central

    Blanco-Claraco, José Luis; López-Martínez, Javier; Torres-Moreno, José Luis; Giménez-Fernández, Antonio

    2015-01-01

    Most experimental fields of science and engineering require the use of data acquisition systems (DAQ), devices in charge of sampling and converting electrical signals into digital data and, typically, performing all of the required signal preconditioning. Since commercial DAQ systems are normally focused on specific types of sensors and actuators, systems engineers may need to employ mutually-incompatible hardware from different manufacturers in applications demanding heterogeneous inputs and outputs, such as small-signal analog inputs, differential quadrature rotatory encoders or variable current outputs. A common undesirable side effect of heterogeneous DAQ hardware is the lack of an accurate synchronization between samples captured by each device. To solve such a problem with low-cost hardware, we present a novel modular DAQ architecture comprising a base board and a set of interchangeable modules. Our main design goal is the ability to sample all sources at predictable, fixed sampling frequencies, with a reduced synchronization mismatch (<1 μs) between heterogeneous signal sources. We present experiments in the field of mechanical engineering, illustrating vibration spectrum analyses from piezoelectric accelerometers and, as a novelty in these kinds of experiments, the spectrum of quadrature encoder signals. Part of the design and software will be publicly released online. PMID:26516865

  5. 75 FR 57284 - Agency Information Collection Activities: Cost Submission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... SECURITY U.S. Customs and Border Protection Agency Information Collection Activities: Cost Submission... other Federal agencies to comment on an information collection requirement concerning: Cost Submission... forms of information technology; and (e) the annual costs burden to respondents or record keepers...

  6. Accurate and cost-effective natural resource data from super large scale aerial photography

    NASA Astrophysics Data System (ADS)

    Grotefendt, Richard Alan

    Increasing amounts and types of timely and accurate data are required for monitoring to ensure compliance with natural resource regulatory requirements. This study developed a cost-effective method to partially fulfill these data requirements using super large scale aerial photography (Scale: greater than 1:2,000). Two synchronized, metric, Rolleiflex 70mm (2.76in) cameras mounted 12m (40ft) apart on a rigid platform and carried at 5.6 km/hr (3 knots) by a helicopter collected this high resolution, 3D imagery from Alaska and Washington. The overlapping photo pairs provided 3D views of natural resource objects as fine as twigs. The 12m (40ft) inter-camera distance improved ground visibility between tree crowns of dense old growth forests. Analytical stereoplotters and the application of photogrammetric principles enabled measurement and interpretation of photo objects such as trees and their height in a cost-effective way. Horizontal and vertical measurement accuracy was within 2% and 3% of field measurement, respectively. Forest inventory and riparian buffer monitoring applications were used to test this method. Although field work is still required to develop photo-field relationships unique to each ecosystem and for quality assurance, the photo estimates of individual tree height, volume, diameter, type, and location, as well as down tree decay class and landing spot, plot timber volume, and area were comparable to and may replace approximately 95% of field effort. For example, the average of the absolute differences between field and photo estimates for tree height was 2.4m (7.8ft) (s.d. = 2.1m (6.8ft), n = 376), diameter at breast height (1.4m (4.5ft) above ground on uphill tree side) was 5.8cm (2.3in) (s.d. = 5.6cm (2.2in), n = 109), and plot volume in gross board feet was within 10.9% to 13.4% (n = 10) depending on the estimator used. Forest type was correctly classified 99.4% (n = 180) of the time. Timber inventory, species identification, sample

  7. Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification.

    PubMed

    Kumar, Ajay; Kwong, Cyril

    2015-03-01

    Human identification using fingerprint impressions has been widely studied and employed for more than 2000 years. Despite new advancements in the 3D imaging technologies, widely accepted representation of 3D fingerprint features and matching methodology is yet to emerge. This paper investigates 3D representation of widely employed 2D minutiae features by recovering and incorporating (i) minutiae height z and (ii) its 3D orientation φ information and illustrates an effective matching strategy for matching popular minutiae features extended in 3D space. One of the obstacles of the emerging 3D fingerprint identification systems to replace the conventional 2D fingerprint system lies in their bulk and high cost, which is mainly contributed from the usage of structured lighting system or multiple cameras. This paper attempts to addresses such key limitations of the current 3D fingerprint technologies bydeveloping the single camera-based 3D fingerprint identification system. We develop a generalized 3D minutiae matching model and recover extended 3D fingerprint features from the reconstructed 3D fingerprints. 2D fingerprint images acquired for the 3D fingerprint reconstruction can themselves be employed for the performance improvement and have been illustrated in the work detailed in this paper. This paper also attempts to answer one of the most fundamental questions on the availability of inherent discriminable information from 3D fingerprints. The experimental results are presented on a database of 240 clients 3D fingerprints, which is made publicly available to further research efforts in this area, and illustrate the discriminant power of 3D minutiae representation and matching to achieve performance improvement.

  8. 75 FR 76022 - Agency Information Collection Activities: Cost Submission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... SECURITY Customs and Border Protection Agency Information Collection Activities: Cost Submission AGENCY: U... approval in accordance with the Paperwork Reduction Act: Cost Submission (CBP Form 247). This is a proposed... forms of information. Title: Cost Submission. OMB Number: 1651-0028. Form Number: 247. Abstract:...

  9. Major Trends and Portents Related to Information Costs.

    ERIC Educational Resources Information Center

    Wilson, John H., Jr.

    Managers' having to account for cost of information activities is here to stay. Budgeting is going to become more stringent and imaginative. Costs should not be equated with human values--psychological and subjective--which apparently many managers do, feeling that having to cost information activities is degrading. Some trends are: Buy products…

  10. Verify by Genability - Providing Solar Customers with Accurate Reports of Utility Bill Cost Savings

    SciTech Connect

    2015-12-01

    The National Renewable Energy Laboratory (NREL), partnering with Genability and supported by the U.S. Department of Energy's SunShot Incubator program, independently verified the accuracy of Genability's monthly cost savings.

  11. Single-sideband modulator accurately reproduces phase information in 2-Mc signals

    NASA Technical Reports Server (NTRS)

    Strenglein, H. F.

    1966-01-01

    Phase-locked oscillator system employing solid state components acts as a single-sideband modulator to accurately reproduce phase information in 2-Mc signals. This system is useful in telemetry, aircraft communications and position-finding stations, and VHF test circuitry.

  12. Managerial Cost Accounting for a Technical Information Center.

    ERIC Educational Resources Information Center

    Helmkamp, John G.

    A two-fold solution to the cost information deficiency problem is proposed. A formal managerial cost accounting system is designed expressly for the two information services of retrospective search and selective dissemination. The system was employed during a trial period to test its effectiveness in a technical information center. Once…

  13. 75 FR 5169 - Insurance Cost Information Regulation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-01

    ... that all car dealers must make available to prospective purchasers, pursuant to 49 CFR 582.4. This... compares differences in insurance costs of different makes and models of passenger cars based...

  14. 76 FR 6516 - Insurance Cost Information Regulation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-04

    ... that all car dealers must make available to prospective purchasers, pursuant to 49 CFR 582.4. This... compares differences in insurance costs of different makes and models of passenger cars based...

  15. 78 FR 71558 - Insurance Cost Information Regulation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... aspects of existing vehicle designs and include new or better safety technologies in future vehicle... vehicle damage susceptibility information to consumers. NHTSA plans to use this information to meet a... spending on maintenance of automobiles and repair of crash damage, the Motor Vehicle Information and...

  16. Rapid, cost-effective and accurate quantification of Yucca schidigera Roezl. steroidal saponins using HPLC-ELSD method.

    PubMed

    Tenon, Mathieu; Feuillère, Nicolas; Roller, Marc; Birtić, Simona

    2017-04-15

    Yucca GRAS-labelled saponins have been and are increasingly used in food/feed, pharmaceutical or cosmetic industries. Existing techniques presently used for Yucca steroidal saponin quantification remain either inaccurate and misleading or accurate but time consuming and cost prohibitive. The method reported here addresses all of the above challenges. HPLC/ELSD technique is an accurate and reliable method that yields results of appropriate repeatability and reproducibility. This method does not over- or under-estimate levels of steroidal saponins. HPLC/ELSD method does not require each and every pure standard of saponins, to quantify the group of steroidal saponins. The method is a time- and cost-effective technique that is suitable for routine industrial analyses. HPLC/ELSD methods yield a saponin fingerprints specific to the plant species. As the method is capable of distinguishing saponin profiles from taxonomically distant species, it can unravel plant adulteration issues.

  17. A Cost Benefit Technique for R & D Based Information.

    ERIC Educational Resources Information Center

    Stern, B. T.

    A cost benefit technique consisting of the following five phases is proposed: (a) specific objectives of the service, (b) measurement of work flow, (c) work costing, (d) charge to users of the information service, and (e) equating demand and cost. In this approach, objectives are best stated by someone not routinely concerned with the individual…

  18. Species Distribution 2.0: An Accurate Time- and Cost-Effective Method of Prospection Using Street View Imagery

    PubMed Central

    Schwoertzig, Eugénie; Millon, Alexandre

    2016-01-01

    Species occurrence data provide crucial information for biodiversity studies in the current context of global environmental changes. Such studies often rely on a limited number of occurrence data collected in the field and on pseudo-absences arbitrarily chosen within the study area, which reduces the value of these studies. To overcome this issue, we propose an alternative method of prospection using geo-located street view imagery (SVI). Following a standardised protocol of virtual prospection using both vertical (aerial photographs) and horizontal (SVI) perceptions, we have surveyed 1097 randomly selected cells across Spain (0.1x0.1 degree, i.e. 20% of Spain) for the presence of Arundo donax L. (Poaceae). In total we have detected A. donax in 345 cells, thus substantially expanding beyond the now two-centuries-old field-derived record, which described A. donax only 216 cells. Among the field occurrence cells, 81.1% were confirmed by SVI prospection to be consistent with species presence. In addition, we recorded, by SVI prospection, 752 absences, i.e. cells where A. donax was considered absent. We have also compared the outcomes of climatic niche modeling based on SVI data against those based on field data. Using generalized linear models fitted with bioclimatic predictors, we have found SVI data to provide far more compelling results in terms of niche modeling than does field data as classically used in SDM. This original, cost- and time-effective method provides the means to accurately locate highly visible taxa, reinforce absence data, and predict species distribution without long and expensive in situ prospection. At this time, the majority of available SVI data is restricted to human-disturbed environments that have road networks. However, SVI is becoming increasingly available in natural areas, which means the technique has considerable potential to become an important factor in future biodiversity studies. PMID:26751565

  19. Accurate protein structure modeling using sparse NMR data and homologous structure information.

    PubMed

    Thompson, James M; Sgourakis, Nikolaos G; Liu, Gaohua; Rossi, Paolo; Tang, Yuefeng; Mills, Jeffrey L; Szyperski, Thomas; Montelione, Gaetano T; Baker, David

    2012-06-19

    While information from homologous structures plays a central role in X-ray structure determination by molecular replacement, such information is rarely used in NMR structure determination because it can be incorrect, both locally and globally, when evolutionary relationships are inferred incorrectly or there has been considerable evolutionary structural divergence. Here we describe a method that allows robust modeling of protein structures of up to 225 residues by combining (1)H(N), (13)C, and (15)N backbone and (13)Cβ chemical shift data, distance restraints derived from homologous structures, and a physically realistic all-atom energy function. Accurate models are distinguished from inaccurate models generated using incorrect sequence alignments by requiring that (i) the all-atom energies of models generated using the restraints are lower than models generated in unrestrained calculations and (ii) the low-energy structures converge to within 2.0 Å backbone rmsd over 75% of the protein. Benchmark calculations on known structures and blind targets show that the method can accurately model protein structures, even with very remote homology information, to a backbone rmsd of 1.2-1.9 Å relative to the conventional determined NMR ensembles and of 0.9-1.6 Å relative to X-ray structures for well-defined regions of the protein structures. This approach facilitates the accurate modeling of protein structures using backbone chemical shift data without need for side-chain resonance assignments and extensive analysis of NOESY cross-peak assignments.

  20. Evaluation of a low-cost and accurate ocean temperature logger on subsurface mooring systems

    SciTech Connect

    Tian, Chuan; Deng, Zhiqun; Lu, Jun; Xu, Xiaoyang; Zhao, Wei; Xu, Ming

    2014-06-23

    Monitoring seawater temperature is important to understanding evolving ocean processes. To monitor internal waves or ocean mixing, a large number of temperature loggers are typically mounted on subsurface mooring systems to obtain high-resolution temperature data at different water depths. In this study, we redesigned and evaluated a compact, low-cost, self-contained, high-resolution and high-accuracy ocean temperature logger, TC-1121. The newly designed TC-1121 loggers are smaller, more robust, and their sampling intervals can be automatically changed by indicated events. They have been widely used in many mooring systems to study internal wave and ocean mixing. The logger’s fundamental design, noise analysis, calibration, drift test, and a long-term sea trial are discussed in this paper.

  1. DRG-based per diem payment system matches costs more accurately.

    PubMed

    Brannen, T J

    1999-04-01

    Some managed care organizations use the DRG hospital payment method developed for Medicare to set case rates. Unfortunately, when such a method is used in a risk-sharing arrangement, hospital and physician incentives are misaligned. Hospitals and payers would benefit from using a hospital reimbursement model that calculates inpatient per diem payments for medical and surgical cases by classifying DRGs in tiers and ranking the tiers according to how resource-intensive they are. DRGs provide the means for a rational classification system of per diem rates that recognizes cases where the expected resources are going to be higher or lower than the average per diem amount. If payers use per diem rates that are weighted according to a DRG classification, hospital payments can correlate closely with the actual costs per day for a specific case, rather than an average for all surgical or medical admissions.

  2. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, B.; Hut, R.; Van De Giesen, N.

    2012-12-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the $150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  3. Accurately measuring volume of soil samples using low cost Kinect 3D scanner

    NASA Astrophysics Data System (ADS)

    van der Sterre, Boy-Santhos; Hut, Rolf; van de Giesen, Nick

    2013-04-01

    The 3D scanner of the Kinect game controller can be used to increase the accuracy and efficiency of determining in situ soil moisture content. Soil moisture is one of the principal hydrological variables in both the water and energy interactions between soil and atmosphere. Current in situ measurements of soil moisture either rely on indirect measurements (of electromagnetic constants or heat capacity) or on physically taking a sample and weighing it in a lab. The bottleneck in accurately retrieving soil moisture using samples is the determining of the volume of the sample. Currently this is mostly done by the very time consuming "sand cone method" in which the volume were the sample used to sit is filled with sand. We show that 3D scanner that is part of the 150 game controller extension "Kinect" can be used to make 3D scans before and after taking the sample. The accuracy of this method is tested by scanning forms of known volume. This method is less time consuming and less error-prone than using a sand cone.

  4. A cost-effective transparency-based digital imaging for efficient and accurate wound area measurement.

    PubMed

    Li, Pei-Nan; Li, Hong; Wu, Mo-Li; Wang, Shou-Yu; Kong, Qing-You; Zhang, Zhen; Sun, Yuan; Liu, Jia; Lv, De-Cheng

    2012-01-01

    Wound measurement is an objective and direct way to trace the course of wound healing and to evaluate therapeutic efficacy. Nevertheless, the accuracy and efficiency of the current measurement methods need to be improved. Taking the advantages of reliability of transparency tracing and the accuracy of computer-aided digital imaging, a transparency-based digital imaging approach is established, by which data from 340 wound tracing were collected from 6 experimental groups (8 rats/group) at 8 experimental time points (Day 1, 3, 5, 7, 10, 12, 14 and 16) and orderly archived onto a transparency model sheet. This sheet was scanned and its image was saved in JPG form. Since a set of standard area units from 1 mm(2) to 1 cm(2) was integrated into the sheet, the tracing areas in JPG image were measured directly, using the "Magnetic lasso tool" in Adobe Photoshop program. The pixel values/PVs of individual outlined regions were obtained and recorded in an average speed of 27 second/region. All PV data were saved in an excel form and their corresponding areas were calculated simultaneously by the formula of Y (PV of the outlined region)/X (PV of standard area unit) × Z (area of standard unit). It took a researcher less than 3 hours to finish area calculation of 340 regions. In contrast, over 3 hours were expended by three skillful researchers to accomplish the above work with traditional transparency-based method. Moreover, unlike the results obtained traditionally, little variation was found among the data calculated by different persons and the standard area units in different sizes and shapes. Given its accurate, reproductive and efficient properties, this transparency-based digital imaging approach would be of significant values in basic wound healing research and clinical practice.

  5. A Cost-Benefit and Accurate Method for Assessing Microalbuminuria: Single versus Frequent Urine Analysis.

    PubMed

    Hemmati, Roholla; Gharipour, Mojgan; Khosravi, Alireza; Jozan, Mahnaz

    2013-01-01

    Background. The purpose of this study was to answer the question whether a single testing for microalbuminuria results in a reliable conclusion leading costs saving. Methods. This current cross-sectional study included a total of 126 consecutive persons. Microalbuminuria was assessed by collection of two fasting random urine specimens on arrival to the clinic as well as one week later in the morning. Results. In overall, 17 out of 126 participants suffered from microalbuminuria that, among them, 12 subjects were also diagnosed as microalbuminuria once assessing this factor with a sensitivity of 70.6%, a specificity of 100%, a PPV of 100%, a NPV of 95.6%, and an accuracy of 96.0%. The measured sensitivity, specificity, PVV, NPV, and accuracy in hypertensive patients were 73.3%, 100%, 100%, 94.8%, and 95.5%, respectively. Also, these rates in nonhypertensive groups were 50.0%, 100%, 100%, 97.3%, and 97.4%, respectively. According to the ROC curve analysis, a single measurement of UACR had a high value for discriminating defected from normal renal function state (c = 0.989). Urinary albumin concentration in a single measurement had also high discriminative value for diagnosis of damaged kidney (c = 0.995). Conclusion. The single testing of both UACR and urine albumin level rather frequent testing leads to high diagnostic sensitivity, specificity, and accuracy as well as high predictive values in total population and also in hypertensive subgroups.

  6. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained.

  7. Cas9-chromatin binding information enables more accurate CRISPR off-target prediction

    PubMed Central

    Singh, Ritambhara; Kuscu, Cem; Quinlan, Aaron; Qi, Yanjun; Adli, Mazhar

    2015-01-01

    The CRISPR system has become a powerful biological tool with a wide range of applications. However, improving targeting specificity and accurately predicting potential off-targets remains a significant goal. Here, we introduce a web-based CRISPR/Cas9 Off-target Prediction and Identification Tool (CROP-IT) that performs improved off-target binding and cleavage site predictions. Unlike existing prediction programs that solely use DNA sequence information; CROP-IT integrates whole genome level biological information from existing Cas9 binding and cleavage data sets. Utilizing whole-genome chromatin state information from 125 human cell types further enhances its computational prediction power. Comparative analyses on experimentally validated datasets show that CROP-IT outperforms existing computational algorithms in predicting both Cas9 binding as well as cleavage sites. With a user-friendly web-interface, CROP-IT outputs scored and ranked list of potential off-targets that enables improved guide RNA design and more accurate prediction of Cas9 binding or cleavage sites. PMID:26032770

  8. Waste management facilities cost information for hazardous waste. Revision 1

    SciTech Connect

    Shropshire, D.; Sherick, M.; Biagi, C.

    1995-06-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for managing hazardous waste. The report`s information on treatment, storage, and disposal modules can be integrated to develop total life-cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of cost estimation data is also summarized in this report.

  9. Waste Management Facilities cost information for low-level waste

    SciTech Connect

    Shropshire, D.; Sherick, M.; Biadgi, C.

    1995-06-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for managing low-level waste. The report`s information on treatment, storage, and disposal modules can be integrated to develop total life-cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of cost estimation data is also summarized in this report.

  10. Improving Training Cost Information at the Naval Avionics Center

    DTIC Science & Technology

    1991-12-01

    end. 33 LIST OF REFERENCES 1. Cooper, Robin, "You Need a New Cost System When...," Harvard Business Review , January-February 1989. 2. Ames, B. Charles...and Hlavacek, James D., "Vital Truths About Managing Your Costs," Harvard Business Review , January-February 1990. 3. Konsynski, Benn R., and McFarlan...F. Warren, "Information Partnerships--Shared Data, Shared Scale," Harvard Business Review , September-October 1990. 4. Kaplan, Robert S., "One Cost

  11. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    SciTech Connect

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Anderson, Kevin K.; Daly, Don S.; Smith, Richard D.

    2005-08-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/- 5 ppm and 1 ppm) and NET value (no constraint, +/- 0.05 and 0.01 on a 0-1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LCMS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate easurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/- 1 ppm and elution time easurements within +/- 0.01 NET.

  12. Accurate, fast and cost-effective diagnostic test for monosomy 1p36 using real-time quantitative PCR.

    PubMed

    Cunha, Pricila da Silva; Pena, Heloisa B; D'Angelo, Carla Sustek; Koiffmann, Celia P; Rosenfeld, Jill A; Shaffer, Lisa G; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5-0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  13. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  14. What Do Information Technology Support Services Really Cost?

    ERIC Educational Resources Information Center

    Leach, Karen; Smallen, David

    1998-01-01

    A study examined the cost of information-technology support services in higher education institutions. The report describes the project's origins and work to date and reports initial results in three areas: network services, desktop repair services, and administrative information systems, looking in each case at economies of scale, outsourcing…

  15. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    PubMed Central

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Smith, Richard D.

    2007-01-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/− 5 ppm and 1 ppm) and NET value (no constraint, +/− 0.05 and 0.01 on a 0–1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LC-MS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate measurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/− 1 ppm and elution time measurements within +/− 0.01 NET. PMID:15979333

  16. Thermodynamic Costs of Information Processing in Sensory Adaptation

    PubMed Central

    Sartori, Pablo; Granger, Léo; Lee, Chiu Fan; Horowitz, Jordan M.

    2014-01-01

    Biological sensory systems react to changes in their surroundings. They are characterized by fast response and slow adaptation to varying environmental cues. Insofar as sensory adaptive systems map environmental changes to changes of their internal degrees of freedom, they can be regarded as computational devices manipulating information. Landauer established that information is ultimately physical, and its manipulation subject to the entropic and energetic bounds of thermodynamics. Thus the fundamental costs of biological sensory adaptation can be elucidated by tracking how the information the system has about its environment is altered. These bounds are particularly relevant for small organisms, which unlike everyday computers, operate at very low energies. In this paper, we establish a general framework for the thermodynamics of information processing in sensing. With it, we quantify how during sensory adaptation information about the past is erased, while information about the present is gathered. This process produces entropy larger than the amount of old information erased and has an energetic cost bounded by the amount of new information written to memory. We apply these principles to the E. coli's chemotaxis pathway during binary ligand concentration changes. In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum. Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response. PMID:25503948

  17. The Costs of Information Technology and the Electronic Library.

    ERIC Educational Resources Information Center

    Tebbetts, Diane R.

    2000-01-01

    Discusses the impact of information technology requirements on the costs of electronic libraries. Addresses key questions concerning hardware, software and network installation and upgrades and provides strategies for dealing with the needs for continuous funding and long-term financing that are essential for keeping up with the requirements of…

  18. Accurate path integral molecular dynamics simulation of ab-initio water at near-zero added cost

    NASA Astrophysics Data System (ADS)

    Elton, Daniel; Fritz, Michelle; Soler, José; Fernandez-Serra, Marivi

    It is now established that nuclear quantum motion plays an important role in determining water's structure and dynamics. These effects are important to consider when evaluating DFT functionals and attempting to develop better ones for water. The standard way of treating nuclear quantum effects, path integral molecular dynamics (PIMD), multiplies the number of energy/force calculations by the number of beads, which is typically 32. Here we introduce a method whereby PIMD can be incorporated into a DFT molecular dynamics simulation at virtually zero cost. The method is based on the cluster (many body) expansion of the energy. We first subtract the DFT monomer energies, using a custom DFT-based monomer potential energy surface. The evolution of the PIMD beads is then performed using only the more-accurate Partridge-Schwenke monomer energy surface. The DFT calculations are done using the centroid positions. Various bead thermostats can be employed to speed up the sampling of the quantum ensemble. The method bears some resemblance to multiple timestep algorithms and other schemes used to speed up PIMD with classical force fields. We show that our method correctly captures some of key effects of nuclear quantum motion on both the structure and dynamics of water. We acknowledge support from DOE Award No. DE-FG02-09ER16052 (D.E.) and DOE Early Career Award No. DE-SC0003871 (M.V.F.S.).

  19. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  20. Information Technology: A Tool to Cut Health Care Costs

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.

    1996-01-01

    Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.

  1. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics

    PubMed Central

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research. PMID:27571061

  2. The earnings of informal carers: wage differentials and opportunity costs.

    PubMed

    Heitmueller, Axel; Inglis, Kirsty

    2007-07-01

    A substantial proportion of working age individuals in Britain are looking after sick, disabled or elderly people, often combining their work and caring responsibilities. Previous research has shown that informal care is linked with substantial opportunity costs for the individual due to forgone wages as a result of non-labour market participation. In this paper we show that informal carers exhibit further disadvantages even when participating. Using the British Household Panel Study (BHPS) we decompose wage differentials and show that carers can expect lower returns for a given set of characteristics, with this wage penalty varying along the pay distribution and by gender. Furthermore, opportunity costs from forgone wages and wage penalties are estimated and found to be substantial.

  3. 40 CFR 2.311 - Special rules governing certain information obtained under the Motor Vehicle Information and Cost...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information obtained under the Motor Vehicle Information and Cost Savings Act. 2.311 Section 2.311 Protection... and Cost Savings Act. (a) Definitions. For the purposes of this section: (1) Act means the Motor Vehicle Information and Cost Savings Act, as amended, 15 U.S.C. 1901 et seq. (2) Average fuel economy...

  4. Accurate refinement of docked protein complexes using evolutionary information and deep learning.

    PubMed

    Akbal-Delibas, Bahar; Farhoodi, Roshanak; Pomplun, Marc; Haspel, Nurit

    2016-06-01

    One of the major challenges for protein docking methods is to accurately discriminate native-like structures from false positives. Docking methods are often inaccurate and the results have to be refined and re-ranked to obtain native-like complexes and remove outliers. In a previous work, we introduced AccuRefiner, a machine learning based tool for refining protein-protein complexes. Given a docked complex, the refinement tool produces a small set of refined versions of the input complex, with lower root-mean-square-deviation (RMSD) of atomic positions with respect to the native structure. The method employs a unique ranking tool that accurately predicts the RMSD of docked complexes with respect to the native structure. In this work, we use a deep learning network with a similar set of features and five layers. We show that a properly trained deep learning network can accurately predict the RMSD of a docked complex with 1.40 Å error margin on average, by approximating the complex relationship between a wide set of scoring function terms and the RMSD of a docked structure. The network was trained on 35000 unbound docking complexes generated by RosettaDock. We tested our method on 25 different putative docked complexes produced also by RosettaDock for five proteins that were not included in the training data. The results demonstrate that the high accuracy of the ranking tool enables AccuRefiner to consistently choose the refinement candidates with lower RMSD values compared to the coarsely docked input structures.

  5. Polyallelic structural variants can provide accurate, highly informative genetic markers focused on diagnosis and therapeutic targets: Accuracy vs. Precision.

    PubMed

    Roses, A D

    2016-02-01

    Structural variants (SVs) include all insertions, deletions, and rearrangements in the genome, with several common types of nucleotide repeats including single sequence repeats, short tandem repeats, and insertion-deletion length variants. Polyallelic SVs provide highly informative markers for association studies with well-phenotyped cohorts. SVs can influence gene regulation by affecting epigenetics, transcription, splicing, and/or translation. Accurate assays of polyallelic SV loci are required to define the range and allele frequency of variable length alleles.

  6. Unsupervised Neural Network Quantifies the Cost of Visual Information Processing.

    PubMed

    Orbán, Levente L; Chartier, Sylvain

    2015-01-01

    Untrained, "flower-naïve" bumblebees display behavioural preferences when presented with visual properties such as colour, symmetry, spatial frequency and others. Two unsupervised neural networks were implemented to understand the extent to which these models capture elements of bumblebees' unlearned visual preferences towards flower-like visual properties. The computational models, which are variants of Independent Component Analysis and Feature-Extracting Bidirectional Associative Memory, use images of test-patterns that are identical to ones used in behavioural studies. Each model works by decomposing images of floral patterns into meaningful underlying factors. We reconstruct the original floral image using the components and compare the quality of the reconstructed image to the original image. Independent Component Analysis matches behavioural results substantially better across several visual properties. These results are interpreted to support a hypothesis that the temporal and energetic costs of information processing by pollinators served as a selective pressure on floral displays: flowers adapted to pollinators' cognitive constraints.

  7. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  8. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  9. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  10. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  11. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  12. 77 FR 69441 - Federal Acquisition Regulation; Information Collection; Cost Accounting Standards Administration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-19

    ... Regulation; Information Collection; Cost Accounting Standards Administration AGENCY: Department of Defense...: Under the provisions of the Paperwork Reduction Act, the Regulatory Secretariat will be submitting to... approved information collection requirement concerning cost accounting standards administration....

  13. Information System for Societal Cost and Benefit Analysis of Vocational and Manpower Programs. Final Report.

    ERIC Educational Resources Information Center

    Arora, Mehar

    The study was directed toward developing a manual for establishing societal benefits and costs of vocational and manpower programs in Wisconsin. After first outlining the background of benefit-cost analysis, problems in establishing cost functions in education are presented along with some important cost concepts and uses of cost information in…

  14. The implications for information system design of how health care costs are determined.

    PubMed

    Ehreth, J

    1996-03-01

    As the costs of health care assume increasing importance in national health policy, information systems will be required to supply better information about how costs are generated and how resources are distributed. Costs, as determined by accounting systems, often are inadequate for policy analysis because they represent resources consumed (expenditures) to produce given outputs but do not measure forgone alternative uses of the resources (opportunity costs). To accommodate cost studies at the program level and the system level, relational information systems must be developed that allow costs to be summed across individuals to determine an organization's costs, across providers to determine an individual patient's costs, and across both to determine system and population costs. Program level studies require that cost variables be grouped into variable costs that are tied to changes in volume of output and fixed costs that are allocated rationally. Data sources for program-level analyses are organizational financial statements, cost center accounting records, Medicare cost reports, American Hospital Association surveys, and the Department of Veterans Affairs (VA) cost distribution files. System-level studies are performed to predict future costs and to compare costs of alternative modes of treatment. System-level analyses aggregate all costs associated with individuals to produce population-based costs. Data sources for system-level analyses include insurance claims;n Medicare files; hospital billing records; and VA inpatient, outpatient, and management databases. Future cost studies will require the assessment of costs from all providers, regardless of organizational membership status, for all individuals in defined populations.

  15. Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

    PubMed Central

    Mantel, Bruno; Stoffregen, Thomas A.; Campbell, Alain; Bardy, Benoît G.

    2015-01-01

    Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate. PMID:25856410

  16. Predicting hospital accounting costs

    PubMed Central

    Newhouse, Joseph P.; Cretin, Shan; Witsberger, Christina J.

    1989-01-01

    Two alternative methods to Medicare Cost Reports that provide information about hospital costs more promptly but less accurately are investigated. Both employ utilization data from current-year bills. The first attaches costs to utilization data using cost-charge ratios from the previous year's cost report; the second uses charges from current year's bills. The first method is the more accurate of the two, but even using it, only 40 percent of hospitals had predicted costs within plus or minus 5 percent of actual costs. The feasibility and cost of obtaining cost reports from a small, fast-track sample of hospitals should be investigated. PMID:10313352

  17. Honey bees can perform accurately directed waggle dances based solely on information from a homeward trip.

    PubMed

    Edrich, Wolfgang

    2015-10-01

    Honey bees were displaced several 100 m from their hive to an unfamiliar site and provisioned with honey. After feeding, almost two-thirds of the bees flew home to their hive within a 50 min observation time. About half of these returning, bees signalled the direction of the release site in waggle dances thus demonstrating that the dance can be guided entirely by information gathered on a single homeward trip. The likely reason for the bees' enthusiastic dancing on their initial return from this new site was the highly rewarding honeycomb that they were given there. The attractive nature of the site is confirmed by many of these bees revisiting the site and continuing to forage there.

  18. Accurately decoding visual information from fMRI data obtained in a realistic virtual environment

    PubMed Central

    Floren, Andrew; Naylor, Bruce; Miikkulainen, Risto; Ress, David

    2015-01-01

    Three-dimensional interactive virtual environments (VEs) are a powerful tool for brain-imaging based cognitive neuroscience that are presently under-utilized. This paper presents machine-learning based methods for identifying brain states induced by realistic VEs with improved accuracy as well as the capability for mapping their spatial topography on the neocortex. VEs provide the ability to study the brain under conditions closer to the environment in which humans evolved, and thus to probe deeper into the complexities of human cognition. As a test case, we designed a stimulus to reflect a military combat situation in the Middle East, motivated by the potential of using real-time functional magnetic resonance imaging (fMRI) in the treatment of post-traumatic stress disorder. Each subject experienced moving through the virtual town where they encountered 1–6 animated combatants at different locations, while fMRI data was collected. To analyze the data from what is, compared to most studies, more complex and less controlled stimuli, we employed statistical machine learning in the form of Multi-Voxel Pattern Analysis (MVPA) with special attention given to artificial Neural Networks (NN). Extensions to NN that exploit the block structure of the stimulus were developed to improve the accuracy of the classification, achieving performances from 58 to 93% (chance was 16.7%) with six subjects. This demonstrates that MVPA can decode a complex cognitive state, viewing a number of characters, in a dynamic virtual environment. To better understand the source of this information in the brain, a novel form of sensitivity analysis was developed to use NN to quantify the degree to which each voxel contributed to classification. Compared with maps produced by general linear models and the searchlight approach, these sensitivity maps revealed a more diverse pattern of information relevant to the classification of cognitive state. PMID:26106315

  19. The Social Costs of Ubiquitous Information: Consuming Information on Mobile Phones Is Associated with Lower Trust.

    PubMed

    Kushlev, Kostadin; Proulx, Jason D E

    2016-01-01

    In an age already saturated with information, the ongoing revolution in mobile computing has expanded the realm of immediate information access far beyond our homes and offices. In addition to changing where people can access information, mobile computing has changed what information people access-from finding specific directions to a restaurant to exploring nearby businesses when on the go. Does this ability to instantly gratify our information needs anytime and anywhere have any bearing on how much we trust those around us-from neighbors to strangers? Using data from a large nationally representative survey (World Values Survey: Wave 6), we found that the more people relied on their mobile phones for information, the less they trusted strangers, neighbors and people from other religions and nationalities. In contrast, obtaining information through any other method-including TV, radio, newspapers, and even the Internet more broadly-predicted higher trust in those groups. Mobile information had no bearing on how much people trusted close others, such as their family. Although causality cannot be inferred, these findings provide an intriguing first glimpse into the possible unforeseen costs of convenient information access for the social lubricant of society-our sense of trust in one another.

  20. The Social Costs of Ubiquitous Information: Consuming Information on Mobile Phones Is Associated with Lower Trust

    PubMed Central

    Proulx, Jason D. E.

    2016-01-01

    In an age already saturated with information, the ongoing revolution in mobile computing has expanded the realm of immediate information access far beyond our homes and offices. In addition to changing where people can access information, mobile computing has changed what information people access—from finding specific directions to a restaurant to exploring nearby businesses when on the go. Does this ability to instantly gratify our information needs anytime and anywhere have any bearing on how much we trust those around us—from neighbors to strangers? Using data from a large nationally representative survey (World Values Survey: Wave 6), we found that the more people relied on their mobile phones for information, the less they trusted strangers, neighbors and people from other religions and nationalities. In contrast, obtaining information through any other method—including TV, radio, newspapers, and even the Internet more broadly—predicted higher trust in those groups. Mobile information had no bearing on how much people trusted close others, such as their family. Although causality cannot be inferred, these findings provide an intriguing first glimpse into the possible unforeseen costs of convenient information access for the social lubricant of society—our sense of trust in one another. PMID:27606707

  1. MBRidge: an accurate and cost-effective method for profiling DNA methylome at single-base resolution.

    PubMed

    Cai, Wanshi; Mao, Fengbiao; Teng, Huajing; Cai, Tao; Zhao, Fangqing; Wu, Jinyu; Sun, Zhong Sheng

    2015-08-01

    Organisms and cells, in response to environmental influences or during development, undergo considerable changes in DNA methylation on a genome-wide scale, which are linked to a variety of biological processes. Using MethylC-seq to decipher DNA methylome at single-base resolution is prohibitively costly. In this study, we develop a novel approach, named MBRidge, to detect the methylation levels of repertoire CpGs, by innovatively introducing C-hydroxylmethylated adapters and bisulfate treatment into the MeDIP-seq protocol and employing ridge regression in data analysis. A systematic evaluation of DNA methylome in a human ovarian cell line T29 showed that MBRidge achieved high correlation (R > 0.90) with much less cost (∼10%) in comparison with MethylC-seq. We further applied MBRidge to profiling DNA methylome in T29H, an oncogenic counterpart of T29's. By comparing methylomes of T29H and T29, we identified 131790 differential methylation regions (DMRs), which are mainly enriched in carcinogenesis-related pathways. These are substantially different from 7567 DMRs that were obtained by RRBS and related with cell development or differentiation. The integrated analysis of DMRs in the promoter and expression of DMR-corresponding genes revealed that DNA methylation enforced reverse regulation of gene expression, depending on the distance from the proximal DMR to transcription starting sites in both mRNA and lncRNA. Taken together, our results demonstrate that MBRidge is an efficient and cost-effective method that can be widely applied to profiling DNA methylomes.

  2. The cost of forming more accurate impressions: accuracy-motivated perceivers see the personality of others more distinctively but less normatively than perceivers without an explicit goal.

    PubMed

    Biesanz, Jeremy C; Human, Lauren J

    2010-04-01

    Does the motivation to form accurate impressions actually improve accuracy? The present work extended Kenny's (1991, 1994) weighted-average model (WAM)--a theoretical model of the factors that influence agreement among personality judgments--to examine two components of interpersonal perception: distinctive and normative accuracy. WAM predicts that an accuracy motivation should enhance distinctive accuracy but decrease normative accuracy. In other words, the impressions of a perceiver with an accuracy motivation will correspond more with the target person's unique characteristics and less with the characteristics of the average person. Perceivers randomly assigned to receive the social goal of forming accurate impressions, which was communicated through a single-sentence instruction, achieved higher levels of distinctive self-other agreement but lower levels of normative agreement compared with perceivers not given an explicit impression-formation goal. The results suggest that people motivated to form accurate impressions do indeed become more accurate, but at the cost of seeing others less normatively and, in particular, less positively.

  3. 75 FR 10455 - Information Collection; Timber Purchaser Cost and Sales Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ...; ] DEPARTMENT OF AGRICULTURE Forest Service Information Collection; Timber Purchaser Cost and Sales Data AGENCY... organizations on the extension of a currently approved information collection, Timber Purchaser Cost and Sales... also may be submitted via facsimile to 202-205-1045 or by e-mail to: cost_collecting@fs.fed.us ....

  4. The Cambridge Face Tracker: Accurate, Low Cost Measurement of Head Posture Using Computer Vision and Face Recognition Software

    PubMed Central

    Thomas, Peter B. M.; Baltrušaitis, Tadas; Robinson, Peter; Vivian, Anthony J.

    2016-01-01

    Purpose We validate a video-based method of head posture measurement. Methods The Cambridge Face Tracker uses neural networks (constrained local neural fields) to recognize facial features in video. The relative position of these facial features is used to calculate head posture. First, we assess the accuracy of this approach against videos in three research databases where each frame is tagged with a precisely measured head posture. Second, we compare our method to a commercially available mechanical device, the Cervical Range of Motion device: four subjects each adopted 43 distinct head postures that were measured using both methods. Results The Cambridge Face Tracker achieved confident facial recognition in 92% of the approximately 38,000 frames of video from the three databases. The respective mean error in absolute head posture was 3.34°, 3.86°, and 2.81°, with a median error of 1.97°, 2.16°, and 1.96°. The accuracy decreased with more extreme head posture. Comparing The Cambridge Face Tracker to the Cervical Range of Motion Device gave correlation coefficients of 0.99 (P < 0.0001), 0.96 (P < 0.0001), and 0.99 (P < 0.0001) for yaw, pitch, and roll, respectively. Conclusions The Cambridge Face Tracker performs well under real-world conditions and within the range of normally-encountered head posture. It allows useful quantification of head posture in real time or from precaptured video. Its performance is similar to that of a clinically validated mechanical device. It has significant advantages over other approaches in that subjects do not need to wear any apparatus, and it requires only low cost, easy-to-setup consumer electronics. Translational Relevance Noncontact assessment of head posture allows more complete clinical assessment of patients, and could benefit surgical planning in future. PMID:27730008

  5. Accurate molecular dynamics and nuclear quantum effects at low cost by multiple steps in real and imaginary time: Using density functional theory to accelerate wavefunction methods

    NASA Astrophysics Data System (ADS)

    Kapil, V.; VandeVondele, J.; Ceriotti, M.

    2016-02-01

    The development and implementation of increasingly accurate methods for electronic structure calculations mean that, for many atomistic simulation problems, treating light nuclei as classical particles is now one of the most serious approximations. Even though recent developments have significantly reduced the overhead for modeling the quantum nature of the nuclei, the cost is still prohibitive when combined with advanced electronic structure methods. Here we present how multiple time step integrators can be combined with ring-polymer contraction techniques (effectively, multiple time stepping in imaginary time) to reduce virtually to zero the overhead of modelling nuclear quantum effects, while describing inter-atomic forces at high levels of electronic structure theory. This is demonstrated for a combination of MP2 and semi-local DFT applied to the Zundel cation. The approach can be seamlessly combined with other methods to reduce the computational cost of path integral calculations, such as high-order factorizations of the Boltzmann operator or generalized Langevin equation thermostats.

  6. Accurate molecular dynamics and nuclear quantum effects at low cost by multiple steps in real and imaginary time: Using density functional theory to accelerate wavefunction methods

    SciTech Connect

    Kapil, V.; Ceriotti, M.; VandeVondele, J.

    2016-02-07

    The development and implementation of increasingly accurate methods for electronic structure calculations mean that, for many atomistic simulation problems, treating light nuclei as classical particles is now one of the most serious approximations. Even though recent developments have significantly reduced the overhead for modeling the quantum nature of the nuclei, the cost is still prohibitive when combined with advanced electronic structure methods. Here we present how multiple time step integrators can be combined with ring-polymer contraction techniques (effectively, multiple time stepping in imaginary time) to reduce virtually to zero the overhead of modelling nuclear quantum effects, while describing inter-atomic forces at high levels of electronic structure theory. This is demonstrated for a combination of MP2 and semi-local DFT applied to the Zundel cation. The approach can be seamlessly combined with other methods to reduce the computational cost of path integral calculations, such as high-order factorizations of the Boltzmann operator or generalized Langevin equation thermostats.

  7. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  8. FAA Information Technology: Complete Cost Data not Provided to OMB

    DTIC Science & Technology

    1991-01-01

    and use of information technology . However, FAA has not provided required data on information technology supporting the air traffic control system...decisionmakers in the Department of Transportation, OMB, and the Congress to focus needed attention on information technology and understates the level

  9. The Government and Information: Costs, Choices and Challenges.

    ERIC Educational Resources Information Center

    Challman, Laura E.

    This paper examines the involvement of the federal government in information activities and services, and raises questions about the legitimacy and consistency of this involvement. Three major areas of government policy in the information sector are discussed: research and development, the National Technical Information Service (NTIS), and public…

  10. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.

  11. 77 FR 63804 - Federal Acquisition Regulation; Information Collection; Indirect Cost Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-17

    ... Paperwork Reduction Act, the Regulatory Secretariat will be submitting to the Office of Management and... the use of appropriate technological collection techniques or other forms of information technology... cost accounting information normally prepared by organizations under sound management and...

  12. 77 FR 67366 - Federal Acquisition Regulation; Information Collection; Travel Costs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... the Paperwork Reduction Act, the Regulatory Secretariat will be submitting to the Office of Management... the use of appropriate technological collection techniques or other forms of information...

  13. Risk informed resource allocation policy: safety can save costs.

    PubMed

    Pasman, H J

    2000-01-07

    During economic doldrums, decision making on investments for safety is even more difficult than it already is when funds are abundant. This paper attempts to offer some guidance. After stating the present challenge to prevention of losses in the process industries, the systematic approach of quantified risk assessment is briefly reviewed and improvements in the methodology are mentioned. In addition, attention is given to the use of a risk matrix to survey a plant and to derive a plan of action. Subsequently, the reduction of risk is reviewed. Measures for prevention, protection, and mitigation are discussed. The organization of safety has become at least as important as technical safety of equipment and standards. It is reflected in the introduction of a safety management system. Furthermore, the design process in a pro-active approach is described and the concept of inherent safety is briefly addressed. The concept of Layer of Protection Analysis is explained and also the reason why it is relevant to provide a cost-benefit analysis. Finally, after comments regarding the cost of accidents, the basics of costing and profitability are summarized and a way is suggested to apply this approach to risk-reducing measures. An example is provided on how a selection can be made from a number of alternatives.

  14. 47 CFR 25.111 - Additional information and ITU cost recovery.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Additional information and ITU cost recovery....111 Additional information and ITU cost recovery. (a) The Commission may request from any party at any... interference caused by radio stations authorized by other Administrations is guaranteed unless ITU...

  15. Cost-Effectiveness of Management Training in the Informal Sector. Discussion Paper No. 101.

    ERIC Educational Resources Information Center

    Nubler, Irmgard

    A research project in the Ivory Coast, Kenya, and Tanzania evaluated the cost effectiveness of management training seminars for women entrepreneurs in the informal sector. Women, a large and growing part of entrepreneurs, had less access to needed resources, skills, and information than men. Reasons for failure to study the cost effectiveness and…

  16. 40 CFR 2.311 - Special rules governing certain information obtained under the Motor Vehicle Information and Cost...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 1 2011-07-01 2011-07-01 false Special rules governing certain information obtained under the Motor Vehicle Information and Cost Savings Act. 2.311 Section 2.311 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL PUBLIC INFORMATION Confidentiality of...

  17. The hidden cost of information in collective foraging

    PubMed Central

    Dechaume-Moncharmont, François-Xavier; Dornhaus, Anna; Houston, Alasdair I; McNamara, John M; Collins, Edmund J; Franks, Nigel R

    2005-01-01

    Many animals nest or roost colonially. At the start of a potential foraging period, they may set out independently or await information from returning foragers. When should such individuals act independently and when should they wait for information? In a social insect colony, for example, information transfer may greatly increase a recruit's probability of finding food, and it is commonly assumed that this will always increase the colony's net energy gain. We test this assumption with a mathematical model. Energy gain by a colony is a function both of the probability of finding food sources and of the duration of their availability. A key factor is the ratio of pro-active foragers to re-active foragers. When leaving the nest, pro-active foragers search for food independently, whereas re-active foragers rely on information from successful foragers to find food. Under certain conditions, the optimum strategy is totally independent (pro-active) foraging because potentially valuable information that re-active foragers may gain from successful foragers is not worth waiting for. This counter-intuitive outcome is remarkably robust over a wide range of parameters. It occurs because food sources are only available for a limited period. Our study emphasizes the importance of time constraints and the analysis of dynamics, not just steady states, to understand social insect foraging. PMID:16087424

  18. Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.

    PubMed

    Hilbig, Benjamin E; Michalkiewicz, Martha; Castela, Marta; Pohl, Rüdiger F; Erdfelder, Edgar

    2015-05-01

    One of the most prominent models of probabilistic inferences from memory is the simple recognition heuristic (RH). The RH theory assumes that judgments are based on recognition in isolation, such that other information is ignored. However, some prior research has shown that available knowledge is not generally ignored. In line with the notion of adaptive strategy selection--and, thus, a trade-off between accuracy and effort--we hypothesized that information integration crucially depends on how easily accessible information beyond recognition is, how much confidence decision makers have in this information, and how (cognitively) costly it is to acquire it. In three experiments, we thus manipulated (a) the availability of information beyond recognition, (b) the subjective usefulness of this information, and (c) the cognitive costs associated with acquiring this information. In line with the predictions, we found that RH use decreased substantially, the more easily and confidently information beyond recognition could be integrated, and increased substantially with increasing cognitive costs.

  19. A Basis for Time and Cost Evaluation of Information Systems.

    ERIC Educational Resources Information Center

    Korfhage, R. R.; DeLutis, T. G.

    A general model for information storage and retrieval (IS&R) systems is proposed. The system is selected from the set of all available IS&R components. These components define the system's users and data sources; the hardware, software, and personnel performing the actual storage and retrieval activities; and the funder who acts as a filter in the…

  20. Cost-volume-profit and net present value analysis of health information systems.

    PubMed

    McLean, R A

    1998-08-01

    The adoption of any information system should be justified by an economic analysis demonstrating that its projected benefits outweigh its projected costs. Analysis differ, however, on which methods to employ for such a justification. Accountants prefer cost-volume-profit analysis, and economists prefer net present value analysis. The article explains the strengths and weaknesses of each method and shows how they can be used together so that well-informed investments in information systems can be made.

  1. A Trial of Nursing Cost Accounting using Nursing Practice Data on a Hospital Information System.

    PubMed

    Miyahira, Akiko; Tada, Kazuko; Ishima, Masatoshi; Nagao, Hidenori; Miyamoto, Tadashi; Nakagawa, Yoshiaki; Takemura, Tadamasa

    2015-01-01

    Hospital administration is very important and many hospitals carry out activity-based costing under comprehensive medicine. However, nursing cost is unclear, because nursing practice is expanding both quantitatively and qualitatively and it is difficult to grasp all nursing practices, and nursing cost is calculated in many cases comprehensively. On the other hand, a nursing information system (NIS) is implemented in many hospitals in Japan and we are beginning to get nursing practical data. In this paper, we propose a nursing cost accounting model and we simulate a cost by nursing contribution using NIS data.

  2. Judgements about the Value and Cost of Human Factors Information in Design.

    ERIC Educational Resources Information Center

    Burns, Catherine M.; Vicente, Kim J.

    1996-01-01

    Describes an empirical evaluation that investigated the criteria by which designers of human-machine systems evaluate design information. Professional designers of nuclear power plant control rooms rated hypothetical information search questions in terms of relevance, importance, cost, and effort based on Rouse's model of information search…

  3. Waste Management Facilities cost information for mixed low-level waste. Revision 1

    SciTech Connect

    Shropshire, D.; Sherick, M.; Biadgi, C.

    1995-06-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for managing mixed low-level waste. The report`s information on treatment, storage, and disposal modules can be integrated to develop total life-cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of cost estimation data is also summarized in this report.

  4. Allele Specific Locked Nucleic Acid Quantitative PCR (ASLNAqPCR): An Accurate and Cost-Effective Assay to Diagnose and Quantify KRAS and BRAF Mutation

    PubMed Central

    Morandi, Luca; de Biase, Dario; Visani, Michela; Cesari, Valentina; De Maglio, Giovanna; Pizzolitto, Stefano; Pession, Annalisa; Tallini, Giovanni

    2012-01-01

    The use of tyrosine kinase inhibitors (TKIs) requires the testing for hot spot mutations of the molecular effectors downstream the membrane-bound tyrosine kinases since their wild type status is expected for response to TKI therapy. We report a novel assay that we have called Allele Specific Locked Nucleic Acid quantitative PCR (ASLNAqPCR). The assay uses LNA-modified allele specific primers and LNA-modified beacon probes to increase sensitivity, specificity and to accurately quantify mutations. We designed primers specific for codon 12/13 KRAS mutations and BRAF V600E, and validated the assay with 300 routine samples from a variety of sources, including cytology specimens. All were analyzed by ASLNAqPCR and Sanger sequencing. Discordant cases were pyrosequenced. ASLNAqPCR correctly identified BRAF and KRAS mutations in all discordant cases and all had a mutated/wild type DNA ratio below the analytical sensitivity of the Sanger method. ASLNAqPCR was 100% specific with greater accuracy, positive and negative predictive values compared with Sanger sequencing. The analytical sensitivity of ASLNAqPCR is 0.1%, allowing quantification of mutated DNA in small neoplastic cell clones. ASLNAqPCR can be performed in any laboratory with real-time PCR equipment, is very cost-effective and can easily be adapted to detect hot spot mutations in other oncogenes. PMID:22558339

  5. Guide to Documenting and Managing Cost and Performance Information for Remediation Projects - Revised Version

    EPA Pesticide Factsheets

    This Guide to Documenting and Managing Cost and Performance Information for Remediation Projects provides the recommended procedures for documenting the results of completed and on-going full-scale and demonstration-scale remediation projects.

  6. Information on cost-effectiveness: an essential product of a national comparative effectiveness program.

    PubMed

    2008-06-17

    The American College of Physicians recently highlighted the need to provide increased information comparing the effectiveness of health care interventions to ensure the rational and effective practice of medicine. Comparative effectiveness refers to the evaluation of the relative clinical effectiveness, safety, and cost of 2 or more medical services, drugs, devices, therapies, or procedures used to treat the same condition. The College further recommended the establishment of an adequately funded, trusted national entity that should prioritize, sponsor, or produce both comparative clinical and cost-effectiveness data. This article addresses the need for the proposed entity to develop cost-effectiveness information. It examines the current reluctance to develop and use cost-effectiveness in the United States; it argues for the importance of this information for all health care stakeholders; and it makes specific recommendations regarding how this information can best be made available and used for the good of the public and our patients.

  7. Cost Accounting for Decision Makers.

    ERIC Educational Resources Information Center

    Kaneklides, Ann L.

    1985-01-01

    Underscores the importance of informed decision making through accurate anticipation of cost incurrence in light of changing economic and environmental conditions. Explains the concepts of cost accounting, full allocation of costs, the selection of an allocation base, the allocation of indirect costs, depreciation, and implications for community…

  8. Measuring the cost impact of hospital information systems: 1987-1994.

    PubMed

    Borzekowski, Ron

    2009-09-01

    This study measures the impact of information technology (IT) use on hospital operating costs during the late 1980s and early 1990s. Using a proprietary eight-year panel dataset (1987-1994) that catalogues application-level automation for the complete census of the 3000 U.S. hospitals with more than 100 beds, this study finds that both financial/administrative and clinical IT systems at the most thoroughly automated hospitals are associated with declining costs three and five years after adoption. At the application level, declining costs are associated with the adoption of some of the newest technologies, including systems designed for cost management, the administration of managed care contracts, and for both financial and clinical decision support. The association of cost declines with lagged IT as well as the cost patterns at the less automated hospitals both provide some evidence of learning effects.

  9. Financial and Cost Management for Libraries and Information Services. Second Edition.

    ERIC Educational Resources Information Center

    Roberts, Stephen A.

    This book highlights the importance of good financial and cost management in libraries and information centers to make best use of available resources, and illustrates how to maintain user services. The book applies current research and theory of financial management techniques to situations faced in libraries and information services. Eight…

  10. 76 FR 22410 - Notice of Proposed Information Collection: Comment Request; Mortgagor's Certificate of Actual Cost

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-21

    ... other forms of information technology, e.g., permitting electronic submission of responses. This Notice... profits. Its provides a base for evaluating housing programs, labor costs, and ] physical improvements in.... Estimation of the total numbers of hours needed to prepare the information collection including number...

  11. Methods for Evaluating Costs of Automated Hospital Information Systems. Research Summary Series.

    ERIC Educational Resources Information Center

    Drazen, Erica; Metzger, Jane

    To provide a compendium of methodologies on cost impacts of automated hospital information systems (AHIS), this report sponsored by the National Center for Services Research identifies, reviews, and summarizes ten studies on information systems which manage patient care data. The studies were identified by a literature search and those that…

  12. Improving Memory after Interruption: Exploiting Soft Constraints and Manipulating Information Access Cost

    ERIC Educational Resources Information Center

    Morgan, Phillip L.; Patrick, John; Waldron, Samuel M.; King, Sophia L.; Patrick, Tanya

    2009-01-01

    Forgetting what one was doing prior to interruption is an everyday problem. The recent soft constraints hypothesis (Gray, Sims, Fu, & Schoelles, 2006) emphasizes the strategic adaptation of information processing strategy to the task environment. It predicts that increasing information access cost (IAC: the time, and physical and mental effort…

  13. The Relationship between Return on Profitability and Costs of Outsourcing Information Technology Technical Support

    ERIC Educational Resources Information Center

    Odion, Segun

    2011-01-01

    The purpose of this quantitative correlational research study was to examine the relationship between costs of operation and total return on profitability of outsourcing information technology technical support in a two-year period of outsourcing operations. United States of America list of Fortune 1000 companies' chief information officers…

  14. 77 FR 75163 - Federal Acquisition Regulation; Information Collection; Contract Funding-Limitation of Costs/Funds

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-19

    ... Regulation; Information Collection; Contract Funding--Limitation of Costs/Funds AGENCIES: Department of... (NASA). ACTION: Notice of request for public comments regarding an extension of an existing OMB...: Submit comments identified by Information Collection 9000- 0074, Contract Funding--Limitation of...

  15. The price of information: Increased inspection costs reduce the confirmation bias in visual search.

    PubMed

    Rajsic, Jason; Wilson, Daryl E; Pratt, Jay

    2017-01-31

    In visual search, there is a confirmation bias such that attention is biased towards stimuli that match a target template, which has been attributed to covert costs of updating the templates that guide search [Rajsic, Wilson, & Pratt, 2015. Confirmation bias in visual search. Journal of Experimental Psychology: Human Perception and Performance. Advance online publication. doi: 10.1037/xhp0000090 ]. In order to provide direct evidence for this speculation, the present study increased the cost of inspections in search by using gaze- and mouse-contingent searches, which restrict the manner in which information in search displays can be accrued, and incur additional motor costs (in the case of mouse-contingent searches). In a fourth experiment, we rhythmically mask elements in the search display to induce temporal inspection costs. Our results indicated that confirmation bias is indeed attenuated when inspection costs are increased. We conclude that confirmation bias results from the low-cost strategy of matching information to a single, concrete visual template, and that more sophisticated guidance strategies will be used when sufficiently beneficial. This demonstrates that search guidance itself comes at a cost, and that the form of guidance adopted in a given search depends on a comparison between guidance costs and the expected benefits of their implementation.

  16. Interim report: Waste management facilities cost information for mixed low-level waste

    SciTech Connect

    Feizollahi, F.; Shropshire, D.

    1994-03-01

    This report contains preconceptual designs and planning level life-cycle cost estimates for treating alpha and nonalpha mixed low-level radioactive waste. This report contains information on twenty-seven treatment, storage, and disposal modules that can be integrated to develop total life cycle costs for various waste management options. A procedure to guide the US Department of Energy and its contractor personnel in the use of estimating data is also summarized in this report.

  17. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  18. Low-Cost Rapid Usability Testing for health information systems: is it worth the effort?

    PubMed

    Baylis, Tristin B; Kushniruk, Andre W; Borycki, Elizabeth M

    2012-01-01

    Usability testing is a step of the usability engineering process that focuses on analyzing and improving user interactions with computer systems. This study was designed to determine if an approach known as Low-Cost Rapid Usability Testing can be introduced as a standard part of the system development lifecycle (SDLC) for health information syste ms in a cost effective manner by completing a full cost-benefit analysis of this testing technique. It was found that by introducing this technique into the system development lifecycle to allow for earlier detection of errors in a health information syste m it is possible for a health organization to achieve an estimated 36.5% to 78.5% cost savings compared to the impact of errors going undetected and potentially causing a technology-induced error. Overall it was found that Low-Cost Rapid Usability Testing can be implemented in a cost effective manner to develop health information systems, and computer systems in general, which will have a lower incidence of technology-induced errors.

  19. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System Reliability, Maintainability, and Cost Model (RMCM) - Description, Users Guide.

    DTIC Science & Technology

    1980-08-01

    27 2.2.2.2 Cost of Support (CS) 27 2.2.3 Disposal Costs ( CDP ) 28 2.3 Time Value of Money 282.4 General Assumptions of the RMCM 31 III. DATA FILE...NRC + RC + 2CDP + + 4 NRC Nonrecurring cost total using a baseline year value. RC Recurring cost total using a baseline year value. CDP Cost of system...application, the availability of data, and assurance that costs are not duplicated between cost elements. 27 2.2.3 Disposal Costs ( CDP ) The disposal costs

  20. Health information technology and its effects on hospital costs, outcomes, and patient safety.

    PubMed

    Encinosa, William E; Bae, Jaeyong

    Underlying many reforms in the Patient Protection and Affordable Care Act (ACA) is the use of electronic medical records (EMRs) to help contain costs. We use MarketScan claims data and American Hospital Association information technology (IT) data to examine whether EMRs can contain costs in the ACA's reforms to reduce patient safety events. We find EMRs do not reduce the rate of patient safety events. However, once an event occurs, EMRs reduce death by 34%, readmissions by 39%, and spending by $4,850 (16%), a cost offset of $1.75 per $1 spent on IT capital. Thus, EMRs contain costs by better coordinating care to rescue patients from medical errors once they occur.

  1. A study of the relative effectiveness and cost of computerized information retrieval in the interactive mode

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Furniss, M. A.; Potter, T. R.

    1974-01-01

    Results of a number of experiments to illuminate the relative effectiveness and costs of computerized information retrieval in the interactive mode are reported. It was found that for equal time spent in preparing the search strategy, the batch and interactive modes gave approximately equal recall and relevance. The interactive mode however encourages the searcher to devote more time to the task and therefore usually yields improved output. Engineering costs as a result are higher in this mode. Estimates of associated hardware costs also indicate that operation in this mode is more expensive. Skilled RECON users like the rapid feedback and additional features offered by this mode if they are not constrained by considerations of cost.

  2. 30 CFR 551.13 - Reimbursement for the costs of reproducing data and information and certain processing costs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... reimburse you or a third party for data acquisition costs or for the costs of analyzing or processing... 30 Mineral Resources 2 2012-07-01 2012-07-01 false Reimbursement for the costs of reproducing data... THE OUTER CONTINENTAL SHELF § 551.13 Reimbursement for the costs of reproducing data and...

  3. 30 CFR 551.13 - Reimbursement for the costs of reproducing data and information and certain processing costs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... reimburse you or a third party for data acquisition costs or for the costs of analyzing or processing... 30 Mineral Resources 2 2013-07-01 2013-07-01 false Reimbursement for the costs of reproducing data... THE OUTER CONTINENTAL SHELF § 551.13 Reimbursement for the costs of reproducing data and...

  4. 30 CFR 551.13 - Reimbursement for the costs of reproducing data and information and certain processing costs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... reimburse you or a third party for data acquisition costs or for the costs of analyzing or processing... 30 Mineral Resources 2 2014-07-01 2014-07-01 false Reimbursement for the costs of reproducing data... THE OUTER CONTINENTAL SHELF § 551.13 Reimbursement for the costs of reproducing data and...

  5. 76 FR 26705 - Proposed Information Collection; Comment Request; Commercial Fishing Vessel Cost and Earnings...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-09

    ... Commerce, to make informed decisions about the expected economic effects of proposed management...; Commercial Fishing Vessel Cost and Earnings Data Collection Survey in the Northeast Region AGENCY: National... legislative requirements of the Magnuson-Stevens Fishery Conservation and Management Act, the...

  6. The role of cognitive switching in head-up displays. [to determine pilot ability to accurately extract information from either of two sources

    NASA Technical Reports Server (NTRS)

    Fischer, E.

    1979-01-01

    The pilot's ability to accurately extract information from either one or both of two superimposed sources of information was determined. Static, aerial, color 35 mm slides of external runway environments and slides of corresponding static head-up display (HUD) symbology were used as the sources. A three channel tachistoscope was utilized to show either the HUD alone, the scene alone, or the two slides superimposed. Cognitive performance of the pilots was assessed by determining the percentage of correct answers given to two HUD related questions, two scene related questions, or one HUD and one scene related question.

  7. Relation between physical time-energy cost of a quantum process and its information fidelity

    NASA Astrophysics Data System (ADS)

    Fung, Chi-Hang Fred; Chau, H. F.

    2014-08-01

    A quantum system can be described and characterized by at least two different concepts, namely, its physical and informational properties. Here, we explicitly connect these two concepts, by equating the time-energy cost which is the product of the largest energy of a Hamiltonian of quantum dynamics and the evolution time, and the entanglement fidelity which is the informational difference between an input state and the corresponding output state produced by a quantum channel characterized by the Hamiltonian. Specifically, the worst-case entanglement fidelity between the input and output states is exactly the cosine of the channel's time-energy cost (except when the fidelity is zero). The exactness of our relation makes a strong statement about the intimate connection between information and physics. Our exact result may also be regarded as a time-energy uncertainty relation for the fastest state that achieves a certain fidelity.

  8. Using a small/low cost computer in an information center

    NASA Technical Reports Server (NTRS)

    Wilde, D. U.

    1972-01-01

    Small/low cost computers are available with I/O capacities that make them suitable for SDI and retrospective searching on any of the many commercially available data bases. A small two-tape computer system is assumed, and an analysis of its run-time equations leads to a three-step search procedure. Run times and costs are shown as a function of file size, number of search terms, and input transmission rates. Actual examples verify that it is economically feasible for an information center to consider its own small, dedicated computer system.

  9. Who pays and who benefits? How different models of shared responsibilities between formal and informal carers influence projections of costs of dementia management

    PubMed Central

    2011-01-01

    Background The few studies that have attempted to estimate the future cost of caring for people with dementia in Australia are typically based on total prevalence and the cost per patient over the average duration of illness. However, costs associated with dementia care also vary according to the length of the disease, severity of symptoms and type of care provided. This study aimed to determine more accurately the future costs of dementia management by taking these factors into consideration. Methods The current study estimated the prevalence of dementia in Australia (2010-2040). Data from a variety of sources was recalculated to distribute this prevalence according to the location (home/institution), care requirements (informal/formal), and dementia severity. The cost of care was attributed to redistributed prevalences and used in prediction of future costs of dementia. Results Our computer modeling indicates that the ratio between the prevalence of people with mild/moderate/severe dementia will change over the three decades from 2010 to 2040 from 50/30/20 to 44/32/24. Taking into account the severity of symptoms, location of care and cost of care per hour, the current study estimates that the informal cost of care in 2010 is AU$3.2 billion and formal care at AU$5.0 billion per annum. By 2040 informal care is estimated to cost AU$11.6 billion and formal care $AU16.7 billion per annum. Interventions to slow disease progression will result in relative savings of 5% (AU$1.5 billion) per annum and interventions to delay disease onset will result in relative savings of 14% (AU$4 billion) of the cost per annum. With no intervention, the projected combined annual cost of formal and informal care for a person with dementia in 2040 will be around AU$38,000 (in 2010 dollars). An intervention to delay progression by 2 years will see this reduced to AU$35,000. Conclusions These findings highlight the need to account for more than total prevalence when estimating the costs of

  10. Multiplying probe for accurate power measurements on an RF driven atmospheric pressure plasma jet applied to the COST reference microplasma jet

    NASA Astrophysics Data System (ADS)

    Beijer, P. A. C.; Sobota, A.; van Veldhuizen, E. M.; Kroesen, G. M. W.

    2016-03-01

    In this paper a new multiplying probe for measuring the power dissipated in a miniature capacitively coupled, RF driven, atmospheric pressure plasma jet (μAPPJ—COST Reference Microplasma Jet—COST RMJ) is presented. The approach aims for substantially higher accuracy than provided by traditionally applied methods using bi-directional power meters or commercially available voltage and current probes in conjunction with digitizing oscilloscopes. The probe is placed on a miniature PCB and designed to minimize losses, influence of unknown elements, crosstalk and variations in temperature. The probe is designed to measure powers of the order of magnitude of 0.1-10 W. It is estimated that it measures power with less than 2% deviation from the real value in the tested power range. The design was applied to measure power dissipated in COST-RMJ running in helium with a small addition of oxygen.

  11. The Spatial Dynamics of Predators and the Benefits and Costs of Sharing Information

    PubMed Central

    2016-01-01

    Predators of all kinds, be they lions hunting in the Serengeti or fishermen searching for their catch, display various collective strategies. A common strategy is to share information about the location of prey. However, depending on the spatial characteristics and mobility of predators and prey, information sharing can either improve or hinder individual success. Here, our goal is to investigate the interacting effects of space and information sharing on predation efficiency, represented by the expected rate at which prey are found and consumed. We derive a feeding functional response that accounts for both spatio-temporal heterogeneity and communication, and validate this mathematical analysis with a computational agent-based model. This agent-based model has an explicit yet minimal representation of space, as well as information sharing about the location of prey. The analytical model simplifies predator behavior into a few discrete states and one essential trade-off, between the individual benefit of acquiring information and the cost of creating spatial and temporal correlation between predators. Despite the absence of an explicit spatial dimension in these equations, they quantitatively predict the predator consumption rates measured in the agent-based simulations across the explored parameter space. Together, the mathematical analysis and agent-based simulations identify the conditions for when there is a benefit to sharing information, and also when there is a cost. PMID:27764098

  12. The Lunar Laser Ranging Experiment: Accurate ranges have given a large improvement in the lunar orbit and new selenophysical information.

    PubMed

    Bender, P L; Currie, D G; Poultney, S K; Alley, C O; Dicke, R H; Wilkinson, D T; Eckhardt, D H; Faller, J E; Kaula, W M; Mulholland, J D; Plotkin, H H; Silverberg, E C; Williams, J G

    1973-10-19

    The lunar ranging measurements now being made at the McDonald Observatory have an accuracy of 1 nsec in round-trip travel time. This corresponds to 15 cm in the one-way distance. The use of lasers with pulse-lengths of less than 1 nsec is expected to give an accuracy of 2 to 3 cm in the next few years. A new station is under construction in Hawaii, and additional stations in other countries are either in operation or under development. It is hoped that these stations will form the basis for a worldwide network to determine polar motion and earth rotation on a regular basis, and will assist in providing information about movement of the tectonic plates making up the earth's surface. Several mobile lunar ranging stations with telescopes having diameters of 1.0 m or less could, in the future, greatly extend the information obtainable about motions within and between the tectonic plates. The data obtained so far by the McDonald Observatory have been used to generate a new lunar ephemeris based on direct numerical integration of the equations of motion for the moon and planets. With this ephemeris, the range to the three Apollo retro-reflectors can be fit to an accuracy of 5 m by adjusting the differences in moments of inertia of the moon about its principal axes, the selenocentric coordinates of the reflectors, and the McDonald longitude. The accuracy of fitting the results is limited currently by errors of the order of an arc second in the angular orientation of the moon, as derived from the best available theory of how the moon rotates in response to the torques acting on it. Both a new calculation of the moon's orientation as a function of time based on direct numerical integration of the torque equations and a new analytic theory of the moon's orientation are expected to be available soon, and to improve considerably the accuracy of fitting the data. The accuracy already achieved routinely in lunar laser ranging represents a hundredfold improvement over any

  13. Multiple descent cost competition: restorable self-organization and multimedia information processing.

    PubMed

    Matsuyama, Y

    1998-01-01

    Multiple descent cost competition is a composition of learning phases for minimizing a given measure of total performance, i.e., cost. If these phases are heterogeneous toward each other, the total learning algorithm shows a variety of extraordinary abilities; especially in regards to multimedia information processing. In the first phase of descent cost learning, elements of source data are grouped. Simultaneously, a weight vector for minimal learning, (i.e., a winner), is found. Then, the winner and its partners are updated for further cost reduction. Therefore, two classes of self-organizing feature maps are generated. One is called a grouping feature map, which partitions the source data. The other is an ordinary weight vector feature map. The grouping feature map, together with the winners, retains most of the source data information. This feature map is able to assist in a high quality approximation of the original data. Traditional weight vector feature maps lack this ability. Another important capacity of the grouping feature map is that it can change its shape. Thus, the grouping pattern can accept external directions in order to metamorphose. In the text, the total algorithm of the multiple descent cost competition is explained first. In that section, image processing concepts are introduced in order to assist in the description of this algorithm. Then, a still image is first data-compressed (DC). Next, a restored image is morphed using the grouping feature map by receiving directions given by an external intelligence. Next, an interpolation of frames is applied in order to complete animation coding (AC). Thus, multiple descent cost competition bridges "DC to AC." Examples of multimedia processing on virtual digital movies are given.

  14. Waste management facilities cost information for transportation of radioactive and hazardous materials

    SciTech Connect

    Feizollahi, F.; Shropshire, D.; Burton, D.

    1995-06-01

    This report contains cost information on the U.S. Department of Energy (DOE) Complex waste streams that will be addressed by DOE in the programmatic environmental impact statement (PEIS) project. It describes the results of the task commissioned by DOE to develop cost information for transportation of radioactive and hazardous waste. It contains transportation costs for most types of DOE waste streams: low-level waste (LLW), mixed low-level waste (MLLW), alpha LLW and alpha MLLW, Greater-Than-Class C (GTCC) LLW and DOE equivalent waste, transuranic (TRU) waste, spent nuclear fuel (SNF), and hazardous waste. Unit rates for transportation of contact-handled (<200 mrem/hr contact dose) and remote-handled (>200 mrem/hr contact dose) radioactive waste are estimated. Land transportation of radioactive and hazardous waste is subject to regulations promulgated by DOE, the U.S. Department of Transportation (DOT), the U.S. Nuclear Regulatory Commission (NRC), and state and local agencies. The cost estimates in this report assume compliance with applicable regulations.

  15. Energy information systems (EIS): Technology costs, benefit, and best practice uses

    SciTech Connect

    Granderson, Jessica; Lin, Guanjing; Piette, Mary Ann

    2013-11-26

    Energy information systems are the web-based software, data acquisition hardware, and communication systems used to store, analyze, and display building energy data. They often include analysis methods such as baselining, benchmarking, load profiling, and energy anomaly detection. This report documents a large-scale assessment of energy information system (EIS) uses, costs, and energy benefits, based on a series of focused case study investigations that are synthesized into generalizable findings. The overall objective is to provide organizational decision makers with the information they need to make informed choices as to whether or not to invest in an EIS--a promising technology that can enable up to 20 percent site energy savings, quick payback, and persistent low-energy performance when implemented as part of best-practice energy management programs.

  16. Know thy neighbor: costly information can hurt cooperation in dynamic networks.

    PubMed

    Antonioni, Alberto; Cacault, Maria Paula; Lalive, Rafael; Tomassini, Marco

    2014-01-01

    People need to rely on cooperation with other individuals in many aspects of everyday life, such as teamwork and economic exchange in anonymous markets. We study whether and how the ability to make or break links in social networks fosters cooperate, paying particular attention to whether information on an individual's actions is freely available to potential partners. Studying the role of information is relevant as information on other people's actions is often not available for free: a recruiting firm may need to call a job candidate's references, a bank may need to find out about the credit history of a new client, etc. We find that people cooperate almost fully when information on their actions is freely available to their potential partners. Cooperation is less likely, however, if people have to pay about half of what they gain from cooperating with a cooperator. Cooperation declines even further if people have to pay a cost that is almost equivalent to the gain from cooperating with a cooperator. Thus, costly information on potential neighbors' actions can undermine the incentive to cooperate in fluid networks.

  17. Compact and cost-effective temperature-insensitive bio-sensor based on long-period fiber gratings for accurate detection of E. coli bacteria in water.

    PubMed

    Dandapat, Krishnendu; Tripathi, Saurabh Mani; Chinifooroshan, Yasser; Bock, Wojtek J; Mikulic, Predrag

    2016-09-15

    We propose and demonstrate a novel temperature-insensitive bio-sensor for accurate and quantitative detection of Escherichia coli (E. coli) bacteria in water. Surface sensitivity is maximized by operating the long-period fiber grating (LPFG) closest to its turnaround wavelength, and the temperature insensitivity is achieved by selectively exciting a pair of cladding modes with opposite dispersion characteristics. Our sensor shows a nominal temperature sensitivity of ∼1.25  pm/°C, which can be further reduced by properly adjusting the LPFG lengths, while maintaining a high refractive index sensitivity of 1929 nm/RIU. The overall length of the sensor is ∼3.6  cm, making it ideally suitable for bio-sensing applications. As an example, we also show the sensor's capability for reliable, quantitative detection of E. coli bacteria in water over a temperature fluctuation of room temperature to 40°C.

  18. A Framework for evaluating the costs, effort, and value of nationwide health information exchange

    PubMed Central

    Zafar, Atif; Overhage, J Marc

    2010-01-01

    Objective The nationwide health information network (NHIN) has been proposed to securely link community and state health information exchange (HIE) entities to create a national, interoperable network for sharing healthcare data in the USA. This paper describes a framework for evaluating the costs, effort, and value of nationwide data exchange as the NHIN moves toward a production state. The paper further presents the results of an initial assessment of the framework by those engaged in HIE activities. Design Using a literature review and knowledge gained from active NHIN technology and policy development, the authors constructed a framework for evaluating the costs, effort, and value of data exchange between an HIE entity and the NHIN. Measurement An online survey was used to assess the perceived usefulness of the metrics in the framework among HIE professionals and researchers. Results The framework is organized into five broad categories: implementation; technology; policy; data; and value. Each category enumerates a variety of measures and measure types. Survey respondents generally indicated the framework contained useful measures for current and future use in HIE and NHIN evaluation. Answers varied slightly based on a respondent's participation in active development of NHIN components. Conclusion The proposed framework supports efforts to measure the costs, effort, and value associated with nationwide data exchange. Collecting longitudinal data along the NHIN's path to production should help with the development of an evidence base that will drive adoption, create value, and stimulate further investment in nationwide data exchange. PMID:20442147

  19. The Effects of Health Information Technology on the Costs and Quality of Medical Care

    PubMed Central

    Agha, Leila

    2015-01-01

    Information technology has been linked to productivity growth in a wide variety of sectors, and health information technology (HIT) is a leading example of an innovation with the potential to transform industry-wide productivity. This paper analyzes the impact of health information technology (HIT) on the quality and intensity of medical care. Using Medicare claims data from 1998-2005, I estimate the effects of early investment in HIT by exploiting variation in hospitals’ adoption statuses over time, analyzing 2.5 million inpatient admissions across 3900 hospitals. HIT is associated with a 1.3 percent increase in billed charges (p-value: 5.6%), and there is no evidence of cost savings even five years after adoption. Additionally, HIT adoption appears to have little impact on the quality of care, measured by patient mortality, adverse drug events, and readmission rates. PMID:24463141

  20. The effects of health information technology on the costs and quality of medical care.

    PubMed

    Agha, Leila

    2014-03-01

    Information technology has been linked to productivity growth in a wide variety of sectors, and health information technology (HIT) is a leading example of an innovation with the potential to transform industry-wide productivity. This paper analyzes the impact of health information technology (HIT) on the quality and intensity of medical care. Using Medicare claims data from 1998 to 2005, I estimate the effects of early investment in HIT by exploiting variation in hospitals' adoption statuses over time, analyzing 2.5 million inpatient admissions across 3900 hospitals. HIT is associated with a 1.3% increase in billed charges (p-value: 5.6%), and there is no evidence of cost savings even five years after adoption. Additionally, HIT adoption appears to have little impact on the quality of care, measured by patient mortality, adverse drug events, and readmission rates.

  1. The benefits and costs of disclosing information about risks: what do we know about right-to-know?

    PubMed

    Beierle, Thomas C

    2004-04-01

    Following the attacks of September 11, 2001, the Environmental Protection Agency and other government agencies removed information from their web sites that they feared could invite attacks on critical public and private infrastructure. Accordingly, the benefits and costs of environmental information disclosure programs have come under increasing scrutiny. This article describes a framework for examining these benefits and costs and illustrates the framework through brief case studies of two information disclosure programs: risk management planning and materials accounting. The article outlines what we know and still need to find out about information disclosure programs in order to appropriately balance benefits and costs.

  2. The value of information as applied to the Landsat Follow-on benefit-cost analysis

    NASA Technical Reports Server (NTRS)

    Wood, D. B.

    1978-01-01

    An econometric model was run to compare the current forecasting system with a hypothetical (Landsat Follow-on) space-based system. The baseline current system was a hybrid of USDA SRS domestic forecasts and the best known foreign data. The space-based system improved upon the present Landsat by the higher spatial resolution capability of the thematic mapper. This satellite system is a major improvement for foreign forecasts but no better than SRS for domestic forecasts. The benefit analysis was concentrated on the use of Landsat Follow-on to forecast world wheat production. Results showed that it was possible to quantify the value of satellite information and that there are significant benefits in more timely and accurate crop condition information.

  3. A Cost-Effectiveness Tool for Informing Policies on Zika Virus Control

    PubMed Central

    Tamagnan, Jules A.; Medlock, Jan; Ndeffo-Mbah, Martial L.; Fish, Durland; Ávila-Agüero, María L.; Marín, Rodrigo; Ko, Albert I.; Galvani, Alison P.

    2016-01-01

    in real-time within our user-friendly tool to provide updated estimates on cost-effectiveness of interventions and inform policy decisions in country-specific settings. PMID:27205899

  4. Effective information channels for reducing costs of environmentally- friendly technologies: evidence from residential PV markets

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Robinson, Scott A.

    2013-03-01

    Realizing the environmental benefits of solar photovoltaics (PV) will require reducing costs associated with perception, informational gaps and technological uncertainties. To identify opportunities to decrease costs associated with residential PV adoption, in this letter we use multivariate regression models to analyze a unique, household-level dataset of PV adopters in Texas (USA) to systematically quantify the effect of different information channels on aspiring PV adopters’ decision-making. We find that the length of the decision period depends on the business model, such as whether the system was bought or leased, and on special opportunities to learn, such as the influence of other PV owners in the neighborhood. This influence accrues passively through merely witnessing PV systems in the neighborhood, increasing confidence and motivation, as well as actively through peer-to-peer communications. Using these insights we propose a new framework to provide public information on PV that could drastically reduce barriers to PV adoption, thereby accelerating its market penetration and environmental benefits. This framework could also serve as a model for other distributed generation technologies.

  5. An Accurate and Fault-Tolerant Target Positioning System for Buildings Using Laser Rangefinders and Low-Cost MEMS-Based MARG Sensors

    PubMed Central

    Zhao, Lin; Guan, Dongxue; Landry, René Jr.; Cheng, Jianhua; Sydorenko, Kostyantyn

    2015-01-01

    Target positioning systems based on MEMS gyros and laser rangefinders (LRs) have extensive prospects due to their advantages of low cost, small size and easy realization. The target positioning accuracy is mainly determined by the LR’s attitude derived by the gyros. However, the attitude error is large due to the inherent noises from isolated MEMS gyros. In this paper, both accelerometer/magnetometer and LR attitude aiding systems are introduced to aid MEMS gyros. A no-reset Federated Kalman Filter (FKF) is employed, which consists of two local Kalman Filters (KF) and a Master Filter (MF). The local KFs are designed by using the Direction Cosine Matrix (DCM)-based dynamic equations and the measurements from the two aiding systems. The KFs can estimate the attitude simultaneously to limit the attitude errors resulting from the gyros. Then, the MF fuses the redundant attitude estimates to yield globally optimal estimates. Simulation and experimental results demonstrate that the FKF-based system can improve the target positioning accuracy effectively and allow for good fault-tolerant capability. PMID:26512672

  6. The challenge of monitoring elusive large carnivores: An accurate and cost-effective tool to identify and sex pumas (Puma concolor) from footprints

    PubMed Central

    2017-01-01

    Acquiring reliable data on large felid populations is crucial for effective conservation and management. However, large felids, typically solitary, elusive and nocturnal, are difficult to survey. Tagging and following individuals with VHF or GPS technology is the standard approach, but costs are high and these methodologies can compromise animal welfare. Such limitations can restrict the use of these techniques at population or landscape levels. In this paper we describe a robust technique to identify and sex individual pumas from footprints. We used a standardized image collection protocol to collect a reference database of 535 footprints from 35 captive pumas over 10 facilities; 19 females (300 footprints) and 16 males (235 footprints), ranging in age from 1–20 yrs. Images were processed in JMP data visualization software, generating one hundred and twenty three measurements from each footprint. Data were analyzed using a customized model based on a pairwise trail comparison using robust cross-validated discriminant analysis with a Ward’s clustering method. Classification accuracy was consistently > 90% for individuals, and for the correct classification of footprints within trails, and > 99% for sex classification. The technique has the potential to greatly augment the methods available for studying puma and other elusive felids, and is amenable to both citizen-science and opportunistic/local community data collection efforts, particularly as the data collection protocol is inexpensive and intuitive. PMID:28273159

  7. The challenge of monitoring elusive large carnivores: An accurate and cost-effective tool to identify and sex pumas (Puma concolor) from footprints.

    PubMed

    Alibhai, Sky; Jewell, Zoe; Evans, Jonah

    2017-01-01

    Acquiring reliable data on large felid populations is crucial for effective conservation and management. However, large felids, typically solitary, elusive and nocturnal, are difficult to survey. Tagging and following individuals with VHF or GPS technology is the standard approach, but costs are high and these methodologies can compromise animal welfare. Such limitations can restrict the use of these techniques at population or landscape levels. In this paper we describe a robust technique to identify and sex individual pumas from footprints. We used a standardized image collection protocol to collect a reference database of 535 footprints from 35 captive pumas over 10 facilities; 19 females (300 footprints) and 16 males (235 footprints), ranging in age from 1-20 yrs. Images were processed in JMP data visualization software, generating one hundred and twenty three measurements from each footprint. Data were analyzed using a customized model based on a pairwise trail comparison using robust cross-validated discriminant analysis with a Ward's clustering method. Classification accuracy was consistently > 90% for individuals, and for the correct classification of footprints within trails, and > 99% for sex classification. The technique has the potential to greatly augment the methods available for studying puma and other elusive felids, and is amenable to both citizen-science and opportunistic/local community data collection efforts, particularly as the data collection protocol is inexpensive and intuitive.

  8. Argon Cluster Sputtering Source for ToF-SIMS Depth Profiling of Insulating Materials: High Sputter Rate and Accurate Interfacial Information.

    PubMed

    Wang, Zhaoying; Liu, Bingwen; Zhao, Evan W; Jin, Ke; Du, Yingge; Neeway, James J; Ryan, Joseph V; Hu, Dehong; Zhang, Kelvin H L; Hong, Mina; Le Guernic, Solenne; Thevuthasan, Suntharampilai; Wang, Fuyi; Zhu, Zihua

    2015-08-01

    The use of an argon cluster ion sputtering source has been demonstrated to perform superiorly relative to traditional oxygen and cesium ion sputtering sources for ToF-SIMS depth profiling of insulating materials. The superior performance has been attributed to effective alleviation of surface charging. A simulated nuclear waste glass (SON68) and layered hole-perovskite oxide thin films were selected as model systems because of their fundamental and practical significance. Our results show that high sputter rates and accurate interfacial information can be achieved simultaneously for argon cluster sputtering, whereas this is not the case for cesium and oxygen sputtering. Therefore, the implementation of an argon cluster sputtering source can significantly improve the analysis efficiency of insulating materials and, thus, can expand its applications to the study of glass corrosion, perovskite oxide thin film characterization, and many other systems of interest.

  9. Mutual information-based feature selection for low-cost BCIs based on motor imagery.

    PubMed

    Schiatti, L; Faes, L; Tessadori, J; Barresi, G; Mattos, L

    2016-08-01

    In the present study a feature selection algorithm based on mutual information (MI) was applied to electro-encephalographic (EEG) data acquired during three different motor imagery tasks from two dataset: Dataset I from BCI Competition IV including full scalp recordings from four subjects, and new data recorded from three subjects using the popular low-cost Emotiv EPOC EEG headset. The aim was to evaluate optimal channels and band-power (BP) features for motor imagery tasks discrimination, in order to assess the feasibility of a portable low-cost motor imagery based Brain-Computer Interface (BCI) system. The minimal sub set of features most relevant to task description and less redundant to each other was determined, and the corresponding classification accuracy was assessed offline employing linear support vector machine (SVM) in a 10-fold cross validation scheme. The analysis was performed: (a) on the original full Dataset I from BCI competition IV, (b) on a restricted channels set from Dataset I corresponding to available Emotiv EPOC electrodes locations, and (c) on data recorded with the EPOC system. Results from (a) showed that an offline classification accuracy above 80% can be reached using only 5 features. Limiting the analysis to EPOC channels caused a decrease of classification accuracy, although it still remained above chance level, both for data from (b) and (c). A top accuracy of 70% was achieved using 2 optimal features. These results encourage further research towards the development of portable low cost motor imagery-based BCI systems.

  10. SnowyOwl: accurate prediction of fungal genes by using RNA-Seq and homology information to select among ab initio models

    PubMed Central

    2014-01-01

    Background Locating the protein-coding genes in novel genomes is essential to understanding and exploiting the genomic information but it is still difficult to accurately predict all the genes. The recent availability of detailed information about transcript structure from high-throughput sequencing of messenger RNA (RNA-Seq) delineates many expressed genes and promises increased accuracy in gene prediction. Computational gene predictors have been intensively developed for and tested in well-studied animal genomes. Hundreds of fungal genomes are now or will soon be sequenced. The differences of fungal genomes from animal genomes and the phylogenetic sparsity of well-studied fungi call for gene-prediction tools tailored to them. Results SnowyOwl is a new gene prediction pipeline that uses RNA-Seq data to train and provide hints for the generation of Hidden Markov Model (HMM)-based gene predictions and to evaluate the resulting models. The pipeline has been developed and streamlined by comparing its predictions to manually curated gene models in three fungal genomes and validated against the high-quality gene annotation of Neurospora crassa; SnowyOwl predicted N. crassa genes with 83% sensitivity and 65% specificity. SnowyOwl gains sensitivity by repeatedly running the HMM gene predictor Augustus with varied input parameters and selectivity by choosing the models with best homology to known proteins and best agreement with the RNA-Seq data. Conclusions SnowyOwl efficiently uses RNA-Seq data to produce accurate gene models in both well-studied and novel fungal genomes. The source code for the SnowyOwl pipeline (in Python) and a web interface (in PHP) is freely available from http://sourceforge.net/projects/snowyowl/. PMID:24980894

  11. Effectiveness and cost of different strategies for information feedback in general practice.

    PubMed Central

    Szczepura, A; Wilmot, J; Davies, C; Fletcher, J

    1994-01-01

    AIM. The aim of this study was to determine the effectiveness and relative cost of three forms of information feedback to general practices--graphical, graphical plus a visit by a medical facilitator and tabular. METHOD. Routinely collected, centrally-held data were used where possible, analysed at practice level. Some non-routine practice data in the form of risk factor recording in medical notes, for example weight, smoking status, alcohol consumption and blood pressure, were also provided to those who requested it. The 52 participating practices were stratified and randomly allocated to one of the three feedback groups. The cost of providing each type of feedback was determined. The immediate response of practitioners to the form of feedback (acceptability), ease of understanding (intelligibility), and usefulness of regular feedback was recorded. Changes introduced as a result of feedback were assessed by questionnaire shortly after feedback, and 12 months later. Changes at the practice level in selected indicators were also assessed 12 and 24 months after initial feedback. RESULTS. The resulting cost per effect was calculated to be 46.10 pounds for both graphical and tabular feedback, 132.50 pounds for graphical feedback plus facilitator visit and 773.00 pounds for the manual audit of risk factors recorded in the practice notes. The three forms of feedback did not differ in intelligibility or usefulness, but feedback plus a medical facilitator visit was significantly less acceptable. There was a high level of self-reported organizational change following feedback, with 69% of practices reporting changes as a direct result; this was not significantly different for the three types of feedback. There were no significant changes in the selected indicators at 12 or 24 months following feedback. The practice characteristic most closely related to better indicators of preventive practice was practice size, smaller practices performing significantly better. Separate

  12. Risk information in support of cost estimates for the Baseline Environmental Management Report (BEMR). Section 1

    SciTech Connect

    Gelston, G.M.; Jarvis, M.F.; Warren, B.R.; Von Berg, R.

    1995-06-01

    The Pacific Northwest Laboratory (PNL)(1) effort on the overall Baseline Environmental Management Report (BEMR) project consists of four installation-specific work components performed in succession. These components include (1) development of source terms, 92) collection of data and preparation of environmental settings reports, (3) calculation of unit risk factors, and (4) utilization of the unit risk factors in Automated Remedial Action Methodology (ARAM) for computation of target concentrations and cost estimates. This report documents work completed for the Nevada Test Site, Nevada, for components 2 and 3. The product of this phase of the BEMR project is the development of unit factors (i.e., unit transport factors, unit exposure factors, and unit risk factors). Thousands of these unit factors are gene rated and fill approximately one megabyte of computer information per installation. The final unit risk factors (URF) are transmitted electronically to BEMR-Cost task personnel as input to a computer program (ARAM). Abstracted files and exhibits of the URF information are included in this report. These visual formats are intended to provide a sample of the final task deliverable (the URF files) which can be easily read without a computer.

  13. 48 CFR 570.110 - Cost or pricing data and information other than cost or pricing data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to and waivers for submitting cost or pricing data. Most leasing actions will have adequate price competition. For price analysis, you may use a market survey or an appraisal conducted using accepted real property appraisal procedures to establish a market price for comparison. (c) In exceptional cases,...

  14. The Perceived Effect of Hidden Costs on the Operational Management of Information Technology Outsourcing: A Qualitative Study

    ERIC Educational Resources Information Center

    Swift, Ian

    2011-01-01

    Information technology (IT) outsourcing is a business trend aimed at reducing costs and enabling companies to concentrate on their core competencies. This qualitative multiple case design research study explored the effects of hidden costs on the operational management of IT outsourcing. The study involved analyzing IT outsourcing agreements as…

  15. A better approach to cost estimation.

    PubMed

    Richmond, Russ

    2013-03-01

    Using ratios of costs to charges (RCCs) to estimate costs can cause hospitals to significantly over- or under-invest in service lines. A focus on improving cost estimation in cost centers where physicians have significant control over operating expenses, such as drugs or implants, can strengthen decision making and strategic planning. Connecting patient file information to purchasing data can lead to more accurate reflections of actual costs and help hospitals gain better visibility across service lines.

  16. IrisPlex: a sensitive DNA tool for accurate prediction of blue and brown eye colour in the absence of ancestry information.

    PubMed

    Walsh, Susan; Liu, Fan; Ballantyne, Kaye N; van Oven, Mannis; Lao, Oscar; Kayser, Manfred

    2011-06-01

    A new era of 'DNA intelligence' is arriving in forensic biology, due to the impending ability to predict externally visible characteristics (EVCs) from biological material such as those found at crime scenes. EVC prediction from forensic samples, or from body parts, is expected to help concentrate police investigations towards finding unknown individuals, at times when conventional DNA profiling fails to provide informative leads. Here we present a robust and sensitive tool, termed IrisPlex, for the accurate prediction of blue and brown eye colour from DNA in future forensic applications. We used the six currently most eye colour-informative single nucleotide polymorphisms (SNPs) that previously revealed prevalence-adjusted prediction accuracies of over 90% for blue and brown eye colour in 6168 Dutch Europeans. The single multiplex assay, based on SNaPshot chemistry and capillary electrophoresis, both widely used in forensic laboratories, displays high levels of genotyping sensitivity with complete profiles generated from as little as 31pg of DNA, approximately six human diploid cell equivalents. We also present a prediction model to correctly classify an individual's eye colour, via probability estimation solely based on DNA data, and illustrate the accuracy of the developed prediction test on 40 individuals from various geographic origins. Moreover, we obtained insights into the worldwide allele distribution of these six SNPs using the HGDP-CEPH samples of 51 populations. Eye colour prediction analyses from HGDP-CEPH samples provide evidence that the test and model presented here perform reliably without prior ancestry information, although future worldwide genotype and phenotype data shall confirm this notion. As our IrisPlex eye colour prediction test is capable of immediate implementation in forensic casework, it represents one of the first steps forward in the creation of a fully individualised EVC prediction system for future use in forensic DNA intelligence.

  17. Layer-switching cost and optimality in information spreading on multiplex networks

    PubMed Central

    Min, Byungjoon; Gwak, Sang-Hwan; Lee, Nanoom; Goh, K. -I.

    2016-01-01

    We study a model of information spreading on multiplex networks, in which agents interact through multiple interaction channels (layers), say online vs. offline communication layers, subject to layer-switching cost for transmissions across different interaction layers. The model is characterized by the layer-wise path-dependent transmissibility over a contact, that is dynamically determined dependently on both incoming and outgoing transmission layers. We formulate an analytical framework to deal with such path-dependent transmissibility and demonstrate the nontrivial interplay between the multiplexity and spreading dynamics, including optimality. It is shown that the epidemic threshold and prevalence respond to the layer-switching cost non-monotonically and that the optimal conditions can change in abrupt non-analytic ways, depending also on the densities of network layers and the type of seed infections. Our results elucidate the essential role of multiplexity that its explicit consideration should be crucial for realistic modeling and prediction of spreading phenomena on multiplex social networks in an era of ever-diversifying social interaction layers. PMID:26887527

  18. Cost and results of information systems for health and poverty indicators in the United Republic of Tanzania.

    PubMed Central

    Rommelmann, Vanessa; Setel, Philip W.; Hemed, Yusuf; Angeles, Gustavo; Mponezya, Hamisi; Whiting, David; Boerma, Ties

    2005-01-01

    OBJECTIVE: To examine the costs of complementary information generation activities in a resource-constrained setting and compare the costs and outputs of information subsystems that generate the statistics on poverty, health and survival required for monitoring, evaluation and reporting on health programmes in the United Republic of Tanzania. METHODS: Nine systems used by four government agencies or ministries were assessed. Costs were calculated from budgets and expenditure data made available by information system managers. System coverage, quality assurance and information production were reviewed using questionnaires and interviews. Information production was characterized in terms of 38 key sociodemographic indicators required for national programme monitoring. FINDINGS: In 2002-03 approximately US$ 0.53 was spent per Tanzanian citizen on the nine information subsystems that generated information on 37 of the 38 selected indicators. The census and reporting system for routine health service statistics had the largest participating populations and highest total costs. Nationally representative household surveys and demographic surveillance systems (which are not based on nationally representative samples) produced more than half the indicators and used the most rigorous quality assurance. Five systems produced fewer than 13 indicators and had comparatively high costs per participant. CONCLUSION: Policy-makers and programme planners should be aware of the many trade-offs with respect to system costs, coverage, production, representativeness and quality control when making investment choices for monitoring and evaluation. In future, formal cost-effectiveness studies of complementary information systems would help guide investments in the monitoring, evaluation and planning needed to demonstrate the impact of poverty-reduction and health programmes. PMID:16184275

  19. Towards a personalized environmental health information service using low-cost sensors and crowdsourcing

    NASA Astrophysics Data System (ADS)

    Castell, Nuria; Liu, Hai-Ying; Schneider, Philipp; Cole-Hunter, Tom; Lahoz, William; Bartonova, Alena

    2015-04-01

    Most European cities exceed the air quality guidelines established by the WHO to protect human health. As such, citizens are exposed to potentially harmful pollutant levels. Some cities have services (e.g., web pages, mobile apps, etc.) which provide timely air quality information to the public. However, air quality data at individual level is currently scarce or non-existent. Making this information directly useful to individuals poses a challenge. For instance, if a user is informed that the air quality is "poor", what does that mean for him/her, and how can this information be acted upon? Despite individuals having a unique relationship with their environment, the information on the state of atmospheric components and related hazards is currently mostly generic, and seldom personally relevant. This undermines citizens' interest in their environment, and consequently limits their ability to recognize and change both their contribution and their exposure to air pollution. In Oslo, two EU founded projects, CITI-SENSE (Engelken-Jorge et al., 2014) and Citi-Sense-MOB (Castell et al., 2014), are trying to establish a dialogue with citizens by providing them with the possibility of getting personalized air quality information on their smartphones. The users are able to check the air quality in their immediate surroundings and track their individual exposure while moving through the urban environment (Castell et al., 2014). In this way, they may be able to reduce their exposure such as by changing transport modes or routes, for example by selecting less polluted streets to walk or cycle through. Using a smartphone application, citizens are engaged in collecting and sharing environmental data generated by low-cost air quality sensors, and in reporting their individual perception (turning citizens into sensors themselves). The highly spatially resolved data on air quality and perception is geo-located. This allows for simultaneous visualization of both kinds of the sensor

  20. The Costs of Information Retrieval Television. A Case Study in the Cost-Effectiveness of Educational Media.

    ERIC Educational Resources Information Center

    Gailitis, Maris M.

    The Information Retrieval Television (IRTV) system was a unique experimental media program initiated in several Ottawa, Canada schools in the fall of 1968. The program allowed teachers to select televised audiovisual programs for their classes at any time. This arrangement freed them from having to adapt to broadcast schedules or to the rigidities…

  1. Food Assistance: Financial Information on WIC Nutrition Services and Administrative Costs. United States General Accounting Office Report to Congressional Committees.

    ERIC Educational Resources Information Center

    Robertson, Robert E.

    The Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) is a federally funded nutrition assistance program administered by the Department of Agriculture's (USDA) Food and Nutrition Service (FNS). Responding to Congressional requests for information regarding program costs, this report provides information on: (1) funding…

  2. The Telecommunications Stranglehold on Europe's Information Use: Practical Constraints for SMEs and Economic Assessment Based on Cost Models.

    ERIC Educational Resources Information Center

    Delcroix, Jean-Claude

    There is a general feeling that European telecommunications are delaying the introduction of new information services. This paper responds to some of the questions concerning online information. The views result from research work at DECADE (Belgium) on the requirements of smaller organizations on the one hand and on telecommunications costs on…

  3. An economic approach that links volumetric estimates of resources with cost and price information

    SciTech Connect

    Nesbitt, D.M. )

    1993-01-01

    For many years, organizations such as the US Geological Survey have assembled volumetric estimates of gas and oil in place. It is legitimate for people in industry to ask: [open quotes]What do such estimates mean to me What do they mean to my business What do they mean for commodity prices [close quotes] In a world of ideal, efficient markets, such estimates would have little relevance; the best use of one's time would be to merely survey the various markets. In reality, markets are not completely efficient, and methods other than market observations are required. Volumetric estimates can contribute to better decisionmaking if they can be associated with cost and price information and if their implications in the market can thereby be determined. Until the generalized equilibrium approach, volumetric information has never been linked with the market. It has never entered the decision process of private companies the United States, Canada, or the rest of the world. With the approach outlined, the US Geological Survey volumetric estimates can be used to support such decisionmaking and lead to better industry profits, more enlightened regulation and Government administration, and more efficient use of resources. 66 refs., 28 figs.

  4. A Cost Analysis of the U.S. Air Force Overseas Posture: Informing Strategic Choices

    DTIC Science & Technology

    2013-01-01

    included end strength reductions of 50,000 personnel (Sustainable Defense Task Force, Debt , Deficits, and Defense: A Way Forward, Washington, D.C...estimate support costs, it is necessary to use a method that will capture the variable- versus fixed-costs dynamic effectively. The nature of these...of forward presence will have swamped the relative costs. Cost estimates for the Operation Iraqi Freedom range from about $800 billion68 to several

  5. Sexual harassment induces a temporary fitness cost but does not constrain the acquisition of environmental information in fruit flies.

    PubMed

    Teseo, Serafino; Veerus, Liisa; Moreno, Céline; Mery, Frédéric

    2016-01-01

    Across animals, sexual harassment induces fitness costs for females and males. However, little is known about the cognitive costs involved, i.e. whether it constrains learning processes, which could ultimately affect an individual's fitness. Here we evaluate the acquisition of environmental information in groups of fruit flies challenged with various levels of male sexual harassment. We show that, although high sexual harassment induces a temporary fitness cost for females, all fly groups of both sexes exhibit similar levels of learning. This suggests that, in fruit flies, the fitness benefits of acquiring environmental information are not affected by the fitness costs of sexual harassment, and that selection may favour cognition even in unfavourable social contexts. Our study provides novel insights into the relationship between sexual conflicts and cognition and the evolution of female counterstrategies against male sexual harassment.

  6. Waste Management Facilities Cost Information report for Greater-Than-Class C and DOE equivalent special case waste

    SciTech Connect

    Feizollahi, F.; Shropshire, D.

    1993-07-01

    This Waste Management Facility Cost Information (WMFCI) report for Greater-Than-Class C low-level waste (GTCC LLW) and DOE equivalent special case waste contains preconceptual designs and planning level life-cycle cost (PLCC) estimates for treatment, storage, and disposal facilities needed for management of GTCC LLW and DOE equivalent waste. The report contains information on 16 facilities (referred to as cost modules). These facilities are treatment facility front-end and back-end support functions (administration support, and receiving, preparation, and shipping cost modules); seven treatment concepts (incineration, metal melting, shredding/compaction, solidification, vitrification, metal sizing and decontamination, and wet/air oxidation cost modules); two storage concepts (enclosed vault and silo); disposal facility front-end functions (disposal receiving and inspection cost module); and four disposal concepts (shallow-land, engineered shallow-land, intermediate depth, and deep geological cost modules). Data in this report allow the user to develop PLCC estimates for various waste management options. A procedure to guide the U.S. Department of Energy (DOE) and its contractor personnel in the use of estimating data is also included in this report.

  7. Cost model for biobanks.

    PubMed

    Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando

    2013-10-01

    Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections.

  8. In-Office Application of Fluoride Gel or Varnish: Cost-Effectiveness and Expected Value of Perfect Information Analysis.

    PubMed

    Schwendicke, Falk; Stolpe, Michael

    2017-04-08

    Application of fluoride gel/varnish (FG/FV) reduces caries increments but generates costs. Avoiding restorative treatments by preventing caries might compensate for these costs. We assessed the cost-effectiveness of dentists applying FG/FV in office and the expected value of perfect information (EVPI). EVPI analyses estimate the economic value of having perfect knowledge, assisting research resource allocation. A mixed public-private-payer perspective in Germany was adopted. A population of 12-year-olds was followed over their lifetime, with caries increments modelled using wide intervals to reflect the uncertainty of caries risk. Biannual application of FV/FG until age 18 years was compared to no fluoride application. Effectiveness parameters and their uncertainty were derived from systematic reviews. The health outcome was caries increment (decayed, missing, or filled teeth; DMFT). Cost calculations were based on fee catalogs or microcosting, including costs for individual-prophylactic fluoridation and, for FG, an individualized tray, plus material costs. Microsimulations, sensitivity, and EVPI analyses were performed. On average and applied to a largely low-risk population, no application of fluoride was least costly but also least effective (EUR 230; 11 DMFT). FV was more costly and effective (EUR 357; 7 DMFT). FG was less effective than FV and also more costly when using individualized trays. FV was the best choice for payers willing to invest EUR 39 or more per avoided DMFT. This cost-effectiveness will differ in different settings/countries or if FG/FV is applied by other care professionals. The EVPI was mainly driven by the individual's caries risk, as FV/FG were significantly more cost-effective in high-risk populations than in low-risk ones. Future studies should focus on caries risk prediction.

  9. Risk-adjusted capitation based on the Diagnostic Cost Group Model: an empirical evaluation with health survey information.

    PubMed Central

    Lamers, L M

    1999-01-01

    OBJECTIVE: To evaluate the predictive accuracy of the Diagnostic Cost Group (DCG) model using health survey information. DATA SOURCES/STUDY SETTING: Longitudinal data collected for a sample of members of a Dutch sickness fund. In the Netherlands the sickness funds provide compulsory health insurance coverage for the 60 percent of the population in the lowest income brackets. STUDY DESIGN: A demographic model and DCG capitation models are estimated by means of ordinary least squares, with an individual's annual healthcare expenditures in 1994 as the dependent variable. For subgroups based on health survey information, costs predicted by the models are compared with actual costs. Using stepwise regression procedures a subset of relevant survey variables that could improve the predictive accuracy of the three-year DCG model was identified. Capitation models were extended with these variables. DATA COLLECTION/EXTRACTION METHODS: For the empirical analysis, panel data of sickness fund members were used that contained demographic information, annual healthcare expenditures, and diagnostic information from hospitalizations for each member. In 1993, a mailed health survey was conducted among a random sample of 15,000 persons in the panel data set, with a 70 percent response rate. PRINCIPAL FINDINGS: The predictive accuracy of the demographic model improves when it is extended with diagnostic information from prior hospitalizations (DCGs). A subset of survey variables further improves the predictive accuracy of the DCG capitation models. The predictable profits and losses based on survey information for the DCG models are smaller than for the demographic model. Most persons with predictable losses based on health survey information were not hospitalized in the preceding year. CONCLUSIONS: The use of diagnostic information from prior hospitalizations is a promising option for improving the demographic capitation payment formula. This study suggests that diagnostic

  10. Development of Cost Benefit Methodology for Scientific and Technical Information Communication and Application to Information Analysis Centers. Final Report.

    ERIC Educational Resources Information Center

    Mason, Robert M.; And Others

    This document presents a research effort intended to improve the economic information available for formulating politics and making decisions related to Information Analysis Centers (IAC's) and IAC services. The project used a system of IAC information activities to analyze the functional aspects of IAC services, calculate the present value of net…

  11. Using value-based total cost of ownership (TCO) measures to inform subsystem trade-offs

    NASA Astrophysics Data System (ADS)

    Radziwill, Nicole M.; DuPlain, Ronald F.

    2010-07-01

    Total Cost of Ownership (TCO) is a metric from management accounting that helps expose both the direct and indirect costs of a business decision. However, TCO can sometimes be too simplistic for "make vs. buy" decisions (or even choosing between competing design alternatives) when value and extensibility are more critical than total cost. A three-dimensional value-based TCO, which was developed to clarify product decisions for an observatory prior to Final Design Review (FDR), will be presented in this session. This value-based approach incorporates priority of requirements, satisfiability of requirements, and cost, and can be easily applied in any environment.

  12. Cost-Effectiveness Analysis. Instructor Guide. Working for Clean Water: An Information Program for Advisory Groups.

    ERIC Educational Resources Information Center

    Buskirk, E. Drannon, Jr.

    Presented is the instructor's manual for a one-hour presentation on cost-effectiveness analysis. Topics covered are the scope of cost-effectiveness analysis, basic assessment procedures, and the role of citizens in the analysis of alternatives. A supplementary audiovisual program is available. These materials are part of the Working for Clean…

  13. Student Awareness of Costs and Benefits of Educational Decisions: Effects of an Information Campaign. CEE DP 139

    ERIC Educational Resources Information Center

    McGuigan, Martin; McNally, Sandra; Wyness, Gill

    2012-01-01

    The economic benefits of staying on in education have been well established. But do students know this? One of the reasons why students might drop out of education too soon is because they are not well informed about the costs and benefits of staying on in education at an appropriate time of their educational career. Indeed, the fact that…

  14. 78 FR 40696 - Proposed Information Collection; Comment Request; Alaska Crab Cost Recovery

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ... Crab Cost Recovery AGENCY: National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION... and Aleutian Islands (BSAI) Crab includes the Crab Rationalization (CR) Program, a limited access system that allocates BSAI Crab resources among harvesters, processors, and coastal communities....

  15. Use of Programmed Review of Information for Costing and Evaluation (PRICE) model at CNES

    NASA Astrophysics Data System (ADS)

    Roux, J. B.

    The bases of the PRICE model, which calculates a cost-mass relation for a given project from data of thousands of other projects are outlined, and its application to the Ariane project is outlined. Ten to 20 descriptors per project define key elements, like manufacturing processes, and regression analysis calculates their coefficients and range. The model assesses physical parameters and others such as time limits. The few differences between model forecasts and actual production costs for Ariane were due to special circumstances.

  16. Economically and environmentally informed policy for road resurfacing: tradeoffs between costs and greenhouse gas emissions

    NASA Astrophysics Data System (ADS)

    Reger, Darren; Madanat, Samer; Horvath, Arpad

    2014-10-01

    As road conditions worsen, users experience an increase in fuel consumption and vehicle wear and tear. This increases the costs incurred by the drivers, and also increases the amount of greenhouse gases (GHGs) that vehicles emit. Pavement condition can be improved through rehabilitation activities (resurfacing) to reduce the effects on users, but these activities also have significant cost and GHG emission impacts. The objective of pavement management is to minimize total societal (user and agency) costs. However, the environmental impacts associated with the cost-minimizing policy are not currently accounted for. We show that there exists a range of potentially optimal decisions, known as the Pareto frontier, in which it is not possible to decrease total emissions without increasing total costs and vice versa. This research explores these tradeoffs for a system of pavement segments. For a case study, a network was created from a subset of California’s highways using available traffic data. It was shown that the current resurfacing strategy used by the state’s transportation agency, Caltrans, does not fall on the Pareto frontier, meaning that significant savings in both total costs and total emissions can be achieved by switching to one of the optimal policies. The methods presented in this paper also allow the decision maker to evaluate the impact of other policies, such as reduced vehicle kilometers traveled or better construction standards.

  17. Estimating the Cost and Payment for Sanitation in the Informal Settlements of Kisumu, Kenya: A Cross Sectional Study.

    PubMed

    Simiyu, Sheillah; Swilling, Mark; Rheingans, Richard; Cairncross, Sandy

    2017-01-06

    Lack of sanitation facilities is a common occurrence in informal settlements that are common in most developing countries. One challenge with sanitation provision in these settlements is the cost and financing of sanitation. This study aimed at estimating the cost of sanitation, and investigating the social and economic dynamics within Kisumu's informal settlements that hinder provision and uptake of sanitation facilities. Primary data was collected from residents of the settlements, and using logistic and hedonic regression analysis, we identify characteristics of residents with sanitation facilities, and estimate the cost of sanitation as revealed in rental prices. Our study finds that sanitation constitutes approximately 54% of the rent paid in the settlements; and dynamics such as landlords and tenants preferences, and sharing of sanitation facilities influence provision and payment for sanitation. This study contributes to general development by estimating the cost of sanitation, and further identifies barriers and opportunities for improvement including the interplay between landlords and tenants. Provision of sanitation in informal settlements is intertwined in social and economic dynamics, and development approaches should target both landlords and tenants, while also engaging various stakeholders to work together to identify affordable and appropriate sanitation technologies.

  18. Estimating the Cost and Payment for Sanitation in the Informal Settlements of Kisumu, Kenya: A Cross Sectional Study

    PubMed Central

    Simiyu, Sheillah; Swilling, Mark; Rheingans, Richard; Cairncross, Sandy

    2017-01-01

    Lack of sanitation facilities is a common occurrence in informal settlements that are common in most developing countries. One challenge with sanitation provision in these settlements is the cost and financing of sanitation. This study aimed at estimating the cost of sanitation, and investigating the social and economic dynamics within Kisumu’s informal settlements that hinder provision and uptake of sanitation facilities. Primary data was collected from residents of the settlements, and using logistic and hedonic regression analysis, we identify characteristics of residents with sanitation facilities, and estimate the cost of sanitation as revealed in rental prices. Our study finds that sanitation constitutes approximately 54% of the rent paid in the settlements; and dynamics such as landlords and tenants preferences, and sharing of sanitation facilities influence provision and payment for sanitation. This study contributes to general development by estimating the cost of sanitation, and further identifies barriers and opportunities for improvement including the interplay between landlords and tenants. Provision of sanitation in informal settlements is intertwined in social and economic dynamics, and development approaches should target both landlords and tenants, while also engaging various stakeholders to work together to identify affordable and appropriate sanitation technologies. PMID:28067812

  19. Draft Submission; Social Cost of Energy Generation

    SciTech Connect

    1990-01-05

    This report is intended to provide a general understanding of the social costs associated with electric power generation. Based on a thorough review of recent literature on the subject, the report describes how these social costs can be most fully and accurately evaluated, and discusses important considerations in applying this information within the competitive bidding process. [DJE 2005

  20. 77 FR 20012 - Federal Acquisition Regulation; Information Collection; Corporate Aircraft Costs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... Management and Budget (OMB) a request to review and approve an extension of a previously approved information... technological collection techniques or other forms of information technology. DATES: Submit comments on...

  1. An Examination of the Explicit Costs of Sensitive Information Security Breaches

    ERIC Educational Resources Information Center

    Toe, Cleophas Adeodat

    2013-01-01

    Data security breaches are categorized as loss of information that is entrusted in an organization by its customers, partners, shareholders, and stakeholders. Data breaches are significant risk factors for companies that store, process, and transmit sensitive personal information. Sensitive information is defined as confidential or proprietary…

  2. 78 FR 4425 - Notice of Proposed Information Collection: Comment Request; Multifamily Contractor's Mortgagor's...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    ... data to keep Field Office cost data banks and cost estimates current and accurate. HUD-92205A is used... estimates current and accurate. HUD-92205A is used to certify the actual costs of acquisition or refinancing... Mortgagor's Cost Breakdowns and Certifications AGENCY: Office of the Chief Information Officer, HUD....

  3. Allocating Information Costs in a Negotiated Information Order: Interorganizational Constraints on Decision Making in Norwegian Oil Insurance.

    ERIC Educational Resources Information Center

    Heimer, Carol A.

    1985-01-01

    This paper analyzes two types of decisions for insuring mobile oil rigs and fixed installations in the Norwegian North Sea: (1) decisions about information for ratemaking and underwriting, and (2) decisions about the conditions of insurance. Appended are 46 references. (MLF)

  4. Investing in International Information Exchange Activities to Improve the Safety, Cost Effectiveness and Schedule of Cleanup - 13281

    SciTech Connect

    Seed, Ian; James, Paula; Mathieson, John; Judd, Laurie; Elmetti-Ramirez, Rosa; Han, Ana

    2013-07-01

    With decreasing budgets and increasing pressure on completing cleanup missions as quickly, safely and cost-effectively as possible, there is significant benefit to be gained from collaboration and joint efforts between organizations facing similar issues. With this in mind, the US Department of Energy (DOE) and the UK Nuclear Decommissioning Authority (NDA) have formally agreed to share information on lessons learned on the development and application of new technologies and approaches to improve the safety, cost effectiveness and schedule of the cleanup legacy wastes. To facilitate information exchange a range of tools and methodologies were established. These included tacit knowledge exchange through facilitated meetings, conference calls and Site visits as well as explicit knowledge exchange through document sharing and newsletters. A DOE web-based portal has been established to capture these exchanges and add to them via discussion boards. The information exchange is operating at the Government-to-Government strategic level as well as at the Site Contractor level to address both technical and managerial topic areas. This effort has resulted in opening a dialogue and building working relationships. In some areas joint programs of work have been initiated thus saving resource and enabling the parties to leverage off one another activities. The potential benefits of high quality information exchange are significant, ranging from cost avoidance through identification of an approach to a problem that has been proven elsewhere to cost sharing and joint development of a new technology to address a common problem. The benefits in outcomes significantly outweigh the costs of the process. The applicability of the tools and methods along with the lessons learned regarding some key issues is of use to any organization that wants to improve value for money. In the waste management marketplace, there are a multitude of challenges being addressed by multiple organizations and

  5. Examining the Potential of Information Technologies to Improve Cost Control in Community Colleges

    ERIC Educational Resources Information Center

    Sudhakar, Samuel

    2013-01-01

    The challenges facing publicly funded community colleges have never been greater. Declining state and federal support and decreasing property tax revenues have placed a tremendous pressure on tuition rates. Declining revenues combined with the lack of adequate cost control, has caused in-state tuition and fees at public 2-year colleges to increase…

  6. 78 FR 37883 - Information Collection Activities: Report of Fuel Cost, Consumption, and Surcharge Revenue

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-24

    ... Surcharge Revenue AGENCY: Surface Transportation Board. ACTION: 60-day notice of request for comments and... Management and Budget (OMB) an extension of approval for the collection of the Report of Fuel Cost, Consumption, and Surcharge Revenue. Comments are requested concerning: (1) The accuracy of the Board's...

  7. Models of Collaboration and Cost Sharing in Transition Programming. Information Brief. Volume 6, Issue 1

    ERIC Educational Resources Information Center

    Timmons, Joe

    2007-01-01

    This brief describes collaboration and cost-sharing among agencies and Federal programs to improve transition supports for youth with disabilities. The brief reviews suggested activities for forming collaborations and pooling resources, and provides examples of programs that have "blended" or "braided" funds provided by partner agencies. It also…

  8. [Hospital information systems ineffectiveness in costing ambulatory chemotherapy in pulmonary oncology].

    PubMed

    Thomas, P; Raholimina, V; Ferri-Dessens, R M; Pibarot, M; Penot Ragon, C; Gregoire, R; Kleisbauer, J P

    2000-06-01

    The real cost of medical consumption was compared with the proportion of medication consumption of (the) GHM n(o) 681 (homogeneous group of patients, chemotherapy for cancer in day care) in the French case mix system (PMSI). For those patients in our thoracic oncology unit (Sainte-Marguerite Hospital, Marseille, France), the real medication cost was calculated from prices paid by the hospital, then compared to the expected expenditures for the medication consumption of the GHM 681, i.e. 678 French francs (24.1% of the 225 ISA points (synthetic activity index)). Over a period of 2 months in 1998, 87 patients (mean age 63 +/- 11) had 194 chemotherapy sessions in day care, with multi-drug therapy in 38 cases. Vinorelbine or gemcitabine represented 81% of the single drug chemotherapy. In 84% of the single drug and 76% of the multi-drug chemotherapy, the real cost of medication consumption was above the allocated budget. The mean cost for single drug chemotherapy was 1722 FF and 2920 FF for multi-drug chemotherapy. The budget allocated by the PMSI shows a deficit in the most cases. To avoid a restriction in the use of some drugs, it appears that the French system of budget evaluation needs to be improved.

  9. 19 CFR 10.21 - Updating cost data and other information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. General Provisions... data. Actual cost data must be submitted as soon as accounting procedures permit. To insure that... because of the accounting period normally used in the trade, or because of other relevant circumstances....

  10. Member satisfaction information as competitive intelligence: a new tool for increasing market share and reducing costs.

    PubMed

    Cooperman, T

    1995-01-01

    MCOs have begun to realize the impact that consumer satisfaction has on enrollment and pricing. Taking a lesson from auto manufacturers, MCOs are now realizing the additional advantages of obtaining consumer satisfaction information about their competitors. Knowing competitors' members intentions to stay or leave their plans, pin-pointing competitors' strengths and weaknesses, and identifying unmet consumer needs, allow MCOs to more successfully develop tactics and strategies for sales, marketing, and planning. This article describes the use of member satisfaction information as competitive intelligence, what to look for in this information, and sources for obtaining reliable information.

  11. Orangutans (Pongo abelii) "play the odds": information-seeking strategies in relation to cost, risk, and benefit.

    PubMed

    Marsh, Heidi L; MacDonald, Suzanne E

    2012-08-01

    Recent research has examined whether animals possess metacognition, or the ability to monitor their knowledge states. However, the extent to which animals actively control their knowledge states is still not well delineated. Although organisms might be capable of seeking information when it is lacking, it does not mean that it is always adaptive to do so. In the present set of experiments, we examined the flexibility of this behavior in captive orangutans (Pongo abelii; two adults and one juvenile) in a foraging task, by varying the necessity of information-seeking, the cost associated with it, the likelihood of error, and the value of the reward. In Experiment 1, subjects searched for information most often when it was "cheapest" energetically. In Experiment 2, subjects searched for information most often when the odds of making an error were the greatest. In Experiment 3, subjects searched for information more when the reward was doubled in value. In Experiment 4, adult subjects adapted to risk/benefit trade-offs in their searching behavior. In every experiment, subjects sought information more often when they needed it than when they already knew the solution to the problem. Therefore, the current research suggests that information-seeking behavior in orangutans shows a sophisticated level of flexibility, comparable to that seen in human children, as they appear to "play the odds" when making the decision to seek information or not.

  12. Information Technology Investment: Agencies Can Improve Performance, Reduce Costs, and Minimize Risks.

    DTIC Science & Technology

    2007-11-02

    strengthened management of three fundamental assets: personnel, knowledge and information, and capital property/fixed assets. Investments in information ... technology (IT) can have a dramatic impact on all three of these assets. However, an IT project’s impact comes from how the investment is selected, designed

  13. Cost (and Quality and Value) of Information Technology Support in Large Research Universities.

    ERIC Educational Resources Information Center

    Peebles, Christopher S.; Antolovic, Laurie

    1999-01-01

    Shows how financial and quality measures associated with the Balanced Scorecard (developed by Kaplan and Norton to measure organizational performance) can be applied to information technology (IT) user education and support in large research universities. Focuses on University Information Technology Services that has measured the quality of IT…

  14. Smart Aquifer Characterisation validated using Information Theory and Cost benefit analysis

    NASA Astrophysics Data System (ADS)

    Moore, Catherine

    2016-04-01

    The field data acquisition required to characterise aquifer systems are time consuming and expensive. Decisions regarding field testing, the type of field measurements to make and the spatial and temporal resolution of measurements have significant cost repercussions and impact the accuracy of various predictive simulations. The Smart Aquifer Characterisation (SAC) research programme (New Zealand (NZ)) addresses this issue by assembling and validating a suite of innovative methods for characterising groundwater systems at the large, regional and national scales. The primary outcome is a suite of cost effective tools and procedures provided to resource managers to advance the understanding and management of groundwater systems and thereby assist decision makers and communities in the management of their groundwater resources, including the setting of land use limits that protect fresh water flows and quality and the ecosystems dependent on that fresh water. The programme has focused novel investigation approaches including the use of geophysics, satellite remote sensing, temperature sensing and age dating. The SMART (Save Money And Reduce Time) aspect of the programme emphasises techniques that use these passive cost effective data sources to characterise groundwater systems at both the aquifer and the national scale by: • Determination of aquifer hydraulic properties • Determination of aquifer dimensions • Quantification of fluxes between ground waters and surface water • Groundwater age dating These methods allow either a lower cost method for estimating these properties and fluxes, or a greater spatial and temporal coverage for the same cost. To demonstrate the cost effectiveness of the methods a 'data worth' analysis is undertaken. The data worth method involves quantification of the utility of observation data in terms of how much it reduces the uncertainty of model parameters and decision focussed predictions which depend on these parameters. Such

  15. Program Information Digest for the Professional Designation in Cost Analysis and Price Analysis

    DTIC Science & Technology

    1981-01-01

    Science degree. THE PROGRAM The primary purpose of the graduate programs offered by ’the School of Systems and Logistics is to provide selected...Each of the graduate programs conducted by the School is designed to help students accomplish the following specific educational objectives: 1. Apply...Management/Cost Analysis Elective STUDENTS Students are accepted in the graduate programs of the School of Systems and Logistics in t;.e grades of

  16. Proceedings of the Printing Resources Management Information Systems Cost and Financial Workshop (1st), held 28-29 October 1982, Washington, DC.

    DTIC Science & Technology

    1982-12-01

    FIRST PRINTING RESOURCES Final - MANAGEMENT INFORMATION SYSTEM (PRMIS) COST AND FINANCIAL WORKSHOP 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(@) I...NPPS). Cost and Financial (C&F) is a subsystem of the proposed second Printing Resources Management Information System (PRMIS II). The objectives of the

  17. 76 FR 35218 - Federal Acquisition Regulation; Information Collection; Cost or Pricing Data Requirements and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-16

    ... Aeronautics and Space Administration (NASA). ACTION: Notice of request for public comments regarding an... Budget (OMB) a request to review and approve an extension of a previously approved information...

  18. Information Search in Judgment Tasks: The Effects of Unequal Cue Validity and Cost.

    DTIC Science & Technology

    1984-05-01

    1976), marketers in market surveys (Chestnut and Jacoby, 1982), and drilling com- panies in test wells (Raiffa, 1968). In each case, a complex balance...contested issues." Journal of Marketing Research, 1977, 14, 569-573. Jennings, D., Amabile, M. & Ross, L.: "Informal covariation assessment: Data-based...University Press, 1982. Kleitzer, G.D. & Wimmer, H.: "Information seeking in a multistage betting game." Archiv fur Psychologie , 1974, 126, 213-230. Lanzetta

  19. Impact of information cost and switching of trading strategies in an artificial stock market

    NASA Astrophysics Data System (ADS)

    Liu, Yi-Fang; Zhang, Wei; Xu, Chao; Vitting Andersen, Jørgen; Xu, Hai-Chuan

    2014-08-01

    This paper studies the switching of trading strategies and its effect on the market volatility in a continuous double auction market. We describe the behavior when some uninformed agents, who we call switchers, decide whether or not to pay for information before they trade. By paying for the information they behave as informed traders. First we verify that our model is able to reproduce some of the stylized facts in real financial markets. Next we consider the relationship between switching and the market volatility under different structures of investors. We find that there exists a positive relationship between the market volatility and the percentage of switchers. We therefore conclude that the switchers are a destabilizing factor in the market. However, for a given fixed percentage of switchers, the proportion of switchers that decide to buy information at a given moment of time is negatively related to the current market volatility. In other words, if more agents pay for information to know the fundamental value at some time, the market volatility will be lower. This is because the market price is closer to the fundamental value due to information diffusion between switchers.

  20. U.S. Geological Survey Streamgage Operation and Maintenance Cost Evaluation...from the National Streamflow Information Program

    USGS Publications Warehouse

    Norris, J. Michael

    2010-01-01

    To help meet the goal of providing earth-science information to the Nation, the U.S. Geological Survey (USGS) operates and maintains the largest streamgage network in the world, with over 7,600 active streamgages in 2010. This network is operated in cooperation with over 850 Federal, tribal, State, and local funding partners. The streamflow information provided by the USGS is used for the protection of life and property; for the assessment, allocation, and management of water resources; for the design of roads, bridges, dams, and water works; for the delineation of flood plains; for the assessment and evaluation of habitat; for understanding the effects of land-use, water-use, and climate changes; for evaluation of water quality; and for recreational safety and enjoyment. USGS streamgages are managed and operated to rigorous national standards, allowing analyses of data from streamgages in different areas and spanning long time periods, some with more than 100 years of data. About 90 percent of USGS streamgages provide streamflow information real-time on the web. Physical measurements of streamflow are made at streamgages multiple times a year, depending on flow conditions, to ensure the highest level of accuracy possible. In addition, multiple reviews and quality assurance checks are performed before the data is finalized. In 2006, the USGS reviewed all activities, operations, equipment, support, and costs associated with operating and maintaining a streamgage program (Norris and others, 2008). A summary of the percentages of costs associated with activities required to operate a streamgage on an annual basis are presented in figure 1. This information represents what it costs to fund a 'typical' USGS streamgage and how those funds are utilized. It should be noted that some USGS streamgages have higher percentages for some categories than do others depending on location and conditions. Forty-one percent of the funding for the typical USGS streamgage is for labor

  1. Accurate spectral color measurements

    NASA Astrophysics Data System (ADS)

    Hiltunen, Jouni; Jaeaeskelaeinen, Timo; Parkkinen, Jussi P. S.

    1999-08-01

    Surface color measurement is of importance in a very wide range of industrial applications including paint, paper, printing, photography, textiles, plastics and so on. For a demanding color measurements spectral approach is often needed. One can measure a color spectrum with a spectrophotometer using calibrated standard samples as a reference. Because it is impossible to define absolute color values of a sample, we always work with approximations. The human eye can perceive color difference as small as 0.5 CIELAB units and thus distinguish millions of colors. This 0.5 unit difference should be a goal for the precise color measurements. This limit is not a problem if we only want to measure the color difference of two samples, but if we want to know in a same time exact color coordinate values accuracy problems arise. The values of two instruments can be astonishingly different. The accuracy of the instrument used in color measurement may depend on various errors such as photometric non-linearity, wavelength error, integrating sphere dark level error, integrating sphere error in both specular included and specular excluded modes. Thus the correction formulas should be used to get more accurate results. Another question is how many channels i.e. wavelengths we are using to measure a spectrum. It is obvious that the sampling interval should be short to get more precise results. Furthermore, the result we get is always compromise of measuring time, conditions and cost. Sometimes we have to use portable syste or the shape and the size of samples makes it impossible to use sensitive equipment. In this study a small set of calibrated color tiles measured with the Perkin Elmer Lamda 18 and the Minolta CM-2002 spectrophotometers are compared. In the paper we explain the typical error sources of spectral color measurements, and show which are the accuracy demands a good colorimeter should have.

  2. Lost in translation: preclinical studies on 3,4-methylenedioxymethamphetamine provide information on mechanisms of action, but do not allow accurate prediction of adverse events in humans

    PubMed Central

    Green, AR; King, MV; Shortall, SE; Fone, KCF

    2012-01-01

    3,4-Methylenedioxymethamphetamine (MDMA) induces both acute adverse effects and long-term neurotoxic loss of brain 5-HT neurones in laboratory animals. However, when choosing doses, most preclinical studies have paid little attention to the pharmacokinetics of the drug in humans or animals. The recreational use of MDMA and current clinical investigations of the drug for therapeutic purposes demand better translational pharmacology to allow accurate risk assessment of its ability to induce adverse events. Recent pharmacokinetic studies on MDMA in animals and humans are reviewed and indicate that the risks following MDMA ingestion should be re-evaluated. Acute behavioural and body temperature changes result from rapid MDMA-induced monoamine release, whereas long-term neurotoxicity is primarily caused by metabolites of the drug. Therefore acute physiological changes in humans are fairly accurately mimicked in animals by appropriate dosing, although allometric dosing calculations have little value. Long-term changes require MDMA to be metabolized in a similar manner in experimental animals and humans. However, the rate of metabolism of MDMA and its major metabolites is slower in humans than rats or monkeys, potentially allowing endogenous neuroprotective mechanisms to function in a species specific manner. Furthermore acute hyperthermia in humans probably limits the chance of recreational users ingesting sufficient MDMA to produce neurotoxicity, unlike in the rat. MDMA also inhibits the major enzyme responsible for its metabolism in humans thereby also assisting in preventing neurotoxicity. These observations question whether MDMA alone produces long-term 5-HT neurotoxicity in human brain, although when taken in combination with other recreational drugs it may induce neurotoxicity. LINKED ARTICLES This article is commented on by Parrott, pp. 1518–1520 of this issue. To view this commentary visit http://dx.doi.org/10.1111/j.1476-5381.2012.01941.x and to view the the

  3. Student Information Systems Demystified: The Increasing Demand for Accurate, Timely Data Means Schools and Districts Are Relying Heavily on SIS Technologies

    ERIC Educational Resources Information Center

    McIntire, Todd

    2004-01-01

    Student information systems, one of the first applications of computer technology in education, are undergoing a significant transition yet again. The first major shift in SIS technologies occurred about 15 years ago when they evolved from mainframe programs to client-server solutions. Now, vendors across the board are offering centralized…

  4. Information technology facilitates cost-effectiveness analysis in developing countries: an observational study of breast cancer chemotherapy in Taiwan.

    PubMed

    Shih, Ya-Chen Tina; Pan, I-Wen; Tsai, Yi-Wen

    2009-01-01

    Health information technology offers a powerful tool to monitor the performance of a healthcare system. Advances in computer technology and capacity combined with lower start-up costs will allow developing countries to achieve greater impact when they initiate electronic health information systems. We focused on the integrated health information system that was established in Taiwan in conjunction with the launch of the National Health Insurance (NHI) programme. We used data from that health information system to conduct a cost-effectiveness analysis of chemotherapy use among breast cancer patients. We then used this analysis to discuss what policy makers can learn from this type of analysis. We identified a cohort of patients in the NHI Research Database who had been diagnosed with breast cancer in 2001 and had received chemotherapy following surgical removal of the tumour. We followed these patients for 3 years and conducted a cost-effectiveness analysis from the payer's perspective. Using the net benefit regression approach, we compared the cost effectiveness of the two most commonly prescribed first-line chemotherapy regimens for the treatment of breast cancer in 2001 in Taiwan. The dependent variable of the regression model was the individual-level net benefit, and the independent variables included a binary variable indicating the choice of chemotherapy regimen, the patients' age, co-morbidity, type of surgery, geographic region and type of treatment facility. We employed both frequentist and Bayesian approaches in our net benefit regression analyses. In the Bayesian analysis, we applied non-informative priors to all parameters in the base-case analyses. We then explored the use of informative priors in the sensitivity analysis, using cost-effectiveness data published in the literature to form the prior distributions for the relevant parameters. Over 60% of surgically treated breast cancer patients received either CMF (cyclophosphamide, methotrexate

  5. Bridging the gap between finance and clinical operations with activity-based cost management.

    PubMed

    Storfjell, J L; Jessup, S

    1996-12-01

    Activity-based cost management (ABCM) is an exciting management tool that links financial information with operations. By determining the costs of specific activities and processes, nurse managers accurately determine true costs of services more accurately than traditional cost accounting methods, and then can target processes for improvement and monitor them for change and improvement. The authors describe the ABCM process applied to nursing management situations.

  6. Information Technology Cost Center Employee Perception of Their Contribution Value in a For Profit Organizational Culture

    ERIC Educational Resources Information Center

    Gilstrap, Donald E.

    2010-01-01

    A literature review revealed a lack of academic research related to cultural dynamics within organizations that influence information technology investments. The goal of this single descriptive case study of a for profit international company was to examine one area of cultural influence on investments. The aim was to gain an understanding of…

  7. A Costing Model for Project-Based Information and Communication Technology Systems

    ERIC Educational Resources Information Center

    Stewart, Brian; Hrenewich, Dave

    2009-01-01

    A major difficulty facing IT departments is ensuring that the projects and activities to which information and communications technologies (ICT) resources are committed represent an effective, economic, and efficient use of those resources. This complex problem has no single answer. To determine effective use requires, at the least, a…

  8. Information-Seeking in Family Day Care: Access, Quality and Personal Cost

    ERIC Educational Resources Information Center

    Corr, L.; Davis, E.; Cook, K.; Mackinnon, A.; Sims, M.; Herrman, H.

    2014-01-01

    Family day-care (FDC) educators work autonomously to provide care and education for children of mixed ages, backgrounds and abilities. To meet the demands and opportunities of their work and regulatory requirements, educators need access to context-relevant and high quality information. No previous research has examined how and where these workers…

  9. Which University? A Study of the Influence of Cost and Information Factors on Scottish Undergraduate Choice

    ERIC Educational Resources Information Center

    Briggs, Senga; Wilson, Alex

    2007-01-01

    At a time when higher education institutions (HEIs) around the globe face declining student numbers and decreasing funding grants, it becomes imperative for those involved in the recruitment process to understand the factors utilized by students in the search process. This paper explores the influence of two such factors: Information Supplied by…

  10. VA/DOD Federal Health Care Center: Costly Information Technology Delays Continue and Evaluation Plan Lacking

    DTIC Science & Technology

    2012-06-01

    portability, which would allow VA and DOD clinicians to place, manage, and update clinical orders from either VA or DOD electronic health records systems...on, which includes five dedicated, full-time pharmacists to conduct manual checks of patient records to reconcile allergy information and identify

  11. Optimal Mandates and The Welfare Cost of Asymmetric Information: Evidence from The U.K. Annuity Market*

    PubMed Central

    Einav, Liran; Finkelstein, Amy; Schrimpf, Paul

    2009-01-01

    Much of the extensive empirical literature on insurance markets has focused on whether adverse selection can be detected. Once detected, however, there has been little attempt to quantify its welfare cost, or to assess whether and what potential government interventions may reduce these costs. To do so, we develop a model of annuity contract choice and estimate it using data from the U.K. annuity market. The model allows for private information about mortality risk as well as heterogeneity in preferences over different contract options. We focus on the choice of length of guarantee among individuals who are required to buy annuities. The results suggest that asymmetric information along the guarantee margin reduces welfare relative to a first best symmetric information benchmark by about £127 million per year, or about 2 percent of annuitized wealth. We also find that by requiring that individuals choose the longest guarantee period allowed, mandates could achieve the first-best allocation. However, we estimate that other mandated guarantee lengths would have detrimental effects on welfare. Since determining the optimal mandate is empirically difficult, our findings suggest that achieving welfare gains through mandatory social insurance may be harder in practice than simple theory may suggest. PMID:20592943

  12. Trade-offs between data resolution, accuracy, and cost when choosing information to plan reserves for coral reef ecosystems.

    PubMed

    Tulloch, Vivitskaia J; Klein, Carissa J; Jupiter, Stacy D; Tulloch, Ayesha I T; Roelfsema, Chris; Possingham, Hugh P

    2017-03-01

    Conservation planners must reconcile trade-offs associated with using biodiversity data of differing qualities to make decisions. Coarse habitat classifications are commonly used as surrogates to design marine reserve networks when fine-scale biodiversity data are incomplete or unavailable. Although finely-classified habitat maps provide more detail, they may have more misclassification errors, a common problem when remotely-sensed imagery is used. Despite these issues, planners rarely consider the effects of errors when choosing data for spatially explicit conservation prioritizations. Here we evaluate trade-offs between accuracy and resolution of hierarchical coral reef habitat data (geomorphology and benthic substrate) derived from remote sensing, in spatial planning for Kubulau District, Fiji. For both, we use accuracy information describing the probability that a mapped habitat classification is correct to design marine reserve networks that achieve habitat conservation targets, and demonstrate inadequacies of using habitat maps without accuracy data. We show that using more detailed habitat information ensures better representation of biogenic habitats (i.e. coral and seagrass), but leads to larger and more costly reserves, because these data have more misclassification errors, and are also more expensive to obtain. Reduced impacts on fishers are possible using coarsely-classified data, which are also more cost-effective for planning reserves if we account for data collection costs, but using these data may under-represent reef habitats that are important for fisheries and biodiversity, due to the maps low thematic resolution. Finally, we show that explicitly accounting for accuracy information in decisions maximizes the chance of successful conservation outcomes by reducing the risk of missing conservation representation targets, particularly when using finely classified data.

  13. Update of cost information contained in a previous GAO report on specific aspects of the Clinch River Breeder Reactor Project

    SciTech Connect

    Not Available

    1981-06-26

    As part of our June 23, 1977, report, the Energy Research and Development Administration (ERDA)--now part of the Department of Energy (DOE)--provided us with some cost and schedule information for the Clinch River Breeder Reactor Project as it related to three different licensing cases. At the time, the administration was attempting to terminate the Clinch River Project. And then, as now, it was a topic of heated debate within the Congress and between the Congress and the executive branch. Consequently, it was against this backdrop that we asked ERDA officials to provide us with specific cost and schedule data for the Clinch River Project, assuming it would be terminated and then restarted about 4 months later, after the Congress had an opportunity to fully consider whether to go ahead with the entire breeder reactor program. At the time, we used the 4-month lapse as an estimate that would provide an indication of the impact the project termination would have on the Clinch River Project's cost and schedule.

  14. Costs and benefits of health information technology: new trends from the literature.

    PubMed

    Goldzweig, Caroline Lubick; Towfigh, Ali; Maglione, Margaret; Shekelle, Paul G

    2009-01-01

    To understand what is new in health information technology (IT), we updated a systematic review of health IT with studies published during 2004-2007. From 4,683 titles, 179 met inclusion criteria. We identified a proliferation of patient-focused applications although little formal evaluation in this area; more descriptions of commercial electronic health records (EHRs) and health IT systems designed to run independently from EHRs; and proportionately fewer relevant studies from the health IT leaders. Accelerating the adoption of health IT will require greater public-private partnerships, new policies to address the misalignment of financial incentives, and a more robust evidence base regarding IT implementation.

  15. Developing Information on Energy Savings and Associated Costs and Benefits of Energy Efficient Emerging Technologies Applicable in California

    SciTech Connect

    Xu, Tengfang; Slaa, Jan Willem; Sathaye, Jayant

    2010-12-15

    Implementation and adoption of efficient end-use technologies have proven to be one of the key measures for reducing greenhouse gas (GHG) emissions throughout the industries. In many cases, implementing energy efficiency measures is among one of the most cost effective investments that the industry could make in improving efficiency and productivity while reducing carbon dioxide (CO2) emissions. Over the years, there have been incentives to use resources and energy in a cleaner and more efficient way to create industries that are sustainable and more productive. With the working of energy programs and policies on GHG inventory and regulation, understanding and managing the costs associated with mitigation measures for GHG reductions is very important for the industry and policy makers around the world and in California. Successful implementation of applicable emerging technologies not only may help advance productivities, improve environmental impacts, or enhance industrial competitiveness, but also can play a significant role in climate-mitigation efforts by saving energy and reducing the associated GHG emissions. Developing new information on costs and savings benefits of energy efficient emerging technologies applicable in California market is important for policy makers as well as the industries. Therefore, provision of timely evaluation and estimation of the costs and energy savings potential of emerging technologies applicable to California is the focus of this report. The overall goal of the project is to identify and select a set of emerging and under-utilized energy-efficient technologies and practices as they are important to reduce energy consumption in industry while maintaining economic growth. Specifically, this report contains the results from performing Task 3 Technology Characterization for California Industries for the project titled Research Opportunities in Emerging and Under-Utilized Energy-Efficient Industrial Technologies, sponsored by

  16. Cost goals

    NASA Technical Reports Server (NTRS)

    Hoag, J.

    1981-01-01

    Cost goal activities for the point focusing parabolic dish program are reported. Cost goals involve three tasks: (1) determination of the value of the dish systems to potential users; (2) the cost targets of the dish system are set out; (3) the value side and cost side are integrated to provide information concerning the potential size of the market for parabolic dishes. The latter two activities are emphasized.

  17. Agent Based Modelling of Communication Costs: Why Information Can Be Free

    NASA Astrophysics Data System (ADS)

    Čače, Ivana; Bryson, Joanna J.

    What purposes, other than facilitating the sharing of information, can language have served? First, it may not have evolved to serve any purpose at all. It is possible that language is just a side effect of the large human brain — a spandrel or exaptation — that only became useful later. If language is adaptive, this does not necessarily mean that it is adaptive for the purpose of communication. For example Dennett (1996) and Chomsky (1980) have stressed the utility of language in thinking. Also, there are different ways to view communication. The purpose of language according to Dunbar (1993), is to replace grooming as a social bonding process and in this way to ensure the stability of large social groups.

  18. Using architecture information and real-time resource state to reduce power consumption and communication costs in parallel applications.

    SciTech Connect

    Brandt, James M.; Devine, Karen Dragon; Gentile, Ann C.; Leung, Vitus Joseph; Olivier, Stephen Lecler; Pedretti, Kevin; Rajamanickam, Sivasankaran; Bunde, David P.; Deveci, Mehmet; Catalyurek, Umit V.

    2014-09-01

    As computer systems grow in both size and complexity, the need for applications and run-time systems to adjust to their dynamic environment also grows. The goal of the RAAMP LDRD was to combine static architecture information and real-time system state with algorithms to conserve power, reduce communication costs, and avoid network contention. We devel- oped new data collection and aggregation tools to extract static hardware information (e.g., node/core hierarchy, network routing) as well as real-time performance data (e.g., CPU uti- lization, power consumption, memory bandwidth saturation, percentage of used bandwidth, number of network stalls). We created application interfaces that allowed this data to be used easily by algorithms. Finally, we demonstrated the benefit of integrating system and application information for two use cases. The first used real-time power consumption and memory bandwidth saturation data to throttle concurrency to save power without increasing application execution time. The second used static or real-time network traffic information to reduce or avoid network congestion by remapping MPI tasks to allocated processors. Results from our work are summarized in this report; more details are available in our publications [2, 6, 14, 16, 22, 29, 38, 44, 51, 54].

  19. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces

    NASA Astrophysics Data System (ADS)

    Abbott, W. W.; Faisal, A. A.

    2012-08-01

    Eye movements are highly correlated with motor intentions and are often retained by patients with serious motor deficiencies. Despite this, eye tracking is not widely used as control interface for movement in impaired patients due to poor signal interpretation and lack of control flexibility. We propose that tracking the gaze position in 3D rather than 2D provides a considerably richer signal for human machine interfaces by allowing direct interaction with the environment rather than via computer displays. We demonstrate here that by using mass-produced video-game hardware, it is possible to produce an ultra-low-cost binocular eye-tracker with comparable performance to commercial systems, yet 800 times cheaper. Our head-mounted system has 30 USD material costs and operates at over 120 Hz sampling rate with a 0.5-1 degree of visual angle resolution. We perform 2D and 3D gaze estimation, controlling a real-time volumetric cursor essential for driving complex user interfaces. Our approach yields an information throughput of 43 bits s-1, more than ten times that of invasive and semi-invasive brain-machine interfaces (BMIs) that are vastly more expensive. Unlike many BMIs our system yields effective real-time closed loop control of devices (10 ms latency), after just ten minutes of training, which we demonstrate through a novel BMI benchmark—the control of the video arcade game ‘Pong’.

  20. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces.

    PubMed

    Abbott, W W; Faisal, A A

    2012-08-01

    Eye movements are highly correlated with motor intentions and are often retained by patients with serious motor deficiencies. Despite this, eye tracking is not widely used as control interface for movement in impaired patients due to poor signal interpretation and lack of control flexibility. We propose that tracking the gaze position in 3D rather than 2D provides a considerably richer signal for human machine interfaces by allowing direct interaction with the environment rather than via computer displays. We demonstrate here that by using mass-produced video-game hardware, it is possible to produce an ultra-low-cost binocular eye-tracker with comparable performance to commercial systems, yet 800 times cheaper. Our head-mounted system has 30 USD material costs and operates at over 120 Hz sampling rate with a 0.5-1 degree of visual angle resolution. We perform 2D and 3D gaze estimation, controlling a real-time volumetric cursor essential for driving complex user interfaces. Our approach yields an information throughput of 43 bits s(-1), more than ten times that of invasive and semi-invasive brain-machine interfaces (BMIs) that are vastly more expensive. Unlike many BMIs our system yields effective real-time closed loop control of devices (10 ms latency), after just ten minutes of training, which we demonstrate through a novel BMI benchmark--the control of the video arcade game 'Pong'.

  1. Daily variation in natural disaster casualties: information flows, safety, and opportunity costs in tornado versus hurricane strikes.

    PubMed

    Zahran, Sammy; Tavani, Daniele; Weiler, Stephan

    2013-07-01

    Casualties from natural disasters may depend on the day of the week they strike. With data from the Spatial Hazard Events and Losses Database for the United States (SHELDUS), daily variation in hurricane and tornado casualties from 5,043 tornado and 2,455 hurricane time/place events is analyzed. Hurricane forecasts provide at-risk populations with considerable lead time. Such lead time allows strategic behavior in choosing protective measures under hurricane threat; opportunity costs in terms of lost income are higher during weekdays than during weekends. On the other hand, the lead time provided by tornadoes is near zero; hence tornados generate no opportunity costs. Tornado casualties are related to risk information flows, which are higher during workdays than during leisure periods, and are related to sheltering-in-place opportunities, which are better in permanent buildings like businesses and schools. Consistent with theoretical expectations, random effects negative binomial regression results indicate that tornado events occurring on the workdays of Monday through Thursday are significantly less lethal than tornados that occur on weekends. In direct contrast, and also consistent with theory, the expected count of hurricane casualties increases significantly with weekday occurrences. The policy implications of observed daily variation in tornado and hurricane events are considered.

  2. Co-emergence of multi-scale cortical activities of irregular firing, oscillations and avalanches achieves cost-efficient information capacity.

    PubMed

    Yang, Dong-Ping; Zhou, Hai-Jun; Zhou, Changsong

    2017-02-01

    The brain is highly energy consuming, therefore is under strong selective pressure to achieve cost-efficiency in both cortical connectivities and activities. However, cost-efficiency as a design principle for cortical activities has been rarely studied. Especially it is not clear how cost-efficiency is related to ubiquitously observed multi-scale properties: irregular firing, oscillations and neuronal avalanches. Here we demonstrate that these prominent properties can be simultaneously observed in a generic, biologically plausible neural circuit model that captures excitation-inhibition balance and realistic dynamics of synaptic conductance. Their co-emergence achieves minimal energy cost as well as maximal energy efficiency on information capacity, when neuronal firing are coordinated and shaped by moderate synchrony to reduce otherwise redundant spikes, and the dynamical clusterings are maintained in the form of neuronal avalanches. Such cost-efficient neural dynamics can be employed as a foundation for further efficient information processing under energy constraint.

  3. Co-emergence of multi-scale cortical activities of irregular firing, oscillations and avalanches achieves cost-efficient information capacity

    PubMed Central

    Zhou, Hai-Jun; Zhou, Changsong

    2017-01-01

    The brain is highly energy consuming, therefore is under strong selective pressure to achieve cost-efficiency in both cortical connectivities and activities. However, cost-efficiency as a design principle for cortical activities has been rarely studied. Especially it is not clear how cost-efficiency is related to ubiquitously observed multi-scale properties: irregular firing, oscillations and neuronal avalanches. Here we demonstrate that these prominent properties can be simultaneously observed in a generic, biologically plausible neural circuit model that captures excitation-inhibition balance and realistic dynamics of synaptic conductance. Their co-emergence achieves minimal energy cost as well as maximal energy efficiency on information capacity, when neuronal firing are coordinated and shaped by moderate synchrony to reduce otherwise redundant spikes, and the dynamical clusterings are maintained in the form of neuronal avalanches. Such cost-efficient neural dynamics can be employed as a foundation for further efficient information processing under energy constraint. PMID:28192429

  4. Quantum measurement and the first law of thermodynamics: the energy cost of measurement is the work value of the acquired information.

    PubMed

    Jacobs, Kurt

    2012-10-01

    The energy cost of measurement is an important fundamental question, and may have profound implications for quantum technologies. In the context of Maxwell's demon, it is often stated that measurement has no minimum energy cost, while information has a work value. However, as we elucidate, the first of these statements does not refer to the cost paid by the measuring device. Here we show that it is only when a measuring device has access to a zero-temperature reservoir-that is, never-that measurement requires no energy. To obtain a given amount of information, all measuring devices must pay a cost equal to that which a heat engine would pay to obtain the equivalent work value of that information.

  5. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses

    PubMed Central

    Soares, Marta O.; Palmer, Stephen; Ades, Anthony E.; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M.

    2015-01-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. PMID:25712447

  6. GME: at what cost?

    PubMed

    Young, David W

    2003-11-01

    Current computing methods impede determining the real cost of graduate medical education. However, a more accurate estimate could be obtained if policy makers would allow for the application of basic cost-accounting principles, including consideration of department-level costs, unbundling of joint costs, and other factors.

  7. 32 CFR Appendix D to Part 286 - DD Form 2086-1, “Record of Freedom of Information (FOI) Processing Cost for Technical Data”

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 2 2011-07-01 2011-07-01 false DD Form 2086-1, âRecord of Freedom of Information (FOI) Processing Cost for Technical Dataâ D Appendix D to Part 286 National Defense Department of... FREEDOM OF INFORMATION ACT PROGRAM REGULATION Pt. 286, App. D Appendix D to Part 286—DD Form...

  8. 32 CFR Appendix D to Part 286 - DD Form 2086-1, “Record of Freedom of Information (FOI) Processing Cost for Technical Data”

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 2 2010-07-01 2010-07-01 false DD Form 2086-1, âRecord of Freedom of Information (FOI) Processing Cost for Technical Dataâ D Appendix D to Part 286 National Defense Department of... FREEDOM OF INFORMATION ACT PROGRAM REGULATION Pt. 286, App. D Appendix D to Part 286—DD Form...

  9. Using Cost-Effectiveness Tests to Design CHP Incentive Programs

    SciTech Connect

    Tidball, Rick

    2014-11-01

    This paper examines the structure of cost-effectiveness tests to illustrate how they can accurately reflect the costs and benefits of CHP systems. This paper begins with a general background discussion on cost-effectiveness analysis of DER and then describes how cost-effectiveness tests can be applied to CHP. Cost-effectiveness results are then calculated and analyzed for CHP projects in five states: Arkansas, Colorado, Iowa, Maryland, and North Carolina. Based on the results obtained for these five states, this paper offers four considerations to inform regulators in the application of cost-effectiveness tests in developing CHP programs.

  10. Tokamak reactor cost model based on STARFIRE/WILDCAT costing

    SciTech Connect

    Evans, K. Jr.

    1983-03-01

    A cost model is presented which is useful for survey and comparative studies of tokamak reactors. The model is heavily based on STARFIRE and WILDCAT costing guidelines, philosophies, and procedures and reproduces the costing for these devices quite accurately.

  11. The Cost-Effectiveness of Two Forms of Case Management Compared to a Control Group for Persons with Dementia and Their Informal Caregivers from a Societal Perspective

    PubMed Central

    Eekhout, Iris; Joling, Karlijn J.; van Mierlo, Lisa D.; Meiland, Franka J. M.; van Hout, Hein P. J.; de Rooij, Sophia E.

    2016-01-01

    Objectives The objective of this article was to compare the costs and cost-effectiveness of the two most prominent types of case management in the Netherlands (intensive case management and linkage models) against no access to case management (control group) for people with already diagnosed dementia and their informal caregivers. Methods The economic evaluation was conducted from a societal perspective embedded within a two year prospective, observational, controlled, cohort study with 521 informal caregivers and community-dwelling persons with dementia. Case management provided within one care organization (intensive case management model, ICMM), case management where care was provided by different care organizations within one region (Linkage model, LM), and a group with no access to case management (control) were compared. The economic evaluation related incremental costs to incremental effects regarding neuropsychiatric symptoms (NPI), psychological health of the informal caregiver (GHQ-12), and quality adjusted life years (QALY) of the person with dementia and informal caregiver. Results Inverse-propensity-score-weighted models showed no significant differences in clinical or total cost outcomes between the three groups. Informal care costs were significantly lower in the ICMM group compared to both other groups. Day center costs were significantly lower in the ICMM group compared to the control group. For all outcomes, the probability that the ICMM was cost-effective in comparison with LM and the control group was larger than 0.97 at a threshold ratio of 0 €/incremental unit of effect. Conclusion This study provides preliminary evidence that the ICMM is cost-effective compared to the control group and the LM. However, the findings should be interpreted with caution since this study was not a randomized controlled trial. PMID:27655234

  12. Resource costing for multinational neurologic clinical trials: methods and results.

    PubMed

    Schulman, K; Burke, J; Drummond, M; Davies, L; Carlsson, P; Gruger, J; Harris, A; Lucioni, C; Gisbert, R; Llana, T; Tom, E; Bloom, B; Willke, R; Glick, H

    1998-11-01

    We present the results of a multinational resource costing study for a prospective economic evaluation of a new medical technology for treatment of subarachnoid hemorrhage within a clinical trial. The study describes a framework for the collection and analysis of international resource cost data that can contribute to a consistent and accurate intercountry estimation of cost. Of the 15 countries that participated in the clinical trial, we collected cost information in the following seven: Australia, France, Germany, the UK, Italy, Spain, and Sweden. The collection of cost data in these countries was structured through the use of worksheets to provide accurate and efficient cost reporting. We converted total average costs to average variable costs and then aggregated the data to develop study unit costs. When unit costs were unavailable, we developed an index table, based on a market-basket approach, to estimate unit costs. To estimate the cost of a given procedure, the market-basket estimation process required that cost information be available for at least one country. When cost information was unavailable in all countries for a given procedure, we estimated costs using a method based on physician-work and practice-expense resource-based relative value units. Finally, we converted study unit costs to a common currency using purchasing power parity measures. Through this costing exercise we developed a set of unit costs for patient services and per diem hospital services. We conclude by discussing the implications of our costing exercise and suggest guidelines to facilitate more effective multinational costing exercises.

  13. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  14. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  15. Estimating the cost of extremely large telescopes

    NASA Astrophysics Data System (ADS)

    Stepp, Larry M.; Daggert, Larry G.; Gillett, Paul E.

    2003-01-01

    For future giant telescopes, control of construction and operation costs will be the key factor in their success. The best way to accomplish this cost control, while maximizing the performance of the telescope, will be through design-to-cost methods that use value engineering techniques to develop the most cost-effective design in terms of performance per dollar. This will require quantifiable measures of performance and cost, including: (1) a way of quantifying science value with scientific merit functions; (2) a way of predicting telescope performance in the presence of real-world disturbances by means of integrated modeling; and (3) a way of predicting the cost of multiple design configurations. Design-to-cost methods should be applied as early as possible in the project, since the majority of the life-cycle costs for the observatory will be locked in by choices made during the conceptual design phase. However, there is a dilemma: how can costs be accurately estimated for systems that have not yet been designed? This paper discusses cost estimating methods and describes their application to estimating the cost of ELTs, showing that the best method to use during the conceptual design phase is parametric cost estimating. Examples of parametric estimating techniques are described, based on experience gained from instrument development programs at NOAO. We then describe efforts underway to collect historical cost information and develop cost estimating relationships in preparation for the conceptual design phase of the Giant Segmented Mirror Telescope.

  16. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  17. Principles and methods of managerial cost-accounting systems.

    PubMed

    Suver, J D; Cooper, J C

    1988-01-01

    An introduction to cost-accounting systems for pharmacy managers is provided; terms are defined and examples of specific applications are given. Cost-accounting systems determine, record, and report the resources consumed in providing services. An effective cost-accounting system must provide the information needed for both internal and external reports. In accounting terms, cost is the value given up to secure an asset. In determining how volumes of activity affect costs, fixed costs and variable costs are calculated; applications include pricing strategies, cost determinations, and break-even analysis. Also discussed are the concepts of direct and indirect costs, opportunity costs, and incremental and sunk costs. For most pharmacy department services, process costing, an accounting of intermediate outputs and homogeneous units, is used; in determining the full cost of providing a product or service (e.g., patient stay), job-order costing is used. Development of work-performance standards is necessary for monitoring productivity and determining product costs. In allocating pharmacy department costs, a ratio of costs to charges can be used; this method is convenient, but microcosting (specific identification of the costs of products) is more accurate. Pharmacy managers can use cost-accounting systems to evaluate the pharmacy's strategies, policies, and services and to improve budgets and reports.

  18. Cost-Effectiveness of Colorectal Cancer Screening in High Risk Spanish Patients: Use of a Validated Model to Inform Public Policy

    PubMed Central

    Ladabaum, Uri; Ferrandez, Angel; Lanas, Angel I.

    2011-01-01

    Background The European Community has made a commitment to colorectal cancer (CRC) screening, but regional considerations may affect the design of national screening programs. We developed a decision analytic model tailored to a pilot screening program for high risk persons in Spain with the aim of informing public policy decisions. Methods We constructed a decision analytic Markov model based on our validated model of CRC screening that reflected CRC epidemiology and costs in persons with first-degree relatives with CRC in Aragón, Spain, and superimposed colonoscopy every 5 or 10 years from ages 40-80 years. The pilot program’s preliminary clinical results and our modeling results were presented to regional health authorities. Results In the model, without screening, 88 CRC cases occurred per 1,000 persons from age 40-85 years. In the base case, screening reduced this by 72-77% and gained 0.12 discounted life-years/person. Screening every 10 years was cost-saving, and screening every 5 years vs. every 10 years cost 7,250 €/life-year gained. Based on these savings, 36-39 €/person/year could go towards operating costs while maintaining a neutral budget. If screening costs doubled, screening remained highly cost-effective, but no longer cost-saving. These results contributed to the health authorities’ decision to expand the pilot program to the entire region in 2009. Conclusions Colonoscopic screening of first-degree relatives of persons with CRC may be cost-saving in public systems like Spain’s. Decision analytic modeling tailored to regional considerations can inform public policy decisions. Impact Tailored decision analytic modeling can inform regional policy decisions on cancer screening. PMID:20810603

  19. Accurate upper body rehabilitation system using kinect.

    PubMed

    Sinha, Sanjana; Bhowmick, Brojeshwar; Chakravarty, Kingshuk; Sinha, Aniruddha; Das, Abhijit

    2016-08-01

    The growing importance of Kinect as a tool for clinical assessment and rehabilitation is due to its portability, low cost and markerless system for human motion capture. However, the accuracy of Kinect in measuring three-dimensional body joint center locations often fails to meet clinical standards of accuracy when compared to marker-based motion capture systems such as Vicon. The length of the body segment connecting any two joints, measured as the distance between three-dimensional Kinect skeleton joint coordinates, has been observed to vary with time. The orientation of the line connecting adjoining Kinect skeletal coordinates has also been seen to differ from the actual orientation of the physical body segment. Hence we have proposed an optimization method that utilizes Kinect Depth and RGB information to search for the joint center location that satisfies constraints on body segment length and as well as orientation. An experimental study have been carried out on ten healthy participants performing upper body range of motion exercises. The results report 72% reduction in body segment length variance and 2° improvement in Range of Motion (ROM) angle hence enabling to more accurate measurements for upper limb exercises.

  20. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  1. How Do Students Meet the Cost of Attending a State University? Information Brief. Volume 4, Issue 2

    ERIC Educational Resources Information Center

    Florida Board of Governors, State University System, 2007

    2007-01-01

    Students and their families must cover, on average, 83% of the roughly $16,000 cost of attendance for a full-time, in-state undergraduate at a state university in Florida. On average, 75% of the cost of attendance in Florida's public universities is from expenses other than tuition and fees and books. The largest expense is room and board,…

  2. Toxic Substances: Information on Costs and Financial Aid to Schools To Control Asbestos. Fact Sheet for the Honorable John J. La Falce, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Resources, Community, and Economic Development Div.

    Information on the costs of and financial aid available to schools for asbestos abatement is provided in this report. Data are based on interviews with officials from 15 school districts in 5 states--Illinois, New Jersey, New York, Ohio, and Pennsylvania. Section 1 provides background on the use of asbestos in buildings, health problems, federal…

  3. EPA (Environmental Protection Agency) evaluation of the HYDRO-VAC device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-08-01

    This document announces the conclusions of the EPA evaluation of the HYDRO-VAC device under section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the HYDRO-VAC device was conducted upon the application of the manufacturer. The product is claimed to improved fuel economy and performance for both gasoline and diesel fueled vehicles.

  4. EPA evaluation of the SYNERGY-1 fuel additive under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1981-06-01

    This document announces the conclusions of the EPA evaluation of the 'SYNERGY-1' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. This additive is intended to improve fuel economy and exhaust emission levels of two and four cycle gasoline fueled engines.

  5. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  6. EPA (Environmental Protection Agency) evaluation of the Cyclone-Z device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-01-01

    This document announces the conclusions of the EPA evaluation of the Cyclone-Z device under the provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the Cyclone-Z device was conducted upon receiving an application from the marketer. The device is claimed to improve fuel economy and driveability and to reduce exhaust emissions. EPA fully considered all of the information submitted by the applicant. The evaluation of the Cyclone-Z device was based on that information, EPA's engineering judgement, and its experience with other air bleed devices.

  7. An Analysis of Rocket Propulsion Testing Costs

    NASA Technical Reports Server (NTRS)

    Ramirez, Carmen; Rahman, Shamim

    2010-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is commonly characterized as one of two types: production testing for certification and acceptance of engine hardware, and developmental testing for prototype evaluation or research and development (R&D) purposes. For programmatic reasons there is a continuing need to assess and evaluate the test costs for the various types of test campaigns that involve liquid rocket propellant test articles. Presently, in fact, there is a critical need to provide guidance on what represents a best value for testing and provide some key economic insights for decision-makers within NASA and the test customers outside the Agency. Hence, selected rocket propulsion test databases and references have been evaluated and analyzed with the intent to discover correlations of technical information and test costs that could help produce more reliable and accurate cost projections in the future. The process of searching, collecting, and validating propulsion test cost information presented some unique obstacles which then led to a set of recommendations for improvement in order to facilitate future cost information gathering and analysis. In summary, this historical account and evaluation of rocket propulsion test cost information will enhance understanding of the various kinds of project cost information; identify certain trends of interest to the aerospace testing community.

  8. Guaranteed cost consensus protocol design for linear multi-agent systems with sampled-data information: An input delay approach.

    PubMed

    Zhao, Yadong; Zhang, Weidong

    2017-03-01

    To investigate the energy consumption involved in a sampled-data consensus process, the problem of guaranteed cost consensus for sampled-data linear multi-agent systems is considered. By using an input delay approach, an equivalent system is constructed to convert the guaranteed cost consensus problem to a guaranteed cost stabilization problem. A sufficient condition for guaranteed cost consensus is given in terms of linear matrix inequalities (LMIs), based on a refined time-dependent Lyapunov functional analysis. Reduced-order protocol design methodologies are proposed, with further discussions on determining sub-optimal protocol gain and enlarging allowable sampling interval bound made as a complement. Simulation results illustrate the effectiveness of the theoretical results.

  9. Public Perceptions of Regulatory Costs, Their Uncertainty and Interindividual Distribution.

    PubMed

    Johnson, Branden B; Finkel, Adam M

    2016-06-01

    Public perceptions of both risks and regulatory costs shape rational regulatory choices. Despite decades of risk perception studies, this article is the first on regulatory cost perceptions. A survey of 744 U.S. residents probed: (1) How knowledgeable are laypeople about regulatory costs incurred to reduce risks? (2) Do laypeople see official estimates of cost and benefit (lives saved) as accurate? (3) (How) do preferences for hypothetical regulations change when mean-preserving spreads of uncertainty replace certain cost or benefit? and (4) (How) do preferences change when unequal interindividual distributions of hypothetical regulatory costs replace equal distributions? Respondents overestimated costs of regulatory compliance, while assuming agencies underestimate costs. Most assumed agency estimates of benefits are accurate; a third believed both cost and benefit estimates are accurate. Cost and benefit estimates presented without uncertainty were slightly preferred to those surrounded by "narrow uncertainty" (a range of costs or lives entirely within a personally-calibrated zone without clear acceptance or rejection of tradeoffs). Certain estimates were more preferred than "wide uncertainty" (a range of agency estimates extending beyond these personal bounds, thus posing a gamble between favored and unacceptable tradeoffs), particularly for costs as opposed to benefits (but even for costs a quarter of respondents preferred wide uncertainty to certainty). Agency-acknowledged uncertainty in general elicited mixed judgments of honesty and trustworthiness. People preferred egalitarian distributions of regulatory costs, despite skewed actual cost distributions, and preferred progressive cost distributions (the rich pay a greater than proportional share) to regressive ones. Efficient and socially responsive regulations require disclosure of much more information about regulatory costs and risks.

  10. PPD-QALY-an index for cost-effectiveness in orthopedics: providing essential information to both physicians and health care policy makers for appropriate allocation of medical resources.

    PubMed

    Dougherty, Christopher P; Howard, Timothy

    2013-09-01

    Because of the increasing health care costs and the need for proper allocation of resources, it is important to ensure the best use of health benefits for sick and injured people of the population. An index or indicator is needed to help us quantify what is being spent so that comparisons with other options can be implemented. Cost-effective analysis seems to be well suited to provide this essential information to health care policy makers and those charged with distributing disability funds so that the proper allocation of resources can be achieved. There is currently no such index to show whether the benefits paid out are the most cost-effective. By comparing the quality-adjusted life year (QALY) of a treatment method to the disability an individual would experience, on the basis of lost wages as measure of disability, we provide decision makers more information for the basis of cost allocation in health care. To accomplish this, we describe a new term, the PPD-QALY (permanent partial disability-quality of life year). This term was developed to establish an index to which musculoskeletal care can be compared, to evaluate the cost-effectiveness of a treatment on the basis of the monetary value of the disability. This term serves to standardize the monetary value of an injury. Cost-effective analysis in arthroscopic surgery may prove to be a valuable asset in this role and to provide decision makers the information needed to determine the societal benefit from new arthroscopic procedures as they are developed and implemented.

  11. The Efficacy of Written Information Intervention in Reduction of Hospital Re-admission Cost in Patients With Heart Failure; A Systematic Review and Meta-Analysis

    PubMed Central

    Zarea Gavgani, Vahideh; Kazemi Majd, Faranak; Nosratnejad, Shirin; Golmohammadi, Ali; Sadeghi-Bazargani, Homayoun

    2015-01-01

    Objective: To assess the efficacy of written information versus non written information intervention in reducing hospital readmission cost, if prescribed or presented to the patients with HF. Methods: The study was a systematic review and meta-analysis. We searched Medline (Ovid) and Cochrane library during the past 20 years from 1993 to 2013. We also conducted a manual search through Google Scholar and a direct search in the group of related journals in Black Well and Science Direct trough their websites. Two reviewers appraised the identified studies, and meta-analysis was done to estimate the mean saving cost of patient readmission. All the included studies must have been done by randomization to be eligible for study. Result: We assessed the full-texts 3 out of 65 studies with 754 patients and average age of 74.33. The mean of estimated saving readmission cost in intervention group versus control group was US $2751 (95% CI: 2708 – 2794) and the mean of total saving cost in intervention group versus control group was US $2047 (base year 2010) with (95% CI: 2004 – 2089). No publication bias was found by testing the heterogeneity of studies. Conclusion: One of the effective factors in minimizing the healthcare cost and preventing from hospital re-admission is providing the patients with information prescription in a written format. It is suggested that hospital management, Medicare organizations, policy makers and individual physicians consider the prescription of appropriate medical information as the indispensable part of patient’s care process. PMID:25859308

  12. Organizational Uses of Health Information Exchange to Change Cost and Utilization Outcomes: A Typology from a Multi-Site Qualitative Analysis.

    PubMed

    Vest, Joshua R; Abramson, Erika

    2015-01-01

    Health information exchange (HIE) systems facilitate access to patient information for a variety of health care organizations, end users, and clinical and organizational goals. While a complex intervention, organizations' usage of HIE is often conceptualized and measured narrowly. We sought to provide greater specificity to the concept of HIE as an intervention by formulating a typology of organizational HIE usage. We interviewed representatives of a regional health information organization and health care organizations actively using HIE information to change patient utilization and costs. The resultant typology includes three dimensions: user role, usage initiation, and patient set. This approach to categorizing how health care organizations are actually applying HIE information to clinical and business tasks provides greater clarity about HIE as an intervention and helps elucidate the conceptual linkage between HIE an organizational and patient outcomes.

  13. Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, Not Just More Evidence on Cost-Effectiveness.

    PubMed

    Baltussen, Rob; Jansen, Maarten P; Mikkelsen, Evelinn; Tromp, Noor; Hontelez, Jan; Bijlmakers, Leon; Van der Wilt, Gert Jan

    2016-06-22

    Priority setting of health interventions is generally considered as a valuable approach to support low- and middle-income countries (LMICs) in their strive for universal health coverage (UHC). However, present initiatives on priority setting are mainly geared towards the development of more cost-effectiveness information, and this evidence does not sufficiently support countries to make optimal choices. The reason is that priority setting is in reality a value-laden political process in which multiple criteria beyond cost-effectiveness are important, and stakeholders often justifiably disagree about the relative importance of these criteria. Here, we propose the use of 'evidence-informed deliberative processes' as an approach that does explicitly recognise priority setting as a political process and an intrinsically complex task. In these processes, deliberation between stakeholders is crucial to identify, reflect and learn about the meaning and importance of values, informed by evidence on these values. Such processes then result in the use of a broader range of explicit criteria that can be seen as the product of both international learning ('core' criteria, which include eg, cost-effectiveness, priority to the worse off, and financial protection) and learning among local stakeholders ('contextual' criteria). We believe that, with these evidence-informed deliberative processes in place, priority setting can provide a more meaningful contribution to achieving UHC.

  14. Development of standardized air-blown coal gasifier/gas turbine concepts for future electric power systems. Volume 5, Appendix D: Cost support information: Final report

    SciTech Connect

    Sadowski, R.S.; Brown, M.J.; Harriz, J.T.; Ostrowski, E.

    1991-01-01

    The cost estimate provided for the DOE sponsored study of Air Blown Coal Gasification was developed from vendor quotes obtained directly for the equipment needed in the 50 MW, 100 MW, and 200 MW sized plants and from quotes from other jobs that have been referenced to apply to the particular cycle. Quotes were generally obtained for the 100 MW cycle and a scale up/down factor was used to generate the cost estimates for the 200 MW and 50 MW cycles, respectively. Information from GTPro (property of Thermoflow, Inc.) was used to estimate the cost of the 200 MW and 50 MW gas turbine, HRSG, and steam turbines. To available the use of GTPro`s estimated values for this equipment, a comparison was made between the quotes obtained for the 100 MW cycle (ABB GT 11N combustion turbine and a HSRG) against the estimated values by GTPro.

  15. COSTS OF URBAN STORMWATER CONTROL

    EPA Science Inventory

    This paper presents information on the cost of stormwater pollution control facilities in urban areas, including collection, control, and treatment systems. Information on prior cost studies of control technologies and cost estimating models used in these studies was collected, r...

  16. The Psychology of Cost Estimating

    NASA Technical Reports Server (NTRS)

    Price, Andy

    2016-01-01

    Cost estimation for large (and even not so large) government programs is a challenge. The number and magnitude of cost overruns associated with large Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) programs highlight the difficulties in developing and promulgating accurate cost estimates. These overruns can be the result of inadequate technology readiness or requirements definition, the whims of politicians or government bureaucrats, or even as failures of the cost estimating profession itself. However, there may be another reason for cost overruns that is right in front of us, but only recently have we begun to grasp it: the fact that cost estimators and their customers are human. The last 70+ years of research into human psychology and behavioral economics have yielded amazing findings into how we humans process and use information to make judgments and decisions. What these scientists have uncovered is surprising: humans are often irrational and illogical beings, making decisions based on factors such as emotion and perception, rather than facts and data. These built-in biases to our thinking directly affect how we develop our cost estimates and how those cost estimates are used. We cost estimators can use this knowledge of biases to improve our cost estimates and also to improve how we communicate and work with our customers. By understanding how our customers think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. By using psychology to our advantage, we can more effectively help the decision maker and our organizations make fact-based decisions.

  17. An Investigation of the Perceptions of Low-Income Students of Color Concerning College Costs and Financial Aid Information

    ERIC Educational Resources Information Center

    Waters, Jennifer A.

    2009-01-01

    As college enrollments continue to increase, the disparity between middle-income white students and low-income students of color enrolling in private higher educational institutions continues to widen. Previous research has identified barriers such as access and equity in education, the high cost of education, and limited knowledge regarding…

  18. Processing of Perceptual Information Is More Robust than Processing of Conceptual Information in Preschool-Age Children: Evidence from Costs of Switching

    ERIC Educational Resources Information Center

    Fisher, Anna V.

    2011-01-01

    Is processing of conceptual information as robust as processing of perceptual information early in development? Existing empirical evidence is insufficient to answer this question. To examine this issue, 3- to 5-year-old children were presented with a flexible categorization task, in which target items (e.g., an open red umbrella) shared category…

  19. EPA evaluation of the Malpassi Filter King device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-01-01

    This report announces the conclusions of the EPA evaluation of the 'Malpassi Filter King' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the 'Malpassi Filter King' device was conducted upon receiving an application from the marketer. The device is a gasoline pressure regulator. The 'Malpassi Filter King' device is claimed to save gasoline by improving the fuel economy of carburetor-equipped, automotive engines.

  20. Low-Cost Sensors Deliver Nanometer-Accurate Measurements

    NASA Technical Reports Server (NTRS)

    2015-01-01

    As part of a unique partnership program, Kennedy Space Center collaborated with a nearby business school to allow MBA students to examine and analyze the market potential for a selection of NASA-patented technologies. Following the semester, a group of students decided to form Winter Park, Florida-based Juntura Group Inc. to license and sell a technology they had worked with: a sensor capable of detecting position changes as small as 10 nanometers-approximately the thickness of a cell wall.

  1. SUPPORT Tools for evidence-informed health Policymaking (STP) 12: Finding and using research evidence about resource use and costs

    PubMed Central

    2009-01-01

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. In this article, we address considerations about resource use and costs. The consequences of a policy or programme option for resource use differ from other impacts (both in terms of benefits and harms) in several ways. However, considerations of the consequences of options for resource use are similar to considerations related to other impacts in that policymakers and their staff need to identify important impacts on resource use, acquire and appraise the best available evidence regarding those impacts, and ensure that appropriate monetary values have been applied. We suggest four questions that can be considered when assessing resource use and the cost consequences of an option. These are: 1. What are the most important impacts on resource use? 2. What evidence is there for important impacts on resource use? 3. How confident is it possible to be in the evidence for impacts on resource use? 4. Have the impacts on resource use been valued appropriately in terms of their true costs? PMID:20018102

  2. Cost and quality of fuels for electric plants 1993

    SciTech Connect

    Not Available

    1994-07-01

    The Cost and Quality of Fuels for Electric Utility Plants (C&Q) presents an annual summary of statistics at the national, Census division, State, electric utility, and plant levels regarding the quantity, quality, and cost of fossil fuels used to produce electricity. The purpose of this publication is to provide energy decision-makers with accurate and timely information that may be used in forming various perspectives on issues regarding electric power.

  3. Within a smoking-cessation program, what impact does genetic information on lung cancer need to have to demonstrate cost-effectiveness?

    PubMed Central

    2010-01-01

    Background Many smoking-cessation programs and pharmaceutical aids demonstrate substantial health gains for a relatively low allocation of resources. Genetic information represents a type of individualized or personal feedback regarding the risk of developing lung cancer, and hence the potential benefits from stopping smoking, may motivate the person to remain smoke-free. The purpose of this study was to explore what the impact of a genetic test needs to have within a typical smoking-cessation program aimed at heavy smokers in order to be cost-effective. Methods Two strategies were modelled for a hypothetical cohort of heavy smokers aged 50 years; individuals either received or did not receive a genetic test within the course of a usual smoking-cessation intervention comprising nicotine replacement therapy (NRT) and counselling. A Markov model was constructed using evidence from published randomized controlled trials and meta-analyses for estimates on 12-month quit rates and long-term relapse rates. Epidemiological data were used for estimates on lung cancer risk stratified by time since quitting and smoking patterns. Extensive sensitivity analyses were used to explore parameter uncertainty. Results The discounted incremental cost per QALY was AU$34,687 (95% CI $12,483, $87,734) over 35 years. At a willingness-to-pay of AU$20,000 per QALY gained, the genetic testing strategy needs to produce a 12-month quit rate of at least 12.4% or a relapse rate 12% lower than NRT and counselling alone for it to be equally cost-effective. The likelihood that adding a genetic test to the usual smoking-cessation intervention is cost-effective was 20.6% however cost-effectiveness ratios were favourable in certain situations (e.g., applied to men only, a 60 year old cohort). Conclusions The findings were sensitive to small changes in critical variables such as the 12-month quit rates and relapse rates. As such, the cost-effectiveness of the genetic testing smoking cessation program

  4. Cost and quality of fuels for electric utility plants, 1992

    SciTech Connect

    Not Available

    1993-08-02

    This publication presents an annual summary of statistics at the national, Census division, State, electric utility, and plant levels regarding the quantity, quality, and cost of fossil fuels used to produce electricity. The purpose of this publication is to provide energy decision-makers with accurate and timely information that may be used in forming various perspectives on issues regarding electric power.

  5. Cost and quality of fuels for electric utility plants, 1994

    SciTech Connect

    1995-07-14

    This document presents an annual summary of statistics at the national, Census division, State, electric utility, and plant levels regarding the quantity, quality, and cost of fossil fuels used to produce electricity. Purpose of this publication is to provide energy decision-makers with accurate, timely information that may be used in forming various perspectives on issues regarding electric power.

  6. Cost Analysis of Selected Patient Categories within a Dermatology Department Using an ABC Approach

    PubMed Central

    Papadaki, Šárka; Popesko, Boris

    2016-01-01

    Background: Present trends in hospital management are facilitating the utilization of more accurate costing methods, which potentially results in superior cost-related information and improved managerial decision-making. However, the Activity-Based Costing method (ABC), which was designed for cost allocation purposes in the 1980s, is not widely used by healthcare organizations. This study analyzes costs related to selected categories of patients, those suffering from psoriasis, varicose ulcers, eczema and other conditions, within a dermatology department at a Czech regional hospital. Methods: The study was conducted in a hospital department where both inpatient and outpatient care are offered. Firstly, the diseases treated at the department were identified. Further costs were determined for each activity using ABC. The study utilized data from managerial and financial accounting, as well as data obtained through interviews with departmental staff. Using a defined cost-allocation procedure makes it possible to determine the cost of an individual patient with a given disease more accurately than via traditional costing procedures. Results: The cost analysis focused on the differences between the costs related to individual patients within the selected diagnoses, variations between inpatient and outpatient treatments and the costs of activities performed by the dermatology department. Furthermore, comparing the costs identified through this approach and the revenue stemming from the health insurance system is an option. Conclusions: Activity-Based Costing is more accurate and relevant than the traditional costing method. The outputs of ABC provide an abundance of additional information for managers. The benefits of this research lie in its practically-tested outputs, resulting from calculating the costs of hospitalization, which could prove invaluable to persons involved in hospital management and decision-making. The study also defines the managerial implications of

  7. The Impact of Densification by Means of Informal Shacks in the Backyards of Low-Cost Houses on the Environment and Service Delivery in Cape Town, South Africa

    PubMed Central

    Govender, Thashlin; Barnes, Jo M.; Pieper, Clarissa H.

    2011-01-01

    This paper investigates the state-sponsored low cost housing provided to previously disadvantaged communities in the City of Cape Town. The strain imposed on municipal services by informal densification of unofficial backyard shacks was found to create unintended public health risks. Four subsidized low-cost housing communities were selected within the City of Cape Town in this cross-sectional survey. Data was obtained from 1080 persons with a response rate of 100%. Illegal electrical connections to backyard shacks that are made of flimsy materials posed increased fire risks. A high proportion of main house owners did not pay for water but sold water to backyard dwellers. The design of state-subsidised houses and the unplanned housing in the backyard added enormous pressure on the existing municipal infrastructure and the environment. Municipal water and sewerage systems and solid waste disposal cannot cope with the increased population density and poor sanitation behaviour of the inhabitants of these settlements. The low-cost housing program in South Africa requires improved management and prudent policies to cope with the densification of state-funded low-cost housing settlements. PMID:21695092

  8. The Cost of Information Systems Modernization: A Comparison of Options for Life-Cycle Project Management Systems

    DTIC Science & Technology

    1990-08-01

    PC, minicomputer, or mainframe computer), off-the-shelf project management/scheduling software (e.g., Open Plan, Primavera , etc.); and database for...linked to * CETAL * Screens, reports division 0 CWS 0 Open Plan e Open Plan functions Mobile * Sperry 0 Primavera 0 INFORMIX- e COEMIS F&A 0 INFORMIX...cost 5000/95 SQL o CETAL, PMRS query system * Harris * CWS 0 Data interfaces * Primavera o COBOL Sacramento 0 PC network 0 Custom C 0 COEMIS F&A

  9. Financing and budgetary impact of landslide losses for highways and urban infrastructures in NW Germany - an economic analysis using landslide database information and cost survey data

    NASA Astrophysics Data System (ADS)

    Maurischat, Philipp; Klose, Martin

    2014-05-01

    Recent studies show that landslides cause even in low mountain areas of Central and Western Europe millions of dollars in annual losses (Klose et al., 2012; Vranken et al., 2013). The objective of this study has therefore been to model landslide disaster financing and to assess budgetary impacts of landslide losses for highways and urban infrastructures in the Lower Saxon Uplands, NW Germany. The present contribution includes two case studies on the financial burden of landslides for public budgets using the examples of the Lower Saxony Department of Transportation and the city of Hann. Münden. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. Using a toolset for landslide cost modeling based on landslide databases (Klose et al., 2013), the direct costs of more than 30 landslide damage events to highways in a local case study area were determined. The annual average landslide maintenance, repair, and mitigation costs for highways in this case study area are estimated at 0.76 million between 1980 and 2010. Alternatively, a cost survey based on expert interviews has been conducted to collect landslide loss data for urban infrastructures. This cost survey for the city of Hann. Münden shows annual landslide losses of up to 3.4 million during the previous 10 years. Further expert interviews at city and highway agency level were focused on identifying procedure, resources, and limits of financing landslide damage costs. The information on landslide disaster financing and cost survey data on annual maintenance and construction budgets for highways, city sewer lines, and urban roads were used to evaluate the fiscal significance of estimated landslide losses. The results of this economic impact assessment prove variable financial burdens of analyzed public budgets. Thus, in costly years with landslide losses of more than 7 million, the Lower Saxony Department of Transportation is required to shift up to 19% of its

  10. Human African trypanosomiasis prevention, treatment and control costs: a systematic review.

    PubMed

    Keating, Joseph; Yukich, Joshua O; Sutherland, C Simone; Woods, Geordie; Tediosi, Fabrizio

    2015-10-01

    The control and eventual elimination of human African trypanosomiasis (HAT) requires the expansion of current control and surveillance activities. A systematic review of the published literature on the costs of HAT prevention, treatment, and control, in addition to the economic burden, was conducted. All studies that contained primary or secondary data on costs of prevention, treatment and control were considered, resulting in the inclusion of 42 papers. The geographically focal nature of the disease and a lack of standardization in the cost data limit the usefulness of the available information for making generalizations across diverse settings. More recent information on the costs of treatment and control interventions for HAT is needed to provide accurate information for analyses and planning. The cost information contained herein can be used to inform rational decision making in control and elimination programs, and to assess potential synergies with existing vector-borne disease control programs, but programs would benefit significantly from new cost data collection.

  11. Who bears the cost of 'informal mhealth'? Health-workers' mobile phone practices and associated political-moral economies of care in Ghana and Malawi.

    PubMed

    Hampshire, Kate; Porter, Gina; Mariwah, Simon; Munthali, Alister; Robson, Elsbeth; Owusu, Samuel Asiedu; Abane, Albert; Milner, James

    2017-02-01

    Africa's recent communications 'revolution' has generated optimism that using mobile phones for health (mhealth) can help bridge healthcare gaps, particularly for rural, hard-to-reach populations. However, while scale-up of mhealth pilots remains limited, health-workers across the continent possess mobile phones. This article draws on interviews from Ghana and Malawi to ask whether/how health-workers are using their phones informally and with what consequences. Health-workers were found to use personal mobile phones for a wide range of purposes: obtaining help in emergencies; communicating with patients/colleagues; facilitating community-based care, patient monitoring and medication adherence; obtaining clinical advice/information and managing logistics. However, the costs were being borne by the health-workers themselves, particularly by those at the lower echelons, in rural communities, often on minimal stipends/salaries, who are required to 'care' even at substantial personal cost. Although there is significant potential for 'informal mhealth' to improve (rural) healthcare, there is a risk that the associated moral and political economies of care will reinforce existing socioeconomic and geographic inequalities.

  12. EPA (Environmental Protection Agency) evaluation of the gyroscopic wheel cover device under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-06-01

    This report announces the conclusions of the Environmental Protection Agency (EPA) evaluation of the Gyroscopic Wheel Cover under the provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the Gyroscopic Wheel Cover device was conducted upon the application of Simmer Wheels, Incorporated. The device is a mechanical assembly which replaces each of the standard wheel covers on a vehicle. The device is claimed to improve fuel economy, handling and braking characteristics, and the life of the brakes and tires.

  13. EPA (Environmental Protection Agency) evaluation of the P. S. C. U. 01 device under section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-08-01

    This document announces the conclusions of the EPA evaluation of the 'P.S.C.U. 01' device under the provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. The evaluation of the P.S.C.U. 01 was conducted upon the application of Dutch Pacific, Incorporated. The device is comprised of several mechanical and electrical components and is intended to generate steam and deliver it to the combustion chamber via an inline catalyst. The device is claimed to improve fuel economy and to reduce exhaust emissions. The P.S.C.U. 01 is classified by EPA as a Vapor bleed device.

  14. Bronchial asthma healthcare costs in Mexico: analysis of trends from 1991-1996 with information from the Mexican Institute of Social Security.

    PubMed

    Rico-Méndez, F G; Barquera, S; Cabrera, D A; Escobedo, S; Ochoa, L G; Massey-Reynaud, L F

    2000-01-01

    Cost trends for bronchial asthma have not been previously estimated in Mexico. The increasing prevalence of bronchial asthma as well as its elevated costs make it necessary to expand the availability of information for health planners. This is a growing problem which has been given little attention in national health reports. We did a descriptive, retrospective analysis using national data from the Mexican Institute for Social Security. We estimated the number of medical consultations provided by the state family medicine and specialty areas. A total of 756,843 consultations due to bronchial asthma were provided between 1991 and 1996 in the service areas under study. The healthcare expenditure for bronchial asthma showed an ascending and sustained trend during the study period. When analyzing the trends by type of service, a significant increase in in-hospital care was observed, ranging from US $14.5 (1991) to $19.8 (1996) million and a maximum of $28.4 (1994) million. A similar increase was found in specialty consultation, from $3.96 (1991) to $8.5 (1996) million; in emergencies, from $1. 1 (1991) to $2.9 (1996) million; and family medicine, from $0.66 (1991) to $0.79 (1996) million. Bronchial asthma follows the same pattern as other noncommunicable chronic diseases, increasing in highly urbanized areas and nationwide. In order to improve healthcare and maximize results with scarce resources, a set of strategies is presented to reduce bronchial asthma recurrence, decrease healthcare costs, and improve quality of life.

  15. Influence of pansharpening techniques in obtaining accurate vegetation thematic maps

    NASA Astrophysics Data System (ADS)

    Ibarrola-Ulzurrun, Edurne; Gonzalo-Martin, Consuelo; Marcello-Ruiz, Javier

    2016-10-01

    In last decades, there have been a decline in natural resources, becoming important to develop reliable methodologies for their management. The appearance of very high resolution sensors has offered a practical and cost-effective means for a good environmental management. In this context, improvements are needed for obtaining higher quality of the information available in order to get reliable classified images. Thus, pansharpening enhances the spatial resolution of the multispectral band by incorporating information from the panchromatic image. The main goal in the study is to implement pixel and object-based classification techniques applied to the fused imagery using different pansharpening algorithms and the evaluation of thematic maps generated that serve to obtain accurate information for the conservation of natural resources. A vulnerable heterogenic ecosystem from Canary Islands (Spain) was chosen, Teide National Park, and Worldview-2 high resolution imagery was employed. The classes considered of interest were set by the National Park conservation managers. 7 pansharpening techniques (GS, FIHS, HCS, MTF based, Wavelet `à trous' and Weighted Wavelet `à trous' through Fractal Dimension Maps) were chosen in order to improve the data quality with the goal to analyze the vegetation classes. Next, different classification algorithms were applied at pixel-based and object-based approach, moreover, an accuracy assessment of the different thematic maps obtained were performed. The highest classification accuracy was obtained applying Support Vector Machine classifier at object-based approach in the Weighted Wavelet `à trous' through Fractal Dimension Maps fused image. Finally, highlight the difficulty of the classification in Teide ecosystem due to the heterogeneity and the small size of the species. Thus, it is important to obtain accurate thematic maps for further studies in the management and conservation of natural resources.

  16. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  17. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  18. Social cost impact assessment of pipeline infrastructure projects

    SciTech Connect

    Matthews, John C.; Allouche, Erez N.; Sterling, Raymond L.

    2015-01-15

    A key advantage of trenchless construction methods compared with traditional open-cut methods is their ability to install or rehabilitate underground utility systems with limited disruption to the surrounding built and natural environments. The equivalent monetary values of these disruptions are commonly called social costs. Social costs are often ignored by engineers or project managers during project planning and design phases, partially because they cannot be calculated using standard estimating methods. In recent years some approaches for estimating social costs were presented. Nevertheless, the cost data needed for validation of these estimating methods is lacking. Development of such social cost databases can be accomplished by compiling relevant information reported in various case histories. This paper identifies eight most important social cost categories, presents mathematical methods for calculating them, and summarizes the social cost impacts for two pipeline construction projects. The case histories are analyzed in order to identify trends for the various social cost categories. The effectiveness of the methods used to estimate these values is also discussed. These findings are valuable for pipeline infrastructure engineers making renewal technology selection decisions by providing a more accurate process for the assessment of social costs and impacts. - Highlights: • Identified the eight most important social cost factors for pipeline construction • Presented mathematical methods for calculating those social cost factors • Summarized social cost impacts for two pipeline construction projects • Analyzed those projects to identify trends for the social cost factors.

  19. Despite the spread of health information exchange, there is little evidence of its impact on cost, use, and quality of care.

    PubMed

    Rahurkar, Saurabh; Vest, Joshua R; Menachemi, Nir

    2015-03-01

    Health information exchange (HIE), which is the transfer of electronic information such as laboratory results, clinical summaries, and medication lists, is believed to boost efficiency, reduce health care costs, and improve outcomes for patients. Stimulated by federal financial incentives, about two-thirds of hospitals and almost half of physician practices are now engaged in some type of HIE with outside organizations. To determine how HIE has affected such health care measures as cost, service use, and quality, we identified twenty-seven scientific studies, extracted selected characteristics from each, and meta-analyzed these characteristics for trends. Overall, 57 percent of published analyses reported some benefit from HIE. However, articles employing study designs having strong internal validity, such as randomized controlled trials or quasi-experiments, were significantly less likely than others to associate HIE with benefits. Among six articles with strong internal validity, one study reported paradoxical negative effects, three studies found no effect, and two studies reported that HIE led to benefits. Furthermore, these two studies had narrower focuses than the others. Overall, little generalizable evidence currently exists regarding benefits attributable to HIE.

  20. Automatic classification and accurate size measurement of blank mask defects

    NASA Astrophysics Data System (ADS)

    Bhamidipati, Samir; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter

    2015-07-01

    A blank mask and its preparation stages, such as cleaning or resist coating, play an important role in the eventual yield obtained by using it. Blank mask defects' impact analysis directly depends on the amount of available information such as the number of defects observed, their accurate locations and sizes. Mask usability qualification at the start of the preparation process, is crudely based on number of defects. Similarly, defect information such as size is sought to estimate eventual defect printability on the wafer. Tracking of defect characteristics, specifically size and shape, across multiple stages, can further be indicative of process related information such as cleaning or coating process efficiencies. At the first level, inspection machines address the requirement of defect characterization by detecting and reporting relevant defect information. The analysis of this information though is still largely a manual process. With advancing technology nodes and reducing half-pitch sizes, a large number of defects are observed; and the detailed knowledge associated, make manual defect review process an arduous task, in addition to adding sensitivity to human errors. Cases where defect information reported by inspection machine is not sufficient, mask shops rely on other tools. Use of CDSEM tools is one such option. However, these additional steps translate into increased costs. Calibre NxDAT based MDPAutoClassify tool provides an automated software alternative to the manual defect review process. Working on defect images generated by inspection machines, the tool extracts and reports additional information such as defect location, useful for defect avoidance[4][5]; defect size, useful in estimating defect printability; and, defect nature e.g. particle, scratch, resist void, etc., useful for process monitoring. The tool makes use of smart and elaborate post-processing algorithms to achieve this. Their elaborateness is a consequence of the variety and

  1. Activity-Based Costing: A Cost Management Tool.

    ERIC Educational Resources Information Center

    Turk, Frederick J.

    1993-01-01

    In college and university administration, overhead costs are often charged to programs indiscriminately, whereas the support activities that underlie those costs remain unanalyzed. It is time for institutions to decrease ineffective use of resources. Activity-based management attributes costs more accurately and can improve efficiency. (MSE)

  2. Activity-Based Costing in the After Press Services Industry

    NASA Astrophysics Data System (ADS)

    Shevasuthisilp, Suntichai; Punsathitwong, Kosum

    2009-10-01

    This research was conducted to apply activity-based costing (ABC) in an after press service company in Chiang Mai province, Thailand. The company produces all of its products by one-stop service (such as coating, stitching, binding, die cutting, and gluing). All products are made to order, and have different sizes and patterns. A strategy of low price is used to compete in the marketplace. After cost analysis, the study found that the company has high overhead (36.5% of total cost). The company's problem is its use of traditional cost accounting, which has low accuracy in assigning overhead costs. If management uses this information when pricing customer orders, losses may occur because real production costs may be higher than the selling price. Therefore, the application of ABC in cost analysis can help executives receive accurate cost information; establish a sound pricing strategy; and improve the manufacturing process by determining work activities which have excessively high production costs. According to this research, 6 out of 56 items had a production cost higher than the selling price, leading to losses of 123,923 baht per year. Methods used to solve this problem were: reducing production costs; establishing suitable prices; and creating a sales promotion with lower prices for customers whose orders include processes involving unused capacity. These actions will increase overall sales of the company, and allow more efficient use of its machinery.

  3. Use of information systems as management tools in health care

    NASA Astrophysics Data System (ADS)

    Davila, Fidel

    1995-10-01

    Information systems that can be used as effective management tools in healthcare do not exist. This is because current information systems do not accurately reflect reality and because they do not provide information to important end-users, i.e., clinicians. To reflect reality, healthcare information systems must assess total health care costs. These not only include the direct economic costs (dollars paid) but also the indirect economic costs (dollars lost, spent, or saved) from having a person ill. These systems must also accurately assess the adjusted, qualitative costs of human life and human pain and suffering resulting from the illness and healthcare provided. Once information systems reflect reality, they can be used to manage healthcare by profiling utilization, projecting need, modeling programs, assessing quality of care and establishing guidelines.

  4. Cost estimation model for advanced planetary programs, fourth edition

    NASA Technical Reports Server (NTRS)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  5. COST OF MTBE REMEDIATION

    EPA Science Inventory

    Widespread contamination of methyl tert-butyl ether (MTBE) in ground water has raised concerns about the increased cost of remediation of MTBE releases compared to BTEX-only sites. To evaluate these cost, cost information for 311 sites was furnished by U.S. EPA Office of Undergr...

  6. Cost of dengue outbreaks: literature review and country case studies

    PubMed Central

    2013-01-01

    different cost components (vector control; surveillance; information, education and communication; direct medical and indirect costs), as percentage of total costs, differed across the respective countries. Resources used for dengue disease control and treatment were country specific. Conclusions The evidence so far collected further confirms the methodological challenges in this field: 1) to define technically dengue outbreaks (what do we measure?) and 2) to measure accurately the costs in prospective field studies (how do we measure?). Currently, consensus on the technical definition of an outbreak is sought through the International Research Consortium on Dengue Risk Assessment, Management and Surveillance (IDAMS). Best practice guidelines should be further developed, also to improve the quality and comparability of cost study findings. Modelling the costs of dengue outbreaks and validating these models through field studies should guide further research. PMID:24195519

  7. Accurate tracking of high dynamic vehicles with translated GPS

    NASA Astrophysics Data System (ADS)

    Blankshain, Kenneth M.

    The GPS concept and the translator processing system (TPS) which were developed for accurate and cost-effective tracking of various types of high dynamic expendable vehicles are described. A technique used by the translator processing system (TPS) to accomplish very accurate high dynamic tracking is presented. Automatic frequency control and fast Fourier transform processes are combined to track 100 g acceleration and 100 g/s jerk with 1-sigma velocity measurement error less than 1 ft/sec.

  8. [The study of the costs of schizophrenia].

    PubMed

    Prot-Herczyńska, K

    1998-01-01

    The present paper considers the problems of measurement of the costs of psychiatric services. Classification of costs' studies includes cost of illness, cost-benefit analysis, cost-effectiveness and cost-utility studies. Costs should be comprehensively measured. Randomised clinical trial is recommended. Information of costs should be integrated with information on patient outcomes. The paper discusses the research which tries to estimate the costs and benefits of community care, using naturalistic or random patient sample methods.

  9. Environmental Protection Agency (EPA) evaluation of the Super-Mag Fuel Extender under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Ashby, H.A.

    1982-01-01

    This document announces the conclusions of the EPA evaluation of the 'Super-Mag Fuel Extender' device under provisions of Section 511 of the Motor Vehicle Information and Cost Savings Act. On December 10, 1980, the EPA received a written request from the Metropolitan Denver District Attorney's Office of Consumer Fraud and Economic Crime to test at least one 'cow magnet' type of fuel economy device. Following a survey of devices being marketed, the Metropolitan Denver District Attorney's Office selected the 'Super-Mag' device as typical of its category and on April 13, 1981 provided EPA with units for testing. The EPA evaluation of the device using three vehicles showed neither fuel economy nor exhaust emissions were affected by the installation of the 'Super-Mag' device. In addition, any differences between baseline test results and results from tests with the device installed were within the range of normal test variability.

  10. Determining true nursing costs improves financial planning.

    PubMed

    Payson, A A

    1987-05-01

    The traditional method of apportioning nursing care costs ona per diem basis does not consider nursing intensity or patients' special needs and often includes nonnursing duties. Many hospitals now favor a fee-for-service concept and are determining direct patient care costs to identify the true nursing cost. A patient classification system correlated with the diagnosis-related group (DRG) classification improves nursing cost analyses. For each patient, nurse managers need systems to determine quantified nursing tasks and patient acuity levels for each day. This information can be used to adjust staffing and to establish variable billing procedures. Then they can institute variable billing methods that are based on direct care costs as well as indirect costs of administration, education, and supplies. Variable billing identifies revenue cost centers, allows systematic monitoring of nursing services, and improves budget planning. The entire nursing staff must become involved in the financial system so the hospital can obtain an accurate data base for rate setting and third-party reimbursement.

  11. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  12. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  13. Cost Recovery Through Depreciation.

    ERIC Educational Resources Information Center

    Forrester, Robert T.; Wesolowski, Leonard V.

    1983-01-01

    The approach of adopting depreciation rather than use allowance in order to recover more accurately the cost of college buildings and equipment used on federal projects is considered. It is suggested that depreciation will offer most colleges and universities a higher annual recovery rate, and an opportunity for better facilities planning. For…

  14. Price-transparency and cost accounting: challenges for health care organizations in the consumer-driven era.

    PubMed

    Hilsenrath, Peter; Eakin, Cynthia; Fischer, Katrina

    2015-01-01

    Health care reform is directed toward improving access and quality while containing costs. An essential part of this is improvement of pricing models to more accurately reflect the costs of providing care. Transparent prices that reflect costs are necessary to signal information to consumers and producers. This information is central in a consumer-driven marketplace. The rapid increase in high deductible insurance and other forms of cost sharing incentivizes the search for price information. The organizational ability to measure costs across a cycle of care is an integral component of creating value, and will play a greater role as reimbursements transition to episode-based care, value-based purchasing, and accountable care organization models. This article discusses use of activity-based costing (ABC) to better measure the cost of health care. It describes examples of ABC in health care organizations and discusses impediments to adoption in the United States including cultural and institutional barriers.

  15. Non-targeted analysis of electronics waste by comprehensive two-dimensional gas chromatography combined with high-resolution mass spectrometry: Using accurate mass information and mass defect analysis to explore the data.

    PubMed

    Ubukata, Masaaki; Jobst, Karl J; Reiner, Eric J; Reichenbach, Stephen E; Tao, Qingping; Hang, Jiliang; Wu, Zhanpin; Dane, A John; Cody, Robert B

    2015-05-22

    Comprehensive two-dimensional gas chromatography (GC×GC) and high-resolution mass spectrometry (HRMS) offer the best possible separation of their respective techniques. Recent commercialization of combined GC×GC-HRMS systems offers new possibilities for the analysis of complex mixtures. However, such experiments yield enormous data sets that require new informatics tools to facilitate the interpretation of the rich information content. This study reports on the analysis of dust obtained from an electronics recycling facility by using GC×GC in combination with a new high-resolution time-of-flight (TOF) mass spectrometer. New software tools for (non-traditional) Kendrick mass defect analysis were developed in this research and greatly aided in the identification of compounds containing chlorine and bromine, elements that feature in most persistent organic pollutants (POPs). In essence, the mass defect plot serves as a visual aid from which halogenated compounds are recognizable on the basis of their mass defect and isotope patterns. Mass chromatograms were generated based on specific ions identified in the plots as well as region of the plot predominantly occupied by halogenated contaminants. Tentative identification was aided by database searches, complementary electron-capture negative ionization experiments and elemental composition determinations from the exact mass data. These included known and emerging flame retardants, such as polybrominated diphenyl ethers (PBDEs), hexabromobenzene, tetrabromo bisphenol A and tris (1-chloro-2-propyl) phosphate (TCPP), as well as other legacy contaminants such as polychlorinated biphenyls (PCBs) and polychlorinated terphenyls (PCTs).

  16. Accurate SHAPE-directed RNA secondary structure modeling, including pseudoknots

    PubMed Central

    Hajdin, Christine E.; Bellaousov, Stanislav; Huggins, Wayne; Leonard, Christopher W.; Mathews, David H.; Weeks, Kevin M.

    2013-01-01

    A pseudoknot forms in an RNA when nucleotides in a loop pair with a region outside the helices that close the loop. Pseudoknots occur relatively rarely in RNA but are highly overrepresented in functionally critical motifs in large catalytic RNAs, in riboswitches, and in regulatory elements of viruses. Pseudoknots are usually excluded from RNA structure prediction algorithms. When included, these pairings are difficult to model accurately, especially in large RNAs, because allowing this structure dramatically increases the number of possible incorrect folds and because it is difficult to search the fold space for an optimal structure. We have developed a concise secondary structure modeling approach that combines SHAPE (selective 2′-hydroxyl acylation analyzed by primer extension) experimental chemical probing information and a simple, but robust, energy model for the entropic cost of single pseudoknot formation. Structures are predicted with iterative refinement, using a dynamic programming algorithm. This melded experimental and thermodynamic energy function predicted the secondary structures and the pseudoknots for a set of 21 challenging RNAs of known structure ranging in size from 34 to 530 nt. On average, 93% of known base pairs were predicted, and all pseudoknots in well-folded RNAs were identified. PMID:23503844

  17. Estimating the cost of health care-associated infections: mind your p's and q's.

    PubMed

    Graves, Nicholas; Harbarth, Stephan; Beyersmann, Jan; Barnett, Adrian; Halton, Kate; Cooper, Ben

    2010-04-01

    Monetary valuations of the economic cost of health care-associated infections (HAIs) are important for decision making and should be estimated accurately. Erroneously high estimates of costs, designed to jolt decision makers into action, may do more harm than good in the struggle to attract funding for infection control. Expectations among policy makers might be raised, and then they are disappointed when the reduction in the number of HAIs does not yield the anticipated cost saving. For this article, we critically review the field and discuss 3 questions. Why measure the cost of an HAI? What outcome should be used to measure the cost of an HAI? What is the best method for making this measurement? The aim is to encourage researchers to collect and then disseminate information that accurately guides decisions about the economic value of expanding or changing current infection control activities.

  18. On numerically accurate finite element

    NASA Technical Reports Server (NTRS)

    Nagtegaal, J. C.; Parks, D. M.; Rice, J. R.

    1974-01-01

    A general criterion for testing a mesh with topologically similar repeat units is given, and the analysis shows that only a few conventional element types and arrangements are, or can be made suitable for computations in the fully plastic range. Further, a new variational principle, which can easily and simply be incorporated into an existing finite element program, is presented. This allows accurate computations to be made even for element designs that would not normally be suitable. Numerical results are given for three plane strain problems, namely pure bending of a beam, a thick-walled tube under pressure, and a deep double edge cracked tensile specimen. The effects of various element designs and of the new variational procedure are illustrated. Elastic-plastic computation at finite strain are discussed.

  19. 77 FR 6137 - Notice of Proposed Information Collection for Public Comment: Data Collection for Full Housing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-07

    ... collect time measurement and cost data at up to 60 PHAs across the country and to survey an additional 130... the study is to collect accurate information on the costs of administering the HCV program across a... the current OMB request, the research team proposes three main types of data collection: (1)...

  20. Wind Integration Cost and Cost-Causation: Preprint

    SciTech Connect

    Milligan, M.; Kirby, B.; Holttinen, H.; Kiviluoma, J.; Estanqueiro, A.; Martin-Martinez, S.; Gomez-Lazaro, E.; Peneda, I.; Smith, C.

    2013-10-01

    The question of wind integration cost has received much attention in the past several years. The methodological challenges to calculating integration costs are discussed in this paper. There are other sources of integration cost unrelated to wind energy. A performance-based approach would be technology neutral, and would provide price signals for all technology types. However, it is difficult to correctly formulate such an approach. Determining what is and is not an integration cost is challenging. Another problem is the allocation of system costs to one source. Because of significant nonlinearities, this can prove to be impossible to determine in an accurate and objective way.

  1. Factors Impacting Decommissioning Costs - 13576

    SciTech Connect

    Kim, Karen; McGrath, Richard

    2013-07-01

    The Electric Power Research Institute (EPRI) studied United States experience with decommissioning cost estimates and the factors that impact the actual cost of decommissioning projects. This study gathered available estimated and actual decommissioning costs from eight nuclear power plants in the United States to understand the major components of decommissioning costs. Major costs categories for decommissioning a nuclear power plant are removal costs, radioactive waste costs, staffing costs, and other costs. The technical factors that impact the costs were analyzed based on the plants' decommissioning experiences. Detailed cost breakdowns by major projects and other cost categories from actual power plant decommissioning experiences will be presented. Such information will be useful in planning future decommissioning and designing new plants. (authors)

  2. Leasing strategies reduce the cost of financing healthcare equipment.

    PubMed

    Bayless, M E; Diltz, J D

    1985-10-01

    Prospective payment has increased the importance of controlling capital costs. One area where this may be possible is lease financing. Reasons commonly cited in favor of leasing may be of questionable validity, but, under an easily identified set of circumstances, lease financing can be cost effective. Recent developments in finance make it possible to not only evaluate the financial attractiveness of a given lease, but also to accurately predict bounds within which the terms of the lease must fall. Hospital administrators armed with this information should be able to negotiate more favorable lease terms under given tax and economic environments.

  3. Accurate ab Initio Spin Densities.

    PubMed

    Boguslawski, Katharina; Marti, Konrad H; Legeza, Ors; Reiher, Markus

    2012-06-12

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as a basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys.2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CASCI-type wave function provides insight into chemically interesting features of the molecule under study such as the distribution of α and β electrons in terms of Slater determinants, CI coefficients, and natural orbitals. The methodology is applied to an iron nitrosyl complex which we have identified as a challenging system for standard approaches [J. Chem. Theory Comput.2011, 7, 2740].

  4. An Analysis of Rocket Propulsion Testing Costs

    NASA Technical Reports Server (NTRS)

    Ramirez-Pagan, Carmen P.; Rahman, Shamim A.

    2009-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is generally performed within two arenas: (1) Production testing for certification and acceptance, and (2) Developmental testing for prototype or experimental purposes. The customer base consists of NASA programs, DOD programs, and commercial programs. Resources in place to perform on-site testing include both civil servants and contractor personnel, hardware and software including data acquisition and control, and 6 test stands with a total of 14 test positions/cells. For several business reasons there is the need to augment understanding of the test costs for all the various types of test campaigns. Historical propulsion test data was evaluated and analyzed in many different ways with the intent to find any correlation or statistics that could help produce more reliable and accurate cost estimates and projections. The analytical efforts included timeline trends, statistical curve fitting, average cost per test, cost per test second, test cost timeline, and test cost envelopes. Further, the analytical effort includes examining the test cost from the perspective of thrust level and test article characteristics. Some of the analytical approaches did not produce evidence strong enough for further analysis. Some other analytical approaches yield promising results and are candidates for further development and focused study. Information was organized for into its elements: a Project Profile, Test Cost Timeline, and Cost Envelope. The Project Profile is a snap shot of the project life cycle on a timeline fashion, which includes various statistical analyses. The Test Cost Timeline shows the cumulative average test cost, for each project, at each month where there was test activity. The Test Cost Envelope shows a range of cost for a given number of test(s). The supporting information upon which this study was performed came from diverse sources and thus it was necessary to

  5. Rapidly falling costs of battery packs for electric vehicles

    NASA Astrophysics Data System (ADS)

    Nykvist, Björn; Nilsson, Måns

    2015-04-01

    To properly evaluate the prospects for commercially competitive battery electric vehicles (BEV) one must have accurate information on current and predicted cost of battery packs. The literature reveals that costs are coming down, but with large uncertainties on past, current and future costs of the dominating Li-ion technology. This paper presents an original systematic review, analysing over 80 different estimates reported 2007-2014 to systematically trace the costs of Li-ion battery packs for BEV manufacturers. We show that industry-wide cost estimates declined by approximately 14% annually between 2007 and 2014, from above US$1,000 per kWh to around US$410 per kWh, and that the cost of battery packs used by market-leading BEV manufacturers are even lower, at US$300 per kWh, and has declined by 8% annually. Learning rate, the cost reduction following a cumulative doubling of production, is found to be between 6 and 9%, in line with earlier studies on vehicle battery technology. We reveal that the costs of Li-ion battery packs continue to decline and that the costs among market leaders are much lower than previously reported. This has significant implications for the assumptions used when modelling future energy and transport systems and permits an optimistic outlook for BEVs contributing to low-carbon transport.

  6. Tracking Costs

    ERIC Educational Resources Information Center

    Erickson, Paul W.

    2010-01-01

    Even though there's been a slight reprieve in energy costs, the reality is that the cost of non-renewable energy is increasing, and state education budgets are shrinking. One way to keep energy and operations costs from overshadowing education budgets is to develop a 10-year energy audit plan to eliminate waste. First, facility managers should…

  7. Low-Cost Spectral Sensor Development Description.

    SciTech Connect

    Armijo, Kenneth Miguel; Yellowhair, Julius

    2014-11-01

    Solar spectral data for all parts of the US is limited due in part to the high cost of commercial spectrometers. Solar spectral information is necessary for accurate photovoltaic (PV) performance forecasting, especially for large utility-scale PV installations. A low-cost solar spectral sensor would address the obstacles and needs. In this report, a novel low-cost, discrete- band sensor device, comprised of five narrow-band sensors, is described. The hardware is comprised of commercial-off-the-shelf components to keep the cost low. Data processing algorithms were developed and are being refined for robustness. PV module short-circuit current ( I sc ) prediction methods were developed based on interaction-terms regression methodology and spectrum reconstruction methodology for computing I sc . The results suggest the computed spectrum using the reconstruction method agreed well with the measured spectrum from the wide-band spectrometer (RMS error of 38.2 W/m 2 -nm). Further analysis of computed I sc found a close correspondence of 0.05 A RMS error. The goal is for ubiquitous adoption of the low-cost spectral sensor in solar PV and other applications such as weather forecasting.

  8. 78 FR 34604 - Submitting Complete and Accurate Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    .... Nuclear Regulatory Commission (NRC) is publishing for comment a petition for rulemaking (PRM) filed with... into ADAMS. The Petition The NRC has received a PRM (ADAMS Accession No. ML13113A443) requesting the... been docketed as PRM-50-107. The full text of the incoming petition is available at...

  9. Accurate thermoplasmonic simulation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Yu, Da-Miao; Liu, Yan-Nan; Tian, Fa-Lin; Pan, Xiao-Min; Sheng, Xin-Qing

    2017-01-01

    Thermoplasmonics leads to enhanced heat generation due to the localized surface plasmon resonances. The measurement of heat generation is fundamentally a complicated task, which necessitates the development of theoretical simulation techniques. In this paper, an efficient and accurate numerical scheme is proposed for applications with complex metallic nanostructures. Light absorption and temperature increase are, respectively, obtained by solving the volume integral equation (VIE) and the steady-state heat diffusion equation through the method of moments (MoM). Previously, methods based on surface integral equations (SIEs) were utilized to obtain light absorption. However, computing light absorption from the equivalent current is as expensive as O(NsNv), where Ns and Nv, respectively, denote the number of surface and volumetric unknowns. Our approach reduces the cost to O(Nv) by using VIE. The accuracy, efficiency and capability of the proposed scheme are validated by multiple simulations. The simulations show that our proposed method is more efficient than the approach based on SIEs under comparable accuracy, especially for the case where many incidents are of interest. The simulations also indicate that the temperature profile can be tuned by several factors, such as the geometry configuration of array, beam direction, and light wavelength.

  10. Cost accounting of radiological examinations. Cost analysis of radiological examinations of intermediate referral hospitals and general practice.

    PubMed

    Lääperi, A L

    1996-01-01

    nonmonetary variables was developed. In it the radiologist, radiographer and examination-specific equipment costs were allocated to the examinations applying estimated cost equivalents. Some minor cost items were replaced by a general cost factor (GCF). The program is suitable for internal cost accounting of radiological departments as well as regional planning. If more accurate cost information is required, cost assignment employing the actual consumption of the resources and applying the principles of activity-based cost accounting is recommended. As an application of the cost accounting formula the average costs of the radiological examinations were calculated. In conventional radiography the average proportion of the cost factors in the total material was: personnel costs 43%, equipment costs 26%, material costs 7%, real estate costs 11%, administration and overheads 14%. The average total costs including radiologist costs in the hospitals were (FIM): conventional roentgen examinations 188, contrast medium examinations 695, ultrasound 296, mammography 315, roentgen examinations with mobile equipment 1578. The average total costs without radiologist costs in the public health centres were (FIM): conventional roentgen examinations 107, contrast medium examinations 988, ultrasound 203, mammography 557. The average currency rate of exchange in 1991 was USD 1 = FIM 4.046. The following formula is proposed for calculating the cost of a radiological examination (or a group of examinations) performed with a certain piece of equipment during a period of time (e.g. 1 year): a2/ sigma ax*ax+ b2/ sigma bx*bx+ d1/d5*dx+ e1 + [(c1+ c2) + d4 + (e2 - e3) + f5 + g1+ g2+ i]/n.

  11. Accurate, meshless methods for magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.; Raives, Matthias J.

    2016-01-01

    Recently, we explored new meshless finite-volume Lagrangian methods for hydrodynamics: the `meshless finite mass' (MFM) and `meshless finite volume' (MFV) methods; these capture advantages of both smoothed particle hydrodynamics (SPH) and adaptive mesh refinement (AMR) schemes. We extend these to include ideal magnetohydrodynamics (MHD). The MHD equations are second-order consistent and conservative. We augment these with a divergence-cleaning scheme, which maintains nabla \\cdot B≈ 0. We implement these in the code GIZMO, together with state-of-the-art SPH MHD. We consider a large test suite, and show that on all problems the new methods are competitive with AMR using constrained transport (CT) to ensure nabla \\cdot B=0. They correctly capture the growth/structure of the magnetorotational instability, MHD turbulence, and launching of magnetic jets, in some cases converging more rapidly than state-of-the-art AMR. Compared to SPH, the MFM/MFV methods exhibit convergence at fixed neighbour number, sharp shock-capturing, and dramatically reduced noise, divergence errors, and diffusion. Still, `modern' SPH can handle most test problems, at the cost of larger kernels and `by hand' adjustment of artificial diffusion. Compared to non-moving meshes, the new methods exhibit enhanced `grid noise' but reduced advection errors and diffusion, easily include self-gravity, and feature velocity-independent errors and superior angular momentum conservation. They converge more slowly on some problems (smooth, slow-moving flows), but more rapidly on others (involving advection/rotation). In all cases, we show divergence control beyond the Powell 8-wave approach is necessary, or all methods can converge to unphysical answers even at high resolution.

  12. Multiple-frequency continuous wave ultrasonic system for accurate distance measurement

    NASA Astrophysics Data System (ADS)

    Huang, C. F.; Young, M. S.; Li, Y. C.

    1999-02-01

    A highly accurate multiple-frequency continuous wave ultrasonic range-measuring system for use in air is described. The proposed system uses a method heretofore applied to radio frequency distance measurement but not to air-based ultrasonic systems. The method presented here is based upon the comparative phase shifts generated by three continuous ultrasonic waves of different but closely spaced frequencies. In the test embodiment to confirm concept feasibility, two low cost 40 kHz ultrasonic transducers are set face to face and used to transmit and receive ultrasound. Individual frequencies are transmitted serially, each generating its own phase shift. For any given frequency, the transmitter/receiver distance modulates the phase shift between the transmitted and received signals. Comparison of the phase shifts allows a highly accurate evaluation of target distance. A single-chip microcomputer-based multiple-frequency continuous wave generator and phase detector was designed to record and compute the phase shift information and the resulting distance, which is then sent to either a LCD or a PC. The PC is necessary only for calibration of the system, which can be run independently after calibration. Experiments were conducted to test the performance of the whole system. Experimentally, ranging accuracy was found to be within ±0.05 mm, with a range of over 1.5 m. The main advantages of this ultrasonic range measurement system are high resolution, low cost, narrow bandwidth requirements, and ease of implementation.

  13. The fiscal impact of informal caregiving to home care recipients in Canada: how the intensity of care influences costs and benefits to government.

    PubMed

    Jacobs, Josephine C; Lilly, Meredith B; Ng, Carita; Coyte, Peter C

    2013-03-01

    The objective of this study was to estimate the annual costs and consequences of unpaid caregiving by Canadians from a government perspective. We estimated these costs both at the individual and population levels for caregivers aged 45 and older. We conducted a cost-benefit analysis where we considered the costs of unpaid caregiving to be potential losses in income tax revenues and changes in social assistance payments and the potential benefit of reduced paid care expenditures. Our costing methods were based on multivariate analyses using the 2007 General Social Survey, a cross-sectional survey of 23,404 individuals. We determined the differential probability of employment, wages, and hours worked by caregivers of varying intensity versus non-caregivers. We also used multivariate analysis to determine how receiving different intensities of unpaid care impacted both the probability of receiving paid care and the weekly hours of paid care received. At the lowest intensities of caregiving, there was a net benefit to government from caregiving, at both the individual and population levels. At the population level, the net benefit to government was estimated to be $4.4 billion for caregivers providing less than five hours of weekly care. At the highest intensity of caregiving, there was a net cost to government of $641 million. Our overall findings were robust to a number of changes applied in our sensitivity analysis. We found that the factor with the greatest impact on cost was the probability of labour force participation. As the biggest cost driver appears to be the higher likelihood of intense caregivers dropping out of the labour force, government policies that enable intense caregivers to balance caregiving with employment may help to mitigate these losses.

  14. Accurate On-Line Intervention Practices for Efficient Improvement of Reading Skills in Africa

    ERIC Educational Resources Information Center

    Marshall, Minda B.

    2016-01-01

    Lifelong learning is the only way to sustain proficient learning in a rapidly changing world. Knowledge and information are exploding across the globe. We need accurate ways to facilitate the process of drawing external factual information into an internal perceptive advantage from which to interpret and argue new information. Accurate and…

  15. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  16. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  17. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  18. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  19. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  20. Cost analysis and the practicing radiologist/manager: an introduction to managerial accounting.

    PubMed

    Forman, H P; Yin, D

    1996-06-01

    Cost analysis is inherently one of the most tedious tasks falling on the shoulders of any manager. In today's world, whether in a service business such as radiology or medicine or in a product line such as car manufacturing, accurate cost analysis is critical to all aspects of management: marketing, competitive strategy, quality control, human resource management, accounting (financial), and operations management, to name but a few. This is a topic that we will explore with the intention of giving the radiologist/manager the understanding and the basic skills to use cost analysis efficiently, making sure that major financial decisions are being made with adequate cost information, and showing that cost accounting is really managerial accounting in that it pays little attention to the bottom line of financial statements but places much more emphasis on equipping managers with the information to determine budgets, prices, salaries, and incentives and influences capital budgeting decisions through an understanding of product profitability rather than firm profitability.

  1. Spacecraft platform cost estimating relationships

    NASA Technical Reports Server (NTRS)

    Gruhl, W. M.

    1972-01-01

    The three main cost areas of unmanned satellite development are discussed. The areas are identified as: (1) the spacecraft platform (SCP), (2) the payload or experiments, and (3) the postlaunch ground equipment and operations. The SCP normally accounts for over half of the total project cost and accurate estimates of SCP costs are required early in project planning as a basis for determining total project budget requirements. The development of single formula SCP cost estimating relationships (CER) from readily available data by statistical linear regression analysis is described. The advantages of single formula CER are presented.

  2. Older Adults Seeking Healthcare Information on the Internet

    ERIC Educational Resources Information Center

    Hardt, Jeffrey H.; Hollis-Sawyer, Lisa

    2007-01-01

    Due to an aging population and increases in healthcare costs, particular attention needs to be focused on developing Internet sites that provide older adults with credible and accurate healthcare information. Present research findings suggest that motivation is only one factor that influences whether or not older adults utilize the World Wide Web…

  3. Information logistics: A production-line approach to information services

    NASA Technical Reports Server (NTRS)

    Adams, Dennis; Lee, Chee-Seng

    1991-01-01

    Logistics can be defined as the process of strategically managing the acquisition, movement, and storage of materials, parts, and finished inventory (and the related information flow) through the organization and its marketing channels in a cost effective manner. It is concerned with delivering the right product to the right customer in the right place at the right time. The logistics function is composed of inventory management, facilities management, communications unitization, transportation, materials management, and production scheduling. The relationship between logistics and information systems is clear. Systems such as Electronic Data Interchange (EDI), Point of Sale (POS) systems, and Just in Time (JIT) inventory management systems are important elements in the management of product development and delivery. With improved access to market demand figures, logisticians can decrease inventory sizes and better service customer demand. However, without accurate, timely information, little, if any, of this would be feasible in today's global markets. Information systems specialists can learn from logisticians. In a manner similar to logistics management, information logistics is concerned with the delivery of the right data, to the ring customer, at the right time. As such, information systems are integral components of the information logistics system charged with providing customers with accurate, timely, cost-effective, and useful information. Information logistics is a management style and is composed of elements similar to those associated with the traditional logistics activity: inventory management (data resource management), facilities management (distributed, centralized and decentralized information systems), communications (participative design and joint application development methodologies), unitization (input/output system design, i.e., packaging or formatting of the information), transportations (voice, data, image, and video communication systems

  4. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  5. How flatbed scanners upset accurate film dosimetry

    NASA Astrophysics Data System (ADS)

    van Battum, L. J.; Huizenga, H.; Verdaasdonk, R. M.; Heukelom, S.

    2016-01-01

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner’s transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner’s optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  6. Variation in the costs of delivering routine immunization services in Peru.

    PubMed Central

    Walker, D.; Mosqueira, N. R.; Penny, M. E.; Lanata, C. F.; Clark, A. D.; Sanderson, C. F. B.; Fox-Rushby, J. A.

    2004-01-01

    OBJECTIVE: Estimates of vaccination costs usually provide only point estimates at national level with no information on cost variation. In practice, however, such information is necessary for programme managers. This paper presents information on the variations in costs of delivering routine immunization services in three diverse districts of Peru: Ayacucho (a mountainous area), San Martin (a jungle area) and Lima (a coastal area). METHODS: We consider the impact of variability on predictions of cost and reflect on the likely impact on expected cost-effectiveness ratios, policy decisions and future research practice. All costs are in 2002 prices in US dollars and include the costs of providing vaccination services incurred by 19 government health facilities during the January-December 2002 financial year. Vaccine wastage rates have been estimated using stock records. FINDINGS: The cost per fully vaccinated child ranged from 16.63-24.52 U.S. Dollars in Ayacucho, 21.79-36.69 U.S. Dollars in San Martin and 9.58-20.31 U.S. Dollars in Lima. The volume of vaccines administered and wastage rates are determinants of the variation in costs of delivering routine immunization services. CONCLUSION: This study shows there is considerable variation in the costs of providing vaccines across geographical regions and different types of facilities. Information on how costs vary can be used as a basis from which to generalize to other settings and provide more accurate estimates for decision-makers who do not have disaggregated data on local costs. Future studies should include sufficiently large sample sizes and ensure that regions are carefully selected in order to maximize the interpretation of cost variation. PMID:15628205

  7. The Effect of Activity-Based Costing on Logistics Management

    DTIC Science & Technology

    1993-01-01

    361 46. Current Cost Accounting System Can Trace Costs to Specific Vendors ..................... 362 47. Will Need to Accurately...elsewhere in the firm? a. What problems have the logistics functions encountered with their current cost accounting systems? 13 b. What benefits the...have already recognized problems within their current cost accounting systems. He claims the systems ". . .give managers incorrect product costing

  8. Troubleshooting Costs

    NASA Astrophysics Data System (ADS)

    Kornacki, Jeffrey L.

    Seventy-six million cases of foodborne disease occur each year in the United States alone. Medical and lost productivity costs of the most common pathogens are estimated to be 5.6-9.4 billion. Product recalls, whether from foodborne illness or spoilage, result in added costs to manufacturers in a variety of ways. These may include expenses associated with lawsuits from real or allegedly stricken individuals and lawsuits from shorted customers. Other costs include those associated with efforts involved in finding the source of the contamination and eliminating it and include time when lines are shut down and therefore non-productive, additional non-routine testing, consultant fees, time and personnel required to overhaul the entire food safety system, lost market share to competitors, and the cost associated with redesign of the factory and redesign or acquisition of more hygienic equipment. The cost associated with an effective quality assurance plan is well worth the effort to prevent the situations described.

  9. Defense Management: DOD Needs Better Information and Guidance to More Effectively Manage and Reduce Operating and Support Costs of Major Weapon Systems

    DTIC Science & Technology

    2010-07-01

    contractor logistics support arrangements since before 2005, the estimates included the costs for government -provided logistics support of the aircraft . For...GAO United States Government Accountability Office Report to Congressional Committees DEFENSE MANAGEMENT DOD Needs Better... Government Accountability Office,441 G Street NW,Washington,DC,20548 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S

  10. 50 CFR 85.41 - Allowable costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Allowable costs. 85.41 Section 85.41... Use/Acceptance of Funds § 85.41 Allowable costs. (a) Allowable grant costs are limited to those costs... applicable Federal cost principles in 43 CFR 12.60(b). Purchase of informational signs, program signs,...

  11. Costs in Perspective: Understanding Cost-Effectiveness Analysis.

    PubMed

    Detsky

    1996-01-01

    This paper covers five questions: (1) What is cost-effectiveness analysis;quest; (2) How can cost-effectiveness analysis help policymakers allocate scarce resources;quest; (3) What are misconceptions about the cost effectiveness of health care interventions;quest; (4) What is an attractive cost-effectiveness ratio;quest; (5) What is the relevance of cost effectiveness to clinicians? The cost side of the equation includes more than simply the cost of the intervention, but rather the cost of all of the downstream clinical events that occur with either therapeutic alternative. Cost-effectiveness analyses are used to help decisionmakers rank programs competing for scarce resources in order to achieve the following objective: to maximize the net health benefits derived from a fixed budget for a target population. A simple example is shown. Measured cost-effectiveness ratios for selected cardiovascular interventions are displayed. The systematic use of information on effectiveness and cost effectiveness should help those involved in setting policies to have a more rational basis for funding of new programs and discontinuation of funding for old programs. In Canadian health care it is important that we use this information to make room for innovations that are effective and efficient, and to remove funding from programs that are either known to be ineffective and costly or inefficient use of resources. More energy should be put toward generating the information necessary to make these kinds of decisions.

  12. Airframe RDT&E Cost Estimating: A Justification for and Development of Unique Cost Estimating Relationships According to Aircraft Type.

    DTIC Science & Technology

    1982-09-01

    T1I4 PAGO(ftWg, Dle& Bnt e 0 ’-Airframe RDT&E costs are invariably predicted by utilizing one general cost estimatin relationship (CER) regardless of...and in funding decisions demanding reliable , internally consistent estimates of absolute cost (10:1). Cost estimating capability is only as accurate as...pertaining to the RDT&E phase appear to be limited in their ability to accurately predict weapon system development costs. This thesis focuses on a

  13. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  14. Accurate, reproducible measurement of blood pressure.

    PubMed Central

    Campbell, N R; Chockalingam, A; Fodor, J G; McKay, D W

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine consumption, smoking and physical exertion within half an hour before measurement. The use of standardized techniques to measure blood pressure will help to avoid large systematic errors. Poor technique can account for differences in readings of more than 15 mm Hg and ultimately misdiagnosis. Most of the recommended procedures are simple and, when routinely incorporated into clinical practice, require little additional time. The equipment must be appropriate and in good condition. Physicians should have a suitable selection of cuff sizes readily available; the use of the correct cuff size is essential to minimize systematic errors in blood pressure measurement. Semiannual calibration of aneroid sphygmomanometers and annual inspection of mercury sphygmomanometers and blood pressure cuffs are recommended. We review the methods recommended for measuring blood pressure and discuss the factors known to produce large differences in blood pressure readings. PMID:2192791

  15. Closing the mental health treatment gap in South Africa: a review of costs and cost-effectiveness

    PubMed Central

    Jack, Helen; Wagner, Ryan G.; Petersen, Inge; Thom, Rita; Newton, Charles R.; Stein, Alan; Kahn, Kathleen; Tollman, Stephen; Hofman, Karen J.

    2014-01-01

    Background Nearly one in three South Africans will suffer from a mental disorder in his or her lifetime, a higher prevalence than many low- and middle-income countries. Understanding the economic costs and consequences of prevention and packages of care is essential, particularly as South Africa considers scaling-up mental health services and works towards universal health coverage. Economic evaluations can inform how priorities are set in system or spending changes. Objective To identify and review research from South Africa and sub-Saharan Africa on the direct and indirect costs of mental, neurological, and substance use (MNS) disorders and the cost-effectiveness of treatment interventions. Design Narrative overview methodology. Results and conclusions Reviewed studies indicate that integrating mental health care into existing health systems may be the most effective and cost-efficient approach to increase access to mental health services in South Africa. Integration would also direct treatment, prevention, and screening to people with HIV and other chronic health conditions who are at high risk for mental disorders. We identify four major knowledge gaps: 1) accurate and thorough assessment of the health burdens of MNS disorders, 2) design and assessment of interventions that integrate mental health screening and treatment into existing health systems, 3) information on the use and costs of traditional medicines, and 4) cost-effectiveness evaluation of a range of specific interventions or packages of interventions that are tailored to the national context. PMID:24848654

  16. Assessing the Cost of Global Biodiversity and Conservation Knowledge.

    PubMed

    Juffe-Bignoli, Diego; Brooks, Thomas M; Butchart, Stuart H M; Jenkins, Richard B; Boe, Kaia; Hoffmann, Michael; Angulo, Ariadne; Bachman, Steve; Böhm, Monika; Brummitt, Neil; Carpenter, Kent E; Comer, Pat J; Cox, Neil; Cuttelod, Annabelle; Darwall, William R T; Di Marco, Moreno; Fishpool, Lincoln D C; Goettsch, Bárbara; Heath, Melanie; Hilton-Taylor, Craig; Hutton, Jon; Johnson, Tim; Joolia, Ackbar; Keith, David A; Langhammer, Penny F; Luedtke, Jennifer; Nic Lughadha, Eimear; Lutz, Maiko; May, Ian; Miller, Rebecca M; Oliveira-Miranda, María A; Parr, Mike; Pollock, Caroline M; Ralph, Gina; Rodríguez, Jon Paul; Rondinini, Carlo; Smart, Jane; Stuart, Simon; Symes, Andy; Tordoff, Andrew W; Woodley, Stephen; Young, Bruce; Kingston, Naomi

    2016-01-01

    Knowledge products comprise assessments of authoritative information supported by standards, governance, quality control, data, tools, and capacity building mechanisms. Considerable resources are dedicated to developing and maintaining knowledge products for biodiversity conservation, and they are widely used to inform policy and advise decision makers and practitioners. However, the financial cost of delivering this information is largely undocumented. We evaluated the costs and funding sources for developing and maintaining four global biodiversity and conservation knowledge products: The IUCN Red List of Threatened Species, the IUCN Red List of Ecosystems, Protected Planet, and the World Database of Key Biodiversity Areas. These are secondary data sets, built on primary data collected by extensive networks of expert contributors worldwide. We estimate that US$160 million (range: US$116-204 million), plus 293 person-years of volunteer time (range: 278-308 person-years) valued at US$ 14 million (range US$12-16 million), were invested in these four knowledge products between 1979 and 2013. More than half of this financing was provided through philanthropy, and nearly three-quarters was spent on personnel costs. The estimated annual cost of maintaining data and platforms for three of these knowledge products (excluding the IUCN Red List of Ecosystems for which annual costs were not possible to estimate for 2013) is US$6.5 million in total (range: US$6.2-6.7 million). We estimated that an additional US$114 million will be needed to reach pre-defined baselines of data coverage for all the four knowledge products, and that once achieved, annual maintenance costs will be approximately US$12 million. These costs are much lower than those to maintain many other, similarly important, global knowledge products. Ensuring that biodiversity and conservation knowledge products are sufficiently up to date, comprehensive and accurate is fundamental to inform decision-making for

  17. The cost of medicines in the United Kingdom. A survey of general practitioners' opinions and knowledge.

    PubMed

    Silcock, J; Ryan, M; Bond, C M; Taylor, R J

    1997-01-01

    Prescribing costs in general practice continue to grow. Their importance is underlined by the amount of information concerned with costs that general practitioners (GPs) receive, and by the existence of target budgets. In 1986 and 1991, surveys showed that GPs agreed that cost should be borne in mind when choosing medicines, but that their knowledge of drug prices was often inaccurate. This study assessed the current knowledge and attitudes of GPs in the UK in respect of prescribing costs, and examined the influence of various developments in general practice since 1986 on the accuracy of drug price estimation. 1000 randomly selected GP principals (500 in Scotland and 125 in each of 4 English health regions) were sent a postal questionnaire. The GPs' level of agreement with 5 statements concerned with prescribing costs, and the accuracy of their estimates of the basic price of 31 drugs, were analysed. Most GPs (71%) agreed that prescribing costs should be taken into account when deciding on the best treatment for patients. Fundholders were more likely than non-fundholders: (i) to agree that prescribing costs could be reduced without affecting patient care; (ii) to agree that providing more information on costs would lower the cost of prescribing; and (iii) to comment that cost guidelines had changed their prescribing habits. Fundholders were less likely than non-fundholders to reject the principle of fixed limits on prescribing costs. Overall, one-third of the price estimates given were accurate (within 25% of the actual cost). For the most expensive drugs in the survey [those priced over 10 pounds sterling (Pound) per pack], half of the price estimates were accurate. There were significant differences between non-fundholders' and fundholders' estimates of the price of less expensive drugs (those priced at less than 10 pounds per pack). Use of a formulary or computer-displayed drug price information did not affect the accuracy of price estimates. It may be that GPs

  18. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    PubMed

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset.

  19. Comprehensive cost analysis of sentinel node biopsy in solid head and neck tumors using a time-driven activity-based costing approach.

    PubMed

    Crott, Ralph; Lawson, Georges; Nollevaux, Marie-Cécile; Castiaux, Annick; Krug, Bruno

    2016-09-01

    Head and neck cancer (HNC) is predominantly a locoregional disease. Sentinel lymph node (SLN) biopsy offers a minimally invasive means of accurately staging the neck. Value in healthcare is determined by both outcomes and the costs associated with achieving them. Time-driven activity-based costing (TDABC) may offer more precise estimates of the true cost. Process maps were developed for nuclear medicine, operating room and pathology care phases. TDABC estimates the costs by combining information about the process with the unit cost of each resource used. Resource utilization is based on observation of care and staff interviews. Unit costs are calculated as a capacity cost rate, measured as a Euros/min (2014), for each resource consumed. Multiplying together the unit costs and resource quantities and summing across all resources used will produce the average cost for each phase of care. Three time equations with six different scenarios were modeled based on the type of camera, the number of SLN and the type of staining used. Total times for different SLN scenarios vary between 284 and 307 min, respectively, with a total cost between 2794 and 3541€. The unit costs vary between 788€/h for the intraoperative evaluation with a gamma-probe and 889€/h for a preoperative imaging with a SPECT/CT. The unit costs for the lymphadenectomy and the pathological examination are, respectively, 560 and 713€/h. A 10 % increase of time per individual activity generates only 1 % change in the total cost. TDABC evaluates the cost of SLN in HNC. The total costs across all phases which varied between 2761 and 3744€ per standard case.

  20. An Iterative, Low-Cost Strategy to Building Information Systems Allows a Small Jurisdiction Local Health Department to Increase Efficiencies and Expand Services

    PubMed Central

    Shah, Gulzar H.

    2016-01-01

    Objective and Methods: The objective of this case study was to describe the process and outcomes of a small local health department's (LHD's) strategy to build and use information systems. The case study is based on a review of documents and semi-structured interviews with key informants in the Pomperaug District Health Department. Interviews were recorded, transcribed, coded, and analyzed. Results and Conclusions: The case study here suggests that small LHDs can use a low-resource, incremental strategy to build information systems for improving departmental effectiveness and efficiency. Specifically, we suggest that the elements for this department's success were simple information systems, clear vision, consistent leadership, and the involvement, training, and support of staff. PMID:27684628

  1. Costing and cost analysis in randomized controlled trials: caveat emptor.

    PubMed

    Polsky, Daniel; Glick, Henry

    2009-01-01

    This article provides an overview of the central issues regarding cost valuation and analysis for a decision maker's evaluation of costing performed within randomized controlled trials (RCTs). Costing involves specific choices for valuation and analysis that involve trade-offs. Understanding these choices and their implications is necessary for proper evaluation of how costs are valued and analyzed within an RCT and cannot be assessed through a checklist of adherence to general principles. Resource costing, the most common method of costing, involves measuring medical service use in study case report forms and translating this use into a cost by multiplying the number of units of each medical service by price weights for those services. A choice must be made as to how detailed the measurement of resources will be. Micro-costing improves the specificity of the cost estimate, but it is often impractical to precisely measure resources at this level and the price weights for these micro-units may not be available. Gross-costing may be more practical, and price weights are often easier to find and are more reliable, but important resource differences between treatment groups may be lost in the bundling of resources. Price weights can either be nationally determined or centre specific, but the appropriate price weight will depend on perspective, convenience, completeness and accuracy. Identifying the resource types and the appropriate price weights for these resources are the essential elements to an accurate valuation of costs. Once medical services are valued, the resulting individual patient cost estimates must be analysed. The difference in the mean cost between treatment groups is the important summary statistic for cost-effectiveness analysis from both the budgetary and the social perspectives. The statistical challenges with cost data typically stem from its skewed distribution and the resulting possibility that the sample mean may be inefficient and possibly

  2. The costs of schizophrenia in Spain.

    PubMed

    Oliva-Moreno, Juan; López-Bastida, Julio; Osuna-Guerrero, Rubén; Montejo-González, Angel Luis; Duque-González, Beatriz

    2006-09-01

    This study estimated the economic impact of schizophrenia-related direct costs (medical and nonmedical costs) in Spain. Direct medical costs (hospitalizations, outpatient consultations, drug costs) and direct nonmedical costs (costs of informal care) were estimated based on prevalence costs for 2002. The total costs of schizophrenia were estimated at euro 1,970.8 million; direct medical costs accounted for 53% and informal care costs 47%. Despite having implemented a conservative approach, the health care costs associated with schizophrenia account for 2.7% of total public health care expenditure in Spain. The sum of medical and nonmedical costs give us a better definition of the magnitude of the problem in Spain as well as contributing to helping make the debate on this issue more transparent.

  3. Costs and benefits

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two models of cost benefit analysis are illustrated and the application of these models to assessing the economic scope of space applications programs was discussed. Four major areas cited as improvable through space derived information - food supply and distribution, energy sources, mineral reserves, and communication and navigation were - discussed. Specific illustrations are given for agriculture and maritime traffic.

  4. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  5. Nonexposure accurate location K-anonymity algorithm in LBS.

    PubMed

    Jia, Jinying; Zhang, Fengli

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR.

  6. 48 CFR 52.215-10 - Price Reduction for Defective Certified Cost or Pricing Data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... complete, accurate, and current as certified in its Certificate of Current Cost or Pricing Data; (2) A... complete, accurate, and current as certified in the Contractor's Certificate of Current Cost or Pricing... been modified even if accurate, complete, and current certified cost or pricing data had been...

  7. Accurate Energy Transaction Allocation using Path Integration and Interpolation

    NASA Astrophysics Data System (ADS)

    Bhide, Mandar Mohan

    This thesis investigates many of the popular cost allocation methods which are based on actual usage of the transmission network. The Energy Transaction Allocation (ETA) method originally proposed by A.Fradi, S.Brigonne and B.Wollenberg which gives unique advantage of accurately allocating the transmission network usage is discussed subsequently. Modified calculation of ETA based on simple interpolation technique is then proposed. The proposed methodology not only increase the accuracy of calculation but also decreases number of calculations to less than half of the number of calculations required in original ETAs.

  8. Selected Tether Applications Cost Model

    NASA Technical Reports Server (NTRS)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  9. Solar power satellite cost estimate

    NASA Technical Reports Server (NTRS)

    Harron, R. J.; Wadle, R. C.

    1981-01-01

    The solar power configuration costed is the 5 GW silicon solar cell reference system. The subsystems identified by work breakdown structure elements to the lowest level for which cost information was generated. This breakdown divides into five sections: the satellite, construction, transportation, the ground receiving station and maintenance. For each work breakdown structure element, a definition, design description and cost estimate were included. An effort was made to include for each element a reference that more thoroughly describes the element and the method of costing used. All costs are in 1977 dollars.

  10. Providing Low-Cost Information Technology Access to Rural Communities in Developing Countries: What Works? What Pays? OECD Development Centre Working Paper No. 229 (Formerly Webdoc No. 17)

    ERIC Educational Resources Information Center

    Caspary, Georg; O'Connor, David

    2003-01-01

    Rural areas of the developing world are the last frontier of the information technology revolution. Telephone and internet penetration there remains a small fraction of what it is in the developed world. Limited means of electronic communication with the outside world are just one source of isolation of rural communities and economies from the…

  11. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    ERIC Educational Resources Information Center

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  12. Informing mental health policies and services in the EMR: cost-effective deployment of human resources to deliver integrated community-based care.

    PubMed

    Ivbijaro, G; Patel, V; Chisholm, D; Goldberg, D; Khoja, T A M; Edwards, T M; Enum, Y; Kolkiewic, L A

    2015-09-28

    For EMR countries to deliver the expectations of the Global Mental Health Action Plan 2013-2020 & the ongoing move towards universal health coverage, all health & social care providers need to innovate and transform their services to provide evidence-based health care that is accessible, cost-effective & with the best patient outcomes. For the primary and community workforce, this includes general medical practitioners, practice & community nurses, community social workers, housing officers, lay health workers, nongovernmental organizations & civil society, including community spiritual leaders/healers. This paper brings together the current best evidence to support transformation & discusses key approaches to achieve this, including skill mix and/or task shifting and integrated care. The important factors that need to be in place to support skill mix/task shifting and good integrated care are outlined with reference to EMR countries.

  13. Cost Control

    ERIC Educational Resources Information Center

    Foreman, Phillip

    2009-01-01

    Education administrators involved in construction initiatives unanimously agree that when it comes to change orders, less is more. Change orders have a negative rippling effect of driving up building costs and producing expensive project delays that often interfere with school operations and schedules. Some change orders are initiated by schools…

  14. The impact of disease stage on direct medical costs of HIV management: a review of the international literature.

    PubMed

    Levy, Adrian; Johnston, Karissa; Annemans, Lieven; Tramarin, Andrea; Montaner, Julio

    2010-01-01

    The global prevalence of HIV infection continues to grow, as a result of increasing incidence in some countries and improved survival where highly active antiretroviral therapy (HAART) is available. Growing healthcare expenditure and shifts in the types of medical resources used have created a greater need for accurate information on the costs of treatment. The objectives of this review were to compare published estimates of direct medical costs for treating HIV and to determine the impact of disease stage on such costs, based on CD4 cell count and plasma viral load. A literature review was conducted to identify studies meeting prespecified criteria for information content, including an original estimate of the direct medical costs of treating an HIV-infected individual, stratified based on markers of disease progression. Three unpublished cost-of-care studies were also included, which were applied in the economic analyses published in this supplement. A two-step procedure was used to convert costs into a common price year (2004) using country-specific health expenditure inflators and, to account for differences in currency, using health-specific purchasing power parities to express all cost estimates in US dollars. In all nine studies meeting the eligibility criteria, infected individuals were followed longitudinally and a 'bottom-up' approach was used to estimate costs. The same patterns were observed in all studies: the lowest CD4 categories had the highest cost; there was a sharp decrease in costs as CD4 cell counts rose towards 100 cells/mm³; and there was a more gradual decline in costs as CD4 cell counts rose above 100 cells/mm³. In the single study reporting cost according to viral load, it was shown that higher plasma viral load level (> 100,000 HIV-RNA copies/mL) was associated with higher costs of care. The results demonstrate that the cost of treating HIV disease increases with disease progression, particularly at CD4 cell counts below 100 cells

  15. A model for the cost of doing a cost estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1992-01-01

    A model for estimating the cost required to do a cost estimate for Deep Space Network (DSN) projects that range from $0.1 to $100 million is presented. The cost of the cost estimate in thousands of dollars, C(sub E), is found to be approximately given by C(sub E) = K((C(sub p))(sup 0.35)) where C(sub p) is the cost of the project being estimated in millions of dollars and K is a constant depending on the accuracy of the estimate. For an order-of-magnitude estimate, K = 24; for a budget estimate, K = 60; and for a definitive estimate, K = 115. That is, for a specific project, the cost of doing a budget estimate is about 2.5 times as much as that for an order-of-magnitude estimate, and a definitive estimate costs about twice as much as a budget estimate. Use of this model should help provide the level of resources required for doing cost estimates and, as a result, provide insights towards more accurate estimates with less potential for cost overruns.

  16. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  17. The Cost of Accumulating Evidence in Perceptual Decision Making

    PubMed Central

    Drugowitsch, Jan; Moreno-Bote, Rubén; Churchland, Anne K.; Shadlen, Michael N.; Pouget, Alexandre

    2012-01-01

    Decision making often involves the accumulation of information over time, but acquiring information typically comes at a cost. Little is known about the cost incurred by animals and humans for acquiring additional information from sensory variables, due, for instance, to attentional efforts. Through a novel integration of diffusion models and dynamic programming, we were able to estimate the cost of making additional observations per unit of time from two monkeys and six humans in a reaction time random dot motion discrimination task. Surprisingly, we find that, the cost is neither zero nor constant over time, but for the animals and humans features a brief period in which it is constant but increases thereafter. In addition, we show that our theory accurately matches the observed reaction time distributions for each stimulus condition, the time-dependent choice accuracy both conditional on stimulus strength and independent of it, and choice accuracy and mean reaction times as a function of stimulus strength. The theory also correctly predicts that urgency signals in the brain should be independent of the difficulty, or stimulus strength, at each trial. PMID:22423085

  18. Computational Time-Accurate Body Movement: Methodology, Validation, and Application

    DTIC Science & Technology

    1995-10-01

    used that had a leading-edge sweep angle of 45 deg and a NACA 64A010 symmetrical airfoil section. A cross section of the pylon is a symmetrical...25 2. Information Flow for the Time-Accurate Store Trajectory Prediction Process . . . . . . . . . 26 3. Pitch Rates for NACA -0012 Airfoil...section are comparisons of the computational results to data for a NACA -0012 airfoil following a predefined pitching motion. Validation of the

  19. EPA (Environmental Protection Agency) evaluation of the POWERFuel Extender System under Section 511 of the Motor Vehicle Information and Cost Savings Act. Technical report

    SciTech Connect

    Syria, S.L.

    1983-08-01

    The evaluation of the POWERFuel Extender System was conducted upon the application of the manufacturer. The device is claimed to improve fuel economy and driveability and to reduce exhaust emissions and required engine maintenance. The device is classified by EPA as a vapor-air bleed device. EPA fully considered all of the information submitted by the applicant. The evaluation of the POWERFuel Extender System was based on that information and on EPA's experience with other similar devices. Although, in theory, the introduction of alcohol and water could have a favorable effect on an engine's cleaniness, power and maintenance requirements and could even allow some vehicles to use lower octane fuel, data were not submitted to substantiate that the POWERFuel Extender System could cause these benefits.

  20. Jail Removal Cost Study. Volume 2.

    ERIC Educational Resources Information Center

    Illinois Univ., Champaign. Community Research Center.

    This second volume of the jail removal cost study provides a detailed report of the study findings which examine the costs, experiences, and ramifications of removing children from adult jails and lockups. The foreword supplies background information and hypothetical removal cost estimates. The approach used to conduct the jail removal cost study…

  1. Cost Accounting in the Automated Manufacturing Environment

    DTIC Science & Technology

    1988-06-01

    current cost accounting systems...Bennett et al., 1987]. First, users of cost accounting information express different levels of satisfaction with the 65 effectiveness of current cost accounting systems...Bennett et al., 1987). For example, it was found that, ආ percent of the users are unhappy with current cost accounting

  2. PRIMAL: Fast and accurate pedigree-based imputation from sequence data in a founder population.

    PubMed

    Livne, Oren E; Han, Lide; Alkorta-Aranburu, Gorka; Wentworth-Sheilds, William; Abney, Mark; Ober, Carole; Nicolae, Dan L

    2015-03-01

    Founder populations and large pedigrees offer many well-known advantages for genetic mapping studies, including cost-efficient study designs. Here, we describe PRIMAL (PedigRee IMputation ALgorithm), a fast and accurate pedigree-based phasing and imputation algorithm for founder populations. PRIMAL incorporates both existing and original ideas, such as a novel indexing strategy of Identity-By-Descent (IBD) segments based on clique graphs. We were able to impute the genomes of 1,317 South Dakota Hutterites, who had genome-wide genotypes for ~300,000 common single nucleotide variants (SNVs), from 98 whole genome sequences. Using a combination of pedigree-based and LD-based imputation, we were able to assign 87% of genotypes with >99% accuracy over the full range of allele frequencies. Using the IBD cliques we were also able to infer the parental origin of 83% of alleles, and genotypes of deceased recent ancestors for whom no genotype information was available. This imputed data set will enable us to better study the relative contribution of rare and common variants on human phenotypes, as well as parental origin effect of disease risk alleles in >1,000 individuals at minimal cost.

  3. Standard cost elements for technology programs

    NASA Technical Reports Server (NTRS)

    Christensen, Carisa B.; Wagenfuehrer, Carl

    1992-01-01

    The suitable structure for an effective and accurate cost estimate for general purposes is discussed in the context of a NASA technology program. Cost elements are defined for research, management, and facility-construction portions of technology programs. Attention is given to the mechanisms for insuring the viability of spending programs, and the need for program managers is established for effecting timely fund disbursement. Formal, structures, and intuitive techniques are discussed for cost-estimate development, and cost-estimate defensibility can be improved with increased documentation. NASA policies for cash management are examined to demonstrate the importance of the ability to obligate funds and the ability to cost contracted funds. The NASA approach to consistent cost justification is set forth with a list of standard cost-element definitions. The cost elements reflect the three primary concerns of cost estimates: the identification of major assumptions, the specification of secondary analytic assumptions, and the status of program factors.

  4. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities.

    PubMed

    Helb, Danica A; Tetteh, Kevin K A; Felgner, Philip L; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R; Beeson, James G; Tappero, Jordan; Smith, David L; Crompton, Peter D; Rosenthal, Philip J; Dorsey, Grant; Drakeley, Christopher J; Greenhouse, Bryan

    2015-08-11

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual's recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86-0.93), whereas responses to six antigens accurately estimated an individual's malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs.

  5. Accurate derivation of heart rate variability signal for detection of sleep disordered breathing in children.

    PubMed

    Chatlapalli, S; Nazeran, H; Melarkod, V; Krishnam, R; Estrada, E; Pamula, Y; Cabrera, S

    2004-01-01

    The electrocardiogram (ECG) signal is used extensively as a low cost diagnostic tool to provide information concerning the heart's state of health. Accurate determination of the QRS complex, in particular, reliable detection of the R wave peak, is essential in computer based ECG analysis. ECG data from Physionet's Sleep-Apnea database were used to develop, test, and validate a robust heart rate variability (HRV) signal derivation algorithm. The HRV signal was derived from pre-processed ECG signals by developing an enhanced Hilbert transform (EHT) algorithm with built-in missing beat detection capability for reliable QRS detection. The performance of the EHT algorithm was then compared against that of a popular Hilbert transform-based (HT) QRS detection algorithm. Autoregressive (AR) modeling of the HRV power spectrum for both EHT- and HT-derived HRV signals was achieved and different parameters from their power spectra as well as approximate entropy were derived for comparison. Poincare plots were then used as a visualization tool to highlight the detection of the missing beats in the EHT method After validation of the EHT algorithm on ECG data from the Physionet, the algorithm was further tested and validated on a dataset obtained from children undergoing polysomnography for detection of sleep disordered breathing (SDB). Sensitive measures of accurate HRV signals were then derived to be used in detecting and diagnosing sleep disordered breathing in children. All signal processing algorithms were implemented in MATLAB. We present a description of the EHT algorithm and analyze pilot data for eight children undergoing nocturnal polysomnography. The pilot data demonstrated that the EHT method provides an accurate way of deriving the HRV signal and plays an important role in extraction of reliable measures to distinguish between periods of normal and sleep disordered breathing (SDB) in children.

  6. Costing Children's Speech, Language and Communication Interventions

    ERIC Educational Resources Information Center

    Beecham, Jennifer; Law, James; Zeng, Biao; Lindsay, Geoff

    2012-01-01

    Background: There are few economic evaluations of speech and language interventions. Such work requires underpinning by an accurate estimate of the costs of the intervention. This study seeks to address some of the complexities of this task by applying existing approaches of cost estimation to interventions described in published effectiveness…

  7. Discrete sensors distribution for accurate plantar pressure analyses.

    PubMed

    Claverie, Laetitia; Ille, Anne; Moretto, Pierre

    2016-12-01

    The aim of this study was to determine the distribution of discrete sensors under the footprint for accurate plantar pressure analyses. For this purpose, two different sensor layouts have been tested and compared, to determine which was the most accurate to monitor plantar pressure with wireless devices in research and/or clinical practice. Ten healthy volunteers participated in the study (age range: 23-58 years). The barycenter of pressures (BoP) determined from the plantar pressure system (W-inshoe®) was compared to the center of pressures (CoP) determined from a force platform (AMTI) in the medial-lateral (ML) and anterior-posterior (AP) directions. Then, the vertical ground reaction force (vGRF) obtained from both W-inshoe® and force platform was compared for both layouts for each subject. The BoP and vGRF determined from the plantar pressure system data showed good correlation (SCC) with those determined from the force platform data, notably for the second sensor organization (ML SCC= 0.95; AP SCC=0.99; vGRF SCC=0.91). The study demonstrates that an adjusted placement of removable sensors is key to accurate plantar pressure analyses. These results are promising for a plantar pressure recording outside clinical or laboratory settings, for long time monitoring, real time feedback or for whatever activity requiring a low-cost system.

  8. Gauging Technology Costs and Benefits

    ERIC Educational Resources Information Center

    Kaestner, Rich

    2007-01-01

    Regardless of the role technology plays in a school district, district personnel should know the costs associated with technology, understand the consequences of technology purchases, and be able to measure the benefits of technology, so they can make more informed decisions. However, determining costs and benefits of current technology or…

  9. Program Tracks Cost Of Travel

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III

    1993-01-01

    Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.

  10. Estimating archiving costs for engineering records

    SciTech Connect

    Stutz, R.A.; Lamartine, B.C.

    1997-02-01

    Information technology has completely changed the concept of record keeping for engineering projects -- the advent of digital records was a momentous discovery, as significant as the invention of the printing press. Digital records allowed huge amounts of information to be stored in a very small space and to be examined quickly. However, digital documents are much more vulnerable to the passage of time than printed documents because the media on which they are stored are easily affected by physical phenomena, such as magnetic fields, oxidation, material decay, and by various environmental factors that may erase the information. Even more important, digital information becomes obsolete because, even if future generations may be able to read it, they may not necessarily be able to interpret it. Engineering projects of all sizes are becoming more dependent on digital records. These records are created on computers used in design, estimating, construction management, and construction. The necessity for the accurate and accessible storage of these documents, generated by computer software systems, is increasing for a number of reasons including legal and environment issues. This paper will discuss media life considerations and life cycle costs associated with several methods of storing engineering records.

  11. Activity-based costing and its application in a Turkish university hospital.

    PubMed

    Yereli, Ayşe Necef

    2009-03-01

    Resource management in hospitals is of increasing importance in today's global economy. Traditional accounting systems have become inadequate for managing hospital resources and accurately determining service costs. Conversely, the activity-based costing approach to hospital accounting is an effective cost management model that determines costs and evaluates financial performance across departments. Obtaining costs that are more accurate can enable hospitals to analyze and interpret costing decisions and make more accurate budgeting decisions. Traditional and activity-based costing approaches were compared using a cost analysis of gall bladder surgeries in the general surgery department of one university hospital in Manisa, Turkey.

  12. 7 CFR 2903.4 - Indirect costs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AGRICULTURE BIODIESEL FUEL EDUCATION PROGRAM General Information § 2903.4 Indirect costs. (a) For the Biodiesel Fuel Education Program, applicants should use the current indirect cost rate negotiated with...

  13. Health Care Costs for Patients With Chronic Spinal Cord Injury in the Veterans Health Administration

    PubMed Central

    French, Dustin D; Campbell, Robert R; Sabharwal, Sunil; Nelson, Audrey L; Palacios, Polly A; Gavin-Dreschnack, Deborah

    2007-01-01

    Background/Objective: Recurring annual costs of caring for patients with chronic spinal cord injury (SCI) is a large economic burden on health care systems, but information on costs of SCI care beyond the acute and initial postacute phase is sparse. The objective of this study was to establish a frame of reference and estimate of the annual direct medical costs associated with health care for a sample of patients with chronic SCI (ie, >2 years after injury). Methods: Patients were recruited from 3 Veterans Health Administration (VHA) SCI facilities; baseline patient information was cross-referenced to the Decision Support System (DSS) National Data Extracts (NDE) to obtain patient-specific health care costs in VHA. Descriptive statistical analysis of annual DSS-NDE cost of patients with SCI (N = 675) for fiscal year (FY) 2005 by level and completeness of injury was conducted. Results: Total (inpatient and outpatient) annual (FY 2005) direct medical costs for 675 patients with SCI exceeded $14.47 million or $21,450 per patient. Average annual total costs varied from $28,334 for cervical complete SCI to $16,792 for thoracic incomplete SCI. Two hundred thirty-three of the 675 patients with SCI who were hospitalized over the study period accounted for a total of 378 hospital discharges, costing in excess of $7.19 million. This approximated a cost of outpatient care received of $7.28 million for our entire sample. Conclusions: The comprehensive nature of health care delivery and related cost capture for people with chronic SCI in the VHA provided us the opportunity to accurately determine health care costs for this population. Future SCI postacute care cost analyses should consider case-mix adjusting patients at high risk for rehospitalization. PMID:18092564

  14. 78 FR 41415 - Information Collection Activities; Submitted for Office of Management and Budget (OMB) Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... information to ensure that the volumes of hydrocarbons produced are measured accurately, and royalties are... cost burdens Liquid Hydrocarbon Measurement 1202(a)(1), (b)(1); 1203(b)(1); Submit application for liquid Simple: 7 49 Simple Applications 343 1204(a)(1). hydrocarbon or gas measurement procedures...

  15. 76 FR 35232 - Notice of Proposed Information Collection for Public Comment; Housing Choice Voucher Program...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-16

    ... Voucher (HCV) Program Administrative Fee study. The purpose of the study is to collect accurate information on the costs of administering the HCV program across a representative sample of high performing... for the HCV program. The study is proceding in multiple phases. This request is for data collection...

  16. 76 FR 60854 - Notice of Submission of Proposed Information Collection to OMB; Housing Choice Voucher Program...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... Housing Choice Voucher (HCV) Program Administrative Fee study. The purpose of the study is to collect accurate information on the costs of administering the HCV program across a representative sample of high... formula for the HCV program. The study is proceeding in multiple phases. This request is for...

  17. Equity flotation cost adjustments in utilities' cost of service

    SciTech Connect

    Bierman, H. Jr.; Hass, J.E.

    1984-03-01

    Recovery of the unavoidable costs of issuing new shares of stock is generally agreed to be appropriate in determining utility revenue requirements. This article suggests that the methods by which that is usually accomplished are of questionable accuracy. The conventional practice of adjusting the allowed rate of return on common equity is examined, and an improved adjustment formulation is presented. Acknowledging that application of the formula remains subject to considerable error, however, the authors propose yet another solution. Capitalization of flotation costs as intangible assets is suggested as a way of more accurately factoring such expenses into tariff determinations. 6 references.

  18. The development of rapid and accurate screening test for RET hotspot somatic and germline mutations in MEN2 syndromes.

    PubMed

    Zupan, Andrej; Glavač, Damjan

    2015-12-01

    Medullary thyroid carcinoma (MTC) is a rare endocrine malignancy with distinctive features separating it from other thyroid cancers. Cancer may be sporadic or occur as a consequence of the hereditary syndrome called multiple endocrine neoplasia type 2 (MEN2) with three distinct phenotypes in MEN2A, MEN2B and FMTC. Each variant of MEN2 results from different RET gene mutations, with a good genotype-phenotype correlation. The goal of the study was to develop a fast and accurate screening method for a reliable detection of hot-spot RET germline and sporadic tumor mutations. From a cohort of 191 patients with MTC and their relatives, 38 tested positive and 31 tested negative for a germline or somatic tumor RET mutation were selected. A positive HRM mutation pattern was detected in all mutation-positive patients and altogether the method was able to clearly differentiate between twenty different genotypes. A novel germline variant p.Ala639Thr was detected in MTC patient, which was determined to be likely benign. Analytical specificity was determined to be 98.6% and a sensitivity threshold was determined to be 30%. The fast and accurate HRM method reduces the turnaround time providing fast and important information, especially when targeted anti-tyrosine kinase therapy on tumor samples is considered. Overall, we developed a high-throughput, accurate and cost-effective approach for the detection of RET germline and sporadic tumor mutations.

  19. Effects of Price, Information, and Transactions Cost Interventions to Raise Voluntary Enrollment in a Social Health Insurance Scheme: A Randomized Experiment in the Philippines.

    PubMed

    Capuno, Joseph J; Kraft, Aleli D; Quimbo, Stella; Tan, Carlos R; Wagstaff, Adam

    2016-06-01

    A cluster randomized experiment was undertaken testing two sets of interventions encouraging enrollment in the Individually Paying Program (IPP), the voluntary component of the Philippines' social health insurance program. In early 2011, 1037 unenrolled IPP-eligible families in 179 randomly selected intervention municipalities were given an information kit and offered a 50% premium subsidy valid until the end of 2011; 383 IPP-eligible families in 64 control municipalities were not. In February 2012, the 787 families in the intervention sites who were still IPP-eligible but had not enrolled had their vouchers extended, were resent the enrollment kits and received SMS reminders. Half the group also received a 'handholding' intervention: in the endline interview, the enumerator offered to help complete the enrollment form, deliver it to the insurer's office in the provincial capital, and mail the membership cards. The main intervention raised the enrollment rate by 3 percentage points (ppts) (p = 0.11), with an 8 ppt larger effect (p < 0.01) among city-dwellers, consistent with travel time to the insurance office affecting enrollment. The handholding intervention raised enrollment by 29 ppts (p < 0.01), with a smaller effect (p < 0.01) among city-dwellers, likely because of shorter travel times, and higher education levels facilitating unaided completion of the enrollment form. Copyright © The World Bank Health Economics © 2015 John Wiley & Sons, Ltd.

  20. Costing blood products and services.

    PubMed

    Wallace, E L

    1991-05-01

    At present, blood centers and transfusion services have limited alternatives for offsetting the ever-rising costs of health care inputs. In the face of current revenue constraints, cost reduction or cost containment through efficiency improvements or service reduction is the principal available means. Such methods ought to be pursued vigorously by blood bankers with the aid of well-designed costing and other physical measurements systems. Experience indicates, however, that blood bankers, in their attempts to reduce or contain costs, are likely to place undue reliance on cost accounting systems as the means of capturing sought-for benefits. Management must learn enough about methods of costing to judge directly the uses and limitations of the information produced. Such understanding begins with recognition that all costs and cost comparisons should be specific to the purpose for which they are developed. No costing procedure is capable of producing measures generally applicable to all management decisions. A measure relevant to a planning decision is unlikely to be appropriate for performance evaluation. Useful comparisons among sets of organizations of costs, or of measures of physical inputs and outputs, require assurance that the methods of measurement employed are the same and that the sets of organizations from which the measures are drawn are reasonably comparable.

  1. Evaluating cost center productivity.

    PubMed

    DiJerome, L; Dunham-Taylor, J; Ash, D; Brown, R

    1999-01-01

    The monthly and yearly productivity summaries were developed and applied to a computer spreadsheet to aid the nurse manager in better understanding and communicating budget issues for diverse ambulatory care departments. A computerized spreadsheet using a commercially available personal computer program, such as Lotus, Quattro Pro, or Excel, can be used to more quickly and accurately track and summarize monthly budget reports. The data can be entered into the spreadsheet either manually or imported by query from the financial mainframe system. Contact your agency's finance or information department for information on how to accomplish this. Periodically acuity and resources should be measured and compared with quality monitors to maintain standards. For the past 10 years, our facility has successfully used this tool to make more informed decisions by identifying trouble spots early, and taking corrective action to avoid crisis management.

  2. Video distribution system cost model

    NASA Technical Reports Server (NTRS)

    Gershkoff, I.; Haspert, J. K.; Morgenstern, B.

    1980-01-01

    A cost model that can be used to systematically identify the costs of procuring and operating satellite linked communications systems is described. The user defines a network configuration by specifying the location of each participating site, the interconnection requirements, and the transmission paths available for the uplink (studio to satellite), downlink (satellite to audience), and voice talkback (between audience and studio) segments of the network. The model uses this information to calculate the least expensive signal distribution path for each participating site. Cost estimates are broken downy by capital, installation, lease, operations and maintenance. The design of the model permits flexibility in specifying network and cost structure.

  3. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  4. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    SciTech Connect

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  5. Personnel administration can act on cost containment.

    PubMed

    Adams, J

    1980-01-01

    Hosiptal personnel administration programs have the potential to contain costs in a variety of areas, including human resources management, payroll costs, benefits costs, labor turnover, and recruitment costs. Recognizing this, the Maine Society for Hospital Personnel Administration presented a seminar and prepared a guide for trustees and administrators on the role that personnel administration can play in the Voluntary Effort to contain costs. The following article is an adaptation of the information included in the seminar and guide.

  6. The economics of enteric infections: human foodborne disease costs.

    PubMed

    Buzby, Jean C; Roberts, Tanya

    2009-05-01

    The World Health Organization estimates that in 2005, 1.5 million people died, worldwide, from diarrheal diseases. A separate study estimated that 70% of diarrheal diseases are foodborne. The widely cited US estimate is that there are 76 million foodborne illnesses annually, resulting in 325,000 hospitalizations and 5200 deaths. However, there are epidemiologic and methodologic challenges to accurately estimate the economic burden of foodborne disease on society, either in terms of monetary costs or non-monetary units of measurement. Studies on the economic burden of foodborne disease vary considerably: some analyze the effects of a single pathogen or a single outbreak, whereas others attempt to estimate all foodborne disease in a country. Differences in surveillance systems, methodology, and other factors preclude meaningful comparisons across existing studies. However, if it were possible to completely estimate the societal costs for all acute foodborne diseases and their chronic sequelae worldwide, on the basis of currently available data, worldwide costs from these illnesses would be substantial. Moreover, foodborne infections are largely manifested as intestinal illnesses and are largely preventable. Total costs of foodborne disease would be much smaller in the United States and the world if economic incentives for industry to produce safer food were improved. However, costs of implementing new food safety prevention and control rules must be weighed against the estimated benefits of reducing foodborne disease to determine net benefits so that governments have information to efficiently allocate funds among competing programs.

  7. Efficient and Accurate Indoor Localization Using Landmark Graphs

    NASA Astrophysics Data System (ADS)

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  8. Laboratory cost control and financial management software.

    PubMed

    Mayer, M

    1998-02-09

    Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.

  9. Understanding the cost of power interruptions to U.S. electricity consumers

    SciTech Connect

    LaCommare, Kristina Hamachi; Eto, Joseph H.

    2004-09-01

    The massive electric power blackout in the northeastern United States and Canada on August 14-15, 2003 resulted in the U.S. electricity system being called ''antiquated'' and catalyzed discussions about modernizing the grid. Industry sources suggested that investments of $50 to $100 billion would be needed. This report seeks to quantify an important piece of information that has been missing from these discussions: how much do power interruptions and fluctuations in power quality (power-quality events) cost U.S. electricity consumers? Accurately estimating this cost will help assess the potential benefits of investments in improving the reliability of the grid. We develop a comprehensive end-use framework for assessing the cost to U.S. electricity consumers of power interruptions and power-quality events (referred to collectively as ''reliability events''). The framework expresses these costs as a function of: (1) Number of customers by type in a region; (2) Frequency and type of reliability events experienced annually (including both power interruptions and power-quality events) by these customers; (3) Cost of reliability events; and (4) Vulnerability of customers to these events. The framework is designed so that its cost estimate can be improved as additional data become available. Using our framework, we estimate that the national cost of power interruptions is about $80 billion annually, based on the best information available in the public domain. However, there are large gaps in and significant uncertainties about the information currently available. Notably, we were not able to develop an estimate of power-quality events. Sensitivity analysis of some of these uncertainties suggests that the total annual cost could range from less than $30 billion to more than $130 billion. Because of this large range and the enormous cost of the decisions that may be based on this estimate, we encourage policy makers, regulators, and industry to jointly under take the

  10. 36 CFR 1202.26 - Who will make sure that my record is accurate?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... RECORDS ADMINISTRATION GENERAL RULES REGULATIONS IMPLEMENTING THE PRIVACY ACT OF 1974 Collecting Information § 1202.26 Who will make sure that my record is accurate? The system manager ensures that...

  11. Step-by-step guideline for disease-specific costing studies in low- and middle-income countries: a mixed methodology

    PubMed Central

    Hendriks, Marleen E.; Kundu, Piyali; Boers, Alexander C.; Bolarinwa, Oladimeji A.; te Pas, Mark J.; Akande, Tanimola M.; Agbede, Kayode; Gomez, Gabriella B.; Redekop, William K.; Schultsz, Constance; Tan, Siok Swan

    2014-01-01

    Background Disease-specific costing studies can be used as input into cost-effectiveness analyses and provide important information for efficient resource allocation. However, limited data availability and limited expertise constrain such studies in low- and middle-income countries (LMICs). Objective To describe a step-by-step guideline for conducting disease-specific costing studies in LMICs where data availability is limited and to illustrate how the guideline was applied in a costing study of cardiovascular disease prevention care in rural Nigeria. Design The step-by-step guideline provides practical recommendations on methods and data requirements for six sequential steps: 1) definition of the study perspective, 2) characterization of the unit of analysis, 3) identification of cost items, 4) measurement of cost items, 5) valuation of cost items, and 6) uncertainty analyses. Results We discuss the necessary tradeoffs between the accuracy of estimates and data availability constraints at each step and illustrate how a mixed methodology of accurate bottom-up micro-costing and more feasible approaches can be used to make optimal use of all available data. An illustrative example from Nigeria is provided. Conclusions An innovative, user-friendly guideline for disease-specific costing in LMICs is presented, using a mixed methodology to account for limited data availability. The illustrative example showed that the step-by-step guideline can be used by healthcare professionals in LMICs to conduct feasible and accurate disease-specific cost analyses. PMID:24685170

  12. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  13. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  14. New model accurately predicts reformate composition

    SciTech Connect

    Ancheyta-Juarez, J.; Aguilar-Rodriguez, E. )

    1994-01-31

    Although naphtha reforming is a well-known process, the evolution of catalyst formulation, as well as new trends in gasoline specifications, have led to rapid evolution of the process, including: reactor design, regeneration mode, and operating conditions. Mathematical modeling of the reforming process is an increasingly important tool. It is fundamental to the proper design of new reactors and revamp of existing ones. Modeling can be used to optimize operating conditions, analyze the effects of process variables, and enhance unit performance. Instituto Mexicano del Petroleo has developed a model of the catalytic reforming process that accurately predicts reformate composition at the higher-severity conditions at which new reformers are being designed. The new AA model is more accurate than previous proposals because it takes into account the effects of temperature and pressure on the rate constants of each chemical reaction.

  15. Accurate colorimetric feedback for RGB LED clusters

    NASA Astrophysics Data System (ADS)

    Man, Kwong; Ashdown, Ian

    2006-08-01

    We present an empirical model of LED emission spectra that is applicable to both InGaN and AlInGaP high-flux LEDs, and which accurately predicts their relative spectral power distributions over a wide range of LED junction temperatures. We further demonstrate with laboratory measurements that changes in LED spectral power distribution with temperature can be accurately predicted with first- or second-order equations. This provides the basis for a real-time colorimetric feedback system for RGB LED clusters that can maintain the chromaticity of white light at constant intensity to within +/-0.003 Δuv over a range of 45 degrees Celsius, and to within 0.01 Δuv when dimmed over an intensity range of 10:1.

  16. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  17. Accurate documentation in cultural heritage by merging TLS and high-resolution photogrammetric data

    NASA Astrophysics Data System (ADS)

    Grussenmeyer, Pierre; Alby, Emmanuel; Assali, Pierre; Poitevin, Valentin; Hullo, Jean-François; Smigiel, Eddie

    2011-07-01

    Several recording techniques are used together in Cultural Heritage Documentation projects. The main purpose of the documentation and conservation works is usually to generate geometric and photorealistic 3D models for both accurate reconstruction and visualization purposes. The recording approach discussed in this paper is based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons, and criteria as geometry, texture, accuracy, resolution, recording and processing time are often compared. TLS techniques (time of flight or phase shift systems) are often used for the recording of large and complex objects or sites. Point cloud generation from images by dense stereo or multi-image matching can be used as an alternative or a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one as the acquisition system is limited to a digital camera and a few accessories only. Indeed, the stereo matching process offers a cheap, flexible and accurate solution to get 3D point clouds and textured models. The calibration of the camera allows the processing of distortion free images, accurate orientation of the images, and matching at the subpixel level. The main advantage of this photogrammetric methodology is to get at the same time a point cloud (the resolution depends on the size of the pixel on the object), and therefore an accurate meshed object with its texture. After the matching and processing steps, we can use the resulting data in much the same way as a TLS point cloud, but with really better raster information for textures. The paper will address the automation of recording and processing steps, the assessment of the results, and the deliverables (e.g. PDF-3D files). Visualization aspects of the final 3D models are presented. Two case studies with merged photogrammetric and TLS data are finally presented: - The Gallo-roman Theatre of Mandeure, France); - The

  18. An Accurate, Simplified Model Intrabeam Scattering

    SciTech Connect

    Bane, Karl LF

    2002-05-23

    Beginning with the general Bjorken-Mtingwa solution for intrabeam scattering (IBS) we derive an accurate, greatly simplified model of IBS, valid for high energy beams in normal storage ring lattices. In addition, we show that, under the same conditions, a modified version of Piwinski's IBS formulation (where {eta}{sub x,y}{sup 2}/{beta}{sub x,y} has been replaced by {Eta}{sub x,y}) asymptotically approaches the result of Bjorken-Mtingwa.

  19. An accurate registration technique for distorted images

    NASA Technical Reports Server (NTRS)

    Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis

    1990-01-01

    Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.

  20. On accurate determination of contact angle

    NASA Technical Reports Server (NTRS)

    Concus, P.; Finn, R.

    1992-01-01

    Methods are proposed that exploit a microgravity environment to obtain highly accurate measurement of contact angle. These methods, which are based on our earlier mathematical results, do not require detailed measurement of a liquid free-surface, as they incorporate discontinuous or nearly-discontinuous behavior of the liquid bulk in certain container geometries. Physical testing is planned in the forthcoming IML-2 space flight and in related preparatory ground-based experiments.

  1. Considering value of information when using CFD in design

    SciTech Connect

    Misra, John Satprim

    2009-01-01

    This thesis presents an approach to find lower resolution CFD models that can accurately lead a designer to a correct decision at a lower computational cost. High-fidelity CFD models often contain too much information and come at a higher computational cost, limiting the designs a designer can test and how much optimization can be performed on the design. Lower model resolution is commonly used to reduce computational time. However there are no clear guidelines on how much model accuracy is required. Instead experience and intuition are used to select an appropriate lower resolution model. This thesis presents an alternative to this ad hoc method by considering the added value of the addition information provided by increasing accurate and more computationally expensive models.

  2. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2009-12-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  3. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert

    2007-04-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.

  4. Advanced Fuel Cycle Cost Basis

    SciTech Connect

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2008-03-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  5. Underestimation of Project Costs

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Large projects almost always exceed their budgets. Estimating cost is difficult and estimated costs are usually too low. Three different reasons are suggested: bad luck, overoptimism, and deliberate underestimation. Project management can usually point to project difficulty and complexity, technical uncertainty, stakeholder conflicts, scope changes, unforeseen events, and other not really unpredictable bad luck. Project planning is usually over-optimistic, so the likelihood and impact of bad luck is systematically underestimated. Project plans reflect optimism and hope for success in a supposedly unique new effort rather than rational expectations based on historical data. Past project problems are claimed to be irrelevant because "This time it's different." Some bad luck is inevitable and reasonable optimism is understandable, but deliberate deception must be condemned. In a competitive environment, project planners and advocates often deliberately underestimate costs to help gain project approval and funding. Project benefits, cost savings, and probability of success are exaggerated and key risks ignored. Project advocates have incentives to distort information and conceal difficulties from project approvers. One naively suggested cure is more openness, honesty, and group adherence to shared overall goals. A more realistic alternative is threatening overrun projects with cancellation. Neither approach seems to solve the problem. A better method to avoid the delusions of over-optimism and the deceptions of biased advocacy is to base the project cost estimate on the actual costs of a large group of similar projects. Over optimism and deception can continue beyond the planning phase and into project execution. Hard milestones based on verified tests and demonstrations can provide a reality check.

  6. 38 CFR 17.260 - Patient care costs to be excluded from direct costs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... VETERANS AFFAIRS MEDICAL Grants for Exchange of Information § 17.260 Patient care costs to be excluded from direct costs. Grant funds for planning or implementing agreements for the exchange of medical information shall not be available for the payment of any hospital, medical, or other costs involving the care...

  7. 38 CFR 17.260 - Patient care costs to be excluded from direct costs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... VETERANS AFFAIRS MEDICAL Grants for Exchange of Information § 17.260 Patient care costs to be excluded from direct costs. Grant funds for planning or implementing agreements for the exchange of medical information shall not be available for the payment of any hospital, medical, or other costs involving the care...

  8. 38 CFR 17.260 - Patient care costs to be excluded from direct costs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... VETERANS AFFAIRS MEDICAL Grants for Exchange of Information § 17.260 Patient care costs to be excluded from direct costs. Grant funds for planning or implementing agreements for the exchange of medical information shall not be available for the payment of any hospital, medical, or other costs involving the care...

  9. 38 CFR 17.260 - Patient care costs to be excluded from direct costs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... VETERANS AFFAIRS MEDICAL Grants for Exchange of Information § 17.260 Patient care costs to be excluded from direct costs. Grant funds for planning or implementing agreements for the exchange of medical information shall not be available for the payment of any hospital, medical, or other costs involving the care...

  10. 48 CFR 1652.215-70 - Rate Reduction for Defective Pricing or Defective Cost or Pricing Data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... that were not complete, accurate, or current as certified in the Certificate of Accurate Cost or... accurate, complete, and current cost or pricing data had been submitted or maintained and identified. (ii... submit or keep in its files a Certificate of Current Cost or Pricing Data. (2)(i) Except as prohibited...

  11. An Inexpensive and Accurate Tensiometer Using an Electronic Balance

    NASA Astrophysics Data System (ADS)

    Dolz, Manuel; Delegido, Jesús; Hernández, María-Jesús; Pellicer, Julio

    2001-09-01

    A method for measuring surface tension of liquid-air interfaces that consists of a modification of the du Noüy tensiometer is proposed. An electronic balance is used to determine the detachment force with high resolution and the relative displacement ring/plate-liquid surface is carried out by the descent of the liquid-free surface. The procedure familiarizes undergraduate students in applied science and technology with the experimental study of surface tension by means of a simple and accurate method that offers the advantages of sophisticated devices at considerably less cost. The operational aspects that must be taken into account are analyzed: the measuring system and determination of its effective length, measurement of the detachment force, and the relative system-liquid interface displacement rate. To check the accuracy of the proposed tensiometer, measurements of the surface tension of different known liquids have been performed, and good agreement with results reported in the literature was obtained.

  12. Accurate documentation, correct coding, and compliance: it's your best defense!

    PubMed

    Coles, T S; Babb, E F

    1999-07-01

    This article focuses on the need for physicians to maintain an awareness of regulatory policy and the law impacting the federal government's medical insurance programs, and to internalize and apply this knowledge in their practices. Basic information concerning selected fraud and abuse statutes and the civil monetary penalties and sanctions for noncompliance is discussed. The application of accurate documentation and correct coding principles, as well as the rationale for implementating an effective compliance plan in order to prevent fraud and abuse and/or minimize disciplinary action from government regulatory agencies, are emphasized.

  13. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  14. Parametric Cost Deployment

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1995-01-01

    Parametric cost analysis is a mathematical approach to estimating cost. Parametric cost analysis uses non-cost parameters, such as quality characteristics, to estimate the cost to bring forth, sustain, and retire a product. This paper reviews parametric cost analysis and shows how it can be used within the cost deployment process.

  15. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  16. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  17. Accurate 3D quantification of the bronchial parameters in MDCT

    NASA Astrophysics Data System (ADS)

    Saragaglia, A.; Fetita, C.; Preteux, F.; Brillet, P. Y.; Grenier, P. A.

    2005-08-01

    The assessment of bronchial reactivity and wall remodeling in asthma plays a crucial role in better understanding such a disease and evaluating therapeutic responses. Today, multi-detector computed tomography (MDCT) makes it possible to perform an accurate estimation of bronchial parameters (lumen and wall areas) by allowing a quantitative analysis in a cross-section plane orthogonal to the bronchus axis. This paper provides the tools for such an analysis by developing a 3D investigation method which relies on 3D reconstruction of bronchial lumen and central axis computation. Cross-section images at bronchial locations interactively selected along the central axis are generated at appropriate spatial resolution. An automated approach is then developed for accurately segmenting the inner and outer bronchi contours on the cross-section images. It combines mathematical morphology operators, such as "connection cost", and energy-controlled propagation in order to overcome the difficulties raised by vessel adjacencies and wall irregularities. The segmentation accuracy was validated with respect to a 3D mathematically-modeled phantom of a pair bronchus-vessel which mimics the characteristics of real data in terms of gray-level distribution, caliber and orientation. When applying the developed quantification approach to such a model with calibers ranging from 3 to 10 mm diameter, the lumen area relative errors varied from 3.7% to 0.15%, while the bronchus area was estimated with a relative error less than 5.1%.

  18. ACCURATE CHEMICAL MASTER EQUATION SOLUTION USING MULTI-FINITE BUFFERS

    PubMed Central

    Cao, Youfang; Terebus, Anna; Liang, Jie

    2016-01-01

    The discrete chemical master equation (dCME) provides a fundamental framework for studying stochasticity in mesoscopic networks. Because of the multi-scale nature of many networks where reaction rates have large disparity, directly solving dCMEs is intractable due to the exploding size of the state space. It is important to truncate the state space effectively with quantified errors, so accurate solutions can be computed. It is also important to know if all major probabilistic peaks have been computed. Here we introduce the Accurate CME (ACME) algorithm for obtaining direct solutions to dCMEs. With multi-finite buffers for reducing the state space by O(n!), exact steady-state and time-evolving network probability landscapes can be computed. We further describe a theoretical framework of aggregating microstates into a smaller number of macrostates by decomposing a network into independent aggregated birth and death processes, and give an a priori method for rapidly determining steady-state truncation errors. The maximal sizes of the finite buffers for a given error tolerance can also be pre-computed without costly trial solutions of dCMEs. We show exactly computed probability landscapes of three multi-scale networks, namely, a 6-node toggle switch, 11-node phage-lambda epigenetic circuit, and 16-node MAPK cascade network, the latter two with no known solutions. We also show how probabilities of rare events can be computed from first-passage times, another class of unsolved problems challenging for simulation-based techniques due to large separations in time scales. Overall, the ACME method enables accurate and efficient solutions of the dCME for a large class of networks. PMID:27761104

  19. ACCURATE CHEMICAL MASTER EQUATION SOLUTION USING MULTI-FINITE BUFFERS.

    PubMed

    Cao, Youfang; Terebus, Anna; Liang, Jie

    2016-01-01

    The discrete chemical master equation (dCME) provides a fundamental framework for studying stochasticity in mesoscopic networks. Because of the multi-scale nature of many networks where reaction rates have large disparity, directly solving dCMEs is intractable due to the exploding size of the state space. It is important to truncate the state space effectively with quantified errors, so accurate solutions can be computed. It is also important to know if all major probabilistic peaks have been computed. Here we introduce the Accurate CME (ACME) algorithm for obtaining direct solutions to dCMEs. With multi-finite buffers for reducing the state space by O(n!), exact steady-state and time-evolving network probability landscapes can be computed. We further describe a theoretical framework of aggregating microstates into a smaller number of macrostates by decomposing a network into independent aggregated birth and death processes, and give an a priori method for rapidly determining steady-state truncation errors. The maximal sizes of the finite buffers for a given error tolerance can also be pre-computed without costly trial solutions of dCMEs. We show exactly computed probability landscapes of three multi-scale networks, namely, a 6-node toggle switch, 11-node phage-lambda epigenetic circuit, and 16-node MAPK cascade network, the latter two with no known solutions. We also show how probabilities of rare events can be computed from first-passage times, another class of unsolved problems challenging for simulation-based techniques due to large separations in time scales. Overall, the ACME method enables accurate and efficient solutions of the dCME for a large class of networks.

  20. Improving the accuracy of admitted subacute clinical costing: an action research approach.

    PubMed

    Hakkennes, Sharon; Arblaster, Ross; Lim, Kim

    2016-08-29

    Objective The aim of the present study was to determine whether action research could be used to improve the breadth and accuracy of clinical costing data in an admitted subacute settingMethods The setting was a 100-bed in-patient rehabilitation centre. Using a pre-post study design all admitted subacute separations during the 2011-12 financial year were eligible for inclusion. An action research framework aimed at improving clinical costing methodology was developed and implemented.Results In all, 1499 separations were included in the study. A medical record audit of a random selection of 80 separations demonstrated that the use of an action research framework was effective in improving the breadth and accuracy of the costing data. This was evidenced by a significant increase in the average number of activities costed, a reduction in the average number of activities incorrectly costed and a reduction in the average number of activities missing from the costing, per episode of care.Conclusions Engaging clinicians and cost centre managers was effective in facilitating the development of robust clinical costing data in an admitted subacute setting. Further investigation into the value of this approach across other care types and healthcare services is warranted.What is known about this topic? Accurate clinical costing data is essential for informing price models used in activity-based funding. In Australia, there is currently a lack of robust admitted subacute cost data to inform the price model for this care type.What does this paper add? The action research framework presented in this study was effective in improving the breadth and accuracy of clinical costing data in an admitted subacute setting.What are the implications for practitioners? To improve clinical costing practices, health services should consider engaging key stakeholders, including clinicians and cost centre managers, in reviewing clinical costing methodology. Robust clinical costing data has the

  1. Contribution of imaging to cancer care costs.

    PubMed

    Yang, Yang; Czernin, Johannes

    2011-12-01

    Health care costs in the United States are increasing faster than the gross domestic product (GDP), and the growth rate of costs related to diagnostic imaging exceeds those of overall health care expenditures. Here we show that the contribution of imaging to cancer care costs pales in comparison to those of other key cost components, such as cancer drugs. Specifically, we estimate that (18)F-FDG PET or PET/CT accounted for approximately 1.5% of overall Medicare cancer care costs in 2009. Moreover, we propose that the appropriate use of (18)F-FDG PET or PET/CT could reduce the costs of cancer care. Because the U.S. health care system is complex and because it is difficult to find accurate data elsewhere, most cost and use assessments are based on published data from the U.S. Centers for Medicare & Medicaid Services.

  2. Bi-fluorescence imaging for estimating accurately the nuclear condition of Rhizoctonia spp.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aims: To simplify the determination of the nuclear condition of the pathogenic Rhizoctonia, which currently needs to be performed either using two fluorescent dyes, thus is more costly and time-consuming, or using only one fluorescent dye, and thus less accurate. Methods and Results: A red primary ...

  3. Time-Driven Activity-Based Costing in Emergency Medicine.

    PubMed

    Yun, Brian J; Prabhakar, Anand M; Warsh, Jonathan; Kaplan, Robert; Brennan, John; Dempsey, Kyle E; Raja, Ali S

    2016-06-01

    Value in emergency medicine is determined by both patient-important outcomes and the costs associated with achieving them. However, measuring true costs is challenging. Without an understanding of costs, emergency department (ED) leaders will be unable to determine which interventions might improve value for their patients. Although ongoing research may determine which outcomes are meaningful, an accurate costing system is also needed. This article reviews current costing mechanisms in the ED and their pitfalls. It then describes how time-driven activity-based costing may be superior to these current costing systems. Time-driven activity-based costing, in addition to being a more accurate costing system, can be used for process improvements in the ED.

  4. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  5. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2017-03-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  6. Monte Carlo simulation by computer for life-cycle costing

    NASA Technical Reports Server (NTRS)

    Gralow, F. H.; Larson, W. J.

    1969-01-01

    Prediction of behavior and support requirements during the entire life cycle of a system enables accurate cost estimates by using the Monte Carlo simulation by computer. The system reduces the ultimate cost to the procuring agency because it takes into consideration the costs of initial procurement, operation, and maintenance.

  7. Cost Analysis of TEOAE-Based Universal Newborn Hearing Screening.

    ERIC Educational Resources Information Center

    Weirather, Yusnita; Korth, Nancy; White, Karl R.; Woods-Kershner, Nancy; Downs, Diane

    1997-01-01

    After reviewing the extant literature, this article describes a cost analysis of the transient evoked otoaoustic emissions (TEOAE)-based universal newborn hearing screening program. Reasons why the cost per baby ($7.42) is lower than in previous reports are explained and the benefits of having accurate cost analysis data are summarized. (Author/CR)

  8. 78 FR 7718 - Review of the General Purpose Costing System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-04

    ... switch engine minutes; (2) equipment costs for the use of railroad-owned cars during switching; and (3... alternative distance between I&I switches that more accurately reflects railroad operations. Definition of... general purpose costing system, the Uniform Railroad Costing System (URCS). Specifically, the Board...

  9. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  10. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  11. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  12. Determining accurate distances to nearby galaxies

    NASA Astrophysics Data System (ADS)

    Bonanos, Alceste Zoe

    2005-11-01

    Determining accurate distances to nearby or distant galaxies is a very simple conceptually, yet complicated in practice, task. Presently, distances to nearby galaxies are only known to an accuracy of 10-15%. The current anchor galaxy of the extragalactic distance scale is the Large Magellanic Cloud, which has large (10-15%) systematic uncertainties associated with it, because of its morphology, its non-uniform reddening and the unknown metallicity dependence of the Cepheid period-luminosity relation. This work aims to determine accurate distances to some nearby galaxies, and subsequently help reduce the error in the extragalactic distance scale and the Hubble constant H 0 . In particular, this work presents the first distance determination of the DIRECT Project to M33 with detached eclipsing binaries. DIRECT aims to obtain a new anchor galaxy for the extragalactic distance scale by measuring direct, accurate (to 5%) distances to two Local Group galaxies, M31 and M33, with detached eclipsing binaries. It involves a massive variability survey of these galaxies and subsequent photometric and spectroscopic follow-up of the detached binaries discovered. In this work, I also present a catalog of variable stars discovered in one of the DIRECT fields, M31Y, which includes 41 eclipsing binaries. Additionally, we derive the distance to the Draco Dwarf Spheroidal galaxy, with ~100 RR Lyrae found in our first CCD variability study of this galaxy. A "hybrid" method of discovering Cepheids with ground-based telescopes is described next. It involves applying the image subtraction technique on the images obtained from ground-based telescopes and then following them up with the Hubble Space Telescope to derive Cepheid period-luminosity distances. By re-analyzing ESO Very Large Telescope data on M83 (NGC 5236), we demonstrate that this method is much more powerful for detecting variability, especially in crowded fields. I finally present photometry for the Wolf-Rayet binary WR 20a

  13. Information Warfare Arms Control: Risks and Costs

    DTIC Science & Technology

    2006-03-01

    of facilities such as breweries, yogurt manufactures , and agricultural ethanol plants if a verification provision was adopted.109 Since there would...just as overwhelming as the inspection of breweries, yogurt manufactures , and agricultural ethanol plants would be for the BWC.118 However, a

  14. Accurate Methods for Large Molecular Systems (Preprint)

    DTIC Science & Technology

    2009-01-06

    Gaussian functions. These basis sets can be used in a systematic way to obtain results approaching the complete basis set ( CBS ) limit. However...convergence to the CBS limit. The high accuracy of these basis sets still comes at a significant computational cost, only feasible on relatively small...J. Chem. Phys. 2006, 124, 114103. (b) ccCA: DeYonker, N. J.; Grimes , T.; Yockel, S.; Dinescu, A.; Mintz, B.; Cundari, T. R.; Wilson, A. K. J. Chem

  15. 7 CFR 2903.4 - Indirect costs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 15 2011-01-01 2011-01-01 false Indirect costs. 2903.4 Section 2903.4 Agriculture... AGRICULTURE BIODIESEL FUEL EDUCATION PROGRAM General Information § 2903.4 Indirect costs. (a) For the Biodiesel Fuel Education Program, applicants should use the current indirect cost rate negotiated with...

  16. In Search of Cost-Effective Schools.

    ERIC Educational Resources Information Center

    Raywid, Mary Anne; Shaheen, Thomas A.

    1994-01-01

    Examines major cost-effectiveness proposals, describing developments that highlight concerns over making schools cost effective. The article discusses ways to blend the concerns of educational quality, equity, and costs (district consolidations, shared service and facilities arrangements, new accountability strategies, new information systems,…

  17. COSTS TO REMEDIATE MTBE-CONTAMINATED SITES

    EPA Science Inventory

    The extensive contamination of methyl tert-butyl ether (MTBE) in ground water has introduced concerns about the increased cost of remediation of MTBE releases compared to sites with BTEX only contamination. In an attempt to evaluate these costs, cost information for 311 sites wa...

  18. 7 CFR 2903.4 - Indirect costs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AGRICULTURE BIODIESEL FUEL EDUCATION PROGRAM General Information § 2903.4 Indirect costs. (a) For the Biodiesel Fuel Education Program, applicants should use the current indirect cost rate negotiated with the... funds. Grantees electing this alternative will not be allowed to charge, as direct costs, indirect...

  19. 7 CFR 2903.4 - Indirect costs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AGRICULTURE BIODIESEL FUEL EDUCATION PROGRAM General Information § 2903.4 Indirect costs. (a) For the Biodiesel Fuel Education Program, applicants should use the current indirect cost rate negotiated with the... funds. Grantees electing this alternative will not be allowed to charge, as direct costs, indirect...

  20. 7 CFR 2903.4 - Indirect costs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AGRICULTURE BIODIESEL FUEL EDUCATION PROGRAM General Information § 2903.4 Indirect costs. (a) For the Biodiesel Fuel Education Program, applicants should use the current indirect cost rate negotiated with the... funds. Grantees electing this alternative will not be allowed to charge, as direct costs, indirect...

  1. Accurate colon residue detection algorithm with partial volume segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Liang, Zhengrong; Zhang, PengPeng; Kutcher, Gerald J.

    2004-05-01

    Colon cancer is the second leading cause of cancer-related death in the United States. Earlier detection and removal of polyps can dramatically reduce the chance of developing malignant tumor. Due to some limitations of optical colonoscopy used in clinic, many researchers have developed virtual colonoscopy as an alternative technique, in which accurate colon segmentation is crucial. However, partial volume effect and existence of residue make it very challenging. The electronic colon cleaning technique proposed by Chen et al is a very attractive method, which is also kind of hard segmentation method. As mentioned in their paper, some artifacts were produced, which might affect the accurate colon reconstruction. In our paper, instead of labeling each voxel with a unique label or tissue type, the percentage of different tissues within each voxel, which we call a mixture, was considered in establishing a maximum a posterior probability (MAP) image-segmentation framework. A Markov random field (MRF) model was developed to reflect the spatial information for the tissue mixtures. The spatial information based on hard segmentation was used to determine which tissue types are in the specific voxel. Parameters of each tissue class were estimated by the expectation-maximization (EM) algorithm during the MAP tissue-mixture segmentation. Real CT experimental results demonstrated that the partial volume effects between four tissue types have been precisely detected. Meanwhile, the residue has been electronically removed and very smooth and clean interface along the colon wall has been obtained.

  2. Accurate taxonomic assignment of short pyrosequencing reads.

    PubMed

    Clemente, José C; Jansson, Jesper; Valiente, Gabriel

    2010-01-01

    Ambiguities in the taxonomy dependent assignment of pyrosequencing reads are usually resolved by mapping each read to the lowest common ancestor in a reference taxonomy of all those sequences that match the read. This conservative approach has the drawback of mapping a read to a possibly large clade that may also contain many sequences not matching the read. A more accurate taxonomic assignment of short reads can be made by mapping each read to the node in the reference taxonomy that provides the best precision and recall. We show that given a suffix array for the sequences in the reference taxonomy, a short read can be mapped to the node of the reference taxonomy with the best combined value of precision and recall in time linear in the size of the taxonomy subtree rooted at the lowest common ancestor of the matching sequences. An accurate taxonomic assignment of short reads can thus be made with about the same efficiency as when mapping each read to the lowest common ancestor of all matching sequences in a reference taxonomy. We demonstrate the effectiveness of our approach on several metagenomic datasets of marine and gut microbiota.

  3. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  4. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  5. Sparse and accurate high resolution SAR imaging

    NASA Astrophysics Data System (ADS)

    Vu, Duc; Zhao, Kexin; Rowe, William; Li, Jian

    2012-05-01

    We investigate the usage of an adaptive method, the Iterative Adaptive Approach (IAA), in combination with a maximum a posteriori (MAP) estimate to reconstruct high resolution SAR images that are both sparse and accurate. IAA is a nonparametric weighted least squares algorithm that is robust and user parameter-free. IAA has been shown to reconstruct SAR images with excellent side lobes suppression and high resolution enhancement. We first reconstruct the SAR images using IAA, and then we enforce sparsity by using MAP with a sparsity inducing prior. By coupling these two methods, we can produce a sparse and accurate high resolution image that are conducive for feature extractions and target classification applications. In addition, we show how IAA can be made computationally efficient without sacrificing accuracies, a desirable property for SAR applications where the size of the problems is quite large. We demonstrate the success of our approach using the Air Force Research Lab's "Gotcha Volumetric SAR Data Set Version 1.0" challenge dataset. Via the widely used FFT, individual vehicles contained in the scene are barely recognizable due to the poor resolution and high side lobe nature of FFT. However with our approach clear edges, boundaries, and textures of the vehicles are obtained.

  6. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  7. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, Kritina L.; Thompson, Shelby G.; Sandor, Aniko; McCann, Robert S.; Kaiser, Mary K.; Adelstein, Barnard D.; Begault, Durand R.; Beutter, Brent R.; Stone, Leland S.; Godfroy, Martine

    2009-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. In addition to addressing display design issues associated with information formatting, style, layout, and interaction, the Information Presentation DRP is also working toward understanding the effects of extreme environments encountered in space travel on information processing. Work is also in progress to refine human factors-based design tools, such as human performance modeling, that will supplement traditional design techniques and help ensure that optimal information design is accomplished in the most cost-efficient manner. The major areas of work, or subtasks, within the Information Presentation DRP for FY10 are: 1) Displays, 2) Controls, 3) Procedures and Fault Management, and 4) Human Performance Modeling. The poster will highlight completed and planned work for each subtask.

  8. Costs and cost containment in nursing homes.

    PubMed Central

    Smith, H L; Fottler, M D

    1981-01-01

    The study examines the impact of structural and process variables on the cost of nursing home care and the utilization of various cost containment methods in 43 california nursing homes. Several predictors were statistically significant in their relation to cost per patient day. A diverse range of cost containment techniques was discovered along with strong predictors of the utilization of these techniques by nursing home administrators. The trade-off between quality of care and cost of care is discussed. PMID:7228713

  9. Assessing the Cost of Global Biodiversity and Conservation Knowledge

    PubMed Central

    Juffe-Bignoli, Diego; Brooks, Thomas M.; Butchart, Stuart H. M.; Jenkins, Richard B.; Boe, Kaia; Hoffmann, Michael; Angulo, Ariadne; Bachman, Steve; Böhm, Monika; Brummitt, Neil; Carpenter, Kent E.; Comer, Pat J.; Cox, Neil; Cuttelod, Annabelle; Darwall, William R. T.; Fishpool, Lincoln D. C.; Goettsch, Bárbara; Heath, Melanie; Hilton-Taylor, Craig; Hutton, Jon; Johnson, Tim; Joolia, Ackbar; Keith, David A.; Langhammer, Penny F.; Luedtke, Jennifer; Nic Lughadha, Eimear; Lutz, Maiko; May, Ian; Miller, Rebecca M.; Oliveira-Miranda, María A.; Parr, Mike; Pollock, Caroline M.; Ralph, Gina; Rodríguez, Jon Paul; Rondinini, Carlo; Smart, Jane; Stuart, Simon; Symes, Andy; Tordoff, Andrew W.; Young, Bruce; Kingston, Naomi

    2016-01-01

    Knowledge products comprise assessments of authoritative information supported by standards, governance, quality control, data, tools, and capacity building mechanisms. Considerable resources are dedicated to developing and maintaining knowledge products for biodiversity conservation, and they are widely used to inform policy and advise decision makers and practitioners. However, the financial cost of delivering this information is largely undocumented. We evaluated the costs and funding sources for developing and maintaining four global biodiversity and conservation knowledge products: The IUCN Red List of Threatened Species, the IUCN Red List of Ecosystems, Protected Planet, and the World Database of Key Biodiversity Areas. These are secondary data sets, built on primary data collected by extensive networks of expert contributors worldwide. We estimate that US$160 million (range: US$116–204 million), plus 293 person-years of volunteer time (range: 278–308 person-years) valued at US$ 14 million (range US$12–16 million), were invested in these four knowledge products between 1979 and 2013. More than half of this financing was provided through philanthropy, and nearly three-quarters was spent on personnel costs. The estimated annual cost of maintaining data and platforms for three of these knowledge products (excluding the IUCN Red List of Ecosystems for which annual costs were not possible to estimate for 2013) is US$6.5 million in total (range: US$6.2–6.7 million). We estimated that an additional US$114 million will be needed to reach pre-defined baselines of data coverage for all the four knowledge products, and that once achieved, annual maintenance costs will be approximately US$12 million. These costs are much lower than those to maintain many other, similarly important, global knowledge products. Ensuring that biodiversity and conservation knowledge products are sufficiently up to date, comprehensive and accurate is fundamental to inform decision

  10. Teaching ABC & Cost Behaviors to Non-Numbers People

    ERIC Educational Resources Information Center

    Taylor, Virginia Anne; Rudnick, Martin

    2007-01-01

    Simply put, a cost analysis studies how you spend your money. Activity based costing models associate costs with services and cost benefit analysis weighs whether or not the costs expended were worth the money given the efforts involved and the results achieved. This study seeks to understand the financial choices and information seeking behaviors…

  11. Unraveling Higher Education's Costs.

    ERIC Educational Resources Information Center

    Gordon, Gus; Charles, Maria

    1998-01-01

    The activity-based costing (ABC) method of analyzing institutional costs in higher education involves four procedures: determining the various discrete activities of the organization; calculating the cost of each; determining the cost drivers; tracing cost to the cost objective or consumer of each activity. Few American institutions have used the…

  12. Adaptive and accurate color edge extraction method for one-shot shape acquisition

    NASA Astrophysics Data System (ADS)

    Yin, Wei; Cheng, Xiaosheng; Cui, Haihua; Li, Dawei; Zhou, Lei

    2016-09-01

    This paper presents an approach to extract accurate color edge information using encoded patterns in hue, saturation, and intensity (HSI) color space. This method is applied to one-shot shape acquisition. Theoretical analysis shows that the hue transition between primary and secondary colors in a color edge is based on light interference and diffraction. We set up a color transition model to illustrate the hue transition on an edge and then define the segmenting position of two stripes. By setting up an adaptive HSI color space, the colors of the stripes and subpixel edges are obtained precisely without a dark laboratory environment, in a low-cost processing algorithm. Since this method does not have any constraints for colors of neighboring stripes, the encoding is an easy procedure. The experimental results show that the edges of dense modulation patterns can be obtained under a complicated environment illumination, and the precision can ensure that the three-dimensional shape of the object is obtained reliably with only one image.

  13. Accurate Damage Location in Complex Composite Structures and Industrial Environments using Acoustic Emission

    NASA Astrophysics Data System (ADS)

    Eaton, M.; Pearson, M.; Lee, W.; Pullin, R.

    2015-07-01

    The ability to accurately locate damage in any given structure is a highly desirable attribute for an effective structural health monitoring system and could help to reduce operating costs and improve safety. This becomes a far greater challenge in complex geometries and materials, such as modern composite airframes. The poor translation of promising laboratory based SHM demonstrators to industrial environments forms a barrier to commercial up take of technology. The acoustic emission (AE) technique is a passive NDT method that detects elastic stress waves released by the growth of damage. It offers very sensitive damage detection, using a sparse array of sensors to detect and globally locate damage within a structure. However its application to complex structures commonly yields poor accuracy due to anisotropic wave propagation and the interruption of wave propagation by structural features such as holes and thickness changes. This work adopts an empirical mapping technique for AE location, known as Delta T Mapping, which uses experimental training data to account for such structural complexities. The technique is applied to a complex geometry composite aerospace structure undergoing certification testing. The component consists of a carbon fibre composite tube with varying wall thickness and multiple holes, that was loaded under bending. The damage location was validated using X-ray CT scanning and the Delta T Mapping technique was shown to improve location accuracy when compared with commercial algorithms. The onset and progression of damage were monitored throughout the test and used to inform future design iterations.

  14. Projected 1999-2000 Cost Allocation Summary.

    ERIC Educational Resources Information Center

    Wisconsin Technical Coll. System Board, Madison.

    Information contained in this summary was derived from data submitted by Wisconsin technical colleges on their 1999-2000 projected cost allocation schedules. Cost allocation information is used to calculate the distribution of state aids to each college, and prepare financial and enrollment reports including state statistical summaries and reports…

  15. Start2quit: a randomised clinical controlled trial to evaluate the effectiveness and cost-effectiveness of using personal tailored risk information and taster sessions to increase the uptake of the NHS Stop Smoking Services.

    PubMed Central

    Gilbert, Hazel; Sutton, Stephen; Morris, Richard; Petersen, Irene; Wu, Qi; Parrott, Steve; Galton, Simon; Kale, Dimitra; Magee, Molly Sweeney; Gardner, Leanne; Nazareth, Irwin

    2017-01-01

    BACKGROUND The NHS Stop Smoking Services (SSSs) offer help to smokers who want to quit. However, the proportion of smokers attending the SSSs is low and current figures show a continuing downward trend. This research addressed the problem of how to motivate more smokers to accept help to quit. OBJECTIVES To assess the relative effectiveness, and cost-effectiveness, of an intervention consisting of proactive recruitment by a brief computer-tailored personal risk letter and an invitation to a 'Come and Try it' taster session to provide information about the SSSs, compared with a standard generic letter advertising the service, in terms of attendance at the SSSs of at least one session and validated 7-day point prevalent abstinence at the 6-month follow-up. DESIGN Randomised controlled trial of a complex intervention with follow-up 6 months after the date of randomisation. SETTING SSSs and general practices in England. PARTICIPANTS All smokers aged ≥ 16 years identified from medical records in participating practices who were motivated to quit and who had not attended the SSS in the previous 12 months. Participants were randomised in the ratio 3 : 2 (intervention to control) by a computer program. INTERVENTIONS Intervention - brief personalised and tailored letter sent from the general practitioner using information obtained from the screening questionnaire and from medical records, and an invitation to attend a taster session, run by the local SSS. Control - standard generic letter from the general practice advertising the local SSS and the therapies available, and asking the smoker to contact the service to make an appointment. MAIN OUTCOME MEASURES (1) Proportion of people attending the first session of a 6-week course over a period of 6 months from the receipt of the invitation letter, measured by records of attendance at the SSSs; (2) 7-day point prevalent abstinence at the 6-month follow-up, validated by salivary cotinine analysis; and (3) cost

  16. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  17. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  18. LSM: perceptually accurate line segment merging

    NASA Astrophysics Data System (ADS)

    Hamid, Naila; Khan, Nazar

    2016-11-01

    Existing line segment detectors tend to break up perceptually distinct line segments into multiple segments. We propose an algorithm for merging such broken segments to recover the original perceptually accurate line segments. The algorithm proceeds by grouping line segments on the basis of angular and spatial proximity. Then those line segment pairs within each group that satisfy unique, adaptive mergeability criteria are successively merged to form a single line segment. This process is repeated until no more line segments can be merged. We also propose a method for quantitative comparison of line segment detection algorithms. Results on the York Urban dataset show that our merged line segments are closer to human-marked ground-truth line segments compared to state-of-the-art line segment detection algorithms.

  19. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  20. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  1. Obtaining accurate translations from expressed sequence tags.

    PubMed

    Wasmuth, James; Blaxter, Mark

    2009-01-01

    The genomes of an increasing number of species are being investigated through the generation of expressed sequence tags (ESTs). However, ESTs are prone to sequencing errors and typically define incomplete transcripts, making downstream annotation difficult. Annotation would be greatly improved with robust polypeptide translations. Many current solutions for EST translation require a large number of full-length gene sequences for training purposes, a resource that is not available for the majority of EST projects. As part of our ongoing EST programs investigating these "neglected" genomes, we have developed a polypeptide prediction pipeline, prot4EST. It incorporates freely available software to produce final translations that are more accurate than those derived from any single method. We describe how this integrated approach goes a long way to overcoming the deficit in training data.

  2. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  3. Accurate radio positions with the Tidbinbilla interferometer

    NASA Technical Reports Server (NTRS)

    Batty, M. J.; Gulkis, S.; Jauncey, D. L.; Rayner, P. T.

    1979-01-01

    The Tidbinbilla interferometer (Batty et al., 1977) is designed specifically to provide accurate radio position measurements of compact radio sources in the Southern Hemisphere with high sensitivity. The interferometer uses the 26-m and 64-m antennas of the Deep Space Network at Tidbinbilla, near Canberra. The two antennas are separated by 200 m on a north-south baseline. By utilizing the existing antennas and the low-noise traveling-wave masers at 2.29 GHz, it has been possible to produce a high-sensitivity instrument with a minimum of capital expenditure. The north-south baseline ensures that a good range of UV coverage is obtained, so that sources lying in the declination range between about -80 and +30 deg may be observed with nearly orthogonal projected baselines of no less than about 1000 lambda. The instrument also provides high-accuracy flux density measurements for compact radio sources.

  4. Magnetic ranging tool accurately guides replacement well

    SciTech Connect

    Lane, J.B.; Wesson, J.P. )

    1992-12-21

    This paper reports on magnetic ranging surveys and directional drilling technology which accurately guided a replacement well bore to intersect a leaking gas storage well with casing damage. The second well bore was then used to pump cement into the original leaking casing shoe. The repair well bore kicked off from the surface hole, bypassed casing damage in the middle of the well, and intersected the damaged well near the casing shoe. The repair well was subsequently completed in the gas storage zone near the original well bore, salvaging the valuable bottom hole location in the reservoir. This method would prevent the loss of storage gas, and it would prevent a potential underground blowout that could permanently damage the integrity of the storage field.

  5. Sensitive and accurate identification of protein–DNA binding events in ChIP-chip assays using higher order derivative analysis

    PubMed Central

    Barrett, Christian L.; Cho, Byung-Kwan

    2011-01-01

    Immuno-precipitation of protein–DNA complexes followed by microarray hybridization is a powerful and cost-effective technology for discovering protein–DNA binding events at the genome scale. It is still an unresolved challenge to comprehensively, accurately and sensitively extract binding event information from the produced data. We have developed a novel strategy composed of an information-preserving signal-smoothing procedure, higher order derivative analysis and application of the principle of maximum entropy to address this challenge. Importantly, our method does not require any input parameters to be specified by the user. Using genome-scale binding data of two Escherichia coli global transcription regulators for which a relatively large number of experimentally supported sites are known, we show that ∼90% of known sites were resolved to within four probes, or ∼88 bp. Over half of the sites were resolved to within two probes, or ∼38 bp. Furthermore, we demonstrate that our strategy delivers significant quantitative and qualitative performance gains over available methods. Such accurate and sensitive binding site resolution has important consequences for accurately reconstructing transcriptional regulatory networks, for motif discovery, for furthering our understanding of local and non-local factors in protein–DNA interactions and for extending the usefulness horizon of the ChIP-chip platform. PMID:21051353

  6. Costs and cost-minimisation analysis.

    PubMed

    Robinson, R

    1993-09-18

    Whatever kind of economic evaluation you plan to undertake, the costs must be assessed. In health care these are first of all divided into costs borne by the NHS (like drugs), by patients and their families (like travel), and by the rest of society (like health education). Next the costs have to be valued in monetary terms; direct costs, like wages, pose little problem, but indirect costs (like time spent in hospital) have to have values imputed to them. And that is not all: costs must be further subdivided into average, marginal, and joint costs, which help decisions on how much of a service should be provided. Capital costs (investments in plant, buildings, and machinery) are also important, as are discounting and inflation. In this second article in the series Ray Robinson defines the types of costs, their measurement, and how they should be valued in monetary terms.

  7. Construction Cost Analysis : Residential Construction Demonstration Project Cycle II.

    SciTech Connect

    Barnett, Cole; Thor, Philip W.

    1990-06-01

    The Residential Construction Demonstration Project (RCDP) is designed to demonstrate new residential building techniques and product innovations which advance the stage-of-the-art in constructing energy-efficient electrically heated residences. A secondary purpose is to obtain documented cost and energy savings data from which to make accurate assessments of the cost-effectiveness of various conservation innovations. The project solicits participation of regional homebuilders by offering them financial incentives for constructing homes to the Model Conservation Standards (MCS) and including at least one innovation.'' The innovations are determined by BPA and the States prior to construction and represent construction techniques or energy saving products that might reduce the cost of building MCS homes, or expand the options available to builders in achieving MCS levels of energy efficiency in homes. Besides covering some of the additional risk for employing the innovation, the incentive payment guarantees that builders will provide certain amounts of information regarding the cost and acceptability of building the homes. In addition, an incentive is paid to homeowners for their participation in data collection efforts following construction. Several one-time'' tests were performed on the houses and homeowners were required to report energy consumption and temperature data on a weekly basis for approximately 18 months. BPA and the States compile the information obtained from the builders and homeowners. Access to this data is provided for the purpose of analyzing the cost and performance of the RCDP homes, as well as understanding the value of the various innovations that are tested. 25 tabs., 4 figs.

  8. Time, Monetary and Other Costs of Participation in Family-Based Child Weight Management Interventions: Qualitative and Systematic Review Evidence

    PubMed Central

    Arai, Lisa; Panca, Monica; Morris, Steve; Curtis-Tyler, Katherine; Lucas, Patricia J.; Roberts, Helen M.

    2015-01-01

    Background Childhood overweight and obesity have health and economic impacts on individuals and the wider society. Families participating in weight management programmes may foresee or experience monetary and other costs which deter them from signing up to or completing programmes. This is recognised in the health economics literature, though within this sparse body of work, costs to families are often narrowly defined and not fully accounted for. A societal perspective incorporating a broader array of costs may provide a more accurate picture. This paper brings together a review of the health economics literature on the costs to families attending child weight management programmes with qualitative data from families participating in a programme to manage child overweight and obesity. Methods A search identified economic evaluation studies of lifestyle interventions in childhood obesity. The qualitative work drew on interviews with families who attended a weight management intervention in three UK regions. Results We identified four cost-effectiveness analyses that include information on costs to families. These were categorised as direct (e.g. monetary) and indirect (e.g. time) costs. Our analysis of qualitative data demonstrated that, for families who attended the programme, costs were associated both with participation on the scheme and with maintaining a healthy lifestyle afterwards. Respondents reported three kinds of cost: time-related, social/emotional and monetary. Conclusion Societal approaches to measuring cost-effectiveness provide a framework for assessing the monetary and non-monetary costs borne by participants attending treatment programmes. From this perspective, all costs should be considered in any analysis of cost-effectiveness. Our data suggest that family costs are important, and may act as a barrier to the uptake, completion and maintenance of behaviours to reduce child obesity. These findings have implications for the development and

  9. FAS 33: accurately recording effects of changing prices.

    PubMed

    Sage, L G

    1987-02-01

    FAS 33 addresses the problem of distortion in conventional historical cost financial statements because of changing prices. It requires 1300 business enterprises to report selected changing price data on a supplementary basis. It has been demonstrated that it is also feasible and beneficial for hospitals to present price disclosures as supplementary information to their financial statements. The possible application of FAS 33 is supported on the basis that the accounting and reporting methods of healthcare institutions are similar to the accounting and reporting practices of profit-seeking entities.

  10. A unique, accurate LWIR optics measurement system

    NASA Astrophysics Data System (ADS)

    Fantone, Stephen D.; Orband, Daniel G.

    2011-05-01

    A compact low-cost LWIR test station has been developed that provides real time MTF testing of IR optical systems and EO imaging systems. The test station is intended to be operated by a technician and can be used to measure the focal length, blur spot size, distortion, and other metrics of system performance. The challenges and tradeoffs incorporated into this instrumentation will be presented. The test station performs the measurement of an IR lens or optical system's first order quantities (focal length, back focal length) including on and off-axis imaging performance (e.g., MTF, resolution, spot size) under actual test conditions to enable the simulation of their actual use. Also described is the method of attaining the needed accuracies so that derived calculations like focal length (EFL = image shift/tan(theta)) can be performed to the requisite accuracy. The station incorporates a patented video capture technology and measures MTF and blur characteristics using newly available lowcost LWIR cameras. This allows real time determination of the optical system performance enabling faster measurements, higher throughput and lower cost results than scanning systems. Multiple spectral filters are also accommodated within the test stations which facilitate performance evaluation under various spectral conditions.

  11. Accurate simulation of optical properties in dyes.

    PubMed

    Jacquemin, Denis; Perpète, Eric A; Ciofini, Ilaria; Adamo, Carlo

    2009-02-17

    Since Antiquity, humans have produced and commercialized dyes. To this day, extraction of natural dyes often requires lengthy and costly procedures. In the 19th century, global markets and new industrial products drove a significant effort to synthesize artificial dyes, characterized by low production costs, huge quantities, and new optical properties (colors). Dyes that encompass classes of molecules absorbing in the UV-visible part of the electromagnetic spectrum now have a wider range of applications, including coloring (textiles, food, paintings), energy production (photovoltaic cells, OLEDs), or pharmaceuticals (diagnostics, drugs). Parallel to the growth in dye applications, researchers have increased their efforts to design and synthesize new dyes to customize absorption and emission properties. In particular, dyes containing one or more metallic centers allow for the construction of fairly sophisticated systems capable of selectively reacting to light of a given wavelength and behaving as molecular devices (photochemical molecular devices, PMDs).Theoretical tools able to predict and interpret the excited-state properties of organic and inorganic dyes allow for an efficient screening of photochemical centers. In this Account, we report recent developments defining a quantitative ab initio protocol (based on time-dependent density functional theory) for modeling dye spectral properties. In particular, we discuss the importance of several parameters, such as the methods used for electronic structure calculations, solvent effects, and statistical treatments. In addition, we illustrate the performance of such simulation tools through case studies. We also comment on current weak points of these methods and ways to improve them.

  12. ADP (Automated Data Processing) cost estimating heuristics

    SciTech Connect

    Sadlowe, A.R.; Arrowood, L.F.; Jones, K.A.; Emrich, M.L.; Watson, B.D.

    1987-09-11

    Artificial Intelligence, in particular expert systems methodologies, is being applied to the US Navy's Automated Data Processing estimating tasks. Skilled Navy project leaders are nearing retirement; replacements may not yet possess the many years of experience required to make accurate decisions regarding time, cost, equipment, and personnel needs. The potential departure of expertise resulted in the development of a system to capture organizational expertise. The prototype allows inexperienced project leaders to interactively generate cost estimates. 5 refs.

  13. Equipment Cost Estimator

    SciTech Connect

    2016-08-24

    The ECE application forecasts annual costs of preventive and corrective maintenance for budgeting purposes. Features within the application enable the user to change the specifications of the model to customize your forecast to best fit their needs and support “what if” analysis. Based on the user's selections, the ECE model forecasts annual maintenance costs. Preventive maintenance costs include the cost of labor to perform preventive maintenance activities at the specific frequency and labor rate. Corrective maintenance costs include the cost of labor and the cost of replacement parts. The application presents forecasted maintenance costs for the next five years in two tables: costs by year and costs by site.

  14. Cost-estimating relationships for space programs

    NASA Technical Reports Server (NTRS)

    Mandell, Humboldt C., Jr.

    1992-01-01

    Cost-estimating relationships (CERs) are defined and discussed as they relate to the estimation of theoretical costs for space programs. The paper primarily addresses CERs based on analogous relationships between physical and performance parameters to estimate future costs. Analytical estimation principles are reviewed examining the sources of errors in cost models, and the use of CERs is shown to be affected by organizational culture. Two paradigms for cost estimation are set forth: (1) the Rand paradigm for single-culture single-system methods; and (2) the Price paradigms that incorporate a set of cultural variables. For space programs that are potentially subject to even small cultural changes, the Price paradigms are argued to be more effective. The derivation and use of accurate CERs is important for developing effective cost models to analyze the potential of a given space program.

  15. Higher order accurate partial implicitization: An unconditionally stable fourth-order-accurate explicit numerical technique

    NASA Technical Reports Server (NTRS)

    Graves, R. A., Jr.

    1975-01-01

    The previously obtained second-order-accurate partial implicitization numerical technique used in the solution of fluid dynamic problems was modified with little complication to achieve fourth-order accuracy. The Von Neumann stability analysis demonstrated the unconditional linear stability of the technique. The order of the truncation error was deduced from the Taylor series expansions of the linearized difference equations and was verified by numerical solutions to Burger's equation. For comparison, results were also obtained for Burger's equation using a second-order-accurate partial-implicitization scheme, as well as the fourth-order scheme of Kreiss.

  16. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing.

  17. Keeping Public Information Public.

    ERIC Educational Resources Information Center

    Kelley, Wayne P.

    1998-01-01

    Discusses the trend toward the transfer of federal government information from the public domain to the private sector. Topics include free access, privatization, information-policy revision, accountability, copyright issues, costs, pricing, and market needs versus public needs. (LRW)

  18. Cost accounting for end-of-life care: recommendations to the field by the Cost Accounting Workgroup.

    PubMed

    Seninger, Stephen; Smith, Dean G

    2004-01-01

    Accurate measurement of economic costs is prerequisite to progress in improving the care delivered to Americans during the last stage of life. The Robert Wood Johnson Excellence in End-of-Life Care national program assembled a Cost Accounting Workgroup to identify accurate and meaningful methods to measure palliative and end-of-life health care use and costs. Eight key issues were identified: (1) planning the cost analysis; (2) identifying the perspective for cost analysis; (3) describing the end-of-life care program; (4) identifying the appropriate comparison group; (5) defining the period of care to be studied; (6) identifying the units of health care services; (7) assigning monetary values to health care service units; and (8) calculating costs. Economic principles of cost measurement and cost measurement issues encountered by practitioners were reviewed and incorporated into a set of recommendations.

  19. The cost of linearization

    NASA Astrophysics Data System (ADS)

    Morel, Danielle; Levy, William B.

    2006-03-01

    Information processing in the brain is metabolically expensive and energy usage by the different components of the nervous system is not well understood. In a continuing effort to explore the costs and constraints of information processing at the single neuron level, dendritic processes are being studied. More specifically, the role of various ion channel conductances is explored in terms of integrating dendritic excitatory synaptic input. Biophysical simulations of dendritic behavior show that the complexity of voltage-dependent, non-linear dendritic conductances can produce simplicity in the form of linear synaptic integration. Over increasing levels of synaptic activity, it is shown that two types of voltage-dependent conductances produce linearization over a limited range. This range is determined by the parameters defining the ion channel and the 'passive' properties of the dendrite. A persistent sodium and a transient A-type potassium channel were considered at steady-state transmembrane potentials in the vicinity of and hyperpolarized to the threshold for action potential initiation. The persistent sodium is seen to amplify and linearize the synaptic input over a short range of low synaptic activity. In contrast, the A-type potassium channel has a broader linearization range but tends to operate at higher levels of synaptic bombardment. Given equivalent 'passive' dendritic properties, the persistent sodium is found to be less costly than the A-type potassium in linearizing synaptic input.

  20. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  1. Accurate Modeling of Scaffold Hopping Transformations in Drug Discovery.

    PubMed

    Wang, Lingle; Deng, Yuqing; Wu, Yujie; Kim, Byungchan; LeBard, David N; Wandschneider, Dan; Beachy, Mike; Friesner, Richard A; Abel, Robert

    2017-01-10

    The accurate prediction of protein-ligand binding free energies remains a significant challenge of central importance in computational biophysics and structure-based drug design. Multiple recent advances including the development of greatly improved protein and ligand molecular mechanics force fields, more efficient enhanced sampling methods, and low-cost powerful GPU computing clusters have enabled accurate and reliable predictions of relative protein-ligand binding free energies through the free energy perturbation (FEP) methods. However, the existing FEP methods can only be used to calculate the relative binding free energies for R-group modifications or single-atom modifications and cannot be used to efficiently evaluate scaffold hopping modifications to a lead molecule. Scaffold hopping or core hopping, a very common design strategy in drug discovery projects, is critical not only in the early stages of a discovery campaign where novel active matter must be identified but also in lead optimization where the resolution of a variety of ADME/Tox problems may require identification of a novel core structure. In this paper, we introduce a method that enables theoretically rigorous, yet computationally tractable, relative protein-ligand binding free energy calculations to be pursued for scaffold hopping modifications. We apply the method to six pharmaceutically interesting cases where diverse types of scaffold hopping modifications were required to identify the drug molecules ultimately sent into the clinic. For these six diverse cases, the predicted binding affinities were in close agreement with experiment, demonstrating the wide applicability and the significant impact Core Hopping FEP may provide in drug discovery projects.

  2. Intrinsic Valuation of Information in Decision Making under Uncertainty

    PubMed Central

    Bode, Stefan; Brydevall, Maja; Murawski, Carsten

    2016-01-01

    In a dynamic world, an accurate model of the environment is vital for survival, and agents ought regularly to seek out new information with which to update their world models. This aspect of behaviour is not captured well by classical theories of decision making, and the cognitive mechanisms of information seeking are poorly understood. In particular, it is not known whether information is valued only for its instrumental use, or whether humans also assign it a non-instrumental intrinsic value. To address this question, the present study assessed preference for non-instrumental information among 80 healthy participants in two experiments. Participants performed a novel information preference task in which they could choose to pay a monetary cost to receive advance information about the outcome of a monetary lottery. Importantly, acquiring information did not alter lottery outcome probabilities. We found that participants were willing to incur considerable monetary costs to acquire payoff-irrelevant information about the lottery outcome. This behaviour was well explained by a computational cognitive model in which information preference resulted from aversion to temporally prolonged uncertainty. These results strongly suggest that humans assign an intrinsic value to information in a manner inconsistent with normative accounts of decision making under uncertainty. This intrinsic value may be associated with adaptive behaviour in real-world environments by producing a bias towards exploratory and information-seeking behaviour. PMID:27416034

  3. Intrinsic Valuation of Information in Decision Making under Uncertainty.

    PubMed

    Bennett, Daniel; Bode, Stefan; Brydevall, Maja; Warren, Hayley; Murawski, Carsten

    2016-07-01

    In a dynamic world, an accurate model of the environment is vital for survival, and agents ought regularly to seek out new information with which to update their world models. This aspect of behaviour is not captured well by classical theories of decision making, and the cognitive mechanisms of information seeking are poorly understood. In particular, it is not known whether information is valued only for its instrumental use, or whether humans also assign it a non-instrumental intrinsic value. To address this question, the present study assessed preference for non-instrumental information among 80 healthy participants in two experiments. Participants performed a novel information preference task in which they could choose to pay a monetary cost to receive advance information about the outcome of a monetary lottery. Importantly, acquiring information did not alter lottery outcome probabilities. We found that participants were willing to incur considerable monetary costs to acquire payoff-irrelevant information about the lottery outcome. This behaviour was well explained by a computational cognitive model in which information preference resulted from aversion to temporally prolonged uncertainty. These results strongly suggest that humans assign an intrinsic value to information in a manner inconsistent with normative accounts of decision making under uncertainty. This intrinsic value may be associated with adaptive behaviour in real-world environments by producing a bias towards exploratory and information-seeking behaviour.

  4. Does a pneumotach accurately characterize voice function?

    NASA Astrophysics Data System (ADS)

    Walters, Gage; Krane, Michael

    2016-11-01

    A study is presented which addresses how a pneumotach might adversely affect clinical measurements of voice function. A pneumotach is a device, typically a mask, worn over the mouth, in order to measure time-varying glottal volume flow. By measuring the time-varying difference in pressure across a known aerodynamic resistance element in the mask, the glottal volume flow waveform is estimated. Because it adds aerodynamic resistance to the vocal system, there is some concern that using a pneumotach may not accurately portray the behavior of the voice. To test this hypothesis, experiments were performed in a simplified airway model with the principal dimensions of an adult human upper airway. A compliant constriction, fabricated from silicone rubber, modeled the vocal folds. Variations of transglottal pressure, time-averaged volume flow, model vocal fold vibration amplitude, and radiated sound with subglottal pressure were performed, with and without the pneumotach in place, and differences noted. Acknowledge support of NIH Grant 2R01DC005642-10A1.

  5. Accurate method for computing correlated color temperature.

    PubMed

    Li, Changjun; Cui, Guihua; Melgosa, Manuel; Ruan, Xiukai; Zhang, Yaoju; Ma, Long; Xiao, Kaida; Luo, M Ronnier

    2016-06-27

    For the correlated color temperature (CCT) of a light source to be estimated, a nonlinear optimization problem must be solved. In all previous methods available to compute CCT, the objective function has only been approximated, and their predictions have achieved limited accuracy. For example, different unacceptable CCT values have been predicted for light sources located on the same isotemperature line. In this paper, we propose to compute CCT using the Newton method, which requires the first and second derivatives of the objective function. Following the current recommendation by the International Commission on Illumination (CIE) for the computation of tristimulus values (summations at 1 nm steps from 360 nm to 830 nm), the objective function and its first and second derivatives are explicitly given and used in our computations. Comprehensive tests demonstrate that the proposed method, together with an initial estimation of CCT using Robertson's method [J. Opt. Soc. Am. 58, 1528-1535 (1968)], gives highly accurate predictions below 0.0012 K for light sources with CCTs ranging from 500 K to 106 K.

  6. Accurate Theoretical Thermochemistry for Fluoroethyl Radicals.

    PubMed

    Ganyecz, Ádám; Kállay, Mihály; Csontos, József

    2017-02-09

    An accurate coupled-cluster (CC) based model chemistry was applied to calculate reliable thermochemical quantities for hydrofluorocarbon derivatives including radicals 1-fluoroethyl (CH3-CHF), 1,1-difluoroethyl (CH3-CF2), 2-fluoroethyl (CH2F-CH2), 1,2-difluoroethyl (CH2F-CHF), 2,2-difluoroethyl (CHF2-CH2), 2,2,2-trifluoroethyl (CF3-CH2), 1,2,2,2-tetrafluoroethyl (CF3-CHF), and pentafluoroethyl (CF3-CF2). The model chemistry used contains iterative triple and perturbative quadruple excitations in CC theory, as well as scalar relativistic and diagonal Born-Oppenheimer corrections. To obtain heat of formation values with better than chemical accuracy perturbative quadruple excitations and scalar relativistic corrections were inevitable. Their contributions to the heats of formation steadily increase with the number of fluorine atoms in the radical reaching 10 kJ/mol for CF3-CF2. When discrepancies were found between the experimental and our values it was always possible to resolve the issue by recalculating the experimental result with currently recommended auxiliary data. For each radical studied here this study delivers the best heat of formation as well as entropy data.

  7. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  8. Accurate methods for large molecular systems.

    PubMed

    Gordon, Mark S; Mullin, Jonathan M; Pruitt, Spencer R; Roskop, Luke B; Slipchenko, Lyudmila V; Boatz, Jerry A

    2009-07-23

    Three exciting new methods that address the accurate prediction of processes and properties of large molecular systems are discussed. The systematic fragmentation method (SFM) and the fragment molecular orbital (FMO) method both decompose a large molecular system (e.g., protein, liquid, zeolite) into small subunits (fragments) in very different ways that are designed to both retain the high accuracy of the chosen quantum mechanical level of theory while greatly reducing the demands on computational time and resources. Each of these methods is inherently scalable and is therefore eminently capable of taking advantage of massively parallel computer hardware while retaining the accuracy of the corresponding electronic structure method from which it is derived. The effective fragment potential (EFP) method is a sophisticated approach for the prediction of nonbonded and intermolecular interactions. Therefore, the EFP method provides a way to further reduce the computational effort while retaining accuracy by treating the far-field interactions in place of the full electronic structure method. The performance of the methods is demonstrated using applications to several systems, including benzene dimer, small organic species, pieces of the alpha helix, water, and ionic liquids.

  9. Accurate equilibrium structures for piperidine and cyclohexane.

    PubMed

    Demaison, Jean; Craig, Norman C; Groner, Peter; Écija, Patricia; Cocinero, Emilio J; Lesarri, Alberto; Rudolph, Heinz Dieter

    2015-03-05

    Extended and improved microwave (MW) measurements are reported for the isotopologues of piperidine. New ground state (GS) rotational constants are fitted to MW transitions with quartic centrifugal distortion constants taken from ab initio calculations. Predicate values for the geometric parameters of piperidine and cyclohexane are found from a high level of ab initio theory including adjustments for basis set dependence and for correlation of the core electrons. Equilibrium rotational constants are obtained from GS rotational constants corrected for vibration-rotation interactions and electronic contributions. Equilibrium structures for piperidine and cyclohexane are fitted by the mixed estimation method. In this method, structural parameters are fitted concurrently to predicate parameters (with appropriate uncertainties) and moments of inertia (with uncertainties). The new structures are regarded as being accurate to 0.001 Å and 0.2°. Comparisons are made between bond parameters in equatorial piperidine and cyclohexane. Another interesting result of this study is that a structure determination is an effective way to check the accuracy of the ground state experimental rotational constants.

  10. Noninvasive hemoglobin monitoring: how accurate is enough?

    PubMed

    Rice, Mark J; Gravenstein, Nikolaus; Morey, Timothy E

    2013-10-01

    Evaluating the accuracy of medical devices has traditionally been a blend of statistical analyses, at times without contextualizing the clinical application. There have been a number of recent publications on the accuracy of a continuous noninvasive hemoglobin measurement device, the Masimo Radical-7 Pulse Co-oximeter, focusing on the traditional statistical metrics of bias and precision. In this review, which contains material presented at the Innovations and Applications of Monitoring Perfusion, Oxygenation, and Ventilation (IAMPOV) Symposium at Yale University in 2012, we critically investigated these metrics as applied to the new technology, exploring what is required of a noninvasive hemoglobin monitor and whether the conventional statistics adequately answer our questions about clinical accuracy. We discuss the glucose error grid, well known in the glucose monitoring literature, and describe an analogous version for hemoglobin monitoring. This hemoglobin error grid can be used to evaluate the required clinical accuracy (±g/dL) of a hemoglobin measurement device to provide more conclusive evidence on whether to transfuse an individual patient. The important decision to transfuse a patient usually requires both an accurate hemoglobin measurement and a physiologic reason to elect transfusion. It is our opinion that the published accuracy data of the Masimo Radical-7 is not good enough to make the transfusion decision.

  11. Fast and accurate exhaled breath ammonia measurement.

    PubMed

    Solga, Steven F; Mudalel, Matthew L; Spacek, Lisa A; Risby, Terence H

    2014-06-11

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations.

  12. Accurate Fission Data for Nuclear Safety

    NASA Astrophysics Data System (ADS)

    Solders, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Lantz, M.; Mattera, A.; Penttilä, H.; Pomp, S.; Rakopoulos, V.; Rinta-Antila, S.

    2014-05-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyväskylä. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (1012 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons for benchmarking and to study the energy dependence of fission yields. The scientific program is extensive and is planed to start in 2013 with a measurement of isomeric yield ratios of proton induced fission in uranium. This will be followed by studies of independent yields of thermal and fast neutron induced fission of various actinides.

  13. A new and accurate continuum description of moving fronts

    NASA Astrophysics Data System (ADS)

    Johnston, S. T.; Baker, R. E.; Simpson, M. J.

    2017-03-01

    Processes that involve moving fronts of populations are prevalent in ecology and cell biology. A common approach to describe these processes is a lattice-based random walk model, which can include mechanisms such as crowding, birth, death, movement and agent–agent adhesion. However, these models are generally analytically intractable and it is computationally expensive to perform sufficiently many realisations of the model to obtain an estimate of average behaviour that is not dominated by random fluctuations. To avoid these issues, both mean-field (MF) and corrected mean-field (CMF) continuum descriptions of random walk models have been proposed. However, both continuum descriptions are inaccurate outside of limited parameter regimes, and CMF descriptions cannot be employed to describe moving fronts. Here we present an alternative description in terms of the dynamics of groups of contiguous occupied lattice sites and contiguous vacant lattice sites. Our description provides an accurate prediction of the average random walk behaviour in all parameter regimes. Critically, our description accurately predicts the persistence or extinction of the population in situations where previous continuum descriptions predict the opposite outcome. Furthermore, unlike traditional MF models, our approach provides information about the spatial clustering within the population and, subsequently, the moving front.

  14. Accurate three-dimensional documentation of distinct sites

    NASA Astrophysics Data System (ADS)

    Singh, Mahesh K.; Dutta, Ashish; Subramanian, Venkatesh K.

    2017-01-01

    One of the most critical aspects of documenting distinct sites is acquiring detailed and accurate range information. Several three-dimensional (3-D) acquisition techniques are available, but each has its own limitations. This paper presents a range data fusion method with the aim to enhance the descriptive contents of the entire 3-D reconstructed model. A kernel function is introduced for supervised classification of the range data using a kernelized support vector machine. The classification method is based on the local saliency features of the acquired range data. The range data acquired from heterogeneous range sensors are transformed into a defined common reference frame. Based on the segmentation criterion, the fusion of range data is performed by integrating finer regions of range data acquired from a laser range scanner with the coarser region of Kinect's range data. After fusion, the Delaunay triangulation algorithm is applied to generate the highly accurate, realistic 3-D model of the scene. Finally, experimental results show the robustness of the proposed approach.

  15. Mouse models of human AML accurately predict chemotherapy response

    PubMed Central

    Zuber, Johannes; Radtke, Ina; Pardee, Timothy S.; Zhao, Zhen; Rappaport, Amy R.; Luo, Weijun; McCurrach, Mila E.; Yang, Miao-Miao; Dolan, M. Eileen; Kogan, Scott C.; Downing, James R.; Lowe, Scott W.

    2009-01-01

    The genetic heterogeneity of cancer influences the trajectory of tumor progression and may underlie clinical variation in therapy response. To model such heterogeneity, we produced genetically and pathologically accurate mouse models of common forms of human acute myeloid leukemia (AML) and developed methods to mimic standard induction chemotherapy and efficiently monitor therapy response. We see that murine AMLs harboring two common human AML genotypes show remarkably diverse responses to conventional therapy that mirror clinical experience. Specifically, murine leukemias expressing the AML1/ETO fusion oncoprotein, associated with a favorable prognosis in patients, show a dramatic response to induction chemotherapy owing to robust activation of the p53 tumor suppressor network. Conversely, murine leukemias expressing MLL fusion proteins, associated with a dismal prognosis in patients, are drug-resistant due to an attenuated p53 response. Our studies highlight the importance of genetic information in guiding the treatment of human AML, functionally establish the p53 network as a central determinant of chemotherapy response in AML, and demonstrate that genetically engineered mouse models of human cancer can accurately predict therapy response in patients. PMID:19339691

  16. Mouse models of human AML accurately predict chemotherapy response.

    PubMed

    Zuber, Johannes; Radtke, Ina; Pardee, Timothy S; Zhao, Zhen; Rappaport, Amy R; Luo, Weijun; McCurrach, Mila E; Yang, Miao-Miao; Dolan, M Eileen; Kogan, Scott C; Downing, James R; Lowe, Scott W

    2009-04-01

    The genetic heterogeneity of cancer influences the trajectory of tumor progression and may underlie clinical variation in therapy response. To model such heterogeneity, we produced genetically and pathologically accurate mouse models of common forms of human acute myeloid leukemia (AML) and developed methods to mimic standard induction chemotherapy and efficiently monitor therapy response. We see that murine AMLs harboring two common human AML genotypes show remarkably diverse responses to conventional therapy that mirror clinical experience. Specifically, murine leukemias expressing the AML1/ETO fusion oncoprotein, associated with a favorable prognosis in patients, show a dramatic response to induction chemotherapy owing to robust activation of the p53 tumor suppressor network. Conversely, murine leukemias expressing MLL fusion proteins, associated with a dismal prognosis in patients, are drug-resistant due to an attenuated p53 response. Our studies highlight the importance of genetic information in guiding the treatment of human AML, functionally establish the p53 network as a central determinant of chemotherapy response in AML, and demonstrate that genetically engineered mouse models of human cancer can accurately predict therapy response in patients.

  17. Accurate and Robust Genomic Prediction of Celiac Disease Using Statistical Learning

    PubMed Central

    Abraham, Gad; Tye-Din, Jason A.; Bhalala, Oneil G.; Kowalczyk, Adam; Zobel, Justin; Inouye, Michael

    2014-01-01

    Practical application of genomic-based risk stratification to clinical diagnosis is appealing yet performance varies widely depending on the disease and genomic risk score (GRS) method. Celiac disease (CD), a common immune-mediated illness, is strongly genetically determined and requires specific HLA haplotypes. HLA testing can exclude diagnosis but has low specificity, providing little information suitable for clinical risk stratification. Using six European cohorts, we provide a proof-of-concept that statistical learning approaches which simultaneously model all SNPs can generate robust and highly accurate predictive models of CD based on genome-wide SNP profiles. The high predictive capacity replicated both in cross-validation within each cohort (AUC of 0.87–0.89) and in independent replication across cohorts (AUC of 0.86–0.9), despite differences in ethnicity. The models explained 30–35% of disease variance and up to ∼43% of heritability. The GRS's utility was assessed in different clinically relevant settings. Comparable to HLA typing, the GRS can be used to identify individuals without CD with ≥99.6% negative predictive value however, unlike HLA typing, fine-scale stratification of individuals into categories of higher-risk for CD can identify those that would benefit from more invasive and costly definitive testing. The GRS is flexible and its performance can be adapted to the clinical situation by adjusting the threshold cut-off. Despite explaining a minority of disease heritability, our findings indicate a genomic risk score provides clinically relevant information to improve upon current diagnostic pathways for CD and support further studies evaluating the clinical utility of this approach in CD and other complex diseases. PMID:24550740

  18. Accurate response surface approximations for weight equations based on structural optimization

    NASA Astrophysics Data System (ADS)

    Papila, Melih

    Accurate weight prediction methods are vitally important for aircraft design optimization. Therefore, designers seek weight prediction techniques with low computational cost and high accuracy, and usually require a compromise between the two. The compromise can be achieved by combining stress analysis and response surface (RS) methodology. While stress analysis provides accurate weight information, RS techniques help to transmit effectively this information to the optimization procedure. The focus of this dissertation is structural weight equations in the form of RS approximations and their accuracy when fitted to results of structural optimizations that are based on finite element analyses. Use of RS methodology filters out the numerical noise in structural optimization results and provides a smooth weight function that can easily be used in gradient-based configuration optimization. In engineering applications RS approximations of low order polynomials are widely used, but the weight may not be modeled well by low-order polynomials, leading to bias errors. In addition, some structural optimization results may have high-amplitude errors (outliers) that may severely affect the accuracy of the weight equation. Statistical techniques associated with RS methodology are sought in order to deal with these two difficulties: (1) high-amplitude numerical noise (outliers) and (2) approximation model inadequacy. The investigation starts with reducing approximation error by identifying and repairing outliers. A potential reason for outliers in optimization results is premature convergence, and outliers of such nature may be corrected by employing different convergence settings. It is demonstrated that outlier repair can lead to accuracy improvements over the more standard approach of removing outliers. The adequacy of approximation is then studied by a modified lack-of-fit approach, and RS errors due to the approximation model are reduced by using higher order polynomials. In

  19. Cost comparisons for SSC magnet dependent systems

    SciTech Connect

    1985-08-15

    An SSC Cost Estimating Task Force was appointed by the SSC Director in May, 1985. The charge to the task force was to perform a detailed review of costs for all superconducting magnet design styles that are under consideration for the SSC. Cost information on five magnet styles was reviewed in detail by the task force members. The basic cost information was developed by participating laboratories and by industry. Details of the procedure and analysis are presented in Chapter III. The purpose of this report is to provide a comparison of all SSC construction project cost information that is dependent on the various magnet styles. It is emphasized that the costs displayed in the tables of this report are not the total costs for an SSC construction project. Only those systems for which costs vary with magnet style are included. In Appendix E, current results are compared with the relevant parts of the 1984 SSC Reference Designs Study (RDS) cost estimate. Following the method used in the RDS, the costs that are developed here are non-site specific. The labor rates utilized are based on a national average for the various labor categories. The Conventional Systems costs for underground structures are derived from an extension of the ``median-site`` model as described in the RDS.

  20. Method for Accurate Surface Temperature Measurements During Fast Induction Heating

    NASA Astrophysics Data System (ADS)

    Larregain, Benjamin; Vanderesse, Nicolas; Bridier, Florent; Bocher, Philippe; Arkinson, Patrick

    2013-07-01

    A robust method is proposed for the measurement of surface temperature fields during induction heating. It is based on the original coupling of temperature-indicating lacquers and a high-speed camera system. Image analysis tools have been implemented to automatically extract the temporal evolution of isotherms. This method was applied to the fast induction treatment of a 4340 steel spur gear, allowing the full history of surface isotherms to be accurately documented for a sequential heating, i.e., a medium frequency preheating followed by a high frequency final heating. Three isotherms, i.e., 704, 816, and 927°C, were acquired every 0.3 ms with a spatial resolution of 0.04 mm per pixel. The information provided by the method is described and discussed. Finally, the transformation temperature Ac1 is linked to the temperature on specific locations of the gear tooth.