Sample records for additional computational costs

  1. 20 CFR 404.278 - Additional cost-of-living increase.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Section 404.278 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.278...) Measuring period for the additional increase—(1) Beginning. To compute the additional increase, we begin...

  2. Computer assisted yarding cost analysis.

    Treesearch

    Ronald W. Mifflin

    1980-01-01

    Programs for a programable calculator and a desk-top computer are provided for quickly determining yarding cost and comparing the economics of alternative yarding systems. The programs emphasize the importance of the relationship between production rate and machine rate, which is the hourly cost of owning and operating yarding equipment. In addition to generating the...

  3. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

    NASA Astrophysics Data System (ADS)

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

  4. The Additional Costs and Health Effects of a Patient Having Overweight or Obesity: A Computational Model.

    PubMed

    Fallah-Fini, Saeideh; Adam, Atif; Cheskin, Lawrence J; Bartsch, Sarah M; Lee, Bruce Y

    2017-10-01

    This paper estimates specific additional disease outcomes and costs that could be prevented by helping a patient go from an obesity or overweight category to a normal weight category at different ages. This information could help physicians, other health care workers, patients, and third-party payers determine how to prioritize weight reduction. A computational Markov model was developed that represented the BMI status, chronic health states, health outcomes, and associated costs (from various perspectives) for an adult at different age points throughout his or her lifetime. Incremental costs were calculated for adult patients with obesity or overweight (vs. normal weight) at different starting ages. For example, for a metabolically healthy 20-year-old, having obesity (vs. normal weight) added lifetime third-party payer costs averaging $14,059 (95% range: $13,956-$14,163), productivity losses of $14,141 ($13,969-$14,312), and total societal costs of $28,020 ($27,751-$28,289); having overweight vs. normal weight added $5,055 ($4,967-$5,144), $5,358 ($5,199-$5,518), and $10,365 ($10,140-$10,590). For a metabolically healthy 50-year-old, having obesity added $15,925 ($15,831-$16,020), $20,120 ($19,887-$20,352), and $36,278 ($35,977-$36,579); having overweight added $5,866 ($5,779-$5,953), $10,205 ($9,980-$10,429), and $16,169 ($15,899-$16,438). Incremental lifetime costs of a patient with obesity or overweight (vs. normal weight) increased with the patient's age, peaked at age 50, and decreased with older ages. However, weight reduction even in older adults still yielded incremental cost savings. © 2017 The Obesity Society.

  5. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  6. NCRETURN Computer Program for Evaluating Investments Revised to Provide Additional Information

    Treesearch

    Allen L. Lundgren; Dennis L. Schweitzer

    1971-01-01

    Reports a modified version of NCRETURN, a computer program for evauating forestry investments. The revised version, RETURN, provides additional information about each investment, including future net worths and benefit-cost ratios, with no added input.

  7. Cost-Benefit Analysis of Computer Resources for Machine Learning

    USGS Publications Warehouse

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  8. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  9. Computers in Education: Their Use and Cost, Education Automation Monograph Number 2.

    ERIC Educational Resources Information Center

    American Data Processing, Inc., Detroit, MI.

    This monograph on the cost and use of computers in education consists of two parts. Part I is a report of the President's Science Advisory Committee concerning the cost and use of the computer in undergraduate, secondary, and higher education. In addition, the report contains a discussion of the interaction between research and educational uses of…

  10. Cost-Effectiveness and Cost-Utility of Internet-Based Computer Tailoring for Smoking Cessation

    PubMed Central

    Evers, Silvia MAA; de Vries, Hein; Hoving, Ciska

    2013-01-01

    Background Although effective smoking cessation interventions exist, information is limited about their cost-effectiveness and cost-utility. Objective To assess the cost-effectiveness and cost-utility of an Internet-based multiple computer-tailored smoking cessation program and tailored counseling by practice nurses working in Dutch general practices compared with an Internet-based multiple computer-tailored program only and care as usual. Methods The economic evaluation was embedded in a randomized controlled trial, for which 91 practice nurses recruited 414 eligible smokers. Smokers were randomized to receive multiple tailoring and counseling (n=163), multiple tailoring only (n=132), or usual care (n=119). Self-reported cost and quality of life were assessed during a 12-month follow-up period. Prolonged abstinence and 24-hour and 7-day point prevalence abstinence were assessed at 12-month follow-up. The trial-based economic evaluation was conducted from a societal perspective. Uncertainty was accounted for by bootstrapping (1000 times) and sensitivity analyses. Results No significant differences were found between the intervention arms with regard to baseline characteristics or effects on abstinence, quality of life, and addiction level. However, participants in the multiple tailoring and counseling group reported significantly more annual health care–related costs than participants in the usual care group. Cost-effectiveness analysis, using prolonged abstinence as the outcome measure, showed that the mere multiple computer-tailored program had the highest probability of being cost-effective. Compared with usual care, in this group €5100 had to be paid for each additional abstinent participant. With regard to cost-utility analyses, using quality of life as the outcome measure, usual care was probably most efficient. Conclusions To our knowledge, this was the first study to determine the cost-effectiveness and cost-utility of an Internet-based smoking

  11. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    PubMed

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p < 0.001). Significant differences in mean costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p < 0.001). Significant differences were also obtained between the costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  12. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Additional cost principles... Additional cost principles. As prescribed in 316.307(j), the Contracting Officer shall insert the following clause: Additional Cost Principles (January 2006) (a) Bid and proposal (B & P) costs. (1) B & P costs are...

  13. Cost Considerations in Nonlinear Finite-Element Computing

    NASA Technical Reports Server (NTRS)

    Utku, S.; Melosh, R. J.; Islam, M.; Salama, M.

    1985-01-01

    Conference paper discusses computational requirements for finiteelement analysis using quasi-linear approach to nonlinear problems. Paper evaluates computational efficiency of different computer architecturtural types in terms of relative cost and computing time.

  14. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  15. Cost-effectiveness of PET and PET/computed tomography: a systematic review.

    PubMed

    Gerke, Oke; Hermansson, Ronnie; Hess, Søren; Schifter, Søren; Vach, Werner; Høilund-Carlsen, Poul Flemming

    2015-01-01

    The development of clinical diagnostic procedures comprises early-phase and late-phase studies to elucidate diagnostic accuracy and patient outcome. Economic assessments of new diagnostic procedures compared with established work-ups indicate additional cost for 1 additional unit of effectiveness measure by means of incremental cost-effectiveness ratios when considering the replacement of the standard regimen by a new diagnostic procedure. This article discusses economic assessments of PET and PET/computed tomography reported until mid-July 2014. Forty-seven studies on cancer and noncancer indications were identified but, because of the widely varying scope of the analyses, a substantial amount of work remains to be done. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Cut Costs with Thin Client Computing.

    ERIC Educational Resources Information Center

    Hartley, Patrick H.

    2001-01-01

    Discusses how school districts can considerably increase the number of administrative computers in their districts without a corresponding increase in costs by using the "Thin Client" component of the Total Cost of Ownership (TCC) model. TCC and Thin Client are described, including its software and hardware components. An example of a…

  17. Costs, Benefits, and Adoption of Additive Manufacturing: A Supply Chain Perspective.

    PubMed

    Thomas, Douglas

    2016-07-01

    There are three primary aspects to the economics of additive manufacturing: measuring the value of goods produced, measuring the costs and benefits of using the technology, and estimating the adoption and diffusion of the technology. This paper provides an updated estimate of the value of goods produced. It then reviews the literature on additive manufacturing costs and identifies those instances in the literature where this technology is cost effective. The paper then goes on to propose an approach for examining and understanding the societal costs and benefits of this technology both from a monetary viewpoint and a resource consumption viewpoint. The final section discusses the trends in the adoption of additive manufacturing. Globally, there is an estimated $667 million in value added produced using additive manufacturing, which equates to 0.01 % of total global manufacturing value added. US value added is estimated as $241 million. Current research on additive manufacturing costs reveals that it is cost effective for manufacturing small batches with continued centralized production; however, with increased automation distributed production may become cost effective. Due to the complexities of measuring additive manufacturing costs and data limitations, current studies are limited in their scope. Many of the current studies examine the production of single parts and those that examine assemblies tend not to examine supply chain effects such as inventory and transportation costs along with decreased risk to supply disruption. The additive manufacturing system and the material costs constitute a significant portion of an additive manufactured product; however, these costs are declining over time. The current trends in costs and benefits have resulted in this technology representing 0.02 % of the relevant manufacturing industries in the US; however, as the costs of additive manufacturing systems decrease, this technology may become widely adopted and change the

  18. Costs, Benefits, and Adoption of Additive Manufacturing: A Supply Chain Perspective

    PubMed Central

    Thomas, Douglas

    2017-01-01

    There are three primary aspects to the economics of additive manufacturing: measuring the value of goods produced, measuring the costs and benefits of using the technology, and estimating the adoption and diffusion of the technology. This paper provides an updated estimate of the value of goods produced. It then reviews the literature on additive manufacturing costs and identifies those instances in the literature where this technology is cost effective. The paper then goes on to propose an approach for examining and understanding the societal costs and benefits of this technology both from a monetary viewpoint and a resource consumption viewpoint. The final section discusses the trends in the adoption of additive manufacturing. Globally, there is an estimated $667 million in value added produced using additive manufacturing, which equates to 0.01 % of total global manufacturing value added. US value added is estimated as $241 million. Current research on additive manufacturing costs reveals that it is cost effective for manufacturing small batches with continued centralized production; however, with increased automation distributed production may become cost effective. Due to the complexities of measuring additive manufacturing costs and data limitations, current studies are limited in their scope. Many of the current studies examine the production of single parts and those that examine assemblies tend not to examine supply chain effects such as inventory and transportation costs along with decreased risk to supply disruption. The additive manufacturing system and the material costs constitute a significant portion of an additive manufactured product; however, these costs are declining over time. The current trends in costs and benefits have resulted in this technology representing 0.02 % of the relevant manufacturing industries in the US; however, as the costs of additive manufacturing systems decrease, this technology may become widely adopted and change the

  19. A Low Cost Micro-Computer Based Local Area Network for Medical Office and Medical Center Automation

    PubMed Central

    Epstein, Mel H.; Epstein, Lynn H.; Emerson, Ron G.

    1984-01-01

    A Low Cost Micro-computer based Local Area Network for medical office automation is described which makes use of an array of multiple and different personal computers interconnected by a local area network. Each computer on the network functions as fully potent workstations for data entry and report generation. The network allows each workstation complete access to the entire database. Additionally, designated computers may serve as access ports for remote terminals. Through “Gateways” the network may serve as a front end for a large mainframe, or may interface with another network. The system provides for the medical office environment the expandability and flexibility of a multi-terminal mainframe system at a far lower cost without sacrifice of performance.

  20. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design

    PubMed Central

    Vanommeslaeghe, K.

    2014-01-01

    Background Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. Scope of Review As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular bimolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields’ parametrization philosophy and methodology. Major Conclusions Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1 microsecond on proteins, DNA, lipids and carbohydrates. General Significance Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers a model that is an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. PMID:25149274

  1. 47 CFR 25.111 - Additional information and ITU cost recovery.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Additional information and ITU cost recovery....111 Additional information and ITU cost recovery. (a) The Commission may request from any party at any time additional information concerning any application, or any other submission or pleading regarding...

  2. A Web-Based Computer-Tailored Alcohol Prevention Program for Adolescents: Cost-Effectiveness and Intersectoral Costs and Benefits.

    PubMed

    Drost, Ruben M W A; Paulus, Aggie T G; Jander, Astrid F; Mercken, Liesbeth; de Vries, Hein; Ruwaard, Dirk; Evers, Silvia M A A

    2016-04-21

    Preventing excessive alcohol use among adolescents is important not only to foster individual and public health, but also to reduce alcohol-related costs inside and outside the health care sector. Computer tailoring can be both effective and cost-effective for working with many lifestyle behaviors, yet the available information on the cost-effectiveness of computer tailoring for reducing alcohol use by adolescents is limited as is information on the costs and benefits pertaining to sectors outside the health care sector, also known as intersectoral costs and benefits (ICBs). The aim was to assess the cost-effectiveness of a Web-based computer-tailored intervention for reducing alcohol use and binge drinking by adolescents from a health care perspective (excluding ICBs) and from a societal perspective (including ICBs). Data used were from the Alcoholic Alert study, a cluster randomized controlled trial with randomization at the level of schools into two conditions. Participants either played a game with tailored feedback on alcohol awareness after the baseline assessment (intervention condition) or received care as usual (CAU), meaning that they had the opportunity to play the game subsequent to the final measurement (waiting list control condition). Data were recorded at baseline (T0=January/February 2014) and after 4 months (T1=May/June 2014) and were used to calculate incremental cost-effectiveness ratios (ICERs), both from a health care perspective and a societal perspective. Stochastic uncertainty in the data was dealt with by using nonparametric bootstraps (5000 simulated replications). Additional sensitivity analyses were conducted based on excluding cost outliers. Subgroup cost-effectiveness analyses were conducted based on several background variables, including gender, age, educational level, religion, and ethnicity. From both the health care perspective and the societal perspective for both outcome measures, the intervention was more costly and more

  3. A study of the additional costs of dispensing workers' compensation prescriptions.

    PubMed

    Schafermeyer, Kenneth W

    2007-03-01

    Although there is a significant amount of additional work involved in dispensing workers' compensation prescriptions, these costs have not been quantified. A study of the additional costs to dispense a workers' compensation prescription is needed to measure actual costs and to help determine the reasonableness of reimbursement for prescriptions dispensed under workers' compensation programs. The purpose of this study was to determine the minimum additional time and costs required to dispense workers' compensation prescriptions in Texas. A convenience sample of 30 store-level pharmacy staff members involved in submitting and processing prescription claims for the Texas Mutual workers' compensation program were interviewed by telephone. Data collected to determine the additional costs of dispensing a workers' compensation prescription included (1) the amount of additional time and personnel costs required to dispense and process an average workers' compensation prescription claim, (2) the difference in time required for a new versus a refilled prescription, (3) overhead costs for processing workers' compensation prescription claims by experienced experts at a central processing facility, (4) carrying costs for workers' compensation accounts receivable, and (5) bad debts due to uncollectible workers' compensation claims. The median of the sample pharmacies' additional costs for dispensing a workers' compensation prescription was estimated to be at least $9.86 greater than for a cash prescription. This study shows that the estimated costs for workers' compensation prescriptions were significantly higher than for cash prescriptions. These costs are probably much more than most employers, workers' compensation payers, and pharmacy managers would expect. It is recommended that pharmacy managers should estimate their own costs and compare these costs to actual reimbursement when considering the reasonableness of workers' compensation prescriptions and whether to accept

  4. Cutting Costs on Computer Forms.

    ERIC Educational Resources Information Center

    Rupp, Robert V., Jr.

    1989-01-01

    Using the experience of Ford Motor Company, Oscar Meyer, and IBM, this article shows that companies are enjoying high quality product performance and substantially lower costs by converting from premium white bond computer stock forms to blended bond forms. School administrators are advised to do likewise. (MLH)

  5. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design.

    PubMed

    Vanommeslaeghe, K; MacKerell, A D

    2015-05-01

    Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular biomolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields' parametrization philosophy and methodology. Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1μs on proteins, DNA, lipids and carbohydrates. Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. This article is part of a Special Issue entitled "Recent developments of molecular dynamics". Copyright © 2014 Elsevier B.V. All rights reserved.

  6. A Web-Based Computer-Tailored Alcohol Prevention Program for Adolescents: Cost-Effectiveness and Intersectoral Costs and Benefits

    PubMed Central

    2016-01-01

    Background Preventing excessive alcohol use among adolescents is important not only to foster individual and public health, but also to reduce alcohol-related costs inside and outside the health care sector. Computer tailoring can be both effective and cost-effective for working with many lifestyle behaviors, yet the available information on the cost-effectiveness of computer tailoring for reducing alcohol use by adolescents is limited as is information on the costs and benefits pertaining to sectors outside the health care sector, also known as intersectoral costs and benefits (ICBs). Objective The aim was to assess the cost-effectiveness of a Web-based computer-tailored intervention for reducing alcohol use and binge drinking by adolescents from a health care perspective (excluding ICBs) and from a societal perspective (including ICBs). Methods Data used were from the Alcoholic Alert study, a cluster randomized controlled trial with randomization at the level of schools into two conditions. Participants either played a game with tailored feedback on alcohol awareness after the baseline assessment (intervention condition) or received care as usual (CAU), meaning that they had the opportunity to play the game subsequent to the final measurement (waiting list control condition). Data were recorded at baseline (T0=January/February 2014) and after 4 months (T1=May/June 2014) and were used to calculate incremental cost-effectiveness ratios (ICERs), both from a health care perspective and a societal perspective. Stochastic uncertainty in the data was dealt with by using nonparametric bootstraps (5000 simulated replications). Additional sensitivity analyses were conducted based on excluding cost outliers. Subgroup cost-effectiveness analyses were conducted based on several background variables, including gender, age, educational level, religion, and ethnicity. Results From both the health care perspective and the societal perspective for both outcome measures, the

  7. Benchmarking DoD Use of Additive Manufacturing and Quantifying Costs

    DTIC Science & Technology

    2017-03-01

    46 VI. Cost Benefit ...developing a cost model. The US Army Logistics Innovation Agency published a study called “Additive Manufacturing Cost - Benefit Analysis”. This...to over fifteen thousand dollars on GSA Advantage. Desktop printers do not require extensive support equipment. 47    VI. Cost Benefit

  8. A computer program for analysis of fuelwood harvesting costs

    Treesearch

    George B. Harpole; Giuseppe Rensi

    1985-01-01

    The fuelwood harvesting computer program (FHP) is written in FORTRAN 60 and designed to select a collection of harvest units and systems from among alternatives to satisfy specified energy requirements at a lowest cost per million Btu's as recovered in a boiler, or thousand pounds of H2O evaporative capacity kiln drying. Computed energy costs are used as a...

  9. A cost-utility analysis of the use of preoperative computed tomographic angiography in abdomen-based perforator flap breast reconstruction.

    PubMed

    Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei

    2015-04-01

    Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.

  10. Additive Manufacturing of Low Cost Upper Stage Propulsion Components

    NASA Technical Reports Server (NTRS)

    Protz, Christopher; Bowman, Randy; Cooper, Ken; Fikes, John; Taminger, Karen; Wright, Belinda

    2014-01-01

    NASA is currently developing Additive Manufacturing (AM) technologies and design tools aimed at reducing the costs and manufacturing time of regeneratively cooled rocket engine components. These Low Cost Upper Stage Propulsion (LCUSP) tasks are funded through NASA's Game Changing Development Program in the Space Technology Mission Directorate. The LCUSP project will develop a copper alloy additive manufacturing design process and develop and optimize the Electron Beam Freeform Fabrication (EBF3) manufacturing process to direct deposit a nickel alloy structural jacket and manifolds onto an SLM manufactured GRCop chamber and Ni-alloy nozzle. In order to develop these processes, the project will characterize both the microstructural and mechanical properties of the SLMproduced GRCop-84, and will explore and document novel design techniques specific to AM combustion devices components. These manufacturing technologies will be used to build a 25K-class regenerative chamber and nozzle (to be used with tested DMLS injectors) that will be tested individually and as a system in hot fire tests to demonstrate the applicability of the technologies. These tasks are expected to bring costs and manufacturing time down as spacecraft propulsion systems typically comprise more than 70% of the total vehicle cost and account for a significant portion of the development schedule. Additionally, high pressure/high temperature combustion chambers and nozzles must be regeneratively cooled to survive their operating environment, causing their design to be time consuming and costly to build. LCUSP presents an opportunity to develop and demonstrate a process that can infuse these technologies into industry, build competition, and drive down costs of future engines.

  11. A nearly-linear computational-cost scheme for the forward dynamics of an N-body pendulum

    NASA Technical Reports Server (NTRS)

    Chou, Jack C. K.

    1989-01-01

    The dynamic equations of motion of an n-body pendulum with spherical joints are derived to be a mixed system of differential and algebraic equations (DAE's). The DAE's are kept in implicit form to save arithmetic and preserve the sparsity of the system and are solved by the robust implicit integration method. At each solution point, the predicted solution is corrected to its exact solution within given tolerance using Newton's iterative method. For each iteration, a linear system of the form J delta X = E has to be solved. The computational cost for solving this linear system directly by LU factorization is O(n exp 3), and it can be reduced significantly by exploring the structure of J. It is shown that by recognizing the recursive patterns and exploiting the sparsity of the system the multiplicative and additive computational costs for solving J delta X = E are O(n) and O(n exp 2), respectively. The formulation and solution method for an n-body pendulum is presented. The computational cost is shown to be nearly linearly proportional to the number of bodies.

  12. Computer-Controlled HVAC -- at Low Cost

    ERIC Educational Resources Information Center

    American School and University, 1974

    1974-01-01

    By tying into a computerized building-automation network, Schaumburg High School, Illinois, slashed its energy consumption by one-third. The remotely connected computer controls the mechanical system for the high school as well as other buildings in the community, with the cost being shared by all. (Author)

  13. Model implementation for dynamic computation of system cost

    NASA Astrophysics Data System (ADS)

    Levri, J.; Vaccari, D.

    The Advanced Life Support (ALS) Program metric is the ratio of the equivalent system mass (ESM) of a mission based on International Space Station (ISS) technology to the ESM of that same mission based on ALS technology. ESM is a mission cost analog that converts the volume, power, cooling and crewtime requirements of a mission into mass units to compute an estimate of the life support system emplacement cost. Traditionally, ESM has been computed statically, using nominal values for system sizing. However, computation of ESM with static, nominal sizing estimates cannot capture the peak sizing requirements driven by system dynamics. In this paper, a dynamic model for a near-term Mars mission is described. The model is implemented in Matlab/Simulink' for the purpose of dynamically computing ESM. This paper provides a general overview of the crew, food, biomass, waste, water and air blocks in the Simulink' model. Dynamic simulations of the life support system track mass flow, volume and crewtime needs, as well as power and cooling requirement profiles. The mission's ESM is computed, based upon simulation responses. Ultimately, computed ESM values for various system architectures will feed into an optimization search (non-derivative) algorithm to predict parameter combinations that result in reduced objective function values.

  14. Estimating the economic opportunity cost of water use with river basin simulators in a computationally efficient way

    NASA Astrophysics Data System (ADS)

    Rougé, Charles; Harou, Julien J.; Pulido-Velazquez, Manuel; Matrosov, Evgenii S.

    2017-04-01

    The marginal opportunity cost of water refers to benefits forgone by not allocating an additional unit of water to its most economically productive use at a specific location in a river basin at a specific moment in time. Estimating the opportunity cost of water is an important contribution to water management as it can be used for better water allocation or better system operation, and can suggest where future water infrastructure could be most beneficial. Opportunity costs can be estimated using 'shadow values' provided by hydro-economic optimization models. Yet, such models' use of optimization means the models had difficulty accurately representing the impact of operating rules and regulatory and institutional mechanisms on actual water allocation. In this work we use more widely available river basin simulation models to estimate opportunity costs. This has been done before by adding in the model a small quantity of water at the place and time where the opportunity cost should be computed, then running a simulation and comparing the difference in system benefits. The added system benefits per unit of water added to the system then provide an approximation of the opportunity cost. This approximation can then be used to design efficient pricing policies that provide incentives for users to reduce their water consumption. Yet, this method requires one simulation run per node and per time step, which is demanding computationally for large-scale systems and short time steps (e.g., a day or a week). Besides, opportunity cost estimates are supposed to reflect the most productive use of an additional unit of water, yet the simulation rules do not necessarily use water that way. In this work, we propose an alternative approach, which computes the opportunity cost through a double backward induction, first recursively from outlet to headwaters within the river network at each time step, then recursively backwards in time. Both backward inductions only require linear

  15. Computer Software for Life Cycle Cost.

    DTIC Science & Technology

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  16. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  17. Cost-effectiveness of additional catheter-directed thrombolysis for deep vein thrombosis.

    PubMed

    Enden, T; Resch, S; White, C; Wik, H S; Kløw, N E; Sandset, P M

    2013-06-01

    Additional treatment with catheter-directed thrombolysis (CDT) has recently been shown to reduce post-thrombotic syndrome (PTS). To estimate the cost effectiveness of additional CDT compared with standard treatment alone. Using a Markov decision model, we compared the two treatment strategies in patients with a high proximal deep vein thrombosis (DVT) and a low risk of bleeding. The model captured the development of PTS, recurrent venous thromboembolism and treatment-related adverse events within a lifetime horizon and the perspective of a third-party payer. Uncertainty was assessed with one-way and probabilistic sensitivity analyzes. Model inputs from the CaVenT study included PTS development, major bleeding from CDT and utilities for post DVT states including PTS. The remaining clinical inputs were obtained from the literature. Costs obtained from the CaVenT study, hospital accounts and the literature are expressed in US dollars ($); effects in quality adjusted life years (QALY). In base case analyzes, additional CDT accumulated 32.31 QALYs compared with 31.68 QALYs after standard treatment alone. Direct medical costs were $64,709 for additional CDT and $51,866 for standard treatment. The incremental cost-effectiveness ratio (ICER) was $20,429/QALY gained. One-way sensitivity analysis showed model sensitivity to the clinical efficacy of both strategies, but the ICER remained < $55,000/QALY over the full range of all parameters. The probability that CDT is cost effective was 82% at a willingness to pay threshold of $50,000/QALY gained. Additional CDT is likely to be a cost-effective alternative to the standard treatment for patients with a high proximal DVT and a low risk of bleeding. © 2013 International Society on Thrombosis and Haemostasis.

  18. Cost-effectiveness of additional catheter-directed thrombolysis for deep vein thrombosis

    PubMed Central

    ENDEN, T.; RESCH, S.; WHITE, C.; WIK, H. S.; KLØW, N. E.; SANDSET, P. M.

    2013-01-01

    Summary Background Additional treatment with catheter-directed thrombolysis (CDT) has recently been shown to reduce post-thrombotic syndrome (PTS). Objectives To estimate the cost effectiveness of additional CDT compared with standard treatment alone. Methods Using a Markov decision model, we compared the two treatment strategies in patients with a high proximal deep vein thrombosis (DVT) and a low risk of bleeding. The model captured the development of PTS, recurrent venous thromboembolism and treatment-related adverse events within a lifetime horizon and the perspective of a third-party payer. Uncertainty was assessed with one-way and probabilistic sensitivity analyzes. Model inputs from the CaVenT study included PTS development, major bleeding from CDT and utilities for post DVT states including PTS. The remaining clinical inputs were obtained from the literature. Costs obtained from the CaVenT study, hospital accounts and the literature are expressed in US dollars ($); effects in quality adjusted life years (QALY). Results In base case analyzes, additional CDT accumulated 32.31 QALYs compared with 31.68 QALYs after standard treatment alone. Direct medical costs were $64 709 for additional CDT and $51 866 for standard treatment. The incremental cost-effectiveness ratio (ICER) was $20 429/QALY gained. One-way sensitivity analysis showed model sensitivity to the clinical efficacy of both strategies, but the ICER remained < $55 000/QALY over the full range of all parameters. The probability that CDT is cost effective was 82% at a willingness to pay threshold of $50 000/QALY gained. Conclusions Additional CDT is likely to be a cost-effective alternative to the standard treatment for patients with a high proximal DVT and a low risk of bleeding. PMID:23452204

  19. [Cost analysis for navigation in knee endoprosthetics].

    PubMed

    Cerha, O; Kirschner, S; Günther, K-P; Lützner, J

    2009-12-01

    Total knee arthroplasty (TKA) is one of the most frequent procedures in orthopaedic surgery. The outcome depends on a range of factors including alignment of the leg and the positioning of the implant in addition to patient-associated factors. Computer-assisted navigation systems can improve the restoration of a neutral leg alignment. This procedure has been established especially in Europe and North America. The additional expenses are not reimbursed in the German DRG system (Diagnosis Related Groups). In the present study a cost analysis of computer-assisted TKA compared to the conventional technique was performed. The acquisition expenses of various navigation systems (5 and 10 year depreciation), annual costs for maintenance and software updates as well as the accompanying costs per operation (consumables, additional operating time) were considered. The additional operating time was determined on the basis of a meta-analysis according to the current literature. Situations with 25, 50, 100, 200 and 500 computer-assisted TKAs per year were simulated. The amount of the incremental costs of the computer-assisted TKA depends mainly on the annual volume and the additional operating time. A relevant decrease of the incremental costs was detected between 50 and 100 procedures per year. In a model with 100 computer-assisted TKAs per year an additional operating time of 14 mins and a 10 year depreciation of the investment costs, the incremental expenses amount to 300-395 depending on the navigation system. Computer-assisted TKA is associated with additional costs. From an economical point of view an amount of more than 50 procedures per year appears to be favourable. The cost-effectiveness could be estimated if long-term results will show a reduction of revisions or a better clinical outcome.

  20. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    PubMed

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  1. Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup

    PubMed Central

    Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.

    2010-01-01

    Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing

  2. Costs of cloud computing for a biometry department. A case study.

    PubMed

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  3. Cost-effectiveness methodology for computer systems selection

    NASA Technical Reports Server (NTRS)

    Vallone, A.; Bajaj, K. S.

    1980-01-01

    A new approach to the problem of selecting a computer system design has been developed. The purpose of this methodology is to identify a system design that is capable of fulfilling system objectives in the most economical way. The methodology characterizes each system design by the cost of the system life cycle and by the system's effectiveness in reaching objectives. Cost is measured by a 'system cost index' derived from an analysis of all expenditures and possible revenues over the system life cycle. Effectiveness is measured by a 'system utility index' obtained by combining the impact that each selection factor has on the system objectives and it is assessed through a 'utility curve'. A preestablished algorithm combines cost and utility and provides a ranking of the alternative system designs from which the 'best' design is selected.

  4. Can Additional Homeopathic Treatment Save Costs? A Retrospective Cost-Analysis Based on 44500 Insured Persons

    PubMed Central

    Ostermann, Julia K.; Reinhold, Thomas; Witt, Claudia M.

    2015-01-01

    Objectives The aim of this study was to compare the health care costs for patients using additional homeopathic treatment (homeopathy group) with the costs for those receiving usual care (control group). Methods Cost data provided by a large German statutory health insurance company were retrospectively analysed from the societal perspective (primary outcome) and from the statutory health insurance perspective. Patients in both groups were matched using a propensity score matching procedure based on socio-demographic variables as well as costs, number of hospital stays and sick leave days in the previous 12 months. Total cumulative costs over 18 months were compared between the groups with an analysis of covariance (adjusted for baseline costs) across diagnoses and for six specific diagnoses (depression, migraine, allergic rhinitis, asthma, atopic dermatitis, and headache). Results Data from 44,550 patients (67.3% females) were available for analysis. From the societal perspective, total costs after 18 months were higher in the homeopathy group (adj. mean: EUR 7,207.72 [95% CI 7,001.14–7,414.29]) than in the control group (EUR 5,857.56 [5,650.98–6,064.13]; p<0.0001) with the largest differences between groups for productivity loss (homeopathy EUR 3,698.00 [3,586.48–3,809.53] vs. control EUR 3,092.84 [2,981.31–3,204.37]) and outpatient care costs (homeopathy EUR 1,088.25 [1,073.90–1,102.59] vs. control EUR 867.87 [853.52–882.21]). Group differences decreased over time. For all diagnoses, costs were higher in the homeopathy group than in the control group, although this difference was not always statistically significant. Conclusion Compared with usual care, additional homeopathic treatment was associated with significantly higher costs. These analyses did not confirm previously observed cost savings resulting from the use of homeopathy in the health care system. PMID:26230412

  5. The Hidden Cost of Buying a Computer.

    ERIC Educational Resources Information Center

    Johnson, Michael

    1983-01-01

    In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)

  6. Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.

  7. COMPUTER PROGRAM FOR CALCULATING THE COST OF DRINKING WATER TREATMENT SYSTEMS

    EPA Science Inventory

    This FORTRAN computer program calculates the construction and operation/maintenance costs for 45 centralized unit treatment processes for water supply. The calculated costs are based on various design parameters and raw water quality. These cost data are applicable to small size ...

  8. X-ray computed tomography for additive manufacturing: a review

    NASA Astrophysics Data System (ADS)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  9. Additively Manufactured Low Cost Upper Stage Combustion Chamber

    NASA Technical Reports Server (NTRS)

    Protz, Christopher; Cooper, Ken; Ellis, David; Fikes, John; Jones, Zachary; Kim, Tony; Medina, Cory; Taminger, Karen; Willingham, Derek

    2016-01-01

    Over the past two years NASA's Low Cost Upper Stage Propulsion (LCUSP) project has developed Additive Manufacturing (AM) technologies and design tools aimed at reducing the costs and manufacturing time of regeneratively cooled rocket engine components. High pressure/high temperature combustion chambers and nozzles must be regeneratively cooled to survive their operating environment, causing their design fabrication to be costly and time consuming due to the number of individual steps and different processes required. Under LCUSP, AM technologies in Sintered Laser Melting (SLM) GRCop-84 and Electron Beam Freeform Fabrication (EBF3) Inconel 625 have been significantly advanced, allowing the team to successfully fabricate a 25k-class regenerative chamber. Estimates of the costs and schedule of future builds indicate cost reductions and significant schedule reductions will be enabled by this technology. Characterization of the microstructural and mechanical properties of the SLM-produced GRCop-84, EBF3 Inconel 625 and the interface layer between the two has been performed and indicates the properties will meet the design requirements. The LCUSP chamber is to be tested with a previously demonstrated SLM injector in order to advance the Technology Readiness Level (TRL) and demonstrate the capability of the application of these processes. NASA is advancing these technologies to reduce cost and schedule for future engine applications and commercial needs.

  10. The Hidden Costs of Wireless Computer Labs

    ERIC Educational Resources Information Center

    Daly, Una

    2005-01-01

    Various elementary schools and middle schools across the U.S. have purchased one or more mobile laboratories. Although the wireless labs have provided more classroom computing, teachers and technology aides still have mixed views about their cost-benefit ratio. This is because the proliferation of viruses and spyware has dramatically increased…

  11. Costs, needs must be balanced when buying computer systems.

    PubMed

    Krantz, G M; Doyle, J J; Stone, S G

    1989-06-01

    A healthcare institution must carefully examine its internal needs and external requirements before selecting an information system. The system's costs must be carefully weighed because significant computer cost overruns can cripple overall hospital finances. A New Jersey hospital carefully studied these issues and determined that a contract with a regional data center was its best option.

  12. How Elected Officials Can Control Computer Costs.

    ERIC Educational Resources Information Center

    Grady, Daniel B.

    Elected officials have a special obligation to monitor and make informed decisions about computer expenditures. In doing so, officials should insist that a needs assessment be carried out; review all cost and configuration data; draw up a master plan specifying user needs as well as hardware, software, and personnel requirements; and subject…

  13. Low-Cost Computer-Aided Instruction/Computer-Managed Instruction (CAI/CMI) System: Feasibility Study. Final Report.

    ERIC Educational Resources Information Center

    Lintz, Larry M.; And Others

    This study investigated the feasibility of a low cost computer-aided instruction/computer-managed instruction (CAI/CMI) system. Air Force instructors and training supervisors were surveyed to determine the potential payoffs of various CAI and CMI functions. Results indicated that a wide range of capabilities had potential for resident technical…

  14. The (cost-)effectiveness of a lifestyle physical activity intervention in addition to a work style intervention on the recovery from neck and upper limb symptoms in computer workers

    PubMed Central

    Bernaards, Claire M; Ariëns, Geertje AM; Hildebrandt, Vincent H

    2006-01-01

    Background Neck and upper limb symptoms are frequently reported by computer workers. Work style interventions are most commonly used to reduce work-related neck and upper limb symptoms but lifestyle physical activity interventions are becoming more popular to enhance workers health and reduce work-related symptoms. A combined approach targeting work style and lifestyle physical activity seems promising, but little is known on the effectiveness of such combined interventions. Methods/design The RSI@Work study is a randomised controlled trial that aims to assess the added value of a lifestyle physical activity intervention in addition to a work style intervention to reduce neck and upper limb symptoms in computer workers. Computer workers from seven Dutch companies with frequent or long-term neck and upper limb symptoms in the preceding six months and/or the last two weeks are randomised into three groups: (1) work style group, (2) work style and physical activity group, or (3) control group. The work style intervention consists of six group meetings in a six month period that take place at the workplace, during work time, and under the supervision of a specially trained counsellor. The goal of this intervention is to stimulate workplace adjustment and to improve body posture, the number and quality of breaks and coping behaviour with regard to high work demands. In the combined (work style and physical activity) intervention the additional goal is to increase moderate to heavy physical activity. The control group receives usual care. Primary outcome measures are degree of recovery, pain intensity, disability, number of days with neck and upper limb symptoms, and number of months without neck and upper limb symptoms. Outcome measures will be assessed at baseline and six and 12 months after randomisation. Cost-effectiveness of the group meetings will be assessed using an employer's perspective. Discussion This study will be one of the first to assess the added value

  15. Fixed-point image orthorectification algorithms for reduced computational cost

    NASA Astrophysics Data System (ADS)

    French, Joseph Clinton

    Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation

  16. Cost Analysis of the Addition of Hyperacute Magnetic Resonance Imaging for Selection of Patients for Endovascular Stroke Therapy.

    PubMed

    John, Seby; Thompson, Nicolas R; Lesko, Terry; Papesh, Nancy; Obuchowski, Nancy; Tomic, Dan; Wisco, Dolora; Khawaja, Zeshaun; Uchino, Ken; Man, Shumei; Cheng-Ching, Esteban; Toth, Gabor; Masaryk, Thomas; Ruggieri, Paul; Modic, Michael; Hussain, Muhammad Shazam

    2017-10-01

    Patient selection is important to determine the best candidates for endovascular stroke therapy. In application of a hyperacute magnetic resonance imaging (MRI) protocol for patient selection, we have shown decreased utilization with improved outcomes. A cost analysis comparing the pre- and post-MRI protocol time periods was performed to determine if the previous findings translated into cost opportunities. We retrospectively identified individuals considered for endovascular stroke therapy from January 2008 to August 2012 who were ≤8 h from stroke symptoms onset. Patients prior to April 30, 2010 were selected based on results of the computed tomography/computed tomography angiography alone (pre-hyperacute), whereas patients after April 30, 2010 were selected based on results of MRI (post-hyperacute MRI). Demographic, outcome, and financial information was collected. Log-transformed average daily direct costs were regressed on time period. The regression model included demographic and clinical covariates as potential confounders. Multiple imputation was used to account for missing data. We identified 267 patients in our database (88 patients in pre-hyperacute MRI period, 179 in hyperacute MRI protocol period). Patient length of stay was not significantly different in the hyperacute MRI protocol period as compared to the pre-hyperacute MRI period (10.6 vs. 9.9 days, p < 0.42). The median of average daily direct costs was reduced by 24.5% (95% confidence interval 14.1-33.7%, p < 0.001). Use of the hyperacute MRI protocol translated into reduced costs, in addition to reduced utilization and better outcomes. MRI selection of patients is an effective strategy, both for patients and hospital systems.

  17. Cost-effectiveness of additional blood screening tests in the Netherlands.

    PubMed

    Borkent-Raven, Barbara A; Janssen, Mart P; van der Poel, Cees L; Bonsel, Gouke J; van Hout, Ben A

    2012-03-01

    During the past decade, blood screening tests such as triplex nucleic acid amplification testing (NAT) and human T-cell lymphotropic virus type I or I (HTLV-I/II) antibody testing were added to existing serologic testing for hepatitis B virus (HBV), human immunodeficiency virus (HIV), and hepatitis C virus (HCV). In some low-prevalence regions these additional tests yielded disputable benefits that can be valuated by cost-effectiveness analyses (CEAs). CEAs are used to support decision making on implementation of medical technology. We present CEAs of selected additional screening tests that are not uniformly implemented in the EU. Cost-effectiveness was analyzed of: 1) HBV, HCV, and HIV triplex NAT in addition to serologic testing; 2) HTLV-I/II antibody test for all donors, for first-time donors only, and for pediatric recipients only; and 3) hepatitis A virus (HAV) for all donations. Disease progression of the studied viral infections was described in five Markov models. In the Netherlands, the incremental cost-effectiveness ratio (ICER) of triplex NAT is €5.20 million per quality-adjusted life-year (QALY) for testing minipools of six donation samples and €4.65 million/QALY for individual donation testing. The ICER for anti-HTLV-I/II is €45.2 million/QALY if testing all donations, €2.23 million/QALY if testing new donors only, and €27.0 million/QALY if testing blood products for pediatric patients only. The ICER of HAV NAT is €18.6 million/QALY. The resulting ICERs are very high, especially when compared to other health care interventions. Nevertheless, these screening tests are implemented in the Netherlands and elsewhere. Policy makers should reflect more explicit on the acceptability of costs and effects whenever additional blood screening tests are implemented. © 2011 American Association of Blood Banks.

  18. A survey of computer search service costs in the academic health sciences library.

    PubMed Central

    Shirley, S

    1978-01-01

    The Norris Medical Library, University of Southern California, has recently completed an extensive survey of costs involved in the provision of computer search services beyond vendor charges for connect time and printing. In this survey costs for such items as terminal depreciation, repair contract, personnel time, and supplies are analyzed. Implications of this cost survey are discussed in relation to planning and price setting for computer search services. PMID:708953

  19. Cost-effectiveness of computer-assisted training in cognitive-behavioral therapy as an adjunct to standard care for addiction.

    PubMed

    Olmstead, Todd A; Ostrow, Cary D; Carroll, Kathleen M

    2010-08-01

    To determine the cost-effectiveness, from clinic and patient perspectives, of a computer-based version of cognitive-behavioral therapy (CBT4CBT) as an addition to regular clinical practice for substance dependence. PARTICIPANTS, DESIGN AND MEASUREMENTS: This cost-effectiveness study is based on a randomized clinical trial in which 77 individuals seeking treatment for substance dependence at an outpatient community setting were randomly assigned to treatment as usual (TAU) or TAU plus biweekly access to computer-based training in CBT (TAU plus CBT4CBT). The primary patient outcome measure was the total number of drug-free specimens provided during treatment. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curves (CEACs) were used to determine the cost-effectiveness of TAU plus CBT4CBT relative to TAU alone. Results are presented from both the clinic and patient perspectives and are shown to be robust to (i) sensitivity analyses and (ii) a secondary objective patient outcome measure. The per patient cost of adding CBT4CBT to standard care was $39 ($27) from the clinic (patient) perspective. From the clinic (patient) perspective, TAU plus CBT4CBT is likely to be cost-effective when the threshold value to decision makers of an additional drug-free specimen is greater than approximately $21 ($15), and TAU alone is likely to be cost-effective when the threshold value is less than approximately $21 ($15). The ICERs for TAU plus CBT4CBT also compare favorably to ICERs reported elsewhere for other empirically validated therapies, including contingency management. TAU plus CBT4CBT appears to be a good value from both the clinic and patient perspectives. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  20. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  1. ESF-X: a low-cost modular experiment computer for space flight experiments

    NASA Astrophysics Data System (ADS)

    Sell, Steven; Zapetis, Joseph; Littlefield, Jim; Vining, Joanne

    2004-08-01

    The high cost associated with spaceflight research often compels experimenters to scale back their research goals significantly purely for budgetary reasons; among experiment systems, control and data collection electronics are a major contributor to total project cost. ESF-X was developed as an architecture demonstration in response to this need: it is a highly capable, radiation-protected experiment support computer, designed to be configurable on demand to each investigator's particular experiment needs, and operational in LEO for missions lasting up to several years (e.g., ISS EXPRESS) without scheduled service or maintenance. ESF-X can accommodate up to 255 data channels (I/O, A/D, D/A, etc.), allocated per customer request, with data rates up to 40kHz. Additionally, ESF-X can be programmed using the graphical block-diagram based programming languages Simulink and MATLAB. This represents a major cost saving opportunity for future investigators, who can now obtain a customized, space-qualified experiment controller at steeply reduced cost compared to 'new' design, and without the performance compromises associated with using preexisting 'generic' systems. This paper documents the functional benchtop prototype, which utilizes a combination of COTS and space-qualified components, along with unit-gravity-specific provisions appropriate to laboratory environment evaluation of the ESF-X design concept and its physical implementation.

  2. Low-cost computer mouse for the elderly or disabled in Taiwan.

    PubMed

    Chen, C-C; Chen, W-L; Chen, B-N; Shih, Y-Y; Lai, J-S; Chen, Y-L

    2014-01-01

    A mouse is an important communication interface between a human and a computer, but it is still difficult to use for the elderly or disabled. To develop a low-cost computer mouse auxiliary tool. The principal structure of the low-cost mouse auxiliary tool is the IR (infrared ray) array module and the Wii icon sensor module, which combine with reflective tape and the SQL Server database. This has several benefits including cheap hardware cost, fluent control, prompt response, adaptive adjustment and portability. Also, it carries the game module with the function of training and evaluation; to the trainee, it is really helpful to upgrade the sensitivity of consciousness/sense and the centralization of attention. The intervention phase/maintenance phase, with regard to clicking accuracy and use of time, p value (p< 0.05) reach the level of significance. The development of the low cost adaptive computer mouse auxiliary tool was completed during the study and was also verified as having the characteristics of low cost, easy operation and the adaptability. To patients with physical disabilities, if they have independent control action parts of their limbs, the mouse auxiliary tool is suitable for them to use, i.e. the user only needs to paste the reflective tape by the independent control action parts of the body to operate the mouse auxiliary tool.

  3. The Processing Cost of Reference Set Computation: Acquisition of Stress Shift and Focus

    ERIC Educational Resources Information Center

    Reinhart, Tanya

    2004-01-01

    Reference set computation -- the construction of a (global) comparison set to determine whether a given derivation is appropriate in context -- comes with a processing cost. I argue that this cost is directly visible at the acquisition stage: In those linguistic areas in which it has been independently established that such computation is indeed…

  4. Low-cost space-varying FIR filter architecture for computational imaging systems

    NASA Astrophysics Data System (ADS)

    Feng, Guotong; Shoaib, Mohammed; Schwartz, Edward L.; Dirk Robinson, M.

    2010-01-01

    Recent research demonstrates the advantage of designing electro-optical imaging systems by jointly optimizing the optical and digital subsystems. The optical systems designed using this joint approach intentionally introduce large and often space-varying optical aberrations that produce blurry optical images. Digital sharpening restores reduced contrast due to these intentional optical aberrations. Computational imaging systems designed in this fashion have several advantages including extended depth-of-field, lower system costs, and improved low-light performance. Currently, most consumer imaging systems lack the necessary computational resources to compensate for these optical systems with large aberrations in the digital processor. Hence, the exploitation of the advantages of the jointly designed computational imaging system requires low-complexity algorithms enabling space-varying sharpening. In this paper, we describe a low-cost algorithmic framework and associated hardware enabling the space-varying finite impulse response (FIR) sharpening required to restore largely aberrated optical images. Our framework leverages the space-varying properties of optical images formed using rotationally-symmetric optical lens elements. First, we describe an approach to leverage the rotational symmetry of the point spread function (PSF) about the optical axis allowing computational savings. Second, we employ a specially designed bank of sharpening filters tuned to the specific radial variation common to optical aberrations. We evaluate the computational efficiency and image quality achieved by using this low-cost space-varying FIR filter architecture.

  5. Additional development of the XTRAN3S computer program

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Additional developments and enhancements to the XTRAN3S computer program, a code for calculation of steady and unsteady aerodynamics, and associated aeroelastic solutions, for 3-D wings in the transonic flow regime are described. Algorithm improvements for the XTRAN3S program were provided including an implicit finite difference scheme to enhance the allowable time step and vectorization for improved computational efficiency. The code was modified to treat configurations with a fuselage, multiple stores/nacelles/pylons, and winglets. Computer program changes (updates) for error corrections and updates for version control are provided.

  6. Manual of phosphoric acid fuel cell power plant cost model and computer program

    NASA Technical Reports Server (NTRS)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  7. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.

  8. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of this...

  9. 42 CFR 413.355 - Additional payment: QIO photocopy and mailing costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.355 Additional payment: QIO photocopy and mailing costs. An additional payment is made to a skilled nursing facility in accordance with § 476.78 of this...

  10. IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.

    ERIC Educational Resources Information Center

    Sheehan, Mark C.; Williams, James G.

    1987-01-01

    Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)

  11. Incremental Costs and Cost Effectiveness of Intensive Treatment in Individuals with Type 2 Diabetes Detected by Screening in the ADDITION-UK Trial: An Update with Empirical Trial-Based Cost Data.

    PubMed

    Laxy, Michael; Wilson, Edward C F; Boothby, Clare E; Griffin, Simon J

    2017-12-01

    There is uncertainty about the cost effectiveness of early intensive treatment versus routine care in individuals with type 2 diabetes detected by screening. To derive a trial-informed estimate of the incremental costs of intensive treatment as delivered in the Anglo-Danish-Dutch Study of Intensive Treatment in People with Screen-Detected Diabetes in Primary Care-Europe (ADDITION) trial and to revisit the long-term cost-effectiveness analysis from the perspective of the UK National Health Service. We analyzed the electronic primary care records of a subsample of the ADDITION-Cambridge trial cohort (n = 173). Unit costs of used primary care services were taken from the published literature. Incremental annual costs of intensive treatment versus routine care in years 1 to 5 after diagnosis were calculated using multilevel generalized linear models. We revisited the long-term cost-utility analyses for the ADDITION-UK trial cohort and reported results for ADDITION-Cambridge using the UK Prospective Diabetes Study Outcomes Model and the trial-informed cost estimates according to a previously developed evaluation framework. Incremental annual costs of intensive treatment over years 1 to 5 averaged £29.10 (standard error = £33.00) for consultations with general practitioners and nurses and £54.60 (standard error = £28.50) for metabolic and cardioprotective medication. For ADDITION-UK, over the 10-, 20-, and 30-year time horizon, adjusted incremental quality-adjusted life-years (QALYs) were 0.014, 0.043, and 0.048, and adjusted incremental costs were £1,021, £1,217, and £1,311, resulting in incremental cost-effectiveness ratios of £71,232/QALY, £28,444/QALY, and £27,549/QALY, respectively. Respective incremental cost-effectiveness ratios for ADDITION-Cambridge were slightly higher. The incremental costs of intensive treatment as delivered in the ADDITION-Cambridge trial were lower than expected. Given UK willingness-to-pay thresholds in patients with screen

  12. Addressing the computational cost of large EIT solutions.

    PubMed

    Boyle, Alistair; Borsic, Andrea; Adler, Andy

    2012-05-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.

  13. Low Cost, Scalable Proteomics Data Analysis Using Amazon's Cloud Computing Services and Open Source Search Algorithms

    PubMed Central

    Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.

    2009-01-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578

  14. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    PubMed

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  15. Low Cost Injection Mold Creation via Hybrid Additive and Conventional Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehoff, Ryan R.; Watkins, Thomas R.; List, III, Frederick Alyious

    2015-12-01

    The purpose of the proposed project between Cummins and ORNL is to significantly reduce the cost of the tooling (machining and materials) required to create injection molds to make plastic components. Presently, the high cost of this tooling forces the design decision to make cast aluminum parts because Cummins typical production volumes are too low to allow injection molded plastic parts to be cost effective with the amortized cost of the injection molding tooling. In addition to reducing the weight of components, polymer injection molding allows the opportunity for the alternative cooling methods, via nitrogen gas. Nitrogen gas cooling offersmore » an environmentally and economically attractive cooling option, if the mold can be manufactured economically. In this project, a current injection molding design was optimized for cooling using nitrogen gas. The various components of the injection mold tooling were fabricated using the Renishaw powder bed laser additive manufacturing technology. Subsequent machining was performed on the as deposited components to form a working assembly. The injection mold is scheduled to be tested in a projection setting at a commercial vendor selected by Cummins.« less

  16. An economic model to evaluate cost-effectiveness of computer assisted knee replacement surgery in Norway.

    PubMed

    Gøthesen, Øystein; Slover, James; Havelin, Leif; Askildsen, Jan Erik; Malchau, Henrik; Furnes, Ove

    2013-07-06

    The use of Computer Assisted Surgery (CAS) for knee replacements is intended to improve the alignment of knee prostheses in order to reduce the number of revision operations. Is the cost effectiveness of computer assisted surgery influenced by patient volume and age? By employing a Markov model, we analysed the cost effectiveness of computer assisted surgery versus conventional arthroplasty with respect to implant survival and operation volume in two theoretical Norwegian age cohorts. We obtained mortality and hospital cost data over a 20-year period from Norwegian registers. We presumed that the cost of an intervention would need to be below NOK 500,000 per QALY (Quality Adjusted Life Year) gained, to be considered cost effective. The added cost of computer assisted surgery, provided this has no impact on implant survival, is NOK 1037 and NOK 1414 respectively for 60 and 75-year-olds per quality-adjusted life year at a volume of 25 prostheses per year, and NOK 128 and NOK 175 respectively at a volume of 250 prostheses per year. Sensitivity analyses showed that the 10-year implant survival in cohort 1 needs to rise from 89.8% to 90.6% at 25 prostheses per year, and from 89.8 to 89.9% at 250 prostheses per year for computer assisted surgery to be considered cost effective. In cohort 2, the required improvement is a rise from 95.1% to 95.4% at 25 prostheses per year, and from 95.10% to 95.14% at 250 prostheses per year. The cost of using computer navigation for total knee replacements may be acceptable for 60-year-old as well as 75-year-old patients if the technique increases the implant survival rate just marginally, and the department has a high operation volume. A low volume department might not achieve cost-effectiveness unless computer navigation has a more significant impact on implant survival, thus may defer the investments until such data are available.

  17. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    PubMed

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  18. Some Useful Cost-Benefit Criteria for Evaluating Computer-Based Test Delivery Models and Systems

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    2005-01-01

    Computer-based testing (CBT) is typically implemented using one of three general test delivery models: (1) multiple fixed testing (MFT); (2) computer-adaptive testing (CAT); or (3) multistage testing (MSTs). This article reviews some of the real cost drivers associated with CBT implementation--focusing on item production costs, the costs…

  19. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    ERIC Educational Resources Information Center

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  20. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 11 2014-01-01 2014-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of the...

  1. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of the...

  2. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of the...

  3. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of the...

  4. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of the...

  5. Cost-Effectiveness of Alternative Approaches to Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Levin, Henry M.; And Others

    Operating on the premise that different approaches to computer-assisted instruction (CAI) may use different configurations of hardware and software, different curricula, and different organizational and personnel arrangements, this study explored the feasibility of collecting evaluations of CAI to evaluate the comparative cost-effectiveness of…

  6. A low-cost vector processor boosting compute-intensive image processing operations

    NASA Technical Reports Server (NTRS)

    Adorf, Hans-Martin

    1992-01-01

    Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

  7. SideRack: A Cost-Effective Addition to Commercial Zebrafish Housing Systems

    PubMed Central

    Burg, Leonard; Gill, Ryan; Balciuniene, Jorune

    2014-01-01

    Abstract Commercially available aquatic housing systems provide excellent and relatively trouble-free hardware for rearing and housing juvenile as well as adult zebrafish. However, the cost of such systems is quite high and potentially prohibitive for smaller educational and research institutions. The need for tank space prompted us to experiment with various additions to our existing Aquaneering system. We also noted that high water exchange rates typical in commercial systems are suboptimal for quick growth of juvenile fish. We devised a housing system we call “SideRack,” which contains 20 large tanks with air supply and slow water circulation. It enables cost-effective expansion of existing fish facility, with a key additional benefit of increased growth and maturation rates of juvenile fish. PMID:24611601

  8. DEP : a computer program for evaluating lumber drying costs and investments

    Treesearch

    Stewart Holmes; George B. Harpole; Edward Bilek

    1983-01-01

    The DEP computer program is a modified discounted cash flow computer program designed for analysis of problems involving economic analysis of wood drying processes. Wood drying processes are different from other processes because of the large amounts of working capital required to finance inventories, and because of relatively large shares of costs charged to inventory...

  9. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2017-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws and geometric features were inspected using a 2-megavolt linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  10. Cost-effectiveness of implementing computed tomography screening for lung cancer in Taiwan.

    PubMed

    Yang, Szu-Chun; Lai, Wu-Wei; Lin, Chien-Chung; Su, Wu-Chou; Ku, Li-Jung; Hwang, Jing-Shiang; Wang, Jung-Der

    2017-06-01

    A screening program for lung cancer requires more empirical evidence. Based on the experience of the National Lung Screening Trial (NLST), we developed a method to adjust lead-time bias and quality-of-life changes for estimating the cost-effectiveness of implementing computed tomography (CT) screening in Taiwan. The target population was high-risk (≥30 pack-years) smokers between 55 and 75 years of age. From a nation-wide, 13-year follow-up cohort, we estimated quality-adjusted life expectancy (QALE), loss-of-QALE, and lifetime healthcare expenditures per case of lung cancer stratified by pathology and stage. Cumulative stage distributions for CT-screening and no-screening were assumed equal to those for CT-screening and radiography-screening in the NLST to estimate the savings of loss-of-QALE and additional costs of lifetime healthcare expenditures after CT screening. Costs attributable to screen-negative subjects, false-positive cases and radiation-induced lung cancer were included to obtain the incremental cost-effectiveness ratio from the public payer's perspective. The incremental costs were US$22,755 per person. After dividing this by savings of loss-of-QALE (1.16 quality-adjusted life year (QALY)), the incremental cost-effectiveness ratio was US$19,683 per QALY. This ratio would fall to US$10,947 per QALY if the stage distribution for CT-screening was the same as that of screen-detected cancers in the NELSON trial. Low-dose CT screening for lung cancer among high-risk smokers would be cost-effective in Taiwan. As only about 5% of our women are smokers, future research is necessary to identify the high-risk groups among non-smokers and increase the coverage. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  11. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  12. Cost effectiveness of a computer-delivered intervention to improve HIV medication adherence

    PubMed Central

    2013-01-01

    Background High levels of adherence to medications for HIV infection are essential for optimal clinical outcomes and to reduce viral transmission, but many patients do not achieve required levels. Clinician-delivered interventions can improve patients’ adherence, but usually require substantial effort by trained individuals and may not be widely available. Computer-delivered interventions can address this problem by reducing required staff time for delivery and by making the interventions widely available via the Internet. We previously developed a computer-delivered intervention designed to improve patients’ level of health literacy as a strategy to improve their HIV medication adherence. The intervention was shown to increase patients’ adherence, but it was not clear that the benefits resulting from the increase in adherence could justify the costs of developing and deploying the intervention. The purpose of this study was to evaluate the relation of development and deployment costs to the effectiveness of the intervention. Methods Costs of intervention development were drawn from accounting reports for the grant under which its development was supported, adjusted for costs primarily resulting from the project’s research purpose. Effectiveness of the intervention was drawn from results of the parent study. The relation of the intervention’s effects to changes in health status, expressed as utilities, was also evaluated in order to assess the net cost of the intervention in terms of quality adjusted life years (QALYs). Sensitivity analyses evaluated ranges of possible intervention effectiveness and durations of its effects, and costs were evaluated over several deployment scenarios. Results The intervention’s cost effectiveness depends largely on the number of persons using it and the duration of its effectiveness. Even with modest effects for a small number of patients the intervention was associated with net cost savings in some scenarios and for

  13. Cost effectiveness of a computer-delivered intervention to improve HIV medication adherence.

    PubMed

    Ownby, Raymond L; Waldrop-Valverde, Drenna; Jacobs, Robin J; Acevedo, Amarilis; Caballero, Joshua

    2013-02-28

    High levels of adherence to medications for HIV infection are essential for optimal clinical outcomes and to reduce viral transmission, but many patients do not achieve required levels. Clinician-delivered interventions can improve patients' adherence, but usually require substantial effort by trained individuals and may not be widely available. Computer-delivered interventions can address this problem by reducing required staff time for delivery and by making the interventions widely available via the Internet. We previously developed a computer-delivered intervention designed to improve patients' level of health literacy as a strategy to improve their HIV medication adherence. The intervention was shown to increase patients' adherence, but it was not clear that the benefits resulting from the increase in adherence could justify the costs of developing and deploying the intervention. The purpose of this study was to evaluate the relation of development and deployment costs to the effectiveness of the intervention. Costs of intervention development were drawn from accounting reports for the grant under which its development was supported, adjusted for costs primarily resulting from the project's research purpose. Effectiveness of the intervention was drawn from results of the parent study. The relation of the intervention's effects to changes in health status, expressed as utilities, was also evaluated in order to assess the net cost of the intervention in terms of quality adjusted life years (QALYs). Sensitivity analyses evaluated ranges of possible intervention effectiveness and durations of its effects, and costs were evaluated over several deployment scenarios. The intervention's cost effectiveness depends largely on the number of persons using it and the duration of its effectiveness. Even with modest effects for a small number of patients the intervention was associated with net cost savings in some scenarios and for durations greater than three months and

  14. The economics of time shared computing: Congestion, user costs and capacity

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.

    1982-01-01

    Time shared systems permit the fixed costs of computing resources to be spread over large numbers of users. However, bottleneck results in the theory of closed queueing networks can be used to show that this economy of scale will be offset by the increased congestion that results as more users are added to the system. If one considers the total costs, including the congestion cost, there is an optimal number of users for a system which equals the saturation value usually used to define system capacity.

  15. Computational algorithm to evaluate product disassembly cost index

    NASA Astrophysics Data System (ADS)

    Zeid, Ibrahim; Gupta, Surendra M.

    2002-02-01

    Environmentally conscious manufacturing is an important paradigm in today's engineering practice. Disassembly is a crucial factor in implementing this paradigm. Disassembly allows the reuse and recycling of parts and products that reach their death after their life cycle ends. There are many questions that must be answered before a disassembly decision can be reached. The most important question is economical. The cost of disassembly versus the cost of scrapping a product is always considered. This paper develops a computational tool that allows decision-makers to calculate the disassembly cost of a product. The tool makes it simple to perform 'what if' scenarios fairly quickly. The tool is Web based and has two main parts. The front-end part is a Web page and runs on the client side in a Web browser, while the back-end part is a disassembly engine (servlet) that has disassembly knowledge and costing algorithms and runs on the server side. The tool is based on the client/server model that is pervasively utilized throughout the World Wide Web. An example is used to demonstrate the implementation and capabilities of the tool.

  16. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation... 42 Public Health 3 2012-10-01 2012-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  17. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of... 42 Public Health 3 2011-10-01 2011-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  18. 42 CFR 417.588 - Computation of adjusted average per capita cost (AAPCC).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Risk Basis § 417.588 Computation of... 42 Public Health 3 2010-10-01 2010-10-01 false Computation of adjusted average per capita cost (AAPCC). 417.588 Section 417.588 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF...

  19. hPIN/hTAN: Low-Cost e-Banking Secure against Untrusted Computers

    NASA Astrophysics Data System (ADS)

    Li, Shujun; Sadeghi, Ahmad-Reza; Schmitz, Roland

    We propose hPIN/hTAN, a low-cost token-based e-banking protection scheme when the adversary has full control over the user's computer. Compared with existing hardware-based solutions, hPIN/hTAN depends on neither second trusted channel, nor secure keypad, nor computationally expensive encryption module.

  20. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  1. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE PAGES

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  2. Cost analysis for computer supported multiple-choice paper examinations

    PubMed Central

    Mandel, Alexander; Hörnlein, Alexander; Ifland, Marianus; Lüneburg, Edeltraud; Deckert, Jürgen; Puppe, Frank

    2011-01-01

    Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results. Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS) 2009/10 eleven, in the summer semester (SS) 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam. PMID:22205913

  3. Cost analysis for computer supported multiple-choice paper examinations.

    PubMed

    Mandel, Alexander; Hörnlein, Alexander; Ifland, Marianus; Lüneburg, Edeltraud; Deckert, Jürgen; Puppe, Frank

    2011-01-01

    Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results. Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS) 2009/10 eleven, in the summer semester (SS) 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam.

  4. Shortcomings of low-cost imaging systems for viewing computed radiographs.

    PubMed

    Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N

    2000-01-01

    To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.

  5. Comparison of different strategies in prenatal screening for Down's syndrome: cost effectiveness analysis of computer simulation.

    PubMed

    Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-02-13

    To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100

  6. A Cost-Utility Analysis of Lung Cancer Screening and the Additional Benefits of Incorporating Smoking Cessation Interventions

    PubMed Central

    Villanti, Andrea C.; Jiang, Yiding; Abrams, David B.; Pyenson, Bruce S.

    2013-01-01

    Background A 2011 report from the National Lung Screening Trial indicates that three annual low-dose computed tomography (LDCT) screenings for lung cancer reduced lung cancer mortality by 20% compared to chest X-ray among older individuals at high risk for lung cancer. Discussion has shifted from clinical proof to financial feasibility. The goal of this study was to determine whether LDCT screening for lung cancer in a commercially-insured population (aged 50–64) at high risk for lung cancer is cost-effective and to quantify the additional benefits of incorporating smoking cessation interventions in a lung cancer screening program. Methods and Findings The current study builds upon a previous simulation model to estimate the cost-utility of annual, repeated LDCT screenings over 15 years in a high risk hypothetical cohort of 18 million adults between age 50 and 64 with 30+ pack-years of smoking history. In the base case, the lung cancer screening intervention cost $27.8 billion over 15 years and yielded 985,284 quality-adjusted life years (QALYs) gained for a cost-utility ratio of $28,240 per QALY gained. Adding smoking cessation to these annual screenings resulted in increases in both the costs and QALYs saved, reflected in cost-utility ratios ranging from $16,198 per QALY gained to $23,185 per QALY gained. Annual LDCT lung cancer screening in this high risk population remained cost-effective across all sensitivity analyses. Conclusions The findings of this study indicate that repeat annual lung cancer screening in a high risk cohort of adults aged 50–64 is highly cost-effective. Offering smoking cessation interventions with the annual screening program improved the cost-effectiveness of lung cancer screening between 20% and 45%. The cost-utility ratios estimated in this study were in line with other accepted cancer screening interventions and support inclusion of annual LDCT screening for lung cancer in a high risk population in clinical recommendations. PMID

  7. Low-cost autonomous perceptron neural network inspired by quantum computation

    NASA Astrophysics Data System (ADS)

    Zidan, Mohammed; Abdel-Aty, Abdel-Haleem; El-Sadek, Alaa; Zanaty, E. A.; Abdel-Aty, Mahmoud

    2017-11-01

    Achieving low cost learning with reliable accuracy is one of the important goals to achieve intelligent machines to save time, energy and perform learning process over limited computational resources machines. In this paper, we propose an efficient algorithm for a perceptron neural network inspired by quantum computing composite from a single neuron to classify inspirable linear applications after a single training iteration O(1). The algorithm is applied over a real world data set and the results are outer performs the other state-of-the art algorithms.

  8. Grid connected integrated community energy system. Phase II: final state 2 report. Cost benefit analysis, operating costs and computer simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-03-22

    A grid-connected Integrated Community Energy System (ICES) with a coal-burning power plant located on the University of Minnesota campus is planned. The cost benefit analysis performed for this ICES, the cost accounting methods used, and a computer simulation of the operation of the power plant are described. (LCL)

  9. Computed tomographic colonography to screen for colorectal cancer, extracolonic cancer, and aortic aneurysm: model simulation with cost-effectiveness analysis.

    PubMed

    Hassan, Cesare; Pickhardt, Perry J; Pickhardt, Perry; Laghi, Andrea; Kim, Daniel H; Kim, Daniel; Zullo, Angelo; Iafrate, Franco; Di Giulio, Lorenzo; Morini, Sergio

    2008-04-14

    In addition to detecting colorectal neoplasia, abdominal computed tomography (CT) with colonography technique (CTC) can also detect unsuspected extracolonic cancers and abdominal aortic aneurysms (AAA).The efficacy and cost-effectiveness of this combined abdominal CT screening strategy are unknown. A computerized Markov model was constructed to simulate the occurrence of colorectal neoplasia, extracolonic malignant neoplasm, and AAA in a hypothetical cohort of 100,000 subjects from the United States who were 50 years of age. Simulated screening with CTC, using a 6-mm polyp size threshold for reporting, was compared with a competing model of optical colonoscopy (OC), both without and with abdominal ultrasonography for AAA detection (OC-US strategy). In the simulated population, CTC was the dominant screening strategy, gaining an additional 1458 and 462 life-years compared with the OC and OC-US strategies and being less costly, with a savings of $266 and $449 per person, respectively. The additional gains for CTC were largely due to a decrease in AAA-related deaths, whereas the modeled benefit from extracolonic cancer downstaging was a relatively minor factor. At sensitivity analysis, OC-US became more cost-effective only when the CTC sensitivity for large polyps dropped to 61% or when broad variations of costs were simulated, such as an increase in CTC cost from $814 to $1300 or a decrease in OC cost from $1100 to $500. With the OC-US approach, suboptimal compliance had a strong negative influence on efficacy and cost-effectiveness. The estimated mortality from CT-induced cancer was less than estimated colonoscopy-related mortality (8 vs 22 deaths), both of which were minor compared with the positive benefit from screening. When detection of extracolonic findings such as AAA and extracolonic cancer are considered in addition to colorectal neoplasia in our model simulation, CT colonography is a dominant screening strategy (ie, more clinically effective and more cost

  10. Low-cost computing and network communication for a point-of-care device to perform a 3-part leukocyte differential

    NASA Astrophysics Data System (ADS)

    Powless, Amy J.; Feekin, Lauren E.; Hutcheson, Joshua A.; Alapat, Daisy V.; Muldoon, Timothy J.

    2016-03-01

    Point-of-care approaches for 3-part leukocyte differentials (granulocyte, monocyte, and lymphocyte), traditionally performed using a hematology analyzer within a panel of tests called a complete blood count (CBC), are essential not only to reduce cost but to provide faster results in low resource areas. Recent developments in lab-on-a-chip devices have shown promise in reducing the size and reagents used, relating to a decrease in overall cost. Furthermore, smartphone diagnostic approaches have shown much promise in the area of point-of-care diagnostics, but the relatively high per-unit cost may limit their utility in some settings. We present here a method to reduce computing cost of a simple epi-fluorescence imaging system using a Raspberry Pi (single-board computer, <$40) to perform a 3-part leukocyte differential comparable to results from a hematology analyzer. This system uses a USB color camera in conjunction with a leukocyte-selective vital dye (acridine orange) in order to determine a leukocyte count and differential from a low volume (<20 microliters) of whole blood obtained via fingerstick. Additionally, the system utilizes a "cloud-based" approach to send image data from the Raspberry Pi to a main server and return results back to the user, exporting the bulk of the computational requirements. Six images were acquired per minute with up to 200 cells per field of view. Preliminary results showed that the differential count varied significantly in monocytes with a 1 minute time difference indicating the importance of time-gating to produce an accurate/consist differential.

  11. A unified RANS–LES model: Computational development, accuracy and cost

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopalan, Harish, E-mail: hgopalan@uwyo.edu; Heinz, Stefan, E-mail: heinz@uwyo.edu; Stöllinger, Michael K., E-mail: MStoell@uwyo.edu

    2013-09-15

    Large eddy simulation (LES) is computationally extremely expensive for the investigation of wall-bounded turbulent flows at high Reynolds numbers. A way to reduce the computational cost of LES by orders of magnitude is to combine LES equations with Reynolds-averaged Navier–Stokes (RANS) equations used in the near-wall region. A large variety of such hybrid RANS–LES methods are currently in use such that there is the question of which hybrid RANS-LES method represents the optimal approach. The properties of an optimal hybrid RANS–LES model are formulated here by taking reference to fundamental properties of fluid flow equations. It is shown that unifiedmore » RANS–LES models derived from an underlying stochastic turbulence model have the properties of optimal hybrid RANS–LES models. The rest of the paper is organized in two parts. First, a priori and a posteriori analyses of channel flow data are used to find the optimal computational formulation of the theoretically derived unified RANS–LES model and to show that this computational model, which is referred to as linear unified model (LUM), does also have all the properties of an optimal hybrid RANS–LES model. Second, a posteriori analyses of channel flow data are used to study the accuracy and cost features of the LUM. The following conclusions are obtained. (i) Compared to RANS, which require evidence for their predictions, the LUM has the significant advantage that the quality of predictions is relatively independent of the RANS model applied. (ii) Compared to LES, the significant advantage of the LUM is a cost reduction of high-Reynolds number simulations by a factor of 0.07Re{sup 0.46}. For coarse grids, the LUM has a significant accuracy advantage over corresponding LES. (iii) Compared to other usually applied hybrid RANS–LES models, it is shown that the LUM provides significantly improved predictions.« less

  12. Minnesota Computer Aided Library System (MCALS); University of Minnesota Subsystem Cost/Benefits Analysis.

    ERIC Educational Resources Information Center

    Lourey, Eugene D., Comp.

    The Minnesota Computer Aided Library System (MCALS) provides a basis of unification for library service program development in Minnesota for eventual linkage to the national information network. A prototype plan for communications functions is illustrated. A cost/benefits analysis was made to show the cost/effectiveness potential for MCALS. System…

  13. Cost effectiveness of the addition of a comprehensive CT scan to the abdomen and pelvis for the detection of cancer after unprovoked venous thromboembolism.

    PubMed

    Coyle, Kathryn; Carrier, Marc; Lazo-Langner, Alejandro; Shivakumar, Sudeep; Zarychanski, Ryan; Tagalakis, Vicky; Solymoss, Susan; Routhier, Nathalie; Douketis, James; Coyle, Douglas

    2017-03-01

    Unprovoked venous thromboembolism (VTE) can be the first manifestation of cancer. It is unclear if extensive screening for occult cancer including a comprehensive computed tomography (CT) scan of the abdomen/pelvis is cost-effective in this patient population. To assess the health care related costs, number of missed cancer cases and health related utility values of a limited screening strategy with and without the addition of a comprehensive CT scan of the abdomen/pelvis and to identify to what extent testing should be done in these circumstances to allow early detection of occult cancers. Cost effectiveness analysis using data that was collected alongside the SOME randomized controlled trial which compared an extensive occult cancer screening including a CT of the abdomen/pelvis to a more limited screening strategy in patients with a first unprovoked VTE, was used for the current analyses. Analyses were conducted with a one-year time horizon from a Canadian health care perspective. Primary analysis was based on complete cases, with sensitivity analysis using appropriate multiple imputation methods to account for missing data. Data from a total of 854 patients with a first unprovoked VTE were included in these analyses. The addition of a comprehensive CT scan was associated with higher costs ($551 CDN) with no improvement in utility values or number of missed cancers. Results were consistent when adopting multiple imputation methods. The addition of a comprehensive CT scan of the abdomen/pelvis for the screening of occult cancer in patients with unprovoked VTE is not cost effective, as it is both more costly and not more effective in detecting occult cancer. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Cost considerations in automating the library.

    PubMed Central

    Bolef, D

    1987-01-01

    The purchase price of a computer and its software is but a part of the cost of any automated system. There are many additional costs, including one-time costs of terminals, printers, multiplexors, microcomputers, consultants, workstations and retrospective conversion, and ongoing costs of maintenance and maintenance contracts for the equipment and software, telecommunications, and supplies. This paper examines those costs in an effort to produce a more realistic picture of an automated system. PMID:3594021

  15. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  16. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    PubMed Central

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  17. Open-source meteor detection software for low-cost single-board computers

    NASA Astrophysics Data System (ADS)

    Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.

    2016-01-01

    This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.

  18. Using 3D Printing (Additive Manufacturing) to Produce Low-Cost Simulation Models for Medical Training.

    PubMed

    Lichtenberger, John P; Tatum, Peter S; Gada, Satyen; Wyn, Mark; Ho, Vincent B; Liacouras, Peter

    2018-03-01

    This work describes customized, task-specific simulation models derived from 3D printing in clinical settings and medical professional training programs. Simulation models/task trainers have an array of purposes and desired achievements for the trainee, defining that these are the first step in the production process. After this purpose is defined, computer-aided design and 3D printing (additive manufacturing) are used to create a customized anatomical model. Simulation models then undergo initial in-house testing by medical specialists followed by a larger scale beta testing. Feedback is acquired, via surveys, to validate effectiveness and to guide or determine if any future modifications and/or improvements are necessary. Numerous custom simulation models have been successfully completed with resulting task trainers designed for procedures, including removal of ocular foreign bodies, ultrasound-guided joint injections, nerve block injections, and various suturing and reconstruction procedures. These task trainers have been frequently utilized in the delivery of simulation-based training with increasing demand. 3D printing has been integral to the production of limited-quantity, low-cost simulation models across a variety of medical specialties. In general, production cost is a small fraction of a commercial, generic simulation model, if available. These simulation and training models are customized to the educational need and serve an integral role in the education of our military health professionals.

  19. Cost-effective computational method for radiation heat transfer in semi-crystalline polymers

    NASA Astrophysics Data System (ADS)

    Boztepe, Sinan; Gilblas, Rémi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2018-05-01

    This paper introduces a cost-effective numerical model for infrared (IR) heating of semi-crystalline polymers. For the numerical and experimental studies presented here semi-crystalline polyethylene (PE) was used. The optical properties of PE were experimentally analyzed under varying temperature and the obtained results were used as input in the numerical studies. The model was built based on optically homogeneous medium assumption whereas the strong variation in the thermo-optical properties of semi-crystalline PE under heating was taken into account. Thus, the change in the amount radiative energy absorbed by the PE medium was introduced in the model induced by its temperature-dependent thermo-optical properties. The computational study was carried out considering an iterative closed-loop computation, where the absorbed radiation was computed using an in-house developed radiation heat transfer algorithm -RAYHEAT- and the computed results was transferred into the commercial software -COMSOL Multiphysics- for solving transient heat transfer problem to predict temperature field. The predicted temperature field was used to iterate the thermo-optical properties of PE that varies under heating. In order to analyze the accuracy of the numerical model experimental analyses were carried out performing IR-thermographic measurements during the heating of the PE plate. The applicability of the model in terms of computational cost, number of numerical input and accuracy was highlighted.

  20. Government regulation and public opposition create high additional costs for field trials with GM crops in Switzerland.

    PubMed

    Bernauer, Thomas; Tribaldos, Theresa; Luginbühl, Carolin; Winzeler, Michael

    2011-12-01

    Field trials with GM crops are not only plant science experiments. They are also social experiments concerning the implications of government imposed regulatory constraints and public opposition for scientific activity. We assess these implications by estimating additional costs due to government regulation and public opposition in a recent set of field trials in Switzerland. We find that for every Euro spent on research, an additional 78 cents were spent on security, an additional 31 cents on biosafety, and an additional 17 cents on government regulatory supervision. Hence the total additional spending due to government regulation and public opposition was around 1.26 Euros for every Euro spent on the research per se. These estimates are conservative; they do not include additional costs that are hard to monetize (e.g. stakeholder information and dialogue activities, involvement of various government agencies). We conclude that further field experiments with GM crops in Switzerland are unlikely unless protected sites are set up to reduce these additional costs.

  1. Scilab software as an alternative low-cost computing in solving the linear equations problem

    NASA Astrophysics Data System (ADS)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  2. Early assessment of the likely cost-effectiveness of a new technology: A Markov model with probabilistic sensitivity analysis of computer-assisted total knee replacement.

    PubMed

    Dong, Hengjin; Buxton, Martin

    2006-01-01

    The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.

  3. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage...

  4. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED..., App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage...

  5. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... practice is to treat these costs by some other method, they may be accepted if they are found to be reasonable and equitable. Bid and proposal costs do not include independent research and development costs or pre-award costs. (b) Independent research and development costs. Independent research and development...

  6. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    PubMed

    Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian

    2011-01-01

    The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single

  7. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S r

  8. Predicting Cost/Performance Trade-Offs for Whitney: A Commodity Computing Cluster

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.; Nitzberg, Bill; VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Recent advances in low-end processor and network technology have made it possible to build a "supercomputer" out of commodity components. We develop simple models of the NAS Parallel Benchmarks version 2 (NPB 2) to explore the cost/performance trade-offs involved in building a balanced parallel computer supporting a scientific workload. We develop closed form expressions detailing the number and size of messages sent by each benchmark. Coupling these with measured single processor performance, network latency, and network bandwidth, our models predict benchmark performance to within 30%. A comparison based on total system cost reveals that current commodity technology (200 MHz Pentium Pros with 100baseT Ethernet) is well balanced for the NPBs up to a total system cost of around $1,000,000.

  9. The cost of an additional disability-free life year for older Americans: 1992-2005.

    PubMed

    Cai, Liming

    2013-02-01

    To estimate the cost of an additional disability-free life year for older Americans in 1992-2005. This study used 1992-2005 Medicare Current Beneficiary Survey, a longitudinal survey of Medicare beneficiaries with a rotating panel design. This analysis used multistate life table model to estimate probabilities of transition among a discrete set of health states (nondisabled, disabled, and dead) for two panels of older Americans in 1992 and 2002. Health spending incurred between annual health interviews was estimated by a generalized linear mixed model. Health status, including death, was simulated for each member of the panel using these transition probabilities; the associated health spending was cross-walked to the simulated health changes. Disability-free life expectancy (DFLE) increased significantly more than life expectancy during the study period. Assuming that 50 percent of the gains in DFLE between 1992 and 2005 were attributable to increases in spending, the average discounted cost per additional disability-free life year was $71,000. There were small differences between gender and racial/ethnic groups. The cost of an additional disability-free life year was substantially below previous estimates based on mortality trends alone. © Health Research and Educational Trust.

  10. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  11. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false What additional costs will I incur if I am granted a... AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing, and Collections § 171.555 What additional costs will I incur if I am granted a Payment Plan? You will incur the...

  12. 25 CFR 171.555 - What additional costs will I incur if I am granted a Payment Plan?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What additional costs will I incur if I am granted a... AND WATER IRRIGATION OPERATION AND MAINTENANCE Financial Matters: Assessments, Billing, and Collections § 171.555 What additional costs will I incur if I am granted a Payment Plan? You will incur the...

  13. Computer program to perform cost and weight analysis of transport aircraft. Volume 2: Technical volume

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.

  14. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  15. Cost Per Additional Responder Associated With Ixekizumab and Etanercept in the Treatment of Moderate-to-Severe Psoriasis.

    PubMed

    Feldman, Steven R; Foster, Shonda A; Zhu, Baojin; Burge, Russel; Al Sawah, Sarah; Goldblum, Orin M

    2017-12-01

    BACKGROUND: Newer psoriasis treatments can achieve greater levels of efficacy than older systemic therapies; however, current psoriasis costs are substantial. We sought to estimate costs per additional responder associated with ixekizumab and etanercept, versus placebo, using efficacy data from phase 3 clinical trials (UNCOVER-2 and UNCOVER-3). METHODS: In UNCOVER-2/UNCOVER-3, patients received subcutaneous placebo, etanercept 50 mg twice weekly (BIW), or ixekizumab one 80 mg injection every 2 weeks (Q2W) after a 160-mg starting dose. Twelve-week induction-phase Psoriasis Area and Severity Index (PASI) 75, PASI 90, and PASI 100 response rates for ixekizumab, etanercept, and placebo were obtained from pooled data from the overall and United States (US) subgroup intention-to-treat (ITT) populations, and used to calculate numbers needed to treat (NNTs) to achieve one additional PASI 75, PASI 90, or PASI 100 response for ixekizumab Q2W and etanercept BIW versus placebo. Twelve-week drug costs per patient were calculated based on the UNCOVER-2/UNCOVER-3 dosing schedule and wholesale acquisition costs. Mean costs per additional responder for PASI 75, PASI 90, and PASI 100 for each treatment versus placebo were calculated for pooled UN-COVER-2/UNCOVER-3 overall and US subgroup ITT populations. RESULTS: Pooled overall ITT population: costs per additional PASI 75, PASI 90, or PASI 100 responder were US $37,540, US $46,299, or US $80,710 for ixekizumab Q2W and US $57,533, US $120,720, or US $404,695 for etanercept BIW, respectively. US subgroup ITT population: costs per additional PASI 75, PASI 90, or PASI 100 responder were US $38,165, US $49,740, or US $93,536 for ixekizumab Q2W and US $69,580, US $140,881, or US $631,875 for etanercept BIW, respectively. CONCLUSIONS: Twelve-week costs per additional responder were lower for ixekizumab Q2W than for etanercept BIW across all levels of clearance (PASI 75, PASI 90, and PASI 100) in the pooled UNCOVER-2/UNCOVER-3 overall and

  16. Computational cost of two alternative formulations of Cahn-Hilliard equations

    NASA Astrophysics Data System (ADS)

    Paszyński, Maciej; Gurgul, Grzegorz; Łoś, Marcin; Szeliga, Danuta

    2018-05-01

    In this paper we propose two formulations of Cahn-Hilliard equations, which have several applications in cancer growth modeling and material science phase-field simulations. The first formulation uses one C4 partial differential equations (PDEs) the second one uses two C2 PDEs. Finally, we compare the computational costs of direct solvers for both formulations, using the refined isogeometric analysis (rIGA) approach.

  17. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Procedures for the Computation of the Real Cost of Capital I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING POWERPLANTS Pt. 504, App. I Appendix I to Part 504—Procedures for the Computation of the Real Cost of Capital (a) The firm's real after-tax weighted average...

  18. Cost-effective poster and print production with digital camera and computer technology.

    PubMed

    Chen, M Y; Ott, D J; Rohde, R P; Henson, E; Gelfand, D W; Boehme, J M

    1997-10-01

    The purpose of this report is to describe a cost-effective method for producing black-and-white prints and color posters within a radiology department. Using a high-resolution digital camera, personal computer, and color printer, the average cost of a 5 x 7 inch (12.5 x 17.5 cm) black-and-white print may be reduced from $8.50 to $1 each in our institution. The average cost for a color print (8.5 x 14 inch [21.3 x 35 cm]) varies from $2 to $3 per sheet depending on the selection of ribbons for a color-capable laser printer and the paper used. For a 30-panel, 4 x 8 foot (1.2 x 2.4 m) standard-sized poster, the cost for materials and construction is approximately $100.

  19. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational

  20. The Ruggedized STD Bus Microcomputer - A low cost computer suitable for Space Shuttle experiments

    NASA Technical Reports Server (NTRS)

    Budney, T. J.; Stone, R. W.

    1982-01-01

    Previous space flight computers have been costly in terms of both hardware and software. The Ruggedized STD Bus Microcomputer is based on the commercial Mostek/Pro-Log STD Bus. Ruggedized PC cards can be based on commercial cards from more than 60 manufacturers, reducing hardware cost and design time. Software costs are minimized by using standard 8-bit microprocessors and by debugging code using commercial versions of the ruggedized flight boards while the flight hardware is being fabricated.

  1. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    ERIC Educational Resources Information Center

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  2. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  3. Computational Fluid Dynamics and Additive Manufacturing to Diagnose and Treat Cardiovascular Disease.

    PubMed

    Randles, Amanda; Frakes, David H; Leopold, Jane A

    2017-11-01

    Noninvasive engineering models are now being used for diagnosing and planning the treatment of cardiovascular disease. Techniques in computational modeling and additive manufacturing have matured concurrently, and results from simulations can inform and enable the design and optimization of therapeutic devices and treatment strategies. The emerging synergy between large-scale simulations and 3D printing is having a two-fold benefit: first, 3D printing can be used to validate the complex simulations, and second, the flow models can be used to improve treatment planning for cardiovascular disease. In this review, we summarize and discuss recent methods and findings for leveraging advances in both additive manufacturing and patient-specific computational modeling, with an emphasis on new directions in these fields and remaining open questions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Using a Computer Game to Reinforce Skills in Addition Basic Facts in Second Grade.

    ERIC Educational Resources Information Center

    Kraus, William H.

    1981-01-01

    A computer-generated game called Fish Chase was developed to present drill-and-practice exercises on addition facts. The subjects of the study were 19 second-grade pupils. The results indicate a computer game can be used effectively to increase proficiency with basic facts. (MP)

  5. Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise

    PubMed Central

    Burge, Johannes

    2017-01-01

    Accuracy Maximization Analysis (AMA) is a recently developed Bayesian ideal observer method for task-specific dimensionality reduction. Given a training set of proximal stimuli (e.g. retinal images), a response noise model, and a cost function, AMA returns the filters (i.e. receptive fields) that extract the most useful stimulus features for estimating a user-specified latent variable from those stimuli. Here, we first contribute two technical advances that significantly reduce AMA’s compute time: we derive gradients of cost functions for which two popular estimators are appropriate, and we implement a stochastic gradient descent (AMA-SGD) routine for filter learning. Next, we show how the method can be used to simultaneously probe the impact on neural encoding of natural stimulus variability, the prior over the latent variable, noise power, and the choice of cost function. Then, we examine the geometry of AMA’s unique combination of properties that distinguish it from better-known statistical methods. Using binocular disparity estimation as a concrete test case, we develop insights that have general implications for understanding neural encoding and decoding in a broad class of fundamental sensory-perceptual tasks connected to the energy model. Specifically, we find that non-orthogonal (partially redundant) filters with scaled additive noise tend to outperform orthogonal filters with constant additive noise; non-orthogonal filters and scaled additive noise can interact to sculpt noise-induced stimulus encoding uncertainty to match task-irrelevant stimulus variability. Thus, we show that some properties of neural response thought to be biophysical nuisances can confer coding advantages to neural systems. Finally, we speculate that, if repurposed for the problem of neural systems identification, AMA may be able to overcome a fundamental limitation of standard subunit model estimation. As natural stimuli become more widely used in the study of psychophysical and

  6. Weight and cost estimating relationships for heavy lift airships

    NASA Technical Reports Server (NTRS)

    Gray, D. W.

    1979-01-01

    Weight and cost estimating relationships, including additional parameters that influence the cost and performance of heavy-lift airships (HLA), are discussed. Inputs to a closed loop computer program, consisting of useful load, forward speed, lift module positive or negative thrust, and rotors and propellers, are examined. Detail is given to the HLA cost and weight program (HLACW), which computes component weights, vehicle size, buoyancy lift, rotor and propellar thrust, and engine horse power. This program solves the problem of interrelating the different aerostat, rotors, engines and propeller sizes. Six sets of 'default parameters' are left for the operator to change during each computer run enabling slight data manipulation without altering the program.

  7. Clinical and cost effectiveness of computer treatment for aphasia post stroke (Big CACTUS): study protocol for a randomised controlled trial.

    PubMed

    Palmer, Rebecca; Cooper, Cindy; Enderby, Pam; Brady, Marian; Julious, Steven; Bowen, Audrey; Latimer, Nicholas

    2015-01-27

    Aphasia affects the ability to speak, comprehend spoken language, read and write. One third of stroke survivors experience aphasia. Evidence suggests that aphasia can continue to improve after the first few months with intensive speech and language therapy, which is frequently beyond what resources allow. The development of computer software for language practice provides an opportunity for self-managed therapy. This pragmatic randomised controlled trial will investigate the clinical and cost effectiveness of a computerised approach to long-term aphasia therapy post stroke. A total of 285 adults with aphasia at least four months post stroke will be randomly allocated to either usual care, computerised intervention in addition to usual care or attention and activity control in addition to usual care. Those in the intervention group will receive six months of self-managed word finding practice on their home computer with monthly face-to-face support from a volunteer/assistant. Those in the attention control group will receive puzzle activities, supplemented by monthly telephone calls. Study delivery will be coordinated by 20 speech and language therapy departments across the United Kingdom. Outcome measures will be made at baseline, six, nine and 12 months after randomisation by blinded speech and language therapist assessors. Primary outcomes are the change in number of words (of personal relevance) named correctly at six months and improvement in functional conversation. Primary outcomes will be analysed using a Hochberg testing procedure. Significance will be declared if differences in both word retrieval and functional conversation at six months are significant at the 5% level, or if either comparison is significant at 2.5%. A cost utility analysis will be undertaken from the NHS and personal social service perspective. Differences between costs and quality-adjusted life years in the three groups will be described and the incremental cost effectiveness ratio

  8. The Cost of an Additional Disability-Free Life Year for Older Americans: 1992–2005

    PubMed Central

    Cai, Liming

    2013-01-01

    Objective To estimate the cost of an additional disability-free life year for older Americans in 1992–2005. Data Source This study used 1992–2005 Medicare Current Beneficiary Survey, a longitudinal survey of Medicare beneficiaries with a rotating panel design. Study Design This analysis used multistate life table model to estimate probabilities of transition among a discrete set of health states (nondisabled, disabled, and dead) for two panels of older Americans in 1992 and 2002. Health spending incurred between annual health interviews was estimated by a generalized linear mixed model. Health status, including death, was simulated for each member of the panel using these transition probabilities; the associated health spending was cross-walked to the simulated health changes. Principal Findings Disability-free life expectancy (DFLE) increased significantly more than life expectancy during the study period. Assuming that 50 percent of the gains in DFLE between 1992 and 2005 were attributable to increases in spending, the average discounted cost per additional disability-free life year was $71,000. There were small differences between gender and racial/ethnic groups. Conclusions The cost of an additional disability-free life year was substantially below previous estimates based on mortality trends alone. PMID:22670874

  9. Computational cost for detecting inspiralling binaries using a network of laser interferometric detectors

    NASA Astrophysics Data System (ADS)

    Pai, Archana; Bose, Sukanta; Dhurandhar, Sanjeev

    2002-04-01

    We extend a coherent network data-analysis strategy developed earlier for detecting Newtonian waveforms to the case of post-Newtonian (PN) waveforms. Since the PN waveform depends on the individual masses of the inspiralling binary, the parameter-space dimension increases by one from that of the Newtonian case. We obtain the number of templates and estimate the computational costs for PN waveforms: for a lower mass limit of 1Msolar, for LIGO-I noise and with 3% maximum mismatch, the online computational speed requirement for single detector is a few Gflops; for a two-detector network it is hundreds of Gflops and for a three-detector network it is tens of Tflops. Apart from idealistic networks, we obtain results for realistic networks comprising of LIGO and VIRGO. Finally, we compare costs incurred in a coincidence detection strategy with those incurred in the coherent strategy detailed above.

  10. 78 FR 32224 - Availability of Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional Discussion Topics in Connect America Cost Model Virtual Workshop AGENCY: Federal Communications Commission. ACTION: Proposed rule... America Cost Model (CAM v3.1.2), which allows Commission staff and interested parties to calculate costs...

  11. Two weeks of additional standing balance circuit classes during inpatient rehabilitation are cost saving and effective: an economic evaluation.

    PubMed

    Treacy, Daniel; Howard, Kirsten; Hayes, Alison; Hassett, Leanne; Schurr, Karl; Sherrington, Catherine

    2018-01-01

    Among people admitted for inpatient rehabilitation, is usual care plus standing balance circuit classes more cost-effective than usual care alone? Cost-effectiveness study embedded within a randomised controlled trial with concealed allocation, assessor blinding and intention-to-treat analysis. 162 rehabilitation inpatients from a metropolitan hospital in Sydney, Australia. The experimental group received a 1-hour standing balance circuit class, delivered three times a week for 2 weeks, in addition to usual therapy. The circuit classes were supervised by one physiotherapist and one physiotherapy assistant for up to eight patients. The control group received usual therapy alone. Costs were estimated from routinely collected hospital use data in the 3 months after randomisation. The functional outcome measure was mobility measured at 3 months using the Short Physical Performance Battery administered by a blinded assessor. An incremental analysis was conducted and the joint probability distribution of costs and outcomes was examined using bootstrapping. The median cost savings for the intervention group was AUD4,741 (95% CI 137 to 9,372) per participant; 94% of bootstraps showed that the intervention was both effective and cost saving. Two weeks of additional standing balance circuit classes delivered in addition to usual therapy resulted in decreased healthcare costs at 3 months in hospital inpatients admitted for rehabilitation. There is a high probability that this intervention is both cost saving and effective. ACTRN12611000412932. [Treacy D, Howard K, Hayes A, Hassett L, Schurr K, Sherrington C (2018) Two weeks of additional standing balance circuit classes during inpatient rehabilitation are cost saving and effective: an economic evaluation. Journal of Physiotherapy 64: 41-47]. Copyright © 2017 Australian Physiotherapy Association. Published by Elsevier B.V. All rights reserved.

  12. 48 CFR 352.216-70 - Additional cost principles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-Federal contracts, grants, and agreements, including the development of scientific, cost, and other data... method, they may be accepted if they are found to be reasonable and equitable. (4) B & P costs do not...

  13. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... scientific, cost, and other data needed to support the bids, proposals, and applications. Bid and proposal... practice is to treat these costs by some other method, they may be accepted if they are found to be...

  14. 48 CFR 3452.216-70 - Additional cost principles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... scientific, cost, and other data needed to support the bids, proposals, and applications. Bid and proposal... practice is to treat these costs by some other method, they may be accepted if they are found to be...

  15. Cost accounting in radiation oncology: a computer-based model for reimbursement.

    PubMed

    Perez, C A; Kobeissi, B; Smith, B D; Fox, S; Grigsby, P W; Purdy, J A; Procter, H D; Wasserman, T H

    1993-04-02

    The skyrocketing cost of medical care in the United States has resulted in multiple efforts in cost containment. The present work offers a rational computer-based cost accounting approach to determine the actual use of resources in providing a specific service in a radiation oncology center. A procedure-level cost accounting system was developed by using recorded information on actual time and effort spent by individual staff members performing various radiation oncology procedures, and analyzing direct and indirect costs related to staffing (labor), facilities and equipment, supplies, etc. Expenditures were classified as direct or indirect and fixed or variable. A relative value unit was generated to allocate specific cost factors to each procedure. Different costs per procedure were identified according to complexity. Whereas there was no significant difference in the treatment time between low-energy (4 and 6 MV) or high-energy (18 MV) accelerators, there were significantly higher costs identified in the operation of a high-energy linear accelerator, a reflection of initial equipment investment, quality assurance and calibration procedures, maintenance costs, service contract, and replacement parts. Utilization of resources was related to the complexity of the procedures performed and whether the treatments were delivered to inpatients or outpatients. In analyzing time motion for physicians and other staff, it was apparent that a greater effort must be made to train the staff to accurately record all times involved in a given procedure, and it is strongly recommended that each institution perform its own time motion studies to more accurately determine operating costs. Sixty-six percent of our facility's global costs were for labor, 20% for other operating expenses, 10% for space, and 4% for equipment. Significant differences were noted in the cost allocation for professional or technical functions, as labor, space, and equipment costs are higher in the latter

  16. The Applications of Computers in Education in Developing Countries--with Specific Reference to the Cost-Effectiveness of Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Lai, Kwok-Wing

    Designed to examine the application and cost-effectiveness of computer-assisted instruction (CAI) for secondary education in developing countries, this document is divided into eight chapters. A general introduction defines the research problem, describes the research methodology, and provides definitions of key terms used throughout the paper.…

  17. Strapdown cost trend study and forecast

    NASA Technical Reports Server (NTRS)

    Eberlein, A. J.; Savage, P. G.

    1975-01-01

    The potential cost advantages offered by advanced strapdown inertial technology in future commercial short-haul aircraft are summarized. The initial procurement cost and six year cost-of-ownership, which includes spares and direct maintenance cost were calculated for kinematic and inertial navigation systems such that traditional and strapdown mechanization costs could be compared. Cost results for the inertial navigation systems showed that initial costs and the cost of ownership for traditional triple redundant gimbaled inertial navigators are three times the cost of the equivalent skewed redundant strapdown inertial navigator. The net cost advantage for the strapdown kinematic system is directly attributable to the reduction in sensor count for strapdown. The strapdown kinematic system has the added advantage of providing a fail-operational inertial navigation capability for no additional cost due to the use of inertial grade sensors and attitude reference computers.

  18. Postprocessing of Voxel-Based Topologies for Additive Manufacturing Using the Computational Geometry Algorithms Library (CGAL)

    DTIC Science & Technology

    2015-06-01

    10-2014 to 00-11-2014 4. TITLE AND SUBTITLE Postprocessing of Voxel-Based Topologies for Additive Manufacturing Using the Computational Geometry...ABSTRACT Postprocessing of 3-dimensional (3-D) topologies that are defined as a set of voxels using the Computational Geometry Algorithms Library (CGAL... computational geometry algorithms, several of which are suited to the task. The work flow described in this report involves first defining a set of

  19. A retrospective cost-analysis of additional homeopathic treatment in Germany: Long-term economic outcomes

    PubMed Central

    Ostermann, Julia K.; Witt, Claudia M.; Reinhold, Thomas

    2017-01-01

    Objectives This study aimed to provide a long-term cost comparison of patients using additional homeopathic treatment (homeopathy group) with patients using usual care (control group) over an observation period of 33 months. Methods Health claims data from a large statutory health insurance company were analysed from both the societal perspective (primary outcome) and from the statutory health insurance perspective (secondary outcome). To compare costs between patient groups, homeopathy and control patients were matched in a 1:1 ratio using propensity scores. Predictor variables for the propensity scores included health care costs and both medical and demographic variables. Health care costs were analysed using an analysis of covariance, adjusted for baseline costs, between groups both across diagnoses and for specific diagnoses over a period of 33 months. Specific diagnoses included depression, migraine, allergic rhinitis, asthma, atopic dermatitis, and headache. Results Data from 21,939 patients in the homeopathy group (67.4% females) and 21,861 patients in the control group (67.2% females) were analysed. Health care costs over the 33 months were 12,414 EUR [95% CI 12,022–12,805] in the homeopathy group and 10,428 EUR [95% CI 10,036–10,820] in the control group (p<0.0001). The largest cost differences were attributed to productivity losses (homeopathy: EUR 6,289 [6,118–6,460]; control: EUR 5,498 [5,326–5,670], p<0.0001) and outpatient costs (homeopathy: EUR 1,794 [1,770–1,818]; control: EUR 1,438 [1,414–1,462], p<0.0001). Although the costs of the two groups converged over time, cost differences remained over the full 33 months. For all diagnoses, homeopathy patients generated higher costs than control patients. Conclusion The analysis showed that even when following-up over 33 months, there were still cost differences between groups, with higher costs in the homeopathy group. PMID:28915242

  20. Thermodynamics of quasideterministic digital computers

    NASA Astrophysics Data System (ADS)

    Chu, Dominique

    2018-02-01

    A central result of stochastic thermodynamics is that irreversible state transitions of Markovian systems entail a cost in terms of an infinite entropy production. A corollary of this is that strictly deterministic computation is not possible. Using a thermodynamically consistent model, we show that quasideterministic computation can be achieved at finite, and indeed modest cost with accuracies that are indistinguishable from deterministic behavior for all practical purposes. Concretely, we consider the entropy production of stochastic (Markovian) systems that behave like and and a not gates. Combinations of these gates can implement any logical function. We require that these gates return the correct result with a probability that is very close to 1, and additionally, that they do so within finite time. The central component of the model is a machine that can read and write binary tapes. We find that the error probability of the computation of these gates falls with the power of the system size, whereas the cost only increases linearly with the system size.

  1. Nutritional supplementation: the additional costs of managing children infected with HIV in resource-constrained settings.

    PubMed

    Cobb, G; Bland, R M

    2013-01-01

    To explore the financial implications of applying the WHO guidelines for the nutritional management of HIV-infected children in a rural South African HIV programme. WHO guidelines describe Nutritional Care Plans (NCPs) for three categories of HIV-infected children: NCP-A: growing adequately; NCP-B: weight-for-age z-score (WAZ) ≤-2 but no evidence of severe acute malnutrition (SAM), confirmed weight loss/growth curve flattening, or condition with increased nutritional needs (e.g. tuberculosis); NCP-C: SAM. In resource-constrained settings, children requiring NCP-B or NCP-C usually need supplementation to achieve the additional energy recommendation. We estimated the proportion of children initiating antiretroviral treatment (ART) in the Hlabisa HIV Programme who would have been eligible for supplementation in 2010. The cost of supplying 26-weeks supplementation as a proportion of the cost of supplying ART to the same group was calculated. A total of 251 children aged 6 months to 14 years initiated ART. Eighty-eight required 6-month NCP-B, including 41 with a WAZ ≤-2 (no evidence of SAM) and 47 with a WAZ >-2 with co-existent morbidities including tuberculosis. Additionally, 25 children had SAM and required 10-weeks NCP-C followed by 16-weeks NCP-B. Thus, 113 of 251 (45%) children were eligible for nutritional supplementation at an estimated overall cost of $11 136, using 2010 exchange rates. These costs are an estimated additional 11.6% to that of supplying 26-week ART to the 251 children initiated. It is essential to address nutritional needs of HIV-infected children to optimise their health outcomes. Nutritional supplementation should be integral to, and budgeted for, in HIV programmes. © 2012 Blackwell Publishing Ltd.

  2. On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi

    2008-01-01

    Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.

  3. Computing the Expected Cost of an Appointment Schedule for Statistically Identical Customers with Probabilistic Service Times

    PubMed Central

    Dietz, Dennis C.

    2014-01-01

    A cogent method is presented for computing the expected cost of an appointment schedule where customers are statistically identical, the service time distribution has known mean and variance, and customer no-shows occur with time-dependent probability. The approach is computationally efficient and can be easily implemented to evaluate candidate schedules within a schedule optimization algorithm. PMID:24605070

  4. Cost-Effective Additive Manufacturing in Space: HELIOS Technology Challenge Guide

    NASA Technical Reports Server (NTRS)

    DeVieneni, Alayna; Velez, Carlos Andres; Benjamin, David; Hollenbeck, Jay

    2012-01-01

    Welcome to the HELIOS Technology Challenge Guide. This document is intended to serve as a general road map for participants of the HELIOS Technology Challenge [HTC] Program and the associated inaugural challenge: HTC-01: Cost-Effective Additive Manufacturing in Space. Please note that this guide is not a rule book and is not meant to hinder the development of innovative ideas. Its primary goal is to highlight the objectives of the HTC-01 Challenge and to describe possible solution routes and pitfalls that such technology may encounter in space. Please also note that participants wishing to demonstrate any hardware developed under this program during any future HELIOS Technology Challenge showcase event(s) may be subject to event regulations to be published separately at a later date.

  5. Training auscultatory skills: computer simulated heart sounds or additional bedside training? A randomized trial on third-year medical students

    PubMed Central

    2010-01-01

    Background The present study compares the value of additional use of computer simulated heart sounds, to conventional bedside auscultation training, on the cardiac auscultation skills of 3rd year medical students at Oslo University Medical School. Methods In addition to their usual curriculum courses, groups of seven students each were randomized to receive four hours of additional auscultation training either employing a computer simulator system or adding on more conventional bedside training. Cardiac auscultation skills were afterwards tested using live patients. Each student gave a written description of the auscultation findings in four selected patients, and was rewarded from 0-10 points for each patient. Differences between the two study groups were evaluated using student's t-test. Results At the auscultation test no significant difference in mean score was found between the students who had used additional computer based sound simulation compared to additional bedside training. Conclusions Students at an early stage of their cardiology training demonstrated equal performance of cardiac auscultation whether they had received an additional short auscultation course based on computer simulated training, or had had additional bedside training. PMID:20082701

  6. A Novel Cost Based Model for Energy Consumption in Cloud Computing

    PubMed Central

    Horri, A.; Dastghaibyfard, Gh.

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. PMID:25705716

  7. A novel cost based model for energy consumption in cloud computing.

    PubMed

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  8. First- and Second-Line Bevacizumab in Addition to Chemotherapy for Metastatic Colorectal Cancer: A United States–Based Cost-Effectiveness Analysis

    PubMed Central

    Goldstein, Daniel A.; Chen, Qiushi; Ayer, Turgay; Howard, David H.; Lipscomb, Joseph; El-Rayes, Bassel F.; Flowers, Christopher R.

    2015-01-01

    Purpose The addition of bevacizumab to fluorouracil-based chemotherapy is a standard of care for previously untreated metastatic colorectal cancer. Continuation of bevacizumab beyond progression is an accepted standard of care based on a 1.4-month increase in median overall survival observed in a randomized trial. No United States–based cost-effectiveness modeling analyses are currently available addressing the use of bevacizumab in metastatic colorectal cancer. Our objective was to determine the cost effectiveness of bevacizumab in the first-line setting and when continued beyond progression from the perspective of US payers. Methods We developed two Markov models to compare the cost and effectiveness of fluorouracil, leucovorin, and oxaliplatin with or without bevacizumab in the first-line treatment and subsequent fluorouracil, leucovorin, and irinotecan with or without bevacizumab in the second-line treatment of metastatic colorectal cancer. Model robustness was addressed by univariable and probabilistic sensitivity analyses. Health outcomes were measured in life-years and quality-adjusted life-years (QALYs). Results Using bevacizumab in first-line therapy provided an additional 0.10 QALYs (0.14 life-years) at a cost of $59,361. The incremental cost-effectiveness ratio was $571,240 per QALY. Continuing bevacizumab beyond progression provided an additional 0.11 QALYs (0.16 life-years) at a cost of $39,209. The incremental cost-effectiveness ratio was $364,083 per QALY. In univariable sensitivity analyses, the variables with the greatest influence on the incremental cost-effectiveness ratio were bevacizumab cost, overall survival, and utility. Conclusion Bevacizumab provides minimal incremental benefit at high incremental cost per QALY in both the first- and second-line settings of metastatic colorectal cancer treatment. PMID:25691669

  9. 12 CFR Appendix K to Part 1026 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions K Appendix K to Part 1026 Banks and Banking BUREAU OF CONSUMER FINANCIAL PROTECTION TRUTH IN LENDING (REGULATION Z) Pt. 1026, App. K Appendix K to Part 1026—Total Annual Loan Cost...

  10. 12 CFR Appendix K to Part 1026 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions K Appendix K to Part 1026 Banks and Banking BUREAU OF CONSUMER FINANCIAL PROTECTION TRUTH IN LENDING (REGULATION Z) Pt. 1026, App. K Appendix K to Part 1026—Total Annual Loan Cost...

  11. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    NASA Astrophysics Data System (ADS)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  12. COEFUV: A Computer Implementation of a Generalized Unmanned Vehicle Cost Model.

    DTIC Science & Technology

    1978-10-01

    78 T N OMBER . C A FEUCNTER CLASSIF lED DAS-TRRNL mh~hhhh~hhE DAS-TR-78-4 DAS-TR-78-4 coI COEFUV: A COMPUTER IMPLEMENTATION OF A IM GENERALIZED ...34 and the time to generate them are important. Many DAS participants supported this effort. The authors wish to acknow- ledge Richard H. Anderson for...conflict and the on-going COMBAT ANGEL program at Davis-Monthan Air Force Base, there is not a generally accepted costing methodology for unmanned vehicles

  13. Versatile, low-cost, computer-controlled, sample positioning system for vacuum applications

    NASA Technical Reports Server (NTRS)

    Vargas-Aburto, Carlos; Liff, Dale R.

    1991-01-01

    A versatile, low-cost, easy to implement, microprocessor-based motorized positioning system (MPS) suitable for accurate sample manipulation in a Second Ion Mass Spectrometry (SIMS) system, and for other ultra-high vacuum (UHV) applications was designed and built at NASA LeRC. The system can be operated manually or under computer control. In the latter case, local, as well as remote operation is possible via the IEEE-488 bus. The position of the sample can be controlled in three linear orthogonal and one angular coordinates.

  14. On the role of cost-sensitive learning in multi-class brain-computer interfaces.

    PubMed

    Devlaminck, Dieter; Waegeman, Willem; Wyns, Bart; Otte, Georges; Santens, Patrick

    2010-06-01

    Brain-computer interfaces (BCIs) present an alternative way of communication for people with severe disabilities. One of the shortcomings in current BCI systems, recently put forward in the fourth BCI competition, is the asynchronous detection of motor imagery versus resting state. We investigated this extension to the three-class case, in which the resting state is considered virtually lying between two motor classes, resulting in a large penalty when one motor task is misclassified into the other motor class. We particularly focus on the behavior of different machine-learning techniques and on the role of multi-class cost-sensitive learning in such a context. To this end, four different kernel methods are empirically compared, namely pairwise multi-class support vector machines (SVMs), two cost-sensitive multi-class SVMs and kernel-based ordinal regression. The experimental results illustrate that ordinal regression performs better than the other three approaches when a cost-sensitive performance measure such as the mean-squared error is considered. By contrast, multi-class cost-sensitive learning enables us to control the number of large errors made between two motor tasks.

  15. Additive Manufacturing and Casting Technology Comparison: Mechanical Properties, Productivity and Cost Benchmark

    NASA Astrophysics Data System (ADS)

    Vevers, A.; Kromanis, A.; Gerins, E.; Ozolins, J.

    2018-04-01

    The casting technology is one of the oldest production technologies in the world but in the recent years metal additive manufacturing also known as metal 3D printing has been evolving with huge steps. Both technologies have capabilities to produce parts with internal holes and at first glance surface roughness is similar for both technologies, which means that for precise dimensions parts have to be machined in places where precise fit is necessary. Benchmark tests have been made to find out if parts which are produced with metal additive manufacturing can be used to replace parts which are produced with casting technology. Most of the comparative tests have been made with GJS-400-15 grade which is one of the most popular cast iron grades. To compare mechanical properties samples have been produced using additive manufacturing and tested for tensile strength, hardness, surface roughness and microstructure and then the results have been compared with the samples produced with casting technology. In addition, both technologies have been compared in terms of the production time and production costs to see if additive manufacturing is competitive with the casting technology. The original paper has been written in the Latvian language as part of the Master Thesis within the framework of the production technology study programme at Riga Technical University.

  16. Outline of cost-benefit analysis and a case study

    NASA Technical Reports Server (NTRS)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  17. Space shuttle solid rocket booster cost-per-flight analysis technique

    NASA Technical Reports Server (NTRS)

    Forney, J. A.

    1979-01-01

    A cost per flight computer model is described which considers: traffic model, component attrition, hardware useful life, turnaround time for refurbishment, manufacturing rates, learning curves on the time to perform tasks, cost improvement curves on quantity hardware buys, inflation, spares philosophy, long lead, hardware funding requirements, and other logistics and scheduling constraints. Additional uses of the model include assessing the cost per flight impact of changing major space shuttle program parameters and searching for opportunities to make cost effective management decisions.

  18. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  19. 12 CFR Appendix L to Part 226 - Assumed Loan Periods for Computations of Total Annual Loan Cost Rates

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Annual Loan Cost Rates L Appendix L to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. L Appendix L to Part 226—Assumed Loan Periods for Computations of Total Annual Loan Cost Rates (a) Required...

  20. Pupillary dynamics reveal computational cost in sentence planning.

    PubMed

    Sevilla, Yamila; Maldonado, Mora; Shalóm, Diego E

    2014-01-01

    This study investigated the computational cost associated with grammatical planning in sentence production. We measured people's pupillary responses as they produced spoken descriptions of depicted events. We manipulated the syntactic structure of the target by training subjects to use different types of sentences following a colour cue. The results showed higher increase in pupil size for the production of passive and object dislocated sentences than for active canonical subject-verb-object sentences, indicating that more cognitive effort is associated with more complex noncanonical thematic order. We also manipulated the time at which the cue that triggered structure-building processes was presented. Differential increase in pupil diameter for more complex sentences was shown to rise earlier as the colour cue was presented earlier, suggesting that the observed pupillary changes are due to differential demands in relatively independent structure-building processes during grammatical planning. Task-evoked pupillary responses provide a reliable measure to study the cognitive processes involved in sentence production.

  1. Multi-tasking computer control of video related equipment

    NASA Technical Reports Server (NTRS)

    Molina, Rod; Gilbert, Bob

    1989-01-01

    The flexibility, cost-effectiveness and widespread availability of personal computers now makes it possible to completely integrate the previously separate elements of video post-production into a single device. Specifically, a personal computer, such as the Commodore-Amiga, can perform multiple and simultaneous tasks from an individual unit. Relatively low cost, minimal space requirements and user-friendliness, provides the most favorable environment for the many phases of video post-production. Computers are well known for their basic abilities to process numbers, text and graphics and to reliably perform repetitive and tedious functions efficiently. These capabilities can now apply as either additions or alternatives to existing video post-production methods. A present example of computer-based video post-production technology is the RGB CVC (Computer and Video Creations) WorkSystem. A wide variety of integrated functions are made possible with an Amiga computer existing at the heart of the system.

  2. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    PubMed

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  3. Resource Utilization and Costs during the Initial Years of Lung Cancer Screening with Computed Tomography in Canada

    PubMed Central

    Lam, Stephen; Tammemagi, Martin C.; Evans, William K.; Leighl, Natasha B.; Regier, Dean A.; Bolbocean, Corneliu; Shepherd, Frances A.; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R.; Mayo, John R.; McWilliams, Annette; Couture, Christian; English, John C.; Goffin, John; Hwang, David M.; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J.; Goss, Glenwood D.; Nicholas, Garth; Seely, Jean M.; Sekhon, Harmanjatinder S.; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N.; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D.; Tan, Wan C.; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J.

    2014-01-01

    Background: It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Methods: Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer’s perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. Results: The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400–$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553–$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254–$52,200; p = 0.061). Conclusion: In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure. PMID:25105438

  4. Analysis of the additional costs of clinical complications in patients undergoing transcatheter aortic valve replacement in the German Health Care System.

    PubMed

    Gutmann, Anja; Kaier, Klaus; Sorg, Stefan; von Zur Mühlen, Constantin; Siepe, Matthias; Moser, Martin; Geibel, Annette; Zirlik, Andreas; Ahrens, Ingo; Baumbach, Hardy; Beyersdorf, Friedhelm; Vach, Werner; Zehender, Manfred; Bode, Christoph; Reinöhl, Jochen

    2015-01-20

    This study aims at analyzing complication-induced additional costs of patients undergoing transcatheter aortic valve replacement (TAVR). In a prospective observational study, a total of 163 consecutive patients received either transfemoral (TF-, n=97) or transapical (TA-) TAVR (n=66) between February 2009 and December 2012. Clinical endpoints were categorized according to VARC-2 definitions and in-hospital costs were determined from the hospital perspective. Finally, the additional costs of complications were estimated using multiple linear regression models. TF-TAVR patients experienced significantly more minor access site bleeding, major non-access site bleeding, minor vascular complications, stage 2 acute kidney injury (AKI) and permanent pacemaker implantation. Total in-hospital costs did not differ between groups and were on average €40,348 (SD 15,851) per patient. The average incremental cost component of a single complication was €3438 (p<0.01) and the estimated cost of a TF-TAVR without complications was €34,351. The complications associated with the highest additional costs were life-threatening non-access site bleeding (€47,494; p<0.05), stage 3 AKI (€20,468; p<0.01), implantation of a second valve (€16,767; p<0.01) and other severe cardiac dysrhythmia (€10,611 p<0.05). Overall, the presence of complication-related in-hospital mortality increased costs. Bleeding complications, severe kidney failure, and implantation of a second valve were the most important cost drivers in our TAVR patients. Strategies and advances in device design aimed at reducing these complications have the potential to generate significant in-hospital cost reductions for the German Health Care System. Copyright © 2014. Published by Elsevier Ireland Ltd.

  5. Evaluation of low-cost computer monitors for the detection of cervical spine injuries in the emergency room: an observer confidence-based study.

    PubMed

    Brem, M H; Böhner, C; Brenning, A; Gelse, K; Radkow, T; Blanke, M; Schlechtweg, P M; Neumann, G; Wu, I Y; Bautz, W; Hennig, F F; Richter, H

    2006-11-01

    To compare the diagnostic value of low-cost computer monitors and a Picture Archiving and Communication System (PACS) workstation for the evaluation of cervical spine fractures in the emergency room. Two groups of readers blinded to the diagnoses (2 radiologists and 3 orthopaedic surgeons) independently assessed-digital radiographs of the cervical spine (anterior-posterior, oblique and trans-oral-dens views). The radiographs of 57 patients who arrived consecutively to the emergency room in 2004 with clinical suspicion of a cervical spine injury were evaluated. The diagnostic values of these radiographs were scored on a 3-point scale (1 = diagnosis not possible/bad image quality, 2 = diagnosis uncertain, 3 = clear diagnosis of fracture or no fracture) on a PACS workstation and on two different liquid crystal display (LCD) personal computer monitors. The images were randomised to avoid memory effects. We used logistic mixed-effects models to determine the possible effects of monitor type on the evaluation of x ray images. To determine the overall effects of monitor type, this variable was used as a fixed effect, and the image number and reader group (radiologist or orthopaedic surgeon) were used as random effects on display quality. Group-specific effects were examined, with the reader group and additional fixed effects as terms. A significance level of 0.05 was established for assessing the contribution of each fixed effect to the model. Overall, the diagnostic score did not differ significantly between standard personal computer monitors and the PACS workstation (both p values were 0.78). Low-cost LCD personal computer monitors may be useful in establishing a diagnosis of cervical spine fractures in the emergency room.

  6. PET-CT in oncological patients: analysis of informal care costs in cost-benefit assessment.

    PubMed

    Orlacchio, Antonio; Ciarrapico, Anna Micaela; Schillaci, Orazio; Chegai, Fabrizio; Tosti, Daniela; D'Alba, Fabrizio; Guazzaroni, Manlio; Simonetti, Giovanni

    2014-04-01

    The authors analysed the impact of nonmedical costs (travel, loss of productivity) in an economic analysis of PET-CT (positron-emission tomography-computed tomography) performed with standard contrast-enhanced CT protocols (CECT). From October to November 2009, a total of 100 patients referred to our institute were administered a questionnaire to evaluate the nonmedical costs of PET-CT. In addition, the medical costs (equipment maintenance and depreciation, consumables and staff) related to PET-CT performed with CECT and PET-CT with low-dose nonenhanced CT and separate CECT were also estimated. The medical costs were 919.3 euro for PET-CT with separate CECT, and 801.3 euro for PET-CT with CECT. Therefore, savings of approximately 13% are possible. Moreover, savings in nonmedical costs can be achieved by reducing the number of hospital visits required by patients undergoing diagnostic imaging. Nonmedical costs heavily affect patients' finances as well as having an indirect impact on national health expenditure. Our results show that PET-CT performed with standard dose CECT in a single session provides benefits in terms of both medical and nonmedical costs.

  7. Design and fabrication of a sleep apnea device using computer-aided design/additive manufacture technologies.

    PubMed

    Al Mortadi, Noor; Eggbeer, Dominic; Lewis, Jeffrey; Williams, Robert J

    2013-04-01

    The aim of this study was to analyze the latest innovations in additive manufacture techniques and uniquely apply them to dentistry, to build a sleep apnea device requiring rotating hinges. Laser scanning was used to capture the three-dimensional topography of an upper and lower dental cast. The data sets were imported into an appropriate computer-aided design software environment, which was used to design a sleep apnea device. This design was then exported as a stereolithography file and transferred for three-dimensional printing by an additive manufacture machine. The results not only revealed that the novel computer-based technique presented provides new design opportunities but also highlighted limitations that must be addressed before the techniques can become clinically viable.

  8. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    PubMed

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  9. Cost aware cache replacement policy in shared last-level cache for hybrid memory based fog computing

    NASA Astrophysics Data System (ADS)

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Wang, Feng

    2018-04-01

    Fog computing requires a large main memory capacity to decrease latency and increase the Quality of Service (QoS). However, dynamic random access memory (DRAM), the commonly used random access memory, cannot be included into a fog computing system due to its high consumption of power. In recent years, non-volatile memories (NVM) such as Phase-Change Memory (PCM) and Spin-transfer torque RAM (STT-RAM) with their low power consumption have emerged to replace DRAM. Moreover, the currently proposed hybrid main memory, consisting of both DRAM and NVM, have shown promising advantages in terms of scalability and power consumption. However, the drawbacks of NVM, such as long read/write latency give rise to potential problems leading to asymmetric cache misses in the hybrid main memory. Current last level cache (LLC) policies are based on the unified miss cost, and result in poor performance in LLC and add to the cost of using NVM. In order to minimize the cache miss cost in the hybrid main memory, we propose a cost aware cache replacement policy (CACRP) that reduces the number of cache misses from NVM and improves the cache performance for a hybrid memory system. Experimental results show that our CACRP behaves better in LLC performance, improving performance up to 43.6% (15.5% on average) compared to LRU.

  10. Cost and resource utilization associated with use of computed tomography to evaluate chest pain in the emergency department: the Rule Out Myocardial Infarction using Computer Assisted Tomography (ROMICAT) study.

    PubMed

    Hulten, Edward; Goehler, Alexander; Bittencourt, Marcio Sommer; Bamberg, Fabian; Schlett, Christopher L; Truong, Quynh A; Nichols, John; Nasir, Khurram; Rogers, Ian S; Gazelle, Scott G; Nagurney, John T; Hoffmann, Udo; Blankstein, Ron

    2013-09-01

    Coronary computed tomographic angiography (cCTA) allows rapid, noninvasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency department with acute chest pain will lead to increased downstream testing and costs compared with alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computer Assisted Tomography I (ROMICAT I) study. We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I study with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results because patients and caregivers were blinded to the cCTA results. Costs after early implementation of cCTA were estimated assuming changes in management based on cCTA findings of the presence and severity of CAD. Sensitivity analysis was used to test the influence of key variables on both outcomes and costs. We determined that in comparison with UC, cCTA-guided triage, whereby patients with no CAD are discharged, could reduce total hospital costs by 23% (P<0.001). However, when the prevalence of obstructive CAD increases, index hospitalization cost increases such that when the prevalence of ≥ 50% stenosis is >28% to 33%, the use of cCTA becomes more costly than UC. cCTA may be a cost-saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD <30%. However, increased cost would be anticipated in populations with higher prevalence of disease.

  11. A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools

    NASA Technical Reports Server (NTRS)

    Warren, G. P.; Seaton, J. M.

    1996-01-01

    Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.

  12. 24 CFR 908.108 - Cost.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts to...

  13. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  14. Can Broader Diffusion of Value-Based Insurance Design Increase Benefits from US Health Care without Increasing Costs? Evidence from a Computer Simulation Model

    PubMed Central

    Scott Braithwaite, R.; Omokaro, Cynthia; Justice, Amy C.; Nucifora, Kimberly; Roberts, Mark S.

    2010-01-01

    Background Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. Methods and Findings We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (<$100,000 per life-year), would remain unchanged for intermediate- or unknown-value services ($100,000–$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to

  15. Can broader diffusion of value-based insurance design increase benefits from US health care without increasing costs? Evidence from a computer simulation model.

    PubMed

    Braithwaite, R Scott; Omokaro, Cynthia; Justice, Amy C; Nucifora, Kimberly; Roberts, Mark S

    2010-02-16

    Evidence suggests that cost sharing (i.e.,copayments and deductibles) decreases health expenditures but also reduces essential care. Value-based insurance design (VBID) has been proposed to encourage essential care while controlling health expenditures. Our objective was to estimate the impact of broader diffusion of VBID on US health care benefits and costs. We used a published computer simulation of costs and life expectancy gains from US health care to estimate the impact of broader diffusion of VBID. Two scenarios were analyzed: (1) applying VBID solely to pharmacy benefits and (2) applying VBID to both pharmacy benefits and other health care services (e.g., devices). We assumed that cost sharing would be eliminated for high-value services (<$100,000 per life-year), would remain unchanged for intermediate- or unknown-value services ($100,000-$300,000 per life-year or unknown), and would be increased for low-value services (>$300,000 per life-year). All costs are provided in 2003 US dollars. Our simulation estimated that approximately 60% of health expenditures in the US are spent on low-value services, 20% are spent on intermediate-value services, and 20% are spent on high-value services. Correspondingly, the vast majority (80%) of health expenditures would have cost sharing that is impacted by VBID. With prevailing patterns of cost sharing, health care conferred 4.70 life-years at a per-capita annual expenditure of US$5,688. Broader diffusion of VBID to pharmaceuticals increased the benefit conferred by health care by 0.03 to 0.05 additional life-years, without increasing costs and without increasing out-of-pocket payments. Broader diffusion of VBID to other health care services could increase the benefit conferred by health care by 0.24 to 0.44 additional life-years, also without increasing costs and without increasing overall out-of-pocket payments. Among those without health insurance, using cost saving from VBID to subsidize insurance coverage would

  16. An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Winstead, Wayland H.; And Others

    An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…

  17. Learning Together; part 2: training costs and health gain - a cost analysis.

    PubMed

    Cullen, Katherine; Riches, Wendy; Macaulay, Chloe; Spicer, John

    2017-01-01

    Learning Together is a complex educational intervention aimed at improving health outcomes for children and young people. There is an additional cost as two doctors are seeing patients together for a longer appointment than a standard general practice (GP) appointment. Our approach combines the impact of the training clinics on activity in South London in 2014-15 with health gain, using NICE guidance and standards to allow comparison of training options. Activity data was collected from Training Practices hosting Learning Together. A computer based model was developed to analyse the costs of the Learning Together intervention compared to usual training in a partial economic evaluation. The results of the model were used to value the health gain required to make the intervention cost effective. Data were returned for 363 patients booked into 61 clinics across 16 Training Practices. Learning Together clinics resulted in an increase in costs of £37 per clinic. Threshold analysis illustrated one child with a common illness like constipation needs to be well for two weeks, in one Practice hosting four training clinics for the clinics to be considered cost effective. Learning Together is of minimal training cost. Our threshold analysis produced a rubric that can be used locally to test cost effectiveness at a Practice or Programme level.

  18. Regional geoid computation by least squares modified Hotine's formula with additive corrections

    NASA Astrophysics Data System (ADS)

    Märdla, Silja; Ellmann, Artu; Ågren, Jonas; Sjöberg, Lars E.

    2018-03-01

    Geoid and quasigeoid modelling from gravity anomalies by the method of least squares modification of Stokes's formula with additive corrections is adapted for the usage with gravity disturbances and Hotine's formula. The biased, unbiased and optimum versions of least squares modification are considered. Equations are presented for the four additive corrections that account for the combined (direct plus indirect) effect of downward continuation (DWC), topographic, atmospheric and ellipsoidal corrections in geoid or quasigeoid modelling. The geoid or quasigeoid modelling scheme by the least squares modified Hotine formula is numerically verified, analysed and compared to the Stokes counterpart in a heterogeneous study area. The resulting geoid models and the additive corrections computed both for use with Stokes's or Hotine's formula differ most in high topography areas. Over the study area (reaching almost 2 km in altitude), the approximate geoid models (before the additive corrections) differ by 7 mm on average with a 3 mm standard deviation (SD) and a maximum of 1.3 cm. The additive corrections, out of which only the DWC correction has a numerically significant difference, improve the agreement between respective geoid or quasigeoid models to an average difference of 5 mm with a 1 mm SD and a maximum of 8 mm.

  19. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  20. Low-cost, high-performance and efficiency computational photometer design

    NASA Astrophysics Data System (ADS)

    Siewert, Sam B.; Shihadeh, Jeries; Myers, Randall; Khandhar, Jay; Ivanov, Vitaly

    2014-05-01

    Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.

  1. The cost-effectiveness of the RSI QuickScan intervention programme for computer workers: Results of an economic evaluation alongside a randomised controlled trial.

    PubMed

    Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W

    2010-11-11

    The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI Quick

  2. Low-dose chest computed tomography for lung cancer screening among Hodgkin lymphoma survivors: a cost-effectiveness analysis.

    PubMed

    Wattson, Daniel A; Hunink, M G Myriam; DiPiro, Pamela J; Das, Prajnan; Hodgson, David C; Mauch, Peter M; Ng, Andrea K

    2014-10-01

    Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening may be cost effective for all smokers but possibly not

  3. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wattson, Daniel A., E-mail: dwattson@partners.org; Hunink, M.G. Myriam; DiPiro, Pamela J.

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs.more » LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT

  4. Effectiveness of Multimedia Elements in Computer Supported Instruction: Analysis of Personalization Effects, Students' Performances and Costs

    ERIC Educational Resources Information Center

    Zaidel, Mark; Luo, XiaoHui

    2010-01-01

    This study investigates the efficiency of multimedia instruction at the college level by comparing the effectiveness of multimedia elements used in the computer supported learning with the cost of their preparation. Among the various technologies that advance learning, instructors and students generally identify interactive multimedia elements as…

  5. Cost Savings Associated with the Adoption of a Cloud Computing Data Transfer System for Trauma Patients.

    PubMed

    Feeney, James M; Montgomery, Stephanie C; Wolf, Laura; Jayaraman, Vijay; Twohig, Michael

    2016-09-01

    Among transferred trauma patients, challenges with the transfer of radiographic studies include problems loading or viewing the studies at the receiving hospitals, and problems manipulating, reconstructing, or evalu- ating the transferred images. Cloud-based image transfer systems may address some ofthese problems. We reviewed the charts of patients trans- ferred during one year surrounding the adoption of a cloud computing data transfer system. We compared the rates of repeat imaging before (precloud) and af- ter (postcloud) the adoption of the cloud-based data transfer system. During the precloud period, 28 out of 100 patients required 90 repeat studies. With the cloud computing transfer system in place, three out of 134 patients required seven repeat films. There was a statistically significant decrease in the proportion of patients requiring repeat films (28% to 2.2%, P < .0001). Based on an annualized volume of 200 trauma patient transfers, the cost savings estimated using three methods of cost analysis, is between $30,272 and $192,453.

  6. Estimate of the benefits of a population-based reduction in dietary sodium additives on hypertension and its related health care costs in Canada.

    PubMed

    Joffres, Michel R; Campbell, Norm R C; Manns, Braden; Tu, Karen

    2007-05-01

    Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension prevalence and result in health care cost savings in Canada.

  7. Estimate of the benefits of a population-based reduction in dietary sodium additives on hypertension and its related health care costs in Canada

    PubMed Central

    Joffres, Michel R; Campbell, Norm RC; Manns, Braden; Tu, Karen

    2007-01-01

    BACKGROUND: Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. OBJECTIVES: To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. METHODS: Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. RESULTS: Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. CONCLUSIONS: Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension

  8. Direct costs and cost-effectiveness of dual-source computed tomography and invasive coronary angiography in patients with an intermediate pretest likelihood for coronary artery disease.

    PubMed

    Dorenkamp, Marc; Bonaventura, Klaus; Sohns, Christian; Becker, Christoph R; Leber, Alexander W

    2012-03-01

    The study aims to determine the direct costs and comparative cost-effectiveness of latest-generation dual-source computed tomography (DSCT) and invasive coronary angiography for diagnosing coronary artery disease (CAD) in patients suspected of having this disease. The study was based on a previously elaborated cohort with an intermediate pretest likelihood for CAD and on complementary clinical data. Cost calculations were based on a detailed analysis of direct costs, and generally accepted accounting principles were applied. Based on Bayes' theorem, a mathematical model was used to compare the cost-effectiveness of both diagnostic approaches. Total costs included direct costs, induced costs and costs of complications. Effectiveness was defined as the ability of a diagnostic test to accurately identify a patient with CAD. Direct costs amounted to €98.60 for DSCT and to €317.75 for invasive coronary angiography. Analysis of model calculations indicated that cost-effectiveness grew hyperbolically with increasing prevalence of CAD. Given the prevalence of CAD in the study cohort (24%), DSCT was found to be more cost-effective than invasive coronary angiography (€970 vs €1354 for one patient correctly diagnosed as having CAD). At a disease prevalence of 49%, DSCT and invasive angiography were equally effective with costs of €633. Above a threshold value of disease prevalence of 55%, proceeding directly to invasive coronary angiography was more cost-effective than DSCT. With proper patient selection and consideration of disease prevalence, DSCT coronary angiography is cost-effective for diagnosing CAD in patients with an intermediate pretest likelihood for it. However, the range of eligible patients may be smaller than previously reported.

  9. Dealing with electronic waste: modeling the costs and environmental benefits of computer monitor disposal.

    PubMed

    Macauley, Molly; Palmer, Karen; Shih, Jhih-Shyang

    2003-05-01

    The importance of information technology to the world economy has brought about a surge in demand for electronic equipment. With rapid technological change, a growing fraction of the increasing stock of many types of electronics becomes obsolete each year. We model the costs and benefits of policies to manage 'e-waste' by focusing on a large component of the electronic waste stream-computer monitors-and the environmental concerns associated with disposal of the lead embodied in cathode ray tubes (CRTs) used in most monitors. We find that the benefits of avoiding health effects associated with CRT disposal appear far outweighed by the costs for a wide range of policies. For the stock of monitors disposed of in the United States in 1998, we find that policies restricting or banning some popular disposal options would increase disposal costs from about US dollar 1 per monitor to between US dollars 3 and US dollars 20 per monitor. Policies to promote a modest amount of recycling of monitor parts, including lead, can be less expensive. In all cases, however, the costs of the policies exceed the value of the avoided health effects of CRT disposal.

  10. Can low-cost VOR and Omega receivers suffice for RNAV - A new computer-based navigation technique

    NASA Technical Reports Server (NTRS)

    Hollaar, L. A.

    1978-01-01

    It is shown that although RNAV is particularly valuable for the personal transportation segment of general aviation, it has not gained complete acceptance. This is due, in part, to its high cost and the necessary special-handling air traffic control. VOR/DME RNAV calculations are ideally suited for analog computers, and the use of microprocessor technology has been suggested for reducing RNAV costs. Three navigation systems, VOR, Omega, and DR, are compared for common navigational difficulties, such as station geometry, siting errors, ground disturbances, and terminal area coverage. The Kalman filtering technique is described with reference to the disadvantages when using a system including standard microprocessors. An integrated navigation system, using input data from various low-cost sensor systems, is presented and current simulation studies are noted.

  11. Quantum computation with realistic magic-state factories

    NASA Astrophysics Data System (ADS)

    O'Gorman, Joe; Campbell, Earl T.

    2017-03-01

    Leading approaches to fault-tolerant quantum computation dedicate a significant portion of the hardware to computational factories that churn out high-fidelity ancillas called magic states. Consequently, efficient and realistic factory design is of paramount importance. Here we present the most detailed resource assessment to date of magic-state factories within a surface code quantum computer, along the way introducing a number of techniques. We show that the block codes of Bravyi and Haah [Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329] have been systematically undervalued; we track correlated errors both numerically and analytically, providing fidelity estimates without appeal to the union bound. We also introduce a subsystem code realization of these protocols with constant time and low ancilla cost. Additionally, we confirm that magic-state factories have space-time costs that scale as a constant factor of surface code costs. We find that the magic-state factory required for postclassical factoring can be as small as 6.3 million data qubits, ignoring ancilla qubits, assuming 10-4 error gates and the availability of long-range interactions.

  12. Integration of High-Performance Computing into Cloud Computing Services

    NASA Astrophysics Data System (ADS)

    Vouk, Mladen A.; Sills, Eric; Dreher, Patrick

    High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).

  13. Cost-Benefit Analysis for ECIA Chapter 1 and State DPPF Programs Comparing Groups Receiving Regular Program Instruction and Groups Receiving Computer Assisted Instruction/Computer Management System (CAI/CMS). 1986-87.

    ERIC Educational Resources Information Center

    Chamberlain, Ed

    A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…

  14. Make or Buy: Cost Impacts of Additive Manufacturing, 3D Laser Scanning Technology, and Collaborative Product Lifecycle Management on Ship Maintenance and Modernization

    DTIC Science & Technology

    2015-05-01

    management during operations 4 Potential Technology 3: Additive Manufacturing (“ 3D Printing ”) 5 • 3D design/image (e.g. from 3D LS) of final part...1 Make or Buy: Cost Impacts of Additive Manufacturing, 3D Laser Scanning Technology, and Collaborative Product Lifecycle Management on Ship...DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Make or Buy: Cost Impacts of Additive Manufacturing, 3D Laser Scanning Technology

  15. Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets

    PubMed Central

    Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  16. Mobile HIV screening in Cape Town, South Africa: clinical impact, cost and cost-effectiveness.

    PubMed

    Bassett, Ingrid V; Govindasamy, Darshini; Erlwanger, Alison S; Hyle, Emily P; Kranzer, Katharina; van Schaik, Nienke; Noubary, Farzad; Paltiel, A David; Wood, Robin; Walensky, Rochelle P; Losina, Elena; Bekker, Linda-Gail; Freedberg, Kenneth A

    2014-01-01

    Mobile HIV screening may facilitate early HIV diagnosis. Our objective was to examine the cost-effectiveness of adding a mobile screening unit to current medical facility-based HIV testing in Cape Town, South Africa. We used the Cost Effectiveness of Preventing AIDS Complications International (CEPAC-I) computer simulation model to evaluate two HIV screening strategies in Cape Town: 1) medical facility-based testing (the current standard of care) and 2) addition of a mobile HIV-testing unit intervention in the same community. Baseline input parameters were derived from a Cape Town-based mobile unit that tested 18,870 individuals over 2 years: prevalence of previously undiagnosed HIV (6.6%), mean CD4 count at diagnosis (males 423/µL, females 516/µL), CD4 count-dependent linkage to care rates (males 31%-58%, females 49%-58%), mobile unit intervention cost (includes acquisition, operation and HIV test costs, $29.30 per negative result and $31.30 per positive result). We conducted extensive sensitivity analyses to evaluate input uncertainty. Model outcomes included site of HIV diagnosis, life expectancy, medical costs, and the incremental cost-effectiveness ratio (ICER) of the intervention compared to medical facility-based testing. We considered the intervention to be "very cost-effective" when the ICER was less than South Africa's annual per capita Gross Domestic Product (GDP) ($8,200 in 2012). We projected that, with medical facility-based testing, the discounted (undiscounted) HIV-infected population life expectancy was 132.2 (197.7) months; this increased to 140.7 (211.7) months with the addition of the mobile unit. The ICER for the mobile unit was $2,400/year of life saved (YLS). Results were most sensitive to the previously undiagnosed HIV prevalence, linkage to care rates, and frequency of HIV testing at medical facilities. The addition of mobile HIV screening to current testing programs can improve survival and be very cost-effective in South Africa and

  17. 19 CFR 201.14 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 3 2014-04-01 2014-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations...

  18. 19 CFR 201.14 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations...

  19. 19 CFR 210.6 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 210.6 Section 210.6 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT...

  20. Expedited Holonomic Quantum Computation via Net Zero-Energy-Cost Control in Decoherence-Free Subspace.

    PubMed

    Pyshkin, P V; Luo, Da-Wei; Jing, Jun; You, J Q; Wu, Lian-Ao

    2016-11-25

    Holonomic quantum computation (HQC) may not show its full potential in quantum speedup due to the prerequisite of a long coherent runtime imposed by the adiabatic condition. Here we show that the conventional HQC can be dramatically accelerated by using external control fields, of which the effectiveness is exclusively determined by the integral of the control fields in the time domain. This control scheme can be realized with net zero energy cost and it is fault-tolerant against fluctuation and noise, significantly relaxing the experimental constraints. We demonstrate how to realize the scheme via decoherence-free subspaces. In this way we unify quantum robustness merits of this fault-tolerant control scheme, the conventional HQC and decoherence-free subspace, and propose an expedited holonomic quantum computation protocol.

  1. Manipulative therapy in addition to usual medical care accelerates recovery of shoulder complaints at higher costs: economic outcomes of a randomized trial.

    PubMed

    Bergman, Gert J D; Winter, Jan C; van Tulder, Maurits W; Meyboom-de Jong, Betty; Postema, Klaas; van der Heijden, Geert J M G

    2010-09-06

    Shoulder complaints are common in primary care and have unfavourable long term prognosis. Our objective was to evaluate the clinical effectiveness of manipulative therapy of the cervicothoracic spine and the adjacent ribs in addition to usual medical care (UMC) by the general practitioner in the treatment of shoulder complaints. This economic evaluation was conducted alongside a randomized trial in primary care. Included were 150 patients with shoulder complaints and a dysfunction of the cervicothoracic spine and adjacent ribs. Patients were treated with UMC (NSAID's, corticosteroid injection or referral to physical therapy) and were allocated at random (yes/no) to manipulative therapy (manipulation and mobilization). Patient perceived recovery, severity of main complaint, shoulder pain, disability and general health were outcome measures. Data about direct and indirect costs were collected by means of a cost diary. Manipulative therapy as add-on to UMC accelerated recovery on all outcome measures included. At 26 weeks after randomization, both groups reported similar recovery rates (41% vs. 38%), but the difference between groups in improvement of severity of the main complaint, shoulder pain and disability sustained. Compared to the UMC group the total costs were higher in the manipulative group (€1167 vs. €555). This is explained mainly by the costs of the manipulative therapy itself and the higher costs due sick leave from work. The cost effectiveness ratio showed that additional manipulative treatment is more costly but also more effective than UMC alone. The cost-effectiveness acceptability curve shows that a 50%-probability of recovery with AMT within 6 months after initiation of treatment is achieved at €2876. Manipulative therapy in addition to UMC accelerates recovery and is more effective than UMC alone on the long term, but is associated with higher costs. INTERNATIONAL STANDARD RANDOMIZED CONTROLLED TRIAL NUMBER REGISTER: ISRCTN11216.

  2. Operating Dedicated Data Centers - Is It Cost-Effective?

    NASA Astrophysics Data System (ADS)

    Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  3. Computation Directorate 2008 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to itsmore » 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.« less

  4. [Is surgical education associated with additional costs? A controlled economic study on the German DRG System for primary TKA].

    PubMed

    Göbel, P; Piesche, K; Randau, T; Wimmer, M D; Wirtz, D C; Gravius, S

    2013-04-01

    Total knee arthroplasty (TKA) is one of the most common procedures in orthopaedic surgery, the cost of surgical training has as yet not been quantified. In a pilot study, we investigated the economic impact of surgical training under DRG system influences, analysing the cost-proceeds structure in surgical training for orthopaedic residents. Consecutive TKAs were performed by the most educated surgeon (Group A) having implanted ≥ 1000 TKAs, another attending (Group B) with ≥ 200 TKAs and a resident (Group C) having assisted in 25 TKAs (n = 30 patients per Group A-C). All patients were embedded in a standardised clinical pathway. By analysing the costs parameters such as numbers of blood transfusions, the operating time and the length of stay in the hospital we investigated the health care-related costs matched to the DRG-based financial refunding. Data were analysed after undergoing a analysis of variance followed by a post-hoc Scheffé procedure. On the one hand the resident generated additional costs of 1111,7 ± 97 € in comparison to the Group A surgeon and 1729,8 ± 152 € compared to the attending Group B (p > 0,05), these were generated by longer stay in hospital, longer operation time and higher need of resources. On the other hand there were significantly higher proceeds of the Group C in comparison to the attending Group B and also to Group A: 474,78 ± 82 € vs. A and 150,54 ± 52 € vs. Group B (p < 0,05). This was generated both by a higher patient clinical level of complexity (PCCL) and increased complication rates resulting in a consecutively augmented profit by grouping these patients to a more lucrative DRG. Overall the deficit per patient treated by the resident is 637 ± 77 € vs. Group A and 1579,3 ± 137 € vs. Group B (p > 0,05). The German DRG matrix results in higher profits accounted to the learning surgeon by increased PCCL relevant status and grouping the case to a more profitable DRG. Hereby, the additional costs are only partly

  5. Cost-effectiveness analysis of additional bevacizumab to pemetrexed plus cisplatin for malignant pleural mesothelioma based on the MAPS trial.

    PubMed

    Zhan, Mei; Zheng, Hanrui; Xu, Ting; Yang, Yu; Li, Qiu

    2017-08-01

    Malignant pleural mesothelioma (MPM) is a rare malignancy, and pemetrexed/cisplatin (PC) is the gold standard first-line regime. This study evaluated the cost-effectiveness of the addition of bevacizumab to PC (with maintenance bevacizumab) for unresectable MPM based on a phase III trial that showed a survival benefit compared with chemotherapy alone. To estimate the incremental cost-effectiveness ratio (ICER) of the incorporation of bevacizumab, a Markov model based on the MAPS trial, including the disease states of progression-free survival, progressive disease and death, was used. Total costs were calculated from a Chinese payer perspective, and health outcomes were converted into quality-adjusted life year (QALY). Model robustness was explored in sensitivity analyses. The addition of bevacizumab to PC was estimated to increase the cost by $81446.69, with a gain of 0.112 QALYs, resulting in an ICER of $727202.589 per QALY. In both one-way sensitivity and probabilistic sensitivity analyses, the ICER exceeded the commonly accepted willingness-to-pay threshold of 3 times the gross domestic product per capita of China ($23970.00 per QALY). The cost of bevacizumab had the most important impact on the ICER. The combination of bevacizumab with PC chemotherapy is not a cost-effective treatment option for MPM in China. Given its positive clinical value and extremely low incidence of MPM, an appropriate price discount, assistance programs and medical insurance should be considered to make bevacizumab more affordable for this rare patient population. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. The Hidden Costs of Owning a Microcomputer.

    ERIC Educational Resources Information Center

    McDole, Thomas L.

    Before purchasing computer hardware, individuals must consider the costs associated with the setup and operation of a microcomputer system. Included among the initial costs of purchasing a computer are the costs of the computer, one or more disk drives, a monitor, and a printer as well as the costs of such optional peripheral devices as a plotter…

  7. General aviation design synthesis utilizing interactive computer graphics

    NASA Technical Reports Server (NTRS)

    Galloway, T. L.; Smith, M. R.

    1976-01-01

    Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.

  8. The UCLA MEDLARS Computer System *

    PubMed Central

    Garvis, Francis J.

    1966-01-01

    Under a subcontract with UCLA the Planning Research Corporation has changed the MEDLARS system to make it possible to use the IBM 7094/7040 direct-couple computer instead of the Honeywell 800 for demand searches. The major tasks were the rewriting of the programs in COBOL and copying of the stored information on the narrower tapes that IBM computers require. (In the future NLM will copy the tapes for IBM computer users.) The differences in the software required by the two computers are noted. Major and costly revisions would be needed to adapt the large MEDLARS system to the smaller IBM 1401 and 1410 computers. In general, MEDLARS is transferrable to other computers of the IBM 7000 class, the new IBM 360, and those of like size, such as the CDC 1604 or UNIVAC 1108, although additional changes are necessary. Potential future improvements are suggested. PMID:5901355

  9. Additional support for the TDK/MABL computer program

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dunn, Stuart S.

    1993-01-01

    An advanced version of the Two-Dimensional Kinetics (TDK) computer program was developed under contract and released to the propulsion community in early 1989. Exposure of the code to this community indicated a need for improvements in certain areas. In particular, the TDK code needed to be adapted to the special requirements imposed by the Space Transportation Main Engine (STME) development program. This engine utilizes injection of the gas generator exhaust into the primary nozzle by means of a set of slots. The subsequent mixing of this secondary stream with the primary stream with finite rate chemical reaction can have a major impact on the engine performance and the thermal protection of the nozzle wall. In attempting to calculate this reacting boundary layer problem, the Mass Addition Boundary Layer (MABL) module of TDK was found to be deficient in several respects. For example, when finite rate chemistry was used to determine gas properties, (MABL-K option) the program run times became excessive because extremely small step sizes were required to maintain numerical stability. A robust solution algorithm was required so that the MABL-K option could be viable as a rocket propulsion industry design tool. Solving this problem was a primary goal of the phase 1 work effort.

  10. Additive Manufacture of Plasma Diagnostic Components Final Report Phase II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodruff, Simon; Romero-Talamas, Carlos; You, Setthivoine

    There is now a well-established set of plasma diagnostics (see e.g. [3]), but these remain some of the mostexpensive assemblies in fusion systems since for every system they have to be custom built, and time fordiagnostic development can pace the project. Additive manufacturing (AM) has the potential to decreaseproduction cost and significantly lower design time of fusion diagnostic subsystems, which would realizesignificant cost reduction for standard diagnostics. In some cases, these basic components can be additivelymanufactured for less than 1/100th costs of conventional manufacturing.In our DOE Phase II SBIR, we examined the impact that AM can have on plasma diagnosticmore » cost bytaking 15 separate diagnostics through an engineering design using Conventional Manufacturing (CM) tech-niques, then optimizing the design to exploit the benefits of AM. The impact of AM techniques on cost isfound to be in several areas. First, the cost of materials falls because AM parts can be manufactured withlittle to no waste, and engineered to use less material than CM. Next, the cost of fabrication falls for AMparts relative to CM since the fabrication time can be computed exactly, and often no post-processing isrequired for the part to be functional. We find that AM techniques are well suited for plasma diagnosticssince typical diagnostic complexity comes at no additional cost. Cooling channels, for example, can be builtin to plasma-facing components at no extra cost. Fabrication costs associated with assembly are lower forAM parts because many components can be combined and printed as monoliths, thereby mitigating the needfor alignment or calibration. Finally, the cost of engineering is impacted by exploiting AM design tools thatallow standard components to be customized through web-interfaces. Furthermore, we find that conceptdesign costs can be impacted by scripting interfaces for online engineering design tools.« less

  11. Cutting Technology Costs with Refurbished Computers

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Many district administrators are finding that they can save money on computers by buying preowned ones instead of new ones. The practice has other benefits as well: It allows districts to give more computers to more students who need them, and it also promotes good environmental practices by keeping the machines out of landfills, where they…

  12. Projections of costs, financing, and additional resource requirements for low- and lower middle-income country immunization programs over the decade, 2011-2020.

    PubMed

    Gandhi, Gian; Lydon, Patrick; Cornejo, Santiago; Brenzel, Logan; Wrobel, Sandra; Chang, Hugh

    2013-04-18

    The Decade of Vaccines Global Vaccine Action Plan has outlined a set of ambitious goals to broaden the impact and reach of immunization across the globe. A projections exercise has been undertaken to assess the costs, financing availability, and additional resource requirements to achieve these goals through the delivery of vaccines against 19 diseases across 94 low- and middle-income countries for the period 2011-2020. The exercise draws upon data from existing published and unpublished global forecasts, country immunization plans, and costing studies. A combination of an ingredients-based approach and use of approximations based on past spending has been used to generate vaccine and non-vaccine delivery costs for routine programs, as well as supplementary immunization activities (SIAs). Financing projections focused primarily on support from governments and the GAVI Alliance. Cost and financing projections are presented in constant 2010 US dollars (US$). Cumulative total costs for the decade are projected to be US$57.5 billion, with 85% for routine programs and the remaining 15% for SIAs. Delivery costs account for 54% of total cumulative costs, and vaccine costs make up the remainder. A conservative estimate of total financing for immunization programs is projected to be $34.3 billion over the decade, with country governments financing 65%. These projections imply a cumulative funding gap of $23.2 billion. About 57% of the total resources required to close the funding gap are needed just to maintain existing programs and scale up other currently available vaccines (i.e., before adding in the additional costs of vaccines still in development). Efforts to mobilize additional resources, manage program costs, and establish mutual accountability between countries and development partners will all be necessary to ensure the goals of the Decade of Vaccines are achieved. Establishing or building on existing mechanisms to more comprehensively track resources and

  13. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  14. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  15. The Instructional Cost Index. A Simplified Approach to Interinstitutional Cost Comparison.

    ERIC Educational Resources Information Center

    Beatty, George, Jr.; And Others

    The paper describes a simple, yet effective method of computing a comparative index of instructional costs. The Instructional Cost Index identifies direct cost differentials among instructional programs. Cost differentials are described in terms of differences among numerical values of variables that reflect fundamental academic and resource…

  16. Expedited Holonomic Quantum Computation via Net Zero-Energy-Cost Control in Decoherence-Free Subspace

    PubMed Central

    Pyshkin, P. V.; Luo, Da-Wei; Jing, Jun; You, J. Q.; Wu, Lian-Ao

    2016-01-01

    Holonomic quantum computation (HQC) may not show its full potential in quantum speedup due to the prerequisite of a long coherent runtime imposed by the adiabatic condition. Here we show that the conventional HQC can be dramatically accelerated by using external control fields, of which the effectiveness is exclusively determined by the integral of the control fields in the time domain. This control scheme can be realized with net zero energy cost and it is fault-tolerant against fluctuation and noise, significantly relaxing the experimental constraints. We demonstrate how to realize the scheme via decoherence-free subspaces. In this way we unify quantum robustness merits of this fault-tolerant control scheme, the conventional HQC and decoherence-free subspace, and propose an expedited holonomic quantum computation protocol. PMID:27886234

  17. Addition of flexible body option to the TOLA computer program, part 1

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    This report describes a flexible body option that was developed and added to the Takeoff and Landing Analysis (TOLA) computer program. The addition of the flexible body option to TOLA allows it to be used to study essentially any conventional type airplane in the ground operating environment. It provides the capability to predict the total motion of selected points on the analytical methods incorporated in the program and operating instructions for the option are described. A program listing is included along with several example problems to aid in interpretation of the operating instructions and to illustrate program usage.

  18. Computationally Efficient Adaptive Beamformer for Ultrasound Imaging Based on QR Decomposition.

    PubMed

    Park, Jongin; Wi, Seok-Min; Lee, Jin S

    2016-02-01

    Adaptive beamforming methods for ultrasound imaging have been studied to improve image resolution and contrast. The most common approach is the minimum variance (MV) beamformer which minimizes the power of the beamformed output while maintaining the response from the direction of interest constant. The method achieves higher resolution and better contrast than the delay-and-sum (DAS) beamformer, but it suffers from high computational cost. This cost is mainly due to the computation of the spatial covariance matrix and its inverse, which requires O(L(3)) computations, where L denotes the subarray size. In this study, we propose a computationally efficient MV beamformer based on QR decomposition. The idea behind our approach is to transform the spatial covariance matrix to be a scalar matrix σI and we subsequently obtain the apodization weights and the beamformed output without computing the matrix inverse. To do that, QR decomposition algorithm is used and also can be executed at low cost, and therefore, the computational complexity is reduced to O(L(2)). In addition, our approach is mathematically equivalent to the conventional MV beamformer, thereby showing the equivalent performances. The simulation and experimental results support the validity of our approach.

  19. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Procedures for the Computation of the Real Cost of Capital I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING... parameters specified above are not obtainable, alternate parameters that closely correspond to those above...

  20. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Procedures for the Computation of the Real Cost of Capital I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING... obtainable, alternate parameters that closely correspond to those above may be used. This may include...

  1. 10 CFR Appendix I to Part 504 - Procedures for the Computation of the Real Cost of Capital

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Procedures for the Computation of the Real Cost of Capital I Appendix I to Part 504 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING... parameters specified above are not obtainable, alternate parameters that closely correspond to those above...

  2. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  3. Getting the most out of additional guidance information in deformable image registration by leveraging multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Alderliesten, Tanja; Bosman, Peter A. N.; Bel, Arjan

    2015-03-01

    Incorporating additional guidance information, e.g., landmark/contour correspondence, in deformable image registration is often desirable and is typically done by adding constraints or cost terms to the optimization function. Commonly, deciding between a "hard" constraint and a "soft" additional cost term as well as the weighting of cost terms in the optimization function is done on a trial-and-error basis. The aim of this study is to investigate the advantages of exploiting guidance information by taking a multi-objective optimization perspective. Hereto, next to objectives related to match quality and amount of deformation, we define a third objective related to guidance information. Multi-objective optimization eliminates the need to a-priori tune a weighting of objectives in a single optimization function or the strict requirement of fulfilling hard guidance constraints. Instead, Pareto-efficient trade-offs between all objectives are found, effectively making the introduction of guidance information straightforward, independent of its type or scale. Further, since complete Pareto fronts also contain less interesting parts (i.e., solutions with near-zero deformation effort), we study how adaptive steering mechanisms can be incorporated to automatically focus more on solutions of interest. We performed experiments on artificial and real clinical data with large differences, including disappearing structures. Results show the substantial benefit of using additional guidance information. Moreover, compared to the 2-objective case, additional computational cost is negligible. Finally, with the same computational budget, use of the adaptive steering mechanism provides superior solutions in the area of interest.

  4. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a...

  5. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a...

  6. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Reverse Mortgage Transactions K Appendix K to Part 226 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. K Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a...

  7. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  8. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    NASA Astrophysics Data System (ADS)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on

  9. Computer-assisted cognitive remediation therapy in schizophrenia: Durability of the effects and cost-utility analysis.

    PubMed

    Garrido, Gemma; Penadés, Rafael; Barrios, Maite; Aragay, Núria; Ramos, Irene; Vallès, Vicenç; Faixa, Carlota; Vendrell, Josep M

    2017-08-01

    The durability of computer-assisted cognitive remediation (CACR) therapy over time and the cost-effectiveness of treatment remains unclear. The aim of the current study is to investigate the effectiveness of CACR and to examine the use and cost of acute psychiatric admissions before and after of CACR. Sixty-seven participants were initially recruited. For the follow-up study a total of 33 participants were enrolled, 20 to the CACR condition group and 13 to the active control condition group. All participants were assessed at baseline, post-therapy and 12 months post-therapy on neuropsychology, QoL and self-esteem measurements. The use and cost of acute psychiatric admissions were collected retrospectively at four assessment points: baseline, 12 months post-therapy, 24 months post-therapy, and 36 months post-therapy. The results indicated that treatment effectiveness persisted in the CACR group one year post-therapy on neuropsychological and well-being outcomes. The CACR group showed a clear decrease in the use of acute psychiatric admissions at 12, 24 and 36 months post-therapy, which lowered the global costs the acute psychiatric admissions at 12, 24 and 36 months post-therapy. The CACR is durable over at least a 12-month period, and CACR may be helping to reduce health care costs for schizophrenia patients. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  10. JPL Energy Consumption Program (ECP) documentation: A computer model simulating heating, cooling and energy loads in buildings. [low cost solar array efficiency

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Chai, V. W.; Lascu, D.; Urbenajo, R.; Wong, P.

    1978-01-01

    The engineering manual provides a complete companion documentation about the structure of the main program and subroutines, the preparation of input data, the interpretation of output results, access and use of the program, and the detailed description of all the analytic, logical expressions and flow charts used in computations and program structure. A numerical example is provided and solved completely to show the sequence of computations followed. The program is carefully structured to reduce both user's time and costs without sacrificing accuracy. The user would expect a cost of CPU time of approximately $5.00 per building zone excluding printing costs. The accuracy, on the other hand, measured by deviation of simulated consumption from watt-hour meter readings, was found by many simulation tests not to exceed + or - 10 percent margin.

  11. Cost-Effectiveness and Cost-Benefit Analysis: Confronting the Problem of Choice.

    ERIC Educational Resources Information Center

    Clardy, Alan

    Cost-effectiveness analysis and cost-benefit analysis are two related yet distinct methods to help decision makers choose the best course of action from among competing alternatives. For both types of analysis, costs are computed similarly. Costs may be reduced to present value amounts for multi-year programs, and parameters may be altered to show…

  12. Furniture rough mill costs evaluated by computer simulation

    Treesearch

    R. Bruce Anderson

    1983-01-01

    A crosscut-first furniture rough mill was simulated to evaluate processing and raw material costs on an individual part basis. Distributions representing the real-world characteristics of lumber, equipment feed speeds, and processing requirements are programed into the simulation. Costs of parts from a specific cutting bill are given, and effects of lumber input costs...

  13. Cost analysis of non-invasive fractional flow reserve derived from coronary computed tomographic angiography in Japan.

    PubMed

    Kimura, Takeshi; Shiomi, Hiroki; Kuribayashi, Sachio; Isshiki, Takaaki; Kanazawa, Susumu; Ito, Hiroshi; Ikeda, Shunya; Forrest, Ben; Zarins, Christopher K; Hlatky, Mark A; Norgaard, Bjarne L

    2015-01-01

    Percutaneous coronary intervention (PCI) based on fractional flow reserve (FFRcath) measurement during invasive coronary angiography (CAG) results in improved patient outcome and reduced healthcare costs. FFR can now be computed non-invasively from standard coronary CT angiography (cCTA) scans (FFRCT). The purpose of this study is to determine the potential impact of non-invasive FFRCT on costs and clinical outcomes of patients with suspected coronary artery disease in Japan. Clinical data from 254 patients in the HeartFlowNXT trial, costs of goods and services in Japan, and clinical outcome data from the literature were used to estimate the costs and outcomes of 4 clinical pathways: (1) CAG-visual guided PCI, (2) CAG-FFRcath guided PCI, (3) cCTA followed by CAG-visual guided PCI, (4) cCTA-FFRCT guided PCI. The CAG-visual strategy demonstrated the highest projected cost ($10,360) and highest projected 1-year death/myocardial infarction rate (2.4 %). An assumed price for FFRCT of US $2,000 produced equivalent clinical outcomes (death/MI rate: 1.9 %) and healthcare costs ($7,222) for the cCTA-FFRCT strategy and the CAG-FFRcath guided PCI strategy. Use of the cCTA-FFRCT strategy to select patients for PCI would result in 32 % lower costs and 19 % fewer cardiac events at 1 year compared to the most commonly used CAG-visual strategy. Use of cCTA-FFRCT to select patients for CAG and PCI may reduce costs and improve clinical outcome in patients with suspected coronary artery disease in Japan.

  14. High Temperature Thermoplastic Additive Manufacturing Using Low-Cost, Open-Source Hardware

    NASA Technical Reports Server (NTRS)

    Gardner, John M.; Stelter, Christopher J.; Yashin, Edward A.; Siochi, Emilie J.

    2016-01-01

    Additive manufacturing (or 3D printing) via Fused Filament Fabrication (FFF), also known as Fused Deposition Modeling (FDM), is a process where material is placed in specific locations layer-by-layer to create a complete part. Printers designed for FFF build parts by extruding a thermoplastic filament from a nozzle in a predetermined path. Originally developed for commercial printers, 3D printing via FFF has become accessible to a much larger community of users since the introduction of Reprap printers. These low-cost, desktop machines are typically used to print prototype parts or novelty items. As the adoption of desktop sized 3D printers broadens, there is increased demand for these machines to produce functional parts that can withstand harsher conditions such as high temperature and mechanical loads. Materials meeting these requirements tend to possess better mechanical properties and higher glass transition temperatures (Tg), thus requiring printers with high temperature printing capability. This report outlines the problems and solutions, and includes a detailed description of the machine design, printing parameters, and processes specific to high temperature thermoplastic 3D printing.

  15. 48 CFR 246.470-1 - Assessment of additional costs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... of Supplies—Fixed-Price; and (2) Demand payment of the costs in accordance with the collection... Supplies—Fixed-Price, after considering the factors in paragraph (c) of this subsection, the quality...

  16. Decision making based on analysis of benefit versus costs of preventive retrofit versus costs of repair after earthquake hazards

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2012-04-01

    dissipaters) to different amount and location in the building was considered. Device computations, a civil engineering method for building economics (and which was, before statistics existed, also the method for computing the costs of general upgrade of buildings), were done for the retrofit and for the repair measures, being able to be applied for different countries, also ones where there is no database on existing projects in seismic retrofit. The building elements for which the device computations were done are named "retrofit elements" and they can be new elements, modified elements or replaced elements of the initial building. The addition of the devices is simple, as the row in project management was, but, for the sake of comparison, also complex project management computed in other works was compared for innovative measures such as FRP (with glass and fibre). The theoretical costs for model measures were compared to the way costs of real retrofit for this building type (with reinforced concrete jacketing and FRP) are computed in Greece. The theoretical proposed measures were generally compared to those applied in practice, in Romania and Italy as well. A further study will include these, as in Italy diagonal braces with dissipation had been used. The typology of braces is relevant also for the local seismic culture, maybe outgoing for another type of skeleton structures the distribution of which has been studied: the timber skeleton. A subtype of Romanian reinforced concrete skeleton buildings includes diagonal braces. In order to assess the costs of rebuilding or general upgrade without retrofit, architecture methods for building economics are considered based on floor surface. Diagrams have been built to see how the total costs vary as addition between the preventive retrofit and the post-earthquake repair, and tables to compare to the costs of rebuilding, outgoing from a the model of addition of day-lighting in atria of buildings. The moment when a repair measure

  17. Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 6: Implementation schedule, development costs, operational costs, benefit assessment, impact on company organization, spin-off assessment, phase 1, tasks 3 to 8

    NASA Technical Reports Server (NTRS)

    Garrocq, C. A.; Hurley, M. J.; Dublin, M.

    1973-01-01

    A baseline implementation plan, including alternative implementation approaches for critical software elements and variants to the plan, was developed. The basic philosophy was aimed at: (1) a progressive release of capability for three major computing systems, (2) an end product that was a working tool, (3) giving participation to industry, government agencies, and universities, and (4) emphasizing the development of critical elements of the IPAD framework software. The results of these tasks indicate an IPAD first release capability 45 months after go-ahead, a five year total implementation schedule, and a total developmental cost of 2027 man-months and 1074 computer hours. Several areas of operational cost increases were identified mainly due to the impact of additional equipment needed and additional computer overhead. The benefits of an IPAD system were related mainly to potential savings in engineering man-hours, reduction of design-cycle calendar time, and indirect upgrading of product quality and performance.

  18. Addition of Adult-to-Adult Living Donation to Liver Transplant Programs Improves Survival but at an Increased Cost1-2

    PubMed Central

    Northup, Patrick G.; Abecassis, Michael M.; Englesbe, Michael J.; Emond, Jean C.; Lee, Vanessa D.; Stukenborg, George J.; Tong, Lan; Berg, Carl L.

    2011-01-01

    We performed a cost-effectiveness analysis exploring the cost and benefits of LDLT using outcomes data from the Adult to Adult Living Donor Liver Transplantation Cohort Study (A2ALL). A multistage Markov decision analysis model was developed with treatment strategies including medical management only (strategy 1), waiting list with possible deceased donor liver transplant (strategy 2), and waiting list with possible LDLT or DDLT (strategy 3) over ten years. Decompensated cirrhosis with medical management offered 2.0 quality adjusted life years (QALY) survival while costing an average of $65,068, waiting list with possible DDLT offered 4.4 QALY survival and a mean cost $151,613, and waiting list with possible DDLT or LDLT offered 4.9 QALY survival and a mean cost $208,149. Strategy 2 had an incremental cost effectiveness ratio (ICER) of $35,976 over strategy 1 while strategy 3 produced an ICER of $106,788 over strategy 2. On average, strategy 3 cost $47,693 more per QALY than strategy 1. Both DDLT and LDLT are cost-effective compared to medical management of cirrhosis over our ten year study period. The addition of LDLT to a standard waiting list DDLT program is effective at improving recipient survival and preventing waiting list deaths but at a greater cost. PMID:19177435

  19. GME: at what cost?

    PubMed

    Young, David W

    2003-11-01

    Current computing methods impede determining the real cost of graduate medical education. However, a more accurate estimate could be obtained if policy makers would allow for the application of basic cost-accounting principles, including consideration of department-level costs, unbundling of joint costs, and other factors.

  20. Costs of fire suppression forces based on cost-aggregation approach

    Treesearch

    Gonz& aacute; lez-Cab& aacute; Armando n; Charles W. McKetta; Thomas J. Mills

    1984-01-01

    A cost-aggregation approach has been developed for determining the cost of Fire Management Inputs (FMls)-the direct fireline production units (personnel and equipment) used in initial attack and large-fire suppression activities. All components contributing to an FMI are identified, computed, and summed to estimate hourly costs. This approach can be applied to any FMI...

  1. Debugging embedded computer programs. [tactical missile computers

    NASA Technical Reports Server (NTRS)

    Kemp, G. H.

    1980-01-01

    Every embedded computer program must complete its debugging cycle using some system that will allow real time debugging. Many of the common items addressed during debugging are listed. Seven approaches to debugging are analyzed to evaluate how well they treat those items. Cost evaluations are also included in the comparison. The results indicate that the best collection of capabilities to cover the common items present in the debugging task occurs in the approach where a minicomputer handles the environment simulation with an emulation of some kind representing the embedded computer. This approach can be taken at a reasonable cost. The case study chosen is an embedded computer in a tactical missile. Several choices of computer for the environment simulation are discussed as well as different approaches to the embedded emulator.

  2. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  3. The Prevalence of Phosphorus Containing Food Additives in Top Selling Foods in Grocery Stores

    PubMed Central

    León, Janeen B.; Sullivan, Catherine M.; Sehgal, Ashwini R.

    2013-01-01

    Objective To determine the prevalence of phosphorus-containing food additives in best selling processed grocery products and to compare the phosphorus content of a subset of top selling foods with and without phosphorus additives. Design The labels of 2394 best selling branded grocery products in northeast Ohio were reviewed for phosphorus additives. The top 5 best selling products containing phosphorus additives from each food category were matched with similar products without phosphorus additives and analyzed for phosphorus content. Four days of sample meals consisting of foods with and without phosphorus additives were created and daily phosphorus and pricing differentials were computed. Setting Northeast Ohio Main outcome measures Presence of phosphorus-containing food additives, phosphorus content Results 44% of the best selling grocery items contained phosphorus additives. The additives were particularly common in prepared frozen foods (72%), dry food mixes (70%), packaged meat (65%), bread & baked goods (57%), soup (54%), and yogurt (51%) categories. Phosphorus additive containing foods averaged 67 mg phosphorus/100 gm more than matched non-additive containing foods (p=.03). Sample meals comprised mostly of phosphorus additive-containing foods had 736 mg more phosphorus per day compared to meals consisting of only additive-free foods. Phosphorus additive-free meals cost an average of $2.00 more per day. Conclusion Phosphorus additives are common in best selling processed groceries and contribute significantly to their phosphorus content. Moreover, phosphorus additive foods are less costly than phosphorus additive-free foods. As a result, persons with chronic kidney disease may purchase these popular low-cost groceries and unknowingly increase their intake of highly bioavailable phosphorus. PMID:23402914

  4. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  5. A simple, low-cost, data logging pendulum built from a computer mouse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gintautas, Vadas; Hubler, Alfred

    Lessons and homework problems involving a pendulum are often a big part of introductory physics classes and laboratory courses from high school to undergraduate levels. Although laboratory equipment for pendulum experiments is commercially available, it is often expensive and may not be affordable for teachers on fixed budgets, particularly in developing countries. We present a low-cost, easy-to-build rotary sensor pendulum using the existing hardware in a ball-type computer mouse. We demonstrate how this apparatus may be used to measure both the frequency and coefficient of damping of a simple physical pendulum. This easily constructed laboratory equipment makes it possible formore » all students to have hands-on experience with one of the most important simple physical systems.« less

  6. International survey on willingness-to-pay (WTP) for one additional QALY gained: what is the threshold of cost effectiveness?

    PubMed

    Shiroiwa, Takeru; Sung, Yoon-Kyoung; Fukuda, Takashi; Lang, Hui-Chu; Bae, Sang-Cheol; Tsutani, Kiichiro

    2010-04-01

    Although the threshold of cost effectiveness of medical interventions is thought to be 20 000- 30 000 UK pounds in the UK, and $50 000-$100 000 in the US, it is well known that these values are unjustified, due to lack of explicit scientific evidence. We measured willingness-to-pay (WTP) for one additional quality-adjusted life-year gained to determine the threshold of the incremental cost-effectiveness ratio. Our study used the Internet to compare WTP for the additional year of survival in a perfect status of health in Japan, the Republic of Korea (ROK), Taiwan, Australia, the UK, and the US. The research utilized a double-bound dichotomous choice, and analysis by the nonparametric Turnbull method. WTP values were JPY 5 million (Japan), KWN 68 million (ROK), NT$ 2.1 million (Taiwan), 23 000 UK pounds (UK), AU$ 64 000 (Australia), and US$ 62 000 (US). The discount rates of outcome were estimated at 6.8% (Japan), 3.7% (ROK), 1.6% (Taiwan), 2.8% (UK), 1.9% (Australia), and 3.2% (US). Based on the current study, we suggest new classification of cost-effectiveness plane and methodology for decision making. Copyright (c) 2009 John Wiley & Sons, Ltd.

  7. The prevalence of phosphorus-containing food additives in top-selling foods in grocery stores.

    PubMed

    León, Janeen B; Sullivan, Catherine M; Sehgal, Ashwini R

    2013-07-01

    The objective of this study was to determine the prevalence of phosphorus-containing food additives in best-selling processed grocery products and to compare the phosphorus content of a subset of top-selling foods with and without phosphorus additives. The labels of 2394 best-selling branded grocery products in northeast Ohio were reviewed for phosphorus additives. The top 5 best-selling products containing phosphorus additives from each food category were matched with similar products without phosphorus additives and analyzed for phosphorus content. Four days of sample meals consisting of foods with and without phosphorus additives were created, and daily phosphorus and pricing differentials were computed. Presence of phosphorus-containing food additives, phosphorus content. Forty-four percent of the best-selling grocery items contained phosphorus additives. The additives were particularly common in prepared frozen foods (72%), dry food mixes (70%), packaged meat (65%), bread and baked goods (57%), soup (54%), and yogurt (51%) categories. Phosphorus additive-containing foods averaged 67 mg phosphorus/100 g more than matched nonadditive-containing foods (P = .03). Sample meals comprised mostly of phosphorus additive-containing foods had 736 mg more phosphorus per day compared with meals consisting of only additive-free foods. Phosphorus additive-free meals cost an average of $2.00 more per day. Phosphorus additives are common in best-selling processed groceries and contribute significantly to their phosphorus content. Moreover, phosphorus additive foods are less costly than phosphorus additive-free foods. As a result, persons with chronic kidney disease may purchase these popular low-cost groceries and unknowingly increase their intake of highly bioavailable phosphorus. Copyright © 2013 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  8. Effectiveness and cost-effectiveness of computer and other electronic aids for smoking cessation: a systematic review and network meta-analysis.

    PubMed

    Chen, Y-F; Madan, J; Welton, N; Yahaya, I; Aveyard, P; Bauld, L; Wang, D; Fry-Smith, A; Munafò, M R

    2012-01-01

    Smoking is harmful to health. On average, lifelong smokers lose 10 years of life, and about half of all lifelong smokers have their lives shortened by smoking. Stopping smoking reverses or prevents many of these harms. However, cessation services in the NHS achieve variable success rates with smokers who want to quit. Approaches to behaviour change can be supplemented with electronic aids, and this may significantly increase quit rates and prevent a proportion of cases that relapse. The primary research question we sought to answer was: What is the effectiveness and cost-effectiveness of internet, pc and other electronic aids to help people stop smoking? We addressed the following three questions: (1) What is the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids for smoking cessation and/or reducing relapse? (2) What is the cost-effectiveness of incorporating internet sites, computer programs, mobile telephone text messages and other electronic aids into current nhs smoking cessation programmes? and (3) What are the current gaps in research into the effectiveness of internet sites, computer programs, mobile telephone text messages and other electronic aids to help people stop smoking? For the effectiveness review, relevant primary studies were sought from The Cochrane Library [Cochrane Central Register of Controlled Trials (CENTRAL)] 2009, Issue 4, and MEDLINE (Ovid), EMBASE (Ovid), PsycINFO (Ovid), Health Management Information Consortium (HMIC) (Ovid) and Cumulative Index to Nursing and Allied Health Literature (CINAHL) (EBSCOhost) from 1980 to December 2009. In addition, NHS Economic Evaluation Database (NHS EED) and Database of Abstracts of Reviews of Effects (DARE) were searched for information on cost-effectiveness and modelling for the same period. Reference lists of included studies and of relevant systematic reviews were examined to identify further potentially relevant studies. Research registries

  9. Energy sources for laparoscopic colectomy: a prospective randomized comparison of conventional electrosurgery, bipolar computer-controlled electrosurgery and ultrasonic dissection. Operative outcome and costs analysis.

    PubMed

    Targarona, Eduardo Ma; Balague, Carmen; Marin, Juan; Neto, Rene Berindoague; Martinez, Carmen; Garriga, Jordi; Trias, Manuel

    2005-12-01

    The development of operative laparoscopic surgery is linked to advances in ancillary surgical instrumentation. Ultrasonic energy devices avoid the use of electricity and provide effective control of small- to medium-sized vessels. Bipolar computer-controlled electrosurgical technology eliminates the disadvantages of electrical energy, and a mechanical blade adds a cutting action. This instrument can provide effective hemostasis of large vessels up to 7 mm. Such devices significantly increase the cost of laparoscopic procedures, however, and the amount of evidence-based information on this topic is surprisingly scarce. This study compared the effectiveness of three different energy sources on the laparoscopic performance of a left colectomy. The trial included 38 nonselected patients with a disease of the colon requiring an elective segmental left-sided colon resection. Patients were preoperatively randomized into three groups. Group I had electrosurgery; vascular dissection was performed entirely with an electrosurgery generator, and vessels were controlled with clips. Group II underwent computer-controlled bipolar electrosurgery; vascular and mesocolon section was completed by using the 10-mm Ligasure device alone. In group III, 5-mm ultrasonic shears (Harmonic Scalpel) were used for bowel dissection, vascular pedicle dissection, and mesocolon transection. The mesenteric vessel pedicle was controlled with an endostapler. Demographics (age, sex, body mass index, comorbidity, previous surgery and diagnoses requiring surgery) were recorded, as were surgical details (operative time, conversion, blood loss), additional disposable instruments (number of trocars, EndoGIA charges, and clip appliers), and clinical outcome. Intraoperative economic costs were also evaluated. End points of the trial were operative time and intraoperative blood loss, and an intention-to-treat principle was followed. The three groups were well matched for demographic and pathologic features

  10. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    NASA Technical Reports Server (NTRS)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  11. Task inhibition, conflict, and the n-2 repetition cost: A combined computational and empirical approach.

    PubMed

    Sexton, Nicholas J; Cooper, Richard P

    2017-05-01

    Task inhibition (also known as backward inhibition) is an hypothesised form of cognitive inhibition evident in multi-task situations, with the role of facilitating switching between multiple, competing tasks. This article presents a novel cognitive computational model of a backward inhibition mechanism. By combining aspects of previous cognitive models in task switching and conflict monitoring, the model instantiates the theoretical proposal that backward inhibition is the direct result of conflict between multiple task representations. In a first simulation, we demonstrate that the model produces two effects widely observed in the empirical literature, specifically, reaction time costs for both (n-1) task switches and n-2 task repeats. Through a systematic search of parameter space, we demonstrate that these effects are a general property of the model's theoretical content, and not specific parameter settings. We further demonstrate that the model captures previously reported empirical effects of inter-trial interval on n-2 switch costs. A final simulation extends the paradigm of switching between tasks of asymmetric difficulty to three tasks, and generates novel predictions for n-2 repetition costs. Specifically, the model predicts that n-2 repetition costs associated with hard-easy-hard alternations are greater than for easy-hard-easy alternations. Finally, we report two behavioural experiments testing this hypothesis, with results consistent with the model predictions. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Cost-effectiveness of digital subtraction angiography in the setting of computed tomographic angiography negative subarachnoid hemorrhage.

    PubMed

    Jethwa, Pinakin R; Punia, Vineet; Patel, Tapan D; Duffis, E Jesus; Gandhi, Chirag D; Prestigiacomo, Charles J

    2013-04-01

    Recent studies have documented the high sensitivity of computed tomography angiography (CTA) in detecting a ruptured aneurysm in the presence of acute subarachnoid hemorrhage (SAH). The practice of digital subtraction angiography (DSA) when CTA does not reveal an aneurysm has thus been called into question. We examined this dilemma from a cost-effectiveness perspective by using current decision analysis techniques. A decision tree was created with the use of TreeAge Pro Suite 2012; in 1 arm, a CTA-negative SAH was followed up with DSA; in the other arm, patients were observed without further imaging. Based on literature review, costs and utilities were assigned to each potential outcome. Base-case and sensitivity analyses were performed to determine the cost-effectiveness of each strategy. A Monte Carlo simulation was then conducted by sampling each variable over a plausible distribution to evaluate the robustness of the model. With the use of a negative predictive value of 95.7% for CTA, observation was found to be the most cost-effective strategy ($6737/Quality Adjusted Life Year [QALY] vs $8460/QALY) in the base-case analysis. One-way sensitivity analysis demonstrated that DSA became the more cost-effective option if the negative predictive value of CTA fell below 93.72%. The Monte Carlo simulation produced an incremental cost-effectiveness ratio of $83 083/QALY. At the conventional willingness-to-pay threshold of $50 000/QALY, observation was the more cost-effective strategy in 83.6% of simulations. The decision to perform a DSA in CTA-negative SAH depends strongly on the sensitivity of CTA, and therefore must be evaluated at each center treating these types of patients. Given the high sensitivity of CTA reported in the current literature, performing DSA on all patients with CTA negative SAH may not be cost-effective at every institution.

  13. Additive Manufacturing for Cost Efficient Production of Compact Ceramic Heat Exchangers and Recuperators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shulman, Holly; Ross, Nicole

    2015-10-30

    An additive manufacture technique known as laminated object manufacturing (LOM) was used to fabricate compact ceramic heat exchanger prototypes. LOM uses precision CO2 laser cutting of ceramic green tapes, which are then precision stacked to build a 3D object with fine internal features. Modeling was used to develop prototype designs and predict the thermal response, stress, and efficiency in the ceramic heat exchangers. Build testing and materials analyses were used to provide feedback for the design selection. During this development process, laminated object manufacturing protocols were established. This included laser optimization, strategies for fine feature integrity, lamination fluid control, greenmore » handling, and firing profile. Three full size prototypes were fabricated using two different designs. One prototype was selected for performance testing. During testing, cross talk leakage prevented the application of a high pressure differential, however, the prototype was successful at withstanding the high temperature operating conditions (1300 °F). In addition, analysis showed that the bulk of the part did not have cracks or leakage issues. This led to the development of a module method for next generation LOM heat exchangers. A scale-up cost analysis showed that given a purpose built LOM system, these ceramic heat exchangers would be affordable for the applications.« less

  14. Computer program to assess impact of fatigue and fracture criteria on weight and cost of transport aircraft

    NASA Technical Reports Server (NTRS)

    Tanner, C. J.; Kruse, G. S.; Oman, B. H.

    1975-01-01

    A preliminary design analysis tool for rapidly performing trade-off studies involving fatigue, fracture, static strength, weight, and cost is presented. Analysis subprograms were developed for fatigue life, crack growth life, and residual strength; and linked to a structural synthesis module which in turn was integrated into a computer program. The part definition module of a cost and weight analysis program was expanded to be compatible with the upgraded structural synthesis capability. The resultant vehicle design and evaluation program is named VDEP-2. It is an accurate and useful tool for estimating purposes at the preliminary design stage of airframe development. A sample case along with an explanation of program applications and input preparation is presented.

  15. The updated algorithm of the Energy Consumption Program (ECP): A computer model simulating heating and cooling energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.

    1979-01-01

    The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.

  16. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  17. Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing

    NASA Astrophysics Data System (ADS)

    Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim

    2011-03-01

    Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.

  18. New Federal Cost Accounting Regulations

    ERIC Educational Resources Information Center

    Wolff, George J.; Handzo, Joseph J.

    1973-01-01

    Discusses a new set of indirect cost accounting procedures which must be followed by school districts wishing to recover any indirect costs of administering federal grants and contracts. Also discusses the amount of indirect costs that may be recovered, computing indirect costs, classifying project costs, and restricted grants. (Author/DN)

  19. Cost-effectiveness of intensive multifactorial treatment compared with routine care for individuals with screen-detected Type 2 diabetes: analysis of the ADDITION-UK cluster-randomized controlled trial

    PubMed Central

    Tao, L; Wilson, E C F; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2015-01-01

    Aims To examine the short- and long-term cost-effectiveness of intensive multifactorial treatment compared with routine care among people with screen-detected Type 2 diabetes. Methods Cost–utility analysis in ADDITION-UK, a cluster-randomized controlled trial of early intensive treatment in people with screen-detected diabetes in 69 UK general practices. Unit treatment costs and utility decrement data were taken from published literature. Accumulated costs and quality-adjusted life years (QALYs) were calculated using ADDITION-UK data from 1 to 5 years (short-term analysis, n = 1024); trial data were extrapolated to 30 years using the UKPDS outcomes model (version 1.3) (long-term analysis; n = 999). All costs were transformed to the UK 2009/10 price level. Results Adjusted incremental costs to the NHS were £285, £935, £1190 and £1745 over a 1-, 5-, 10- and 30-year time horizon, respectively (discounted at 3.5%). Adjusted incremental QALYs were 0.0000, – 0.0040, 0.0140 and 0.0465 over the same time horizons. Point estimate incremental cost-effectiveness ratios (ICERs) suggested that the intervention was not cost-effective although the ratio improved over time: the ICER over 10 years was £82 250, falling to £37 500 over 30 years. The ICER fell below £30 000 only when the intervention cost was below £631 per patient: we estimated the cost at £981. Conclusion Given conventional thresholds of cost-effectiveness, the intensive treatment delivered in ADDITION was not cost-effective compared with routine care for individuals with screen-detected diabetes in the UK. The intervention may be cost-effective if it can be delivered at reduced cost. PMID:25661661

  20. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    PubMed

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Protecting child health and nutrition status with ready-to-use food in addition to food assistance in urban Chad: a cost-effectiveness analysis

    PubMed Central

    2013-01-01

    Background Despite growing interest in use of lipid nutrient supplements for preventing child malnutrition and morbidity, there is inconclusive evidence on the effectiveness, and no evidence on the cost-effectiveness of this strategy. Methods A cost effectiveness analysis was conducted comparing costs and outcomes of two arms of a cluster randomized controlled trial implemented in eastern Chad during the 2010 hunger gap by Action contre la Faim France and Ghent University. This trial assessed the effect on child malnutrition and morbidity of a 5-month general distribution of staple rations, or staple rations plus a ready-to-use supplementary food (RUSF). RUSF was distributed to households with a child aged 6–36 months who was not acutely malnourished (weight-for-height > = 80% of the NCHS reference median, and absence of bilateral pitting edema), to prevent acute malnutrition in these children. While the addition of RUSF to a staple ration did not result in significant reduction in wasting rates, cost-effectiveness was assessed using successful secondary outcomes of cases of diarrhea and anemia (hemoglobin <110 g/L) averted among children receiving RUSF. Total costs of the program and incremental costs of RUSF and related management and logistics were estimated using accounting records and key informant interviews, and include costs to institutions and communities. An activity-based costing methodology was applied and incremental costs were calculated per episode of diarrhea and case of anemia averted. Results Adding RUSF to a general food distribution increased total costs by 23%, resulting in an additional cost per child of 374 EUR, and an incremental cost per episode of diarrhea averted of 1,083 EUR and per case of anemia averted of 3,627 EUR. Conclusions Adding RUSF to a staple ration was less cost-effective than other standard intervention options for averting diarrhea and anemia. This strategy holds potential to address a broad array of health and

  2. A precise goniometer/tensiometer using a low cost single-board computer

    NASA Astrophysics Data System (ADS)

    Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.

    2017-12-01

    Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.

  3. An economic evaluation of a video- and text-based computer-tailored intervention for smoking cessation: a cost-effectiveness and cost-utility analysis of a randomized controlled trial.

    PubMed

    Stanczyk, Nicola E; Smit, Eline S; Schulz, Daniela N; de Vries, Hein; Bolman, Catherine; Muris, Jean W M; Evers, Silvia M A A

    2014-01-01

    Although evidence exists for the effectiveness of web-based smoking cessation interventions, information about the cost-effectiveness of these interventions is limited. The study investigated the cost-effectiveness and cost-utility of two web-based computer-tailored (CT) smoking cessation interventions (video- vs. text-based CT) compared to a control condition that received general text-based advice. In a randomized controlled trial, respondents were allocated to the video-based condition (N = 670), the text-based condition (N = 708) or the control condition (N = 721). Societal costs, smoking status, and quality-adjusted life years (QALYs; EQ-5D-3L) were assessed at baseline, six-and twelve-month follow-up. The incremental costs per abstinent respondent and per QALYs gained were calculated. To account for uncertainty, bootstrapping techniques and sensitivity analyses were carried out. No significant differences were found in the three conditions regarding demographics, baseline values of outcomes and societal costs over the three months prior to baseline. Analyses using prolonged abstinence as outcome measure indicated that from a willingness to pay of €1,500, the video-based intervention was likely to be the most cost-effective treatment, whereas from a willingness to pay of €50,400, the text-based intervention was likely to be the most cost-effective. With regard to cost-utilities, when quality of life was used as outcome measure, the control condition had the highest probability of being the most preferable treatment. Sensitivity analyses yielded comparable results. The video-based CT smoking cessation intervention was the most cost-effective treatment for smoking abstinence after twelve months, varying the willingness to pay per abstinent respondent from €0 up to €80,000. With regard to cost-utility, the control condition seemed to be the most preferable treatment. Probably, more time will be required to assess changes in quality of life

  4. The Additional Costs per Month of Progression-Free Survival and Overall Survival: An Economic Model Comparing Everolimus with Cabozantinib, Nivolumab, and Axitinib for Second-Line Treatment of Metastatic Renal Cell Carcinoma.

    PubMed

    Swallow, Elyse; Messali, Andrew; Ghate, Sameer; McDonald, Evangeline; Duchesneau, Emilie; Perez, Jose Ricardo

    2018-04-01

    When considering optimal second-line treatments for metastatic renal cell carcinoma (mRCC), clinicians and payers seek to understand the relative clinical benefits and costs of treatment. To use an economic model to compare the additional cost per month of overall survival (OS) and of progression-free survival (PFS) for cabozantinib, nivolumab, and axitinib with everolimus for the second-line treatment of mRCC from a third-party U.S. payer perspective. The model evaluated mean OS and PFS and costs associated with drug acquisition/administration; adverse event (AE) treatment; monitoring; and postprogression (third-line treatment, monitoring, and end-of-life costs) over 1- and 2-year horizons. Efficacy, safety, and treatment duration inputs were estimated from regimens' pivotal clinical trials; for everolimus, results were weighted across trials. Mean 1- and 2-year OS and mean 1-year PFS were estimated using regimens' reported OS and PFS Kaplan-Meier curves. Dosing and administration inputs were consistent with approved prescribing information and the clinical trials used to estimate efficacy and safety inputs. Cost inputs came from published literature and public data. Additional cost per additional month of OS or PFS was calculated using the ratio of the cost difference per treated patient and the corresponding difference in mean OS or PFS between everolimus and each comparator. One-way sensitivity analyses were conducted by varying efficacy and cost inputs. Compared with everolimus, cabozantinib, nivolumab, and axitinib were associated with 1.6, 0.3, and 0.5 additional months of PFS, respectively, over 1 year. Cabozantinib and nivolumab were associated with additional months of OS compared with everolimus (1 year: 0.7 and 0.8 months; 2 years: 1.6 and 2.3 months; respectively); axitinib was associated with fewer months (1 year: -0.2 months; 2 years: -0.7 months). The additional costs of treatment with cabozantinib, nivolumab, or axitinib versus everolimus over 1

  5. Ship Compliance in Emission Control Areas: Technology Costs and Policy Instruments.

    PubMed

    Carr, Edward W; Corbett, James J

    2015-08-18

    This paper explores whether a Panama Canal Authority pollution tax could be an effective economic instrument to achieve Emission Control Area (ECA)-like reductions in emissions from ships transiting the Panama Canal. This tariff-based policy action, whereby vessels in compliance with International Maritime Organisation (IMO) ECA standards pay a lower transit tariff than noncompliant vessels, could be a feasible alternative to petitioning for a Panamanian ECA through the IMO. A $4.06/container fuel tax could incentivize ECA-compliant emissions reductions for nearly two-thirds of Panama Canal container vessels, mainly through fuel switching; if the vessel(s) also operate in IMO-defined ECAs, exhaust-gas treatment technologies may be cost-effective. The RATES model presented here compares current abatement technologies based on hours of operation within an ECA, computing costs for a container vessel to comply with ECA standards in addition to computing the Canal tax that would reduce emissions in Panama. Retrofitted open-loop scrubbers are cost-effective only for vessels operating within an ECA for more than 4500 h annually. Fuel switching is the least-cost option to industry for vessels that operate mostly outside of ECA regions, whereas vessels operating entirely within an ECA region could reduce compliance cost with exhaust-gas treatment technology (scrubbers).

  6. The Effects of Computer-Assisted Instruction on Student Achievement in Addition and Subtraction at First Grade Level.

    ERIC Educational Resources Information Center

    Spivey, Patsy M.

    This study was conducted to determine whether the traditional classroom approach to instruction involving the addition and subtraction of number facts (digits 0-6) is more or less effective than the traditional classroom approach plus a commercially-prepared computer game. A pretest-posttest control group design was used with two groups of first…

  7. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    NASA Astrophysics Data System (ADS)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  8. Cost, cost-efficiency and cost-effectiveness of integrated family planning and HIV services.

    PubMed

    Shade, Starley B; Kevany, Sebastian; Onono, Maricianah; Ochieng, George; Steinfeld, Rachel L; Grossman, Daniel; Newmann, Sara J; Blat, Cinthia; Bukusi, Elizabeth A; Cohen, Craig R

    2013-10-01

    To evaluate costs, cost-efficiency and cost-effectiveness of integration of family planning into HIV services. Integration of family planning services into HIV care and treatment clinics. A cluster-randomized trial. Twelve health facilities in Nyanza, Kenya were randomized to integrate family planning into HIV care and treatment; six health facilities were randomized to (nonintegrated) standard-of-care with separately delivered family planning and HIV services. We assessed costs, cost-efficiency (cost per additional use of more effective family planning), and cost-effectiveness (cost per pregnancy averted) associated with the first year of integration of family planning into HIV care. More effective family planning methods included oral and injectable contraceptives, subdermal implants, intrauterine device, and female and male sterilization. We collected cost data through interviews with study staff and review of financial records to determine costs of service integration. Integration of services was associated with an average marginal cost of $841 per site and $48 per female patient. Average overall and marginal costs of integration were associated with personnel costs [initial ($1003 vs. $872) and refresher ($498 vs. $330) training, mentoring ($1175 vs. $902) and supervision ($1694 vs. $1636)], with fewer resources required for other fixed ($18 vs. $0) and recurring expenses ($471 vs. $287). Integration was associated with a marginal cost of $65 for each additional use of more effective family planning and $1368 for each pregnancy averted. Integration of family planning and HIV services is feasible, inexpensive to implement, and cost-efficient in the Kenyan setting, and thus supports current Kenyan integration policy.

  9. ANL/RBC: A computer code for the analysis of Rankine bottoming cycles, including system cost evaluation and off-design performance

    NASA Technical Reports Server (NTRS)

    Mclennan, G. A.

    1986-01-01

    This report describes, and is a User's Manual for, a computer code (ANL/RBC) which calculates cycle performance for Rankine bottoming cycles extracting heat from a specified source gas stream. The code calculates cycle power and efficiency and the sizes for the heat exchangers, using tabular input of the properties of the cycle working fluid. An option is provided to calculate the costs of system components from user defined input cost functions. These cost functions may be defined in equation form or by numerical tabular data. A variety of functional forms have been included for these functions and they may be combined to create very general cost functions. An optional calculation mode can be used to determine the off-design performance of a system when operated away from the design-point, using the heat exchanger areas calculated for the design-point.

  10. Cloud computing for comparative genomics.

    PubMed

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  11. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  12. Computer-generated holograms by multiple wavefront recording plane method with occlusion culling.

    PubMed

    Symeonidou, Athanasia; Blinder, David; Munteanu, Adrian; Schelkens, Peter

    2015-08-24

    We propose a novel fast method for full parallax computer-generated holograms with occlusion processing, suitable for volumetric data such as point clouds. A novel light wave propagation strategy relying on the sequential use of the wavefront recording plane method is proposed, which employs look-up tables in order to reduce the computational complexity in the calculation of the fields. Also, a novel technique for occlusion culling with little additional computation cost is introduced. Additionally, the method adheres a Gaussian distribution to the individual points in order to improve visual quality. Performance tests show that for a full-parallax high-definition CGH a speedup factor of more than 2,500 compared to the ray-tracing method can be achieved without hardware acceleration.

  13. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are

  14. Grading Multiple Choice Exams with Low-Cost and Portable Computer-Vision Techniques

    NASA Astrophysics Data System (ADS)

    Fisteus, Jesus Arias; Pardo, Abelardo; García, Norberto Fernández

    2013-08-01

    Although technology for automatic grading of multiple choice exams has existed for several decades, it is not yet as widely available or affordable as it should be. The main reasons preventing this adoption are the cost and the complexity of the setup procedures. In this paper, Eyegrade, a system for automatic grading of multiple choice exams is presented. While most current solutions are based on expensive scanners, Eyegrade offers a truly low-cost solution requiring only a regular off-the-shelf webcam. Additionally, Eyegrade performs both mark recognition as well as optical character recognition of handwritten student identification numbers, which avoids the use of bubbles in the answer sheet. When compared with similar webcam-based systems, the user interface in Eyegrade has been designed to provide a more efficient and error-free data collection procedure. The tool has been validated with a set of experiments that show the ease of use (both setup and operation), the reduction in grading time, and an increase in the reliability of the results when compared with conventional, more expensive systems.

  15. Addition of docetaxel and/or zoledronic acid to standard of care for hormone-naive prostate cancer: a cost-effectiveness analysis.

    PubMed

    Zhang, Pengfei; Wen, Feng; Fu, Ping; Yang, Yu; Li, Qiu

    2017-07-31

    The effectiveness of the addition of docetaxel and/or zoledronic acid to the standard of care (SOC) for hormone-naive prostate cancer has been evaluated in the STAMPEDE trial. The object of the present analysis was to evaluate the cost-effectiveness of these treatment options in the treatment of advanced hormone-naive prostate cancer in China. A cost-effectiveness analysis using a Markov model was carried out from the Chinese societal perspective. The efficacy data were obtained from the STAMPEDE trial and health utilities were derived from previous studies. Transition probabilities were calculated based on the survival in each group. The primary endpoint in the analysis was the incremental cost-effectiveness ratio (ICER), and model uncertainties were explored by 1-way sensitivity analysis and probabilistic sensitivity analysis. SOC alone generated an effectiveness of 2.65 quality-adjusted life years (QALYs) at a lifetime cost of $20,969.23. At a cost of $25,001.34, SOC plus zoledronic acid was associated with 2.69 QALYs, resulting in an ICER of $100,802.75/QALY compared with SOC alone. SOC plus docetaxel gained an effectiveness of 2.85 QALYs at a cost of $28,764.66, while the effectiveness and cost data in the SOC plus zoledronic acid/docetaxel group were 2.78 QALYs and $32,640.95. Based on the results of the analysis, SOC plus zoledronic acid, SOC plus docetaxel, and SOC plus zoledronic acid/docetaxel are unlikely to be cost-effective options in patients with advanced hormone-naive prostate cancer compared with SOC alone.

  16. Chest Computed Tomographic Image Screening for Cystic Lung Diseases in Patients with Spontaneous Pneumothorax Is Cost Effective.

    PubMed

    Gupta, Nishant; Langenderfer, Dale; McCormack, Francis X; Schauer, Daniel P; Eckman, Mark H

    2017-01-01

    Patients without a known history of lung disease presenting with a spontaneous pneumothorax are generally diagnosed as having primary spontaneous pneumothorax. However, occult diffuse cystic lung diseases such as Birt-Hogg-Dubé syndrome (BHD), lymphangioleiomyomatosis (LAM), and pulmonary Langerhans cell histiocytosis (PLCH) can also first present with a spontaneous pneumothorax, and their early identification by high-resolution computed tomographic (HRCT) chest imaging has implications for subsequent management. The objective of our study was to evaluate the cost-effectiveness of HRCT chest imaging to facilitate early diagnosis of LAM, BHD, and PLCH. We constructed a Markov state-transition model to assess the cost-effectiveness of screening HRCT to facilitate early diagnosis of diffuse cystic lung diseases in patients presenting with an apparent primary spontaneous pneumothorax. Baseline data for prevalence of BHD, LAM, and PLCH and rates of recurrent pneumothoraces in each of these diseases were derived from the literature. Costs were extracted from 2014 Medicare data. We compared a strategy of HRCT screening followed by pleurodesis in patients with LAM, BHD, or PLCH versus conventional management with no HRCT screening. In our base case analysis, screening for the presence of BHD, LAM, or PLCH in patients presenting with a spontaneous pneumothorax was cost effective, with a marginal cost-effectiveness ratio of $1,427 per quality-adjusted life-year gained. Sensitivity analysis showed that screening HRCT remained cost effective for diffuse cystic lung diseases prevalence as low as 0.01%. HRCT image screening for BHD, LAM, and PLCH in patients with apparent primary spontaneous pneumothorax is cost effective. Clinicians should consider performing a screening HRCT in patients presenting with apparent primary spontaneous pneumothorax.

  17. Teaching with Technology: The Classroom Manager. Cost-Conscious Computing.

    ERIC Educational Resources Information Center

    Smith, Rhea; And Others

    1992-01-01

    Teachers discuss how to make the most of technology in the classroom during a tight economy. Ideas include recycling computer printer ribbons, buying replacement batteries for computer power supply packs, upgrading via software, and soliciting donated computer equipment. (SM)

  18. Basic principles of cone beam computed tomography.

    PubMed

    Abramovitch, Kenneth; Rice, Dwight D

    2014-07-01

    At the end of the millennium, cone-beam computed tomography (CBCT) heralded a new dental technology for the next century. Owing to the dramatic and positive impact of CBCT on implant dentistry and orthognathic/orthodontic patient care, additional applications for this technology soon evolved. New software programs were developed to improve the applicability of, and access to, CBCT for dental patients. Improved, rapid, and cost-effective computer technology, combined with the ability of software engineers to develop multiple dental imaging applications for CBCT with broad diagnostic capability, have played a large part in the rapid incorporation of CBCT technology into dentistry. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Computer-assisted learning in critical care: from ENIAC to HAL.

    PubMed

    Tegtmeyer, K; Ibsen, L; Goldstein, B

    2001-08-01

    Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer.

  20. Is a computer-assisted design and computer-assisted manufacturing method for mandibular reconstruction economically viable?

    PubMed

    Tarsitano, Achille; Battaglia, Salvatore; Crimi, Salvatore; Ciocca, Leonardo; Scotti, Roberto; Marchetti, Claudio

    2016-07-01

    The design and manufacture of patient-specific mandibular reconstruction plates, particularly in combination with cutting guides, has created many new opportunities for the planning and implementation of mandibular reconstruction. Although this surgical method is being used more widely and the outcomes appear to be improved, the question of the additional cost has to be discussed. To evaluate the cost generated by the management of this technology, we studied a cohort of patients treated for mandibular neoplasms. The population was divided into two groups of 20 patients each who were undergoing a 'traditional' freehand mandibular reconstruction or a computer-aided design/computer-aided manufacturing (CAD-CAM) mandibular reconstruction. Data concerning operation time, complications, and days of hospitalisation were used to evaluate costs related to the management of these patients. The mean operating time for the CAD-CAM group was 435 min, whereas that for the freehand group was 550.5 min. The total difference in terms of average time gain was 115.5 min. No microvascular complication occurred in the CAD-CAM group; two complications (10%) were observed in patients undergoing freehand reconstructions. The mean overall lengths of hospital stay were 13.8 days for the CAD-CAM group and 17 days for the freehand group. Finally, considering that the institutional cost per minute of theatre time is €30, the money saved as a result of the time gained was €3,450. This cost corresponds approximately to the total price of the CAD-CAM surgery. In conclusion, we believe that CAD-CAM technology for mandibular reconstruction will become a widely used reconstructive method and that its cost will be covered by gains in terms of surgical time, quality of reconstruction, and reduced complications. Copyright © 2016 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  1. CAI System Costs: Present and Future.

    ERIC Educational Resources Information Center

    Pressman, Israel; Rosenbloom, Bruce

    1984-01-01

    Discusses costs related to providing computer assisted instruction (CAI), considering hardware, software, user training, maintenance, and installation. Provides an example of the total cost of CAI broken down into these categories, giving an adjusted yearly cost. Projects future trends and costs of CAI as well as cost savings possibilities. (JM)

  2. Consumer Security Perceptions and the Perceived Influence on Adopting Cloud Computing: A Quantitative Study Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Paquet, Katherine G.

    2013-01-01

    Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…

  3. Computed tomography imaging in the management of headache in the emergency department: cost efficacy and policy implications.

    PubMed

    Jordan, Yusef J; Lightfoote, Johnson B; Jordan, John E

    2009-04-01

    To evaluate the economic impact and diagnostic utility of computed tomography (CT) in the management of emergency department (ED) patients presenting with headache and nonfocal physical examinations. Computerized medical records from 2 major community hospitals were retrospectively reviewed of patients presenting with headache over a 2.5-year period (2003-2006). A model was developed to assess test outcomes, CT result costs, and average institutional costs of the ED visit. The binomial probabilistic distribution of expected maximum cases was also calculated. Of the 5510 patient records queried, 882 (16%) met the above criteria. Two hundred eighty-one patients demonstrated positive CT findings (31.8%), but only 9 (1.02%) demonstrated clinically significant results (requiring a change in management). Most positive studies were incidental, including old infarcts, chronic ischemic changes, encephalomalacia, and sinusitis. The average cost of the head CT exam and ED visit was $764 (2006 dollars). This was approximately 3 times the cost of a routine outpatient visit (plus CT) for headache ($253). The incremental cost per clinically significant case detected in the ED was $50078. The calculated expected maximum number of clinically significant positive cases was almost 50% lower than what was actually detected. Our results indicate that emergent CT imaging of nonfocal headache yields a low percentage of positive clinically significant results, and has limited cost efficacy. Since the use of CT for imaging patients with headache in the ED is widespread, the economic implications are considerable. Health policy reforms are indicated to better direct utilization in these patients.

  4. Can a Costly Intervention Be Cost-effective?

    PubMed Central

    Foster, E. Michael; Jones, Damon

    2009-01-01

    Objectives To examine the cost-effectiveness of the Fast Track intervention, a multi-year, multi-component intervention designed to reduce violence among at-risk children. A previous report documented the favorable effect of intervention on the highest-risk group of ninth-graders diagnosed with conduct disorder, as well as self-reported delinquency. The current report addressed the cost-effectiveness of the intervention for these measures of program impact. Design Costs of the intervention were estimated using program budgets. Incremental cost-effectiveness ratios were computed to determine the cost per unit of improvement in the 3 outcomes measured in the 10th year of the study. Results Examination of the total sample showed that the intervention was not cost-effective at likely levels of policymakers' willingness to pay for the key outcomes. Subsequent analysis of those most at risk, however, showed that the intervention likely was cost-effective given specified willingness-to-pay criteria. Conclusions Results indicate that the intervention is cost-effective for the children at highest risk. From a policy standpoint, this finding is encouraging because such children are likely to generate higher costs for society over their lifetimes. However, substantial barriers to cost-effectiveness remain, such as the ability to effectively identify and recruit such higher-risk children in future implementations. PMID:17088509

  5. Computer-Aided Surgical Simulation in Head and Neck Reconstruction: A Cost Comparison among Traditional, In-House, and Commercial Options.

    PubMed

    Li, Sean S; Copeland-Halperin, Libby R; Kaminsky, Alexander J; Li, Jihui; Lodhi, Fahad K; Miraliakbari, Reza

    2018-06-01

    Computer-aided surgical simulation (CASS) has redefined surgery, improved precision and reduced the reliance on intraoperative trial-and-error manipulations. CASS is provided by third-party services; however, it may be cost-effective for some hospitals to develop in-house programs. This study provides the first cost analysis comparison among traditional (no CASS), commercial CASS, and in-house CASS for head and neck reconstruction.  The costs of three-dimensional (3D) pre-operative planning for mandibular and maxillary reconstructions were obtained from an in-house CASS program at our large tertiary care hospital in Northern Virginia, as well as a commercial provider (Synthes, Paoli, PA). A cost comparison was performed among these modalities and extrapolated in-house CASS costs were derived. The calculations were based on estimated CASS use with cost structures similar to our institution and sunk costs were amortized over 10 years.  Average operating room time was estimated at 10 hours, with an average of 2 hours saved with CASS. The hourly cost to the hospital for the operating room (including anesthesia and other ancillary costs) was estimated at $4,614/hour. Per case, traditional cases were $46,140, commercial CASS cases were $40,951, and in-house CASS cases were $38,212. Annual in-house CASS costs were $39,590.  CASS reduced operating room time, likely due to improved efficiency and accuracy. Our data demonstrate that hospitals with similar cost structure as ours, performing greater than 27 cases of 3D head and neck reconstructions per year can see a financial benefit from developing an in-house CASS program. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  6. Cloud computing for comparative genomics

    PubMed Central

    2010-01-01

    Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems. PMID:20482786

  7. Performance Assessment of a Custom, Portable, and Low-Cost Brain-Computer Interface Platform.

    PubMed

    McCrimmon, Colin M; Fu, Jonathan Lee; Wang, Ming; Lopes, Lucas Silva; Wang, Po T; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An Hong

    2017-10-01

    Conventional brain-computer interfaces (BCIs) are often expensive, complex to operate, and lack portability, which confines their use to laboratory settings. Portable, inexpensive BCIs can mitigate these problems, but it remains unclear whether their low-cost design compromises their performance. Therefore, we developed a portable, low-cost BCI and compared its performance to that of a conventional BCI. The BCI was assembled by integrating a custom electroencephalogram (EEG) amplifier with an open-source microcontroller and a touchscreen. The function of the amplifier was first validated against a commercial bioamplifier, followed by a head-to-head comparison between the custom BCI (using four EEG channels) and a conventional 32-channel BCI. Specifically, five able-bodied subjects were cued to alternate between hand opening/closing and remaining motionless while the BCI decoded their movement state in real time and provided visual feedback through a light emitting diode. Subjects repeated the above task for a total of 10 trials, and were unaware of which system was being used. The performance in each trial was defined as the temporal correlation between the cues and the decoded states. The EEG data simultaneously acquired with the custom and commercial amplifiers were visually similar and highly correlated ( ρ = 0.79). The decoding performances of the custom and conventional BCIs averaged across trials and subjects were 0.70 ± 0.12 and 0.68 ± 0.10, respectively, and were not significantly different. The performance of our portable, low-cost BCI is comparable to that of the conventional BCIs. Platforms, such as the one developed here, are suitable for BCI applications outside of a laboratory.

  8. An Economic Evaluation of a Video- and Text-Based Computer-Tailored Intervention for Smoking Cessation: A Cost-Effectiveness and Cost-Utility Analysis of a Randomized Controlled Trial

    PubMed Central

    Stanczyk, Nicola E.; Smit, Eline S.; Schulz, Daniela N.; de Vries, Hein; Bolman, Catherine; Muris, Jean W. M.; Evers, Silvia M. A. A.

    2014-01-01

    Background Although evidence exists for the effectiveness of web-based smoking cessation interventions, information about the cost-effectiveness of these interventions is limited. Objective The study investigated the cost-effectiveness and cost-utility of two web-based computer-tailored (CT) smoking cessation interventions (video- vs. text-based CT) compared to a control condition that received general text-based advice. Methods In a randomized controlled trial, respondents were allocated to the video-based condition (N = 670), the text-based condition (N = 708) or the control condition (N = 721). Societal costs, smoking status, and quality-adjusted life years (QALYs; EQ-5D-3L) were assessed at baseline, six-and twelve-month follow-up. The incremental costs per abstinent respondent and per QALYs gained were calculated. To account for uncertainty, bootstrapping techniques and sensitivity analyses were carried out. Results No significant differences were found in the three conditions regarding demographics, baseline values of outcomes and societal costs over the three months prior to baseline. Analyses using prolonged abstinence as outcome measure indicated that from a willingness to pay of €1,500, the video-based intervention was likely to be the most cost-effective treatment, whereas from a willingness to pay of €50,400, the text-based intervention was likely to be the most cost-effective. With regard to cost-utilities, when quality of life was used as outcome measure, the control condition had the highest probability of being the most preferable treatment. Sensitivity analyses yielded comparable results. Conclusion The video-based CT smoking cessation intervention was the most cost-effective treatment for smoking abstinence after twelve months, varying the willingness to pay per abstinent respondent from €0 up to €80,000. With regard to cost-utility, the control condition seemed to be the most preferable treatment. Probably, more time will be

  9. Computational methods in drug discovery

    PubMed Central

    Leelananda, Sumudu P

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed. PMID:28144341

  10. Computational methods in drug discovery.

    PubMed

    Leelananda, Sumudu P; Lindert, Steffen

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein-ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  11. Automatic Computer Mapping of Terrain

    NASA Technical Reports Server (NTRS)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  12. Reducing metal alloy powder costs for use in powder bed fusion additive manufacturing: Improving the economics for production

    NASA Astrophysics Data System (ADS)

    Medina, Fransisco

    Titanium and its associated alloys have been used in industry for over 50 years and have become more popular in the recent decades. Titanium has been most successful in areas where the high strength to weight ratio provides an advantage over aluminum and steels. Other advantages of titanium include biocompatibility and corrosion resistance. Electron Beam Melting (EBM) is an additive manufacturing (AM) technology that has been successfully applied in the manufacturing of titanium components for the aerospace and medical industry with equivalent or better mechanical properties as parts fabricated via more traditional casting and machining methods. As the demand for titanium powder continues to increase, the price also increases. Titanium spheroidized powder from different vendors has a price range from 260/kg-450/kg, other spheroidized alloys such as Niobium can cost as high as $1,200/kg. Alternative titanium powders produced from methods such as the Titanium Hydride-Dehydride (HDH) process and the Armstrong Commercially Pure Titanium (CPTi) process can be fabricated at a fraction of the cost of powders fabricated via gas atomization. The alternative powders can be spheroidized and blended. Current sectors in additive manufacturing such as the medical industry are concerned that there will not be enough spherical powder for production and are seeking other powder options. It is believed the EBM technology can use a blend of spherical and angular powder to build fully dense parts with equal mechanical properties to those produced using traditional powders. Some of the challenges with angular and irregular powders are overcoming the poor flow characteristics and the attainment of the same or better packing densities as spherical powders. The goal of this research is to demonstrate the feasibility of utilizing alternative and lower cost powders in the EBM process. As a result, reducing the cost of the raw material to reduce the overall cost of the product produced with

  13. Using additive manufacturing in accuracy evaluation of reconstructions from computed tomography.

    PubMed

    Smith, Erin J; Anstey, Joseph A; Venne, Gabriel; Ellis, Randy E

    2013-05-01

    Bone models derived from patient imaging and fabricated using additive manufacturing technology have many potential uses including surgical planning, training, and research. This study evaluated the accuracy of bone surface reconstruction of two diarthrodial joints, the hip and shoulder, from computed tomography. Image segmentation of the tomographic series was used to develop a three-dimensional virtual model, which was fabricated using fused deposition modelling. Laser scanning was used to compare cadaver bones, printed models, and intermediate segmentations. The overall bone reconstruction process had a reproducibility of 0.3 ± 0.4 mm. Production of the model had an accuracy of 0.1 ± 0.1 mm, while the segmentation had an accuracy of 0.3 ± 0.4 mm, indicating that segmentation accuracy was the key factor in reconstruction. Generally, the shape of the articular surfaces was reproduced accurately, with poorer accuracy near the periphery of the articular surfaces, particularly in regions with periosteum covering and where osteophytes were apparent.

  14. CAI: Its Cost and Its Role.

    ERIC Educational Resources Information Center

    Pressman, Israel; Rosenbloom, Bruce

    1984-01-01

    Describes and evaluates costs of hardware, software, training, and maintenance for computer assisted instruction (CAI) as they relate to total system cost. An example of an educational system provides an illustration of CAI cost analysis. Future developments, cost effectiveness, affordability, and applications in public and private environments…

  15. Ubiquitous Computing: The Universal Use of Computers on College Campuses.

    ERIC Educational Resources Information Center

    Brown, David G., Ed.

    This book is a collection of vignettes from 13 universities where everyone on campus has his or her own computer. These 13 institutions have instituted "ubiquitous computing" in very different ways at very different costs. The chapters are: (1) "Introduction: The Ubiquitous Computing Movement" (David G. Brown); (2) "Dartmouth College" (Malcolm…

  16. Estimation of optimal educational cost per medical student.

    PubMed

    Yang, Eunbae B; Lee, Seunghee

    2009-09-01

    This study aims to estimate the optimal educational cost per medical student. A private medical college in Seoul was targeted by the study, and its 2006 learning environment and data from the 2003~2006 budget and settlement were carefully analyzed. Through interviews with 3 medical professors and 2 experts in the economics of education, the study attempted to establish the educational cost estimation model, which yields an empirically computed estimate of the optimal cost per student in medical college. The estimation model was based primarily upon the educational cost which consisted of direct educational costs (47.25%), support costs (36.44%), fixed asset purchases (11.18%) and costs for student affairs (5.14%). These results indicate that the optimal cost per student is approximately 20,367,000 won each semester; thus, training a doctor costs 162,936,000 won over 4 years. Consequently, we inferred that the tuition levels of a local medical college or professional medical graduate school cover one quarter or one-half of the per- student cost. The findings of this study do not necessarily imply an increase in medical college tuition; the estimation of the per-student cost for training to be a doctor is one matter, and the issue of who should bear this burden is another. For further study, we should consider the college type and its location for general application of the estimation method, in addition to living expenses and opportunity costs.

  17. Educationally and Cost Effective: Computers in the Classroom.

    ERIC Educational Resources Information Center

    Agee, Roy

    1986-01-01

    The author states that the educational community must provide programs that assure students they will be able to learn how to use and control computers. He discusses micro labs, prerequisites to computer literacy, curriculum development, teaching methods, simulation projects, a systems analysis project, new job titles, and primary basic skills…

  18. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  19. Cloud Computing for radiologists

    PubMed Central

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560

  20. Contextuality supplies the 'magic' for quantum computation.

    PubMed

    Howard, Mark; Wallman, Joel; Veitch, Victor; Emerson, Joseph

    2014-06-19

    Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via 'magic state' distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple 'hidden variable' model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms.

  1. Chest Computed Tomographic Image Screening for Cystic Lung Diseases in Patients with Spontaneous Pneumothorax Is Cost Effective

    PubMed Central

    Langenderfer, Dale; McCormack, Francis X.; Schauer, Daniel P.; Eckman, Mark H.

    2017-01-01

    Rationale: Patients without a known history of lung disease presenting with a spontaneous pneumothorax are generally diagnosed as having primary spontaneous pneumothorax. However, occult diffuse cystic lung diseases such as Birt-Hogg-Dubé syndrome (BHD), lymphangioleiomyomatosis (LAM), and pulmonary Langerhans cell histiocytosis (PLCH) can also first present with a spontaneous pneumothorax, and their early identification by high-resolution computed tomographic (HRCT) chest imaging has implications for subsequent management. Objectives: The objective of our study was to evaluate the cost-effectiveness of HRCT chest imaging to facilitate early diagnosis of LAM, BHD, and PLCH. Methods: We constructed a Markov state-transition model to assess the cost-effectiveness of screening HRCT to facilitate early diagnosis of diffuse cystic lung diseases in patients presenting with an apparent primary spontaneous pneumothorax. Baseline data for prevalence of BHD, LAM, and PLCH and rates of recurrent pneumothoraces in each of these diseases were derived from the literature. Costs were extracted from 2014 Medicare data. We compared a strategy of HRCT screening followed by pleurodesis in patients with LAM, BHD, or PLCH versus conventional management with no HRCT screening. Measurements and Main Results: In our base case analysis, screening for the presence of BHD, LAM, or PLCH in patients presenting with a spontaneous pneumothorax was cost effective, with a marginal cost-effectiveness ratio of $1,427 per quality-adjusted life-year gained. Sensitivity analysis showed that screening HRCT remained cost effective for diffuse cystic lung diseases prevalence as low as 0.01%. Conclusions: HRCT image screening for BHD, LAM, and PLCH in patients with apparent primary spontaneous pneumothorax is cost effective. Clinicians should consider performing a screening HRCT in patients presenting with apparent primary spontaneous pneumothorax. PMID:27737563

  2. The role of dedicated data computing centers in the age of cloud computing

    NASA Astrophysics Data System (ADS)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  3. The Computer-based Lecture

    PubMed Central

    Wofford, Marcia M; Spickard, Anderson W; Wofford, James L

    2001-01-01

    Advancing computer technology, cost-containment pressures, and desire to make innovative improvements in medical education argue for moving learning resources to the computer. A reasonable target for such a strategy is the traditional clinical lecture. The purpose of the lecture, the advantages and disadvantages of “live” versus computer-based lectures, and the technical options in computerizing the lecture deserve attention in developing a cost-effective, complementary learning strategy that preserves the teacher-learner relationship. Based on a literature review of the traditional clinical lecture, we build on the strengths of the lecture format and discuss strategies for converting the lecture to a computer-based learning presentation. PMID:11520384

  4. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    PubMed

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  5. In-Situ monitoring and modeling of metal additive manufacturing powder bed fusion

    NASA Astrophysics Data System (ADS)

    Alldredge, Jacob; Slotwinski, John; Storck, Steven; Kim, Sam; Goldberg, Arnold; Montalbano, Timothy

    2018-04-01

    One of the major challenges in metal additive manufacturing is developing in-situ sensing and feedback control capabilities to eliminate build errors and allow qualified part creation without the need for costly and destructive external testing. Previously, many groups have focused on high fidelity numerical modeling and true temperature thermal imaging systems. These approaches require large computational resources or costly hardware that requires complex calibration and are difficult to integrate into commercial systems. In addition, due to the rapid change in the state of the material as well as its surface properties, getting true temperature is complicated and difficult. Here, we describe a different approach where we implement a low cost thermal imaging solution allowing for relative temperature measurements sufficient for detecting unwanted process variability. We match this with a faster than real time qualitative model that allows the process to be rapidly modeled during the build. The hope is to combine these two, allowing for the detection of anomalies in real time, enabling corrective action to potentially be taken, or parts to be stopped immediately after the error, saving material and time. Here we describe our sensor setup, its costs and abilities. We also show the ability to detect in real time unwanted process deviations. We also show that the output of our high speed model agrees qualitatively with experimental results. These results lay the groundwork for our vision of an integrated feedback and control scheme that combines low cost, easy to use sensors and fast modeling for process deviation monitoring.

  6. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  7. Software for Tracking Costs of Mars Projects

    NASA Technical Reports Server (NTRS)

    Wong, Alvin; Warfield, Keith

    2003-01-01

    The Mars Cost Tracking Model is a computer program that administers a system set up for tracking the costs of future NASA projects that pertain to Mars. Previously, no such tracking system existed, and documentation was written in a variety of formats and scattered in various places. It was difficult to justify costs or even track the history of costs of a spacecraft mission to Mars. The present software enables users to maintain all cost-model definitions, documentation, and justifications of cost estimates in one computer system that is accessible via the Internet. The software provides sign-off safeguards to ensure the reliability of information entered into the system. This system may eventually be used to track the costs of projects other than only those that pertain to Mars.

  8. A hierarchical storage management (HSM) scheme for cost-effective on-line archival using lossy compression.

    PubMed

    Avrin, D E; Andriole, K P; Yin, L; Gould, R G; Arenson, R L

    2001-03-01

    A hierarchical storage management (HSM) scheme for cost-effective on-line archival of image data using lossy compression is described. This HSM scheme also provides an off-site tape backup mechanism and disaster recovery. The full-resolution image data are viewed originally for primary diagnosis, then losslessly compressed and sent off site to a tape backup archive. In addition, the original data are wavelet lossy compressed (at approximately 25:1 for computed radiography, 10:1 for computed tomography, and 5:1 for magnetic resonance) and stored on a large RAID device for maximum cost-effective, on-line storage and immediate retrieval of images for review and comparison. This HSM scheme provides a solution to 4 problems in image archiving, namely cost-effective on-line storage, disaster recovery of data, off-site tape backup for the legal record, and maximum intermediate storage and retrieval through the use of on-site lossy compression.

  9. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    NASA Astrophysics Data System (ADS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  10. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    DOE PAGES

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

  11. What does an MRI scan cost?

    PubMed

    Young, David W

    2015-11-01

    Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs.

  12. Addition and Removal Energies via the In-Medium Similarity Renormalization Group Method

    NASA Astrophysics Data System (ADS)

    Yuan, Fei

    The in-medium similarity renormalization group (IM-SRG) is an ab initio many-body method suitable for systems with moderate numbers of particles due to its polynomial scaling in computational cost. The formalism is highly flexible and admits a variety of modifications that extend its utility beyond the original goal of computing ground state energies of closed-shell systems. In this work, we present an extension of IM-SRG through quasidegenerate perturbation theory (QDPT) to compute addition and removal energies (single particle energies) near the Fermi level at low computational cost. This expands the range of systems that can be studied from closed-shell ones to nearby systems that differ by one particle. The method is applied to circular quantum dot systems and nuclei, and compared against other methods including equations-of-motion (EOM) IM-SRG and EOM coupled-cluster (CC) theory. The results are in good agreement for most cases. As part of this work, we present an open-source implementation of our flexible and easy-to-use J-scheme framework as well as the HF, IM-SRG, and QDPT codes built upon this framework. We include an overview of the overall structure, the implementation details, and strategies for maintaining high code quality and efficiency. Lastly, we also present a graphical application for manipulation of angular momentum coupling coefficients through a diagrammatic notation for angular momenta (Jucys diagrams). The tool enables rapid derivations of equations involving angular momentum coupling--such as in J-scheme--and significantly reduces the risk of human errors.

  13. Cost-effectiveness of routine imaging of suspected appendicitis.

    PubMed

    D'Souza, N; Marsden, M; Bottomley, S; Nagarajah, N; Scutt, F; Toh, S

    2018-01-01

    Introduction The misdiagnosis of appendicitis and consequent removal of a normal appendix occurs in one in five patients in the UK. On the contrary, in healthcare systems with routine cross-sectional imaging of suspected appendicitis, the negative appendicectomy rate is around 5%. If we could reduce the rate in the UK to similar numbers, would this be cost effective? This study aimed to calculate the financial impact of negative appendicectomy at the Queen Alexandra Hospital and to explore whether a policy of routine imaging of such patients could reduce hospital costs. Materials and methods We performed a retrospective analysis of all appendicectomies over a 1-year period at our institution. Data were extracted on outcomes including appendix histology, operative time and length of stay to calculate the negative appendicectomy rate and to analyse costs. Results A total of 531 patients over 5 years of age had an appendicectomy. The negative appendicectomy rate was 22% (115/531). The additional financial costs of negative appendicectomy to the hospital during this period were £270,861. Universal imaging of all patients with right iliac fossa pain that could result in a 5% negative appendicectomy rate would cost between £67,200 and £165,600 per year but could save £33,896 (magnetic resonance imaging), £105,896 (computed tomography) or £132,296 (ultrasound) depending on imaging modality used. Conclusions Negative appendicectomy is still too frequent and results in additional financial burden to the health service. Routine imaging of patients with suspected appendicitis would not only reduce the negative appendicectomy rate but could lead to cost savings and a better service for our patients.

  14. Non-additive benefit or cost? Disentangling the indirect effects that occur when plants bearing extrafloral nectaries and honeydew-producing insects share exotic ant mutualists.

    PubMed

    Savage, Amy M; Rudgers, Jennifer A

    2013-06-01

    In complex communities, organisms often form mutualisms with multiple different partners simultaneously. Non-additive effects may emerge among species linked by these positive interactions. Ants commonly participate in mutualisms with both honeydew-producing insects (HPI) and their extrafloral nectary (EFN)-bearing host plants. Consequently, HPI and EFN-bearing plants may experience non-additive benefits or costs when these groups co-occur. The outcomes of these interactions are likely to be influenced by variation in preferences among ants for honeydew vs. nectar. In this study, a test was made for non-additive effects on HPI and EFN-bearing plants resulting from sharing exotic ant guards. Preferences of the dominant exotic ant species for nectar vs. honeydew resources were also examined. Ant access, HPI and nectar availability were manipulated on the EFN-bearing shrub, Morinda citrifolia, and ant and HPI abundances, herbivory and plant growth were assessed. Ant-tending behaviours toward HPI across an experimental gradient of nectar availability were also tracked in order to investigate mechanisms underlying ant responses. The dominant ant species, Anoplolepis gracilipes, differed from less invasive ants in response to multiple mutualists, with reductions in plot-wide abundances when nectar was reduced, but no response to HPI reduction. Conversely, at sites where A. gracilipes was absent or rare, abundances of less invasive ants increased when nectar was reduced, but declined when HPI were reduced. Non-additive benefits were found at sites dominated by A. gracilipes, but only for M. citrifolia plants. Responses of HPI at these sites supported predictions of the non-additive cost model. Interestingly, the opposite non-additive patterns emerged at sites dominated by other ants. It was demonstrated that strong non-additive benefits and costs can both occur when a plant and herbivore share mutualist partners. These findings suggest that broadening the community

  15. Non-additive benefit or cost? Disentangling the indirect effects that occur when plants bearing extrafloral nectaries and honeydew-producing insects share exotic ant mutualists

    PubMed Central

    Savage, Amy M.; Rudgers, Jennifer A.

    2013-01-01

    Background and Aims In complex communities, organisms often form mutualisms with multiple different partners simultaneously. Non-additive effects may emerge among species linked by these positive interactions. Ants commonly participate in mutualisms with both honeydew-producing insects (HPI) and their extrafloral nectary (EFN)-bearing host plants. Consequently, HPI and EFN-bearing plants may experience non-additive benefits or costs when these groups co-occur. The outcomes of these interactions are likely to be influenced by variation in preferences among ants for honeydew vs. nectar. In this study, a test was made for non-additive effects on HPI and EFN-bearing plants resulting from sharing exotic ant guards. Preferences of the dominant exotic ant species for nectar vs. honeydew resources were also examined. Methods Ant access, HPI and nectar availability were manipulated on the EFN-bearing shrub, Morinda citrifolia, and ant and HPI abundances, herbivory and plant growth were assessed. Ant-tending behaviours toward HPI across an experimental gradient of nectar availability were also tracked in order to investigate mechanisms underlying ant responses. Key Results The dominant ant species, Anoplolepis gracilipes, differed from less invasive ants in response to multiple mutualists, with reductions in plot-wide abundances when nectar was reduced, but no response to HPI reduction. Conversely, at sites where A. gracilipes was absent or rare, abundances of less invasive ants increased when nectar was reduced, but declined when HPI were reduced. Non-additive benefits were found at sites dominated by A. gracilipes, but only for M. citrifolia plants. Responses of HPI at these sites supported predictions of the non-additive cost model. Interestingly, the opposite non-additive patterns emerged at sites dominated by other ants. Conclusions It was demonstrated that strong non-additive benefits and costs can both occur when a plant and herbivore share mutualist partners. These

  16. The Protein Cost of Metabolic Fluxes: Prediction from Enzymatic Rate Laws and Cost Minimization.

    PubMed

    Noor, Elad; Flamholz, Avi; Bar-Even, Arren; Davidi, Dan; Milo, Ron; Liebermeister, Wolfram

    2016-11-01

    Bacterial growth depends crucially on metabolic fluxes, which are limited by the cell's capacity to maintain metabolic enzymes. The necessary enzyme amount per unit flux is a major determinant of metabolic strategies both in evolution and bioengineering. It depends on enzyme parameters (such as kcat and KM constants), but also on metabolite concentrations. Moreover, similar amounts of different enzymes might incur different costs for the cell, depending on enzyme-specific properties such as protein size and half-life. Here, we developed enzyme cost minimization (ECM), a scalable method for computing enzyme amounts that support a given metabolic flux at a minimal protein cost. The complex interplay of enzyme and metabolite concentrations, e.g. through thermodynamic driving forces and enzyme saturation, would make it hard to solve this optimization problem directly. By treating enzyme cost as a function of metabolite levels, we formulated ECM as a numerically tractable, convex optimization problem. Its tiered approach allows for building models at different levels of detail, depending on the amount of available data. Validating our method with measured metabolite and protein levels in E. coli central metabolism, we found typical prediction fold errors of 4.1 and 2.6, respectively, for the two kinds of data. This result from the cost-optimized metabolic state is significantly better than randomly sampled metabolite profiles, supporting the hypothesis that enzyme cost is important for the fitness of E. coli. ECM can be used to predict enzyme levels and protein cost in natural and engineered pathways, and could be a valuable computational tool to assist metabolic engineering projects. Furthermore, it establishes a direct connection between protein cost and thermodynamics, and provides a physically plausible and computationally tractable way to include enzyme kinetics into constraint-based metabolic models, where kinetics have usually been ignored or oversimplified.

  17. The Protein Cost of Metabolic Fluxes: Prediction from Enzymatic Rate Laws and Cost Minimization

    PubMed Central

    Noor, Elad; Flamholz, Avi; Bar-Even, Arren; Davidi, Dan; Milo, Ron; Liebermeister, Wolfram

    2016-01-01

    Bacterial growth depends crucially on metabolic fluxes, which are limited by the cell’s capacity to maintain metabolic enzymes. The necessary enzyme amount per unit flux is a major determinant of metabolic strategies both in evolution and bioengineering. It depends on enzyme parameters (such as kcat and KM constants), but also on metabolite concentrations. Moreover, similar amounts of different enzymes might incur different costs for the cell, depending on enzyme-specific properties such as protein size and half-life. Here, we developed enzyme cost minimization (ECM), a scalable method for computing enzyme amounts that support a given metabolic flux at a minimal protein cost. The complex interplay of enzyme and metabolite concentrations, e.g. through thermodynamic driving forces and enzyme saturation, would make it hard to solve this optimization problem directly. By treating enzyme cost as a function of metabolite levels, we formulated ECM as a numerically tractable, convex optimization problem. Its tiered approach allows for building models at different levels of detail, depending on the amount of available data. Validating our method with measured metabolite and protein levels in E. coli central metabolism, we found typical prediction fold errors of 4.1 and 2.6, respectively, for the two kinds of data. This result from the cost-optimized metabolic state is significantly better than randomly sampled metabolite profiles, supporting the hypothesis that enzyme cost is important for the fitness of E. coli. ECM can be used to predict enzyme levels and protein cost in natural and engineered pathways, and could be a valuable computational tool to assist metabolic engineering projects. Furthermore, it establishes a direct connection between protein cost and thermodynamics, and provides a physically plausible and computationally tractable way to include enzyme kinetics into constraint-based metabolic models, where kinetics have usually been ignored or oversimplified

  18. 32 CFR 701.52 - Computation of fees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... correspondence and preparation costs, these fees are not recoupable from the requester. (b) DD 2086, Record of... costs, as requesters may solicit a copy of that document to ensure accurate computation of fees. Costs... 32 National Defense 5 2010-07-01 2010-07-01 false Computation of fees. 701.52 Section 701.52...

  19. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  20. Information sampling behavior with explicit sampling costs

    PubMed Central

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  1. Medical imaging and registration in computer assisted surgery.

    PubMed

    Simon, D A; Lavallée, S

    1998-09-01

    Imaging, sensing, and computing technologies that are being introduced to aid in the planning and execution of surgical procedures are providing orthopaedic surgeons with a powerful new set of tools for improving clinical accuracy, reliability, and patient outcomes while reducing costs and operating times. Current computer assisted surgery systems typically include a measurement process for collecting patient specific medical data, a decision making process for generating a surgical plan, a registration process for aligning the surgical plan to the patient, and an action process for accurately achieving the goals specified in the plan. Some of the key concepts in computer assisted surgery applied to orthopaedics with a focus on the basic framework and underlying technologies is outlined. In addition, technical challenges and future trends in the field are discussed.

  2. COST FUNCTION STUDIES FOR POWER REACTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heestand, J.; Wos, L.T.

    1961-11-01

    A function to evaluate the cost of electricity produced by a nuclear power reactor was developed. The basic equation, revenue = capital charges + profit + operating expenses, was expanded in terms of various cost parameters to enable analysis of multiregion nuclear reactors with uranium and/or plutonium for fuel. A corresponding IBM 704 computer program, which will compute either the price of electricity or the value of plutonium, is presented in detail. (auth)

  3. 48 CFR 42.709-4 - Computing interest.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...

  4. 48 CFR 42.709-4 - Computing interest.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...

  5. 48 CFR 42.709-4 - Computing interest.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Computing interest. 42.709... MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Indirect Cost Rates 42.709-4 Computing interest. For 42.709-1(a)(1)(ii), compute interest on any paid portion of the disallowed cost as follows: (a) Consider...

  6. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    NASA Technical Reports Server (NTRS)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  7. Virtualization and cloud computing in dentistry.

    PubMed

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  8. Cost and cost-effectiveness of digital mammography compared with film-screen mammography in Australia.

    PubMed

    Wang, Shuhong; Merlin, Tracy; Kreisz, Florian; Craft, Paul; Hiller, Janet E

    2009-10-01

    A systematic review assessed the relative safety and effectiveness of digital mammography compared with film-screen mammography. This study utilised the evidence from the review to examine the economic value of digital compared with film-screen mammography in Australia. A cost-comparison analysis between the two technologies was conducted for the overall population for the purposes of breast cancer screening and diagnosis. In addition, a cost-effectiveness analysis was conducted for the screening subgroups where digital mammography was considered to be more accurate than film-screen mammography. Digital mammography in a screening setting is $11 more per examination than film-screen mammography, and $36 or $33 more per examination in a diagnostic setting when either digital radiography or computed radiography is used. In both the screening and diagnostic settings, the throughput of the mammography system had the most significant impact on decreasing the incremental cost/examination/year of digital mammography. Digital mammography is more expensive than film-screen mammography. Whether digital mammography represents good value for money depends on the eventual life-years and quality-adjusted life-years gained from the early cancer diagnosis. The evidence generated from this study has informed the allocation of public resources for the screening and diagnosis of breast cancer in Australia.

  9. Thoracoabdominal Computed Tomography in Trauma Patients: A Cost-Consequences Analysis

    PubMed Central

    van Vugt, Raoul; Kool, Digna R.; Brink, Monique; Dekker, Helena M.; Deunk, Jaap; Edwards, Michael J.

    2014-01-01

    Background: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. Objectives: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use thoracoabdominal CT in primary evaluation of adult patients with high-energy blunt trauma. Materials and Methods: We compared three different algorithms in which CT was applied as an immediate diagnostic tool (rush CT), a diagnostic tool after limited conventional work-up (routine CT), and a selective tool (selective CT). Probabilities of detecting and missing clinically relevant injuries were retrospectively derived. We collected data on radiation exposure and performed a micro-cost analysis on a reference case-based approach. Results: Both rush and routine CT detected all thoracoabdominal injuries in 99.1% of the patients during primary evaluation (n = 1040). Selective CT missed one or more diagnoses in 11% of the patients in which a change of treatment was necessary in 4.8%. Rush CT algorithm costed € 2676 (US$ 3660) per patient with a mean radiation dose of 26.40 mSv per patient. Routine CT costed € 2815 (US$ 3850) and resulted in the same radiation exposure. Selective CT resulted in less radiation dose (23.23 mSv) and costed € 2771 (US$ 3790). Conclusions: Rush CT seems to result in the least costs and is comparable in terms of radiation dose exposure and diagnostic certainty with routine CT after a limited conventional work-up. However, selective CT results in less radiation dose exposure but a slightly higher cost and less certainty. PMID:25337521

  10. Development and implementation of a low cost micro computer system for LANDSAT analysis and geographic data base applications

    NASA Technical Reports Server (NTRS)

    Faust, N.; Jordon, L.

    1981-01-01

    Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.

  11. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell; Nettles, Mindy

    2015-01-01

    The Additive Manufacturing Infrared Inspection Task started the development of a real-time dimensional inspection technique and digital quality record for the additive manufacturing process using infrared camera imaging and processing techniques. This project will benefit additive manufacturing by providing real-time inspection of internal geometry that is not currently possible and reduce the time and cost of additive manufactured parts with automated real-time dimensional inspections which deletes post-production inspections.

  12. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2015-05-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  13. Compact and low-cost THz QTDS system.

    PubMed

    Probst, Thorsten; Rehn, Arno; Koch, Martin

    2015-08-24

    We present a terahertz quasi time domain spectroscopy (QTDS) system setup which is improved regarding cost and compactness. The diode laser is mounted directly onto the optical delay line, making the optical setup more compact. The system is operated using a Raspberry Pi and an additional sound card. This combination replaces the desktop/laptop computer, the lock-in-amplifier, the stage controller and the signal generator. We examined not only a commercially available stepper motor driven delay line, but also the repurposed internal mechanics from a DVD drive. We characterize the performance of the new system concept.

  14. Identification of cost effective energy conservation measures

    NASA Technical Reports Server (NTRS)

    Bierenbaum, H. S.; Boggs, W. H.

    1978-01-01

    In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed

  15. A strategy for improved computational efficiency of the method of anchored distributions

    NASA Astrophysics Data System (ADS)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  16. Performance Comparison of Mainframe, Workstations, Clusters, and Desktop Computers

    NASA Technical Reports Server (NTRS)

    Farley, Douglas L.

    2005-01-01

    A performance evaluation of a variety of computers frequently found in a scientific or engineering research environment was conducted using a synthetic and application program benchmarks. From a performance perspective, emerging commodity processors have superior performance relative to legacy mainframe computers. In many cases, the PC clusters exhibited comparable performance with traditional mainframe hardware when 8-12 processors were used. The main advantage of the PC clusters was related to their cost. Regardless of whether the clusters were built from new computers or whether they were created from retired computers their performance to cost ratio was superior to the legacy mainframe computers. Finally, the typical annual maintenance cost of legacy mainframe computers is several times the cost of new equipment such as multiprocessor PC workstations. The savings from eliminating the annual maintenance fee on legacy hardware can result in a yearly increase in total computational capability for an organization.

  17. The Next Generation of Personal Computers.

    ERIC Educational Resources Information Center

    Crecine, John P.

    1986-01-01

    Discusses factors converging to create high-capacity, low-cost nature of next generation of microcomputers: a coherent vision of what graphics workstation and future computing environment should be like; hardware developments leading to greater storage capacity at lower costs; and development of software and expertise to exploit computing power…

  18. Processing power limits social group size: computational evidence for the cognitive costs of sociality

    PubMed Central

    Dávid-Barrett, T.; Dunbar, R. I. M.

    2013-01-01

    Sociality is primarily a coordination problem. However, the social (or communication) complexity hypothesis suggests that the kinds of information that can be acquired and processed may limit the size and/or complexity of social groups that a species can maintain. We use an agent-based model to test the hypothesis that the complexity of information processed influences the computational demands involved. We show that successive increases in the kinds of information processed allow organisms to break through the glass ceilings that otherwise limit the size of social groups: larger groups can only be achieved at the cost of more sophisticated kinds of information processing that are disadvantageous when optimal group size is small. These results simultaneously support both the social brain and the social complexity hypotheses. PMID:23804623

  19. Economic evaluation of a web-based tailored lifestyle intervention for adults: findings regarding cost-effectiveness and cost-utility from a randomized controlled trial.

    PubMed

    Schulz, Daniela N; Smit, Eline S; Stanczyk, Nicola E; Kremers, Stef P J; de Vries, Hein; Evers, Silvia M A A

    2014-03-20

    Different studies have reported the effectiveness of Web-based computer-tailored lifestyle interventions, but economic evaluations of these interventions are scarce. The objective was to assess the cost-effectiveness and cost-utility of a sequential and a simultaneous Web-based computer-tailored lifestyle intervention for adults compared to a control group. The economic evaluation, conducted from a societal perspective, was part of a 2-year randomized controlled trial including 3 study groups. All groups received personalized health risk appraisals based on the guidelines for physical activity, fruit intake, vegetable intake, alcohol consumption, and smoking. Additionally, respondents in the sequential condition received personal advice about one lifestyle behavior in the first year and a second behavior in the second year; respondents in the simultaneous condition received personal advice about all unhealthy behaviors in both years. During a period of 24 months, health care use, medication use, absenteeism from work, and quality of life (EQ-5D-3L) were assessed every 3 months using Web-based questionnaires. Demographics were assessed at baseline, and lifestyle behaviors were assessed at both baseline and after 24 months. Cost-effectiveness and cost-utility analyses were performed based on the outcome measures lifestyle factor (the number of guidelines respondents adhered to) and quality of life, respectively. We accounted for uncertainty by using bootstrapping techniques and sensitivity analyses. A total of 1733 respondents were included in the analyses. From a willingness to pay of €4594 per additional guideline met, the sequential intervention (n=552) was likely to be the most cost-effective, whereas from a willingness to pay of €10,850, the simultaneous intervention (n=517) was likely to be most cost-effective. The control condition (n=664) appeared to be preferred with regard to quality of life. Both the sequential and the simultaneous lifestyle

  20. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces

    NASA Astrophysics Data System (ADS)

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-01

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  1. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces.

    PubMed

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-28

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  2. Construction and field test of a programmable and self-cleaning auto-sampler controlled by a low-cost one-board computer

    NASA Astrophysics Data System (ADS)

    Stadler, Philipp; Farnleitner, Andreas H.; Zessner, Matthias

    2016-04-01

    This presentation describes in-depth how a low cost micro-computer was used for substantial improvement of established measuring systems due to the construction and implementation of a purposeful complementary device for on-site sample pretreatment. A fully automated on-site device was developed and field-tested, that enables water sampling with simultaneous filtration as well as effective cleaning procedure of the devicés components. The described auto-sampler is controlled by a low-cost one-board computer and designed for sample pre-treatment, with minimal sample alteration, to meet requirements of on-site measurement devices that cannot handle coarse suspended solids within the measurement procedure or -cycle. The automated sample pretreatment was tested for over one year for rapid and on-site enzymatic activity (beta-D-glucuronidase, GLUC) determination in sediment laden stream water. The formerly used proprietary sampling set-up was assumed to lead to a significant damping of the measurement signal due to its susceptibility to clogging, debris- and bio film accumulation. Results show that the installation of the developed apparatus considerably enhanced error-free running time of connected measurement devices and increased the measurement accuracy to an up-to-now unmatched quality.

  3. Thermodynamics of Computational Copying in Biochemical Systems

    NASA Astrophysics Data System (ADS)

    Ouldridge, Thomas E.; Govern, Christopher C.; ten Wolde, Pieter Rein

    2017-04-01

    Living cells use readout molecules to record the state of receptor proteins, similar to measurements or copies in typical computational devices. But is this analogy rigorous? Can cells be optimally efficient, and if not, why? We show that, as in computation, a canonical biochemical readout network generates correlations; extracting no work from these correlations sets a lower bound on dissipation. For general input, the biochemical network cannot reach this bound, even with arbitrarily slow reactions or weak thermodynamic driving. It faces an accuracy-dissipation trade-off that is qualitatively distinct from and worse than implied by the bound, and more complex steady-state copy processes cannot perform better. Nonetheless, the cost remains close to the thermodynamic bound unless accuracy is extremely high. Additionally, we show that biomolecular reactions could be used in thermodynamically optimal devices under exogenous manipulation of chemical fuels, suggesting an experimental system for testing computational thermodynamics.

  4. Cost-of-illness studies based on massive data: a prevalence-based, top-down regression approach.

    PubMed

    Stollenwerk, Björn; Welchowski, Thomas; Vogl, Matthias; Stock, Stephanie

    2016-04-01

    Despite the increasing availability of routine data, no analysis method has yet been presented for cost-of-illness (COI) studies based on massive data. We aim, first, to present such a method and, second, to assess the relevance of the associated gain in numerical efficiency. We propose a prevalence-based, top-down regression approach consisting of five steps: aggregating the data; fitting a generalized additive model (GAM); predicting costs via the fitted GAM; comparing predicted costs between prevalent and non-prevalent subjects; and quantifying the stochastic uncertainty via error propagation. To demonstrate the method, it was applied to aggregated data in the context of chronic lung disease to German sickness funds data (from 1999), covering over 7.3 million insured. To assess the gain in numerical efficiency, the computational time of the innovative approach has been compared with corresponding GAMs applied to simulated individual-level data. Furthermore, the probability of model failure was modeled via logistic regression. Applying the innovative method was reasonably fast (19 min). In contrast, regarding patient-level data, computational time increased disproportionately by sample size. Furthermore, using patient-level data was accompanied by a substantial risk of model failure (about 80 % for 6 million subjects). The gain in computational efficiency of the innovative COI method seems to be of practical relevance. Furthermore, it may yield more precise cost estimates.

  5. Computing at the speed limit (supercomputers)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernhard, R.

    1982-07-01

    The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less

  6. Comprehensive silicon solar cell computer modeling

    NASA Technical Reports Server (NTRS)

    Lamorte, M. F.

    1984-01-01

    The development of an efficient, comprehensive Si solar cell modeling program that has the capability of simulation accuracy of 5 percent or less is examined. A general investigation of computerized simulation is provided. Computer simulation programs are subdivided into a number of major tasks: (1) analytical method used to represent the physical system; (2) phenomena submodels that comprise the simulation of the system; (3) coding of the analysis and the phenomena submodels; (4) coding scheme that results in efficient use of the CPU so that CPU costs are low; and (5) modularized simulation program with respect to structures that may be analyzed, addition and/or modification of phenomena submodels as new experimental data become available, and the addition of other photovoltaic materials.

  7. Using technology to support investigations in the electronic age: tracking hackers to large scale international computer fraud

    NASA Astrophysics Data System (ADS)

    McFall, Steve

    1994-03-01

    With the increase in business automation and the widespread availability and low cost of computer systems, law enforcement agencies have seen a corresponding increase in criminal acts involving computers. The examination of computer evidence is a new field of forensic science with numerous opportunities for research and development. Research is needed to develop new software utilities to examine computer storage media, expert systems capable of finding criminal activity in large amounts of data, and to find methods of recovering data from chemically and physically damaged computer storage media. In addition, defeating encryption and password protection of computer files is also a topic requiring more research and development.

  8. Activity-based costing: a practical model for cost calculation in radiotherapy.

    PubMed

    Lievens, Yolande; van den Bogaert, Walter; Kesteloot, Katrien

    2003-10-01

    The activity-based costing method was used to compute radiotherapy costs. This report describes the model developed, the calculated costs, and possible applications for the Leuven radiotherapy department. Activity-based costing is an advanced cost calculation technique that allocates resource costs to products based on activity consumption. In the Leuven model, a complex allocation principle with a large diversity of cost drivers was avoided by introducing an extra allocation step between activity groups and activities. A straightforward principle of time consumption, weighed by some factors of treatment complexity, was used. The model was developed in an iterative way, progressively defining the constituting components (costs, activities, products, and cost drivers). Radiotherapy costs are predominantly determined by personnel and equipment cost. Treatment-related activities consume the greatest proportion of the resource costs, with treatment delivery the most important component. This translates into products that have a prolonged total or daily treatment time being the most costly. The model was also used to illustrate the impact of changes in resource costs and in practice patterns. The presented activity-based costing model is a practical tool to evaluate the actual cost structure of a radiotherapy department and to evaluate possible resource or practice changes.

  9. Study of USGS/NASA land use classification system. [computer analysis from LANDSAT data

    NASA Technical Reports Server (NTRS)

    Spann, G. W.

    1975-01-01

    The results of a computer mapping project using LANDSAT data and the USGS/NASA land use classification system are summarized. During the computer mapping portion of the project, accuracies of 67 percent to 79 percent were achieved using Level II of the classification system and a 4,000 acre test site centered on Douglasville, Georgia. Analysis of response to a questionaire circulated to actual and potential LANDSAT data users reveals several important findings: (1) there is a substantial desire for additional information related to LANDSAT capabilities; (2) a majority of the respondents feel computer mapping from LANDSAT data could aid present or future projects; and (3) the costs of computer mapping are substantially less than those of other methods.

  10. Costs of rearing children in agricultural economies: an alternative estimation approach and findings from rural Bangladesh.

    PubMed

    Khan, M M; Magnani, R J; Mock, N B; Saadat, Y S

    1993-03-01

    There are changes in child costs during demographic transition. This study examines household time allocation from 66 agricultural households in 3 villages in Tangail District in rural north central Bangladesh in 1984-85 (371 days). Component and total child-rearing costs are estimated in alternative ways. Conventional "opportunity wage" measures are considered overestimated. The methodological shortcomings of direct cost accounting procedures and consumer demand methods in computing time cost and monetary cost of child rearing are pointed out. In this study's alternative computation, age standardized equivalent costs are generated. Child food consumption costs were generated from a large national survey conducted in 1983. Nonfood expenditures were estimated by food to nonfood expenditure ratios taken from the aforementioned survey. For estimating breast-feeding costs, an estimate was produced based on the assumption that costs for infant food consumption were a fixed proportion of food costs for older children. Land ownership groups were set up to reflect socioeconomic status: 1) landless households, 2) marginal farm households with 1 acre or .4 hectares of land, 3) middle income households with 1-2 acres of land, 4) upper middle income households with 2-4 acres of land, and 5) upper income or rich households with over 4 acres of land. The nonmarket wage rate for hired household help was used to determine the value of cooking, fetching water, and household cleaning and repairing. The results confirm the low costs of child rearing in high fertility societies. Productive nonmarket activities are effective in subsidizing the costs of children. The addition of a child into households already with children has a low impact on time costs of children; "this economies of scale effect is estimated ... at 20%." The highest relative costs were found in the lowest income households, and the lowest costs were in the highest income households. 5% of total household income is

  11. Development of a small-scale computer cluster

    NASA Astrophysics Data System (ADS)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  12. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Treesearch

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  13. Heterogeneous high throughput scientific computing with APM X-Gene and Intel Xeon Phi

    DOE PAGES

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; ...

    2015-05-22

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. As a result, we report our experience on software porting, performance and energy efficiency and evaluatemore » the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).« less

  14. Cost-effectiveness of colorectal cancer screening with computed tomography colonography according to a polyp size threshold for polypectomy.

    PubMed

    Heresbach, Denis; Chauvin, Pauline; Hess-Migliorretti, Aurélie; Riou, Françoise; Grolier, Jacques; Josselin, Jean-Michel

    2010-06-01

    Computed tomography colonography (CTC) has an acceptable accuracy in detecting colonic lesions, especially for polyps at least 6 mm. The aim of this analysis is to determine the cost-effectiveness of population-based screening for colorectal cancer (CRC) using CTC with a polyp size threshold. The cost-effectiveness ratios of CTC performed at 50, 60 and 70 years old, without (PL strategy) or with (TS strategy) polyp size threshold were compared using a Markov process. Incremental cost-effectiveness ratios (ICER) were calculated per life-years gained (LYG) for a time horizon of 30 years. The ICER of PL and TS strategies were 12 042 and 2765 euro/LYG associated to CRC prevention rates of 37.9 and 36.5%. The ICER of PL and TS strategies dropped to 9687 and 1857 euro/LYG when advanced adenoma (AA) prevalence increased from 6.9 to 8.6% for male participants and 3.8-4.9% for female participants or to 9482 and 2067 euro/LYG when adenoma and AA annual recurrence rates dropped to 3.2 and 0.25%. The ICER for PL and TS strategies decreased to 7947 and 954 euro/LYG or when only two CTC were performed at 50 and 60-years-old. Conversely, the ICER did not significantly change when varying population participation rate or accuracy of CTC. CTC with a 6 mm threshold for polypectomy is associated to a substantial cost reduction without significant loss of efficacy. Cost-effectiveness depends more on the AA prevalence or transition rate to CRC than on CTC accuracy or screening compliance.

  15. Cost per remission and cost per response with infliximab, adalimumab, and golimumab for the treatment of moderately-to-severely active ulcerative colitis.

    PubMed

    Toor, Kabirraaj; Druyts, Eric; Jansen, Jeroen P; Thorlund, Kristian

    2015-06-01

    To determine the short-term costs per sustained remission and sustained response of three tumor necrosis factor inhibitors (infliximab, adalimumab, and golimumab) in comparison to conventional therapy for the treatment of moderately-to-severely active ulcerative colitis. A probabilistic Markov model was developed. This included an 8-week induction period, and 22 subsequent 2-week cycles (up to 1 year). The model included three disease states: remission, response, and relapse. Costs were from a Canadian public payer perspective. Estimates for the additional cost per 1 year of sustained remission and sustained response were obtained. Golimumab 100 mg provided the lowest cost per additional remission ($935) and cost per additional response ($701) compared with conventional therapy. Golimumab 50 mg yielded slightly higher costs than golimumab 100 mg. Infliximab was associated with the largest additional number of estimated remissions and responses, but also higher cost at $1975 per remission and $1311 per response. Adalimumab was associated with the largest cost per remission ($7430) and cost per response ($2361). The cost per additional remission and cost per additional response associated with infliximab vs golimumab 100 mg was $14,659 and $4753, respectively. The results suggest that the additional cost of 1 full year of remission and response are lowest with golimumab 100 mg, followed by golimumab 50 mg. Although infliximab has the highest efficacy, it did not exhibit the lowest cost per additional remission or response. Adalimumab produced the highest cost per additional remission and response.

  16. Development of computer software for pavement life cycle cost analysis.

    DOT National Transportation Integrated Search

    1988-01-01

    The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...

  17. Cost and Cost-effectiveness of Students for Nutrition and Exercise (SNaX)

    PubMed Central

    Ladapo, Joseph A.; Bogart, Laura M.; Klein, David J.; Cowgill, Burton O.; Uyeda, Kimberly; Binkle, David G.; Stevens, Elizabeth R.; Schuster, Mark A.

    2015-01-01

    Objective To examine the cost and cost-effectiveness of implementing Students for Nutrition and eXercise (SNaX), a 5-week middle-school-based obesity-prevention intervention combining school-wide environmental changes, multimedia, encouragement to eat healthy school cafeteria foods, and peer-led education. Methods Five intervention and five control middle schools (mean enrollment = 1,520 students) from the Los Angeles Unified School District participated in a randomized controlled trial of SNaX. Acquisition costs for materials and time and wage data for employees involved in implementing the program were used to estimate fixed and variable costs. Cost-effectiveness was determined using the ratio of variable costs to program efficacy outcomes. Results The costs of implementing the program over 5 weeks were $5,433.26 per school in fixed costs and $2.11 per student in variable costs, equaling a total cost of $8,637.17 per school, or $0.23 per student per day. This investment yielded significant increases in the proportion of students served fruit and lunch and a significant decrease in the proportion of students buying snacks. The cost-effectiveness of the program, per student over 5 weeks, was $1.20 per additional fruit served during meals, $8.43 per additional full-priced lunch served, $2.11 per additional reduced-price/free lunch served, and $1.69 per reduction in snacks sold. Conclusions SNaX demonstrated the feasibility and cost-effectiveness of a middle-school-based obesity-prevention intervention combining school-wide environmental changes, multimedia, encouragement to eat healthy school cafeteria foods, and peer-led education. Its cost is modest and unlikely to be a significant barrier to adoption for many schools considering its implementation. PMID:26427719

  18. The Cost-Effectiveness of Dual Mobility Implants for Primary Total Hip Arthroplasty: A Computer-Based Cost-Utility Model.

    PubMed

    Barlow, Brian T; McLawhorn, Alexander S; Westrich, Geoffrey H

    2017-05-03

    Dislocation remains a clinically important problem following primary total hip arthroplasty, and it is a common reason for revision total hip arthroplasty. Dual mobility (DM) implants decrease the risk of dislocation but can be more expensive than conventional implants and have idiosyncratic failure mechanisms. The purpose of this study was to investigate the cost-effectiveness of DM implants compared with conventional bearings for primary total hip arthroplasty. Markov model analysis was conducted from the societal perspective with use of direct and indirect costs. Costs, expressed in 2013 U.S. dollars, were derived from the literature, the National Inpatient Sample, and the Centers for Medicare & Medicaid Services. Effectiveness was expressed in quality-adjusted life years (QALYs). The model was populated with health state utilities and state transition probabilities derived from previously published literature. The analysis was performed for a patient's lifetime, and costs and effectiveness were discounted at 3% annually. The principal outcome was the incremental cost-effectiveness ratio (ICER), with a willingness-to-pay threshold of $100,000/QALY. Sensitivity analyses were performed to explore relevant uncertainty. In the base case, DM total hip arthroplasty showed absolute dominance over conventional total hip arthroplasty, with lower accrued costs ($39,008 versus $40,031 U.S. dollars) and higher accrued utility (13.18 versus 13.13 QALYs) indicating cost-savings. DM total hip arthroplasty ceased being cost-saving when its implant costs exceeded those of conventional total hip arthroplasty by $1,023, and the cost-effectiveness threshold for DM implants was $5,287 greater than that for conventional implants. DM was not cost-effective when the annualized incremental probability of revision from any unforeseen failure mechanism or mechanisms exceeded 0.29%. The probability of intraprosthetic dislocation exerted the most influence on model results. This model

  19. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  20. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  1. Cost effectiveness of computer tailored and non-tailored smoking cessation letters in general practice: randomised controlled trial

    PubMed Central

    Lennox, A Scott; Osman, Liesl M; Reiter, Ehud; Robertson, Roma; Friend, James; McCann, Ian; Skatun, Diane; Donnan, Peter T

    2001-01-01

    Objectives To develop and evaluate, in a primary care setting, a computerised system for generating tailored letters about smoking cessation. Design Randomised controlled trial. Setting Six general practices in Aberdeen, Scotland. Participants 2553 smokers aged 17 to 65. Interventions All participants received a questionnaire asking about their smoking. Participants subsequently received either a computer tailored or a non-tailored, standard letter on smoking cessation, or no letter. Main outcome measures Prevalence of validated abstinence at six months; change in intention to stop smoking in the next six months. Results The validated cessation rate at six months was 3.5% (30/857) (95% confidence interval 2.3% to 4.7%) for the tailored letter group, 4.4% (37/846) (3.0% to 5.8%) for the non-tailored letter group, and 2.6% (22/850) (1.5% to 3.7%) for the control (no letter) group. After adjustment for significant covariates, the cessation rate was 66% greater (−4% to 186%; P=0.07) in the non-tailored letter group than that in the no letter group. Among participants who smoked <20 cigarettes per day, the cessation rate in the non-tailored letter group was 87% greater (0% to 246%; P=0.05) than that in the no letter group. Among heavy smokers who did not quit, a 76% higher rate of positive shift in “stage of change” (intention to quit within a particular period of time) was seen compared with those who received no letter (11% to 180%; P=0.02). The increase in cost for each additional quitter in the non-tailored letter group compared with the no letter group was £89. Conclusions In a large general practice, a brief non-tailored letter effectively increased cessation rates among smokers. A tailored letter was not effective in increasing cessation rates but promoted shift in movement towards cessation (“stage of change”) in heavy smokers. As a pragmatic tool to encourage cessation of smoking, a mass mailing of non-tailored letters from general practices is more

  2. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  3. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    NASA Technical Reports Server (NTRS)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  4. Additive direct-write microfabrication for MEMS: A review

    NASA Astrophysics Data System (ADS)

    Teh, Kwok Siong

    2017-12-01

    Direct-write additive manufacturing refers to a rich and growing repertoire of well-established fabrication techniques that builds solid objects directly from computer- generated solid models without elaborate intermediate fabrication steps. At the macroscale, direct-write techniques such as stereolithography, selective laser sintering, fused deposition modeling ink-jet printing, and laminated object manufacturing have significantly reduced concept-to-product lead time, enabled complex geometries, and importantly, has led to the renaissance in fabrication known as the maker movement. The technological premises of all direct-write additive manufacturing are identical—converting computer generated three-dimensional models into layers of two-dimensional planes or slices, which are then reconstructed sequentially into threedimensional solid objects in a layer-by-layer format. The key differences between the various additive manufacturing techniques are the means of creating the finished layers and the ancillary processes that accompany them. While still at its infancy, direct-write additive manufacturing techniques at the microscale have the potential to significantly lower the barrier-of-entry—in terms of cost, time and training—for the prototyping and fabrication of MEMS parts that have larger dimensions, high aspect ratios, and complex shapes. In recent years, significant advancements in materials chemistry, laser technology, heat and fluid modeling, and control systems have enabled additive manufacturing to achieve higher resolutions at the micrometer and nanometer length scales to be a viable technology for MEMS fabrication. Compared to traditional MEMS processes that rely heavily on expensive equipment and time-consuming steps, direct-write additive manufacturing techniques allow for rapid design-to-prototype realization by limiting or circumventing the need for cleanrooms, photolithography and extensive training. With current direct-write additive

  5. Cloud Computing with iPlant Atmosphere.

    PubMed

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  6. Cost Optimization Model for Business Applications in Virtualized Grid Environments

    NASA Astrophysics Data System (ADS)

    Strebel, Jörg

    The advent of Grid computing gives enterprises an ever increasing choice of computing options, yet research has so far hardly addressed the problem of mixing the different computing options in a cost-minimal fashion. The following paper presents a comprehensive cost model and a mixed integer optimization model which can be used to minimize the IT expenditures of an enterprise and help in decision-making when to outsource certain business software applications. A sample scenario is analyzed and promising cost savings are demonstrated. Possible applications of the model to future research questions are outlined.

  7. Adapting smartphones for low-cost optical medical imaging

    NASA Astrophysics Data System (ADS)

    Pratavieira, Sebastião.; Vollet-Filho, José D.; Carbinatto, Fernanda M.; Blanco, Kate; Inada, Natalia M.; Bagnato, Vanderlei S.; Kurachi, Cristina

    2015-06-01

    Optical images have been used in several medical situations to improve diagnosis of lesions or to monitor treatments. However, most systems employ expensive scientific (CCD or CMOS) cameras and need computers to display and save the images, usually resulting in a high final cost for the system. Additionally, this sort of apparatus operation usually becomes more complex, requiring more and more specialized technical knowledge from the operator. Currently, the number of people using smartphone-like devices with built-in high quality cameras is increasing, which might allow using such devices as an efficient, lower cost, portable imaging system for medical applications. Thus, we aim to develop methods of adaptation of those devices to optical medical imaging techniques, such as fluorescence. Particularly, smartphones covers were adapted to connect a smartphone-like device to widefield fluorescence imaging systems. These systems were used to detect lesions in different tissues, such as cervix and mouth/throat mucosa, and to monitor ALA-induced protoporphyrin-IX formation for photodynamic treatment of Cervical Intraepithelial Neoplasia. This approach may contribute significantly to low-cost, portable and simple clinical optical imaging collection.

  8. Out-of-hours primary care. Implications of organisation on costs

    PubMed Central

    van Uden, Caro JT; Ament, Andre JHA; Voss, Gemma BWE; Wesseling, Geertjan; Winkens, Ron AG; van Schayck, Onno CP; Crebolder, Harry FJM

    2006-01-01

    Background To perform out-of-hours primary care, Dutch general practitioners (GPs) have organised themselves in large-scale GP cooperatives. Roughly, two models of out-of-hours care can be distinguished; GP cooperatives working separate from the hospital emergency department (ED) and GP cooperatives integrated with the hospital ED. Research has shown differences in care utilisation between these two models; a significant shift in the integrated model from utilisation of ED care to primary care. These differences may have implications on costs, however, until now this has not been investigated. This study was performed to provide insight in costs of these two different models of out-of-hours care. Methods Annual reports of two GP cooperatives (one separate from and one integrated with a hospital emergency department) in 2003 were analysed on costs and use of out-of-hours care. Costs were calculated per capita. Comparisons were made between the two cooperatives. In addition, a comparison was made between the costs of the hospital ED of the integrated model before and after the set up of the GP cooperative were analysed. Results Costs per capita of the GP cooperative in the integrated model were slightly higher than in the separate model (ε 11.47 and ε 10.54 respectively). Differences were mainly caused by personnel and other costs, including transportation, interest, cleaning, computers and overhead. Despite a significant reduction in patients utilising ED care as a result of the introduction of the GP cooperative integrated within the ED, the costs of the ED remained the same. Conclusion The study results show that the costs of primary care appear to be more dependent on the size of the population the cooperative covers than on the way the GP cooperative is organised, i.e. separated versus integrated. In addition, despite the substantial reduction of patients, locating the GP cooperative at the same site as the ED was found to have little effect on costs of the ED

  9. Out-of-hours primary care. Implications of organisation on costs.

    PubMed

    van Uden, Caro J T; Ament, Andre J H A; Voss, Gemma B W E; Wesseling, Geertjan; Winkens, Ron A G; van Schayck, Onno C P; Crebolder, Harry F J M

    2006-05-04

    To perform out-of-hours primary care, Dutch general practitioners (GPs) have organised themselves in large-scale GP cooperatives. Roughly, two models of out-of-hours care can be distinguished; GP cooperatives working separate from the hospital emergency department (ED) and GP cooperatives integrated with the hospital ED. Research has shown differences in care utilisation between these two models; a significant shift in the integrated model from utilisation of ED care to primary care. These differences may have implications on costs, however, until now this has not been investigated. This study was performed to provide insight in costs of these two different models of out-of-hours care. Annual reports of two GP cooperatives (one separate from and one integrated with a hospital emergency department) in 2003 were analysed on costs and use of out-of-hours care. Costs were calculated per capita. Comparisons were made between the two cooperatives. In addition, a comparison was made between the costs of the hospital ED of the integrated model before and after the set up of the GP cooperative were analysed. Costs per capita of the GP cooperative in the integrated model were slightly higher than in the separate model (epsilon 11.47 and epsilon 10.54 respectively). Differences were mainly caused by personnel and other costs, including transportation, interest, cleaning, computers and overhead. Despite a significant reduction in patients utilising ED care as a result of the introduction of the GP cooperative integrated within the ED, the costs of the ED remained the same. The study results show that the costs of primary care appear to be more dependent on the size of the population the cooperative covers than on the way the GP cooperative is organised, i.e. separated versus integrated. In addition, despite the substantial reduction of patients, locating the GP cooperative at the same site as the ED was found to have little effect on costs of the ED. Sharing more facilities

  10. Computer-aided communication satellite system analysis and optimization

    NASA Technical Reports Server (NTRS)

    Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.

    1973-01-01

    The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.

  11. Avoiding the Enumeration of Infeasible Elementary Flux Modes by Including Transcriptional Regulatory Rules in the Enumeration Process Saves Computational Costs

    PubMed Central

    Jungreuthmayer, Christian; Ruckerbauer, David E.; Gerstl, Matthias P.; Hanscho, Michael; Zanghellini, Jürgen

    2015-01-01

    Despite the significant progress made in recent years, the computation of the complete set of elementary flux modes of large or even genome-scale metabolic networks is still impossible. We introduce a novel approach to speed up the calculation of elementary flux modes by including transcriptional regulatory information into the analysis of metabolic networks. Taking into account gene regulation dramatically reduces the solution space and allows the presented algorithm to constantly eliminate biologically infeasible modes at an early stage of the computation procedure. Thereby, computational costs, such as runtime, memory usage, and disk space, are extremely reduced. Moreover, we show that the application of transcriptional rules identifies non-trivial system-wide effects on metabolism. Using the presented algorithm pushes the size of metabolic networks that can be studied by elementary flux modes to new and much higher limits without the loss of predictive quality. This makes unbiased, system-wide predictions in large scale metabolic networks possible without resorting to any optimization principle. PMID:26091045

  12. The cost-effectiveness and budget impact of stepwise addition of bolus insulin in the treatment of type 2 diabetes: evaluation of the FullSTEP trial.

    PubMed

    Saunders, Rhodri; Lian, Jean; Karolicki, Boris; Valentine, William

    2014-12-01

    Abstract Background and aims: Intensification of basal insulin-only therapy in type 2 diabetes is often achieved through addition of bolus insulin 3-times daily. The FullSTEP trial demonstrated that stepwise addition (SWA) of bolus insulin aspart was non-inferior to full basal-bolus (FBB) therapy and reduced the rate of hypoglycemia. Here the cost-effectiveness and budget impact of SWA is evaluated. Cost-effectiveness and budget impact models were developed to assess the cost and quality-of-life (QoL) implications of intensification using SWA compared with FBB in the US setting. At assessment, SWA patients added one bolus dose to their current regimen if the HbA1c target was not met. SWA patients reaching three bolus doses used FBB event rates. Outcomes were evaluated at trial end and projected annually up to 5 years. Models captured hypoglycemic events, the proportion meeting HbA1c target, and self-measured blood glucose. Event rates and QoL utilities were taken from trial data and published literature. Costs were evaluated from a healthcare-payer perspective, reported in 2013 USD, and discounted (like clinical outcomes) at 3.5% annually. This analysis applies to patients with HbA1c 7.0-9.0% and body mass index <40 kg/m(2). SWA was associated with improved QoL and reduced costs compared with FBB. Improvement in QoL and cost reduction were driven by lower rates of hypoglycemia. Sensitivity analyses showed that outcomes were most influenced by the cost of bolus insulin and QoL impact of symptomatic hypoglycemia. Budget impact analysis estimated that, by moving from FBB to SWA, a health plan with 77,000 patients with type 2 diabetes, of whom 7.8% annually intensified to basal-bolus therapy, would save USD 1304 per intensifying patient over the trial period. SWA of bolus insulin should be considered a beneficial and cost-saving alternative to FBB therapy for the intensification of treatment in type 2 diabetes.

  13. Gedanken Experiments in Educational Cost Effectiveness

    ERIC Educational Resources Information Center

    Brudner, Harvey J.

    1978-01-01

    Discusses the effectiveness of cost determining techniques in education. The areas discussed are: education and management; cost-effectiveness models; figures of merit determination; and the implications as they relate to the areas of audio-visual and computer educational technology. (Author/GA)

  14. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  15. Future Costs, Fixed Healthcare Budgets, and the Decision Rules of Cost-Effectiveness Analysis.

    PubMed

    van Baal, Pieter; Meltzer, David; Brouwer, Werner

    2016-02-01

    Life-saving medical technologies result in additional demand for health care due to increased life expectancy. However, most economic evaluations do not include all medical costs that may result from this additional demand in health care and include only future costs of related illnesses. Although there has been much debate regarding the question to which extent future costs should be included from a societal perspective, the appropriate role of future medical costs in the widely adopted but more narrow healthcare perspective has been neglected. Using a theoretical model, we demonstrate that optimal decision rules for cost-effectiveness analyses assuming fixed healthcare budgets dictate that future costs of both related and unrelated medical care should be included. Practical relevance of including the costs of future unrelated medical care is illustrated using the example of transcatheter aortic valve implantation. Our findings suggest that guidelines should prescribe inclusion of these costs. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Dopamine Manipulation Affects Response Vigor Independently of Opportunity Cost.

    PubMed

    Zénon, Alexandre; Devesse, Sophie; Olivier, Etienne

    2016-09-14

    Dopamine is known to be involved in regulating effort investment in relation to reward, and the disruption of this mechanism is thought to be central in some pathological situations such as Parkinson's disease, addiction, and depression. According to an influential model, dopamine plays this role by encoding the opportunity cost, i.e., the average value of forfeited actions, which is an important parameter to take into account when making decisions about which action to undertake and how fast to execute it. We tested this hypothesis by asking healthy human participants to perform two effort-based decision-making tasks, following either placebo or levodopa intake in a double blind within-subject protocol. In the effort-constrained task, there was a trade-off between the amount of force exerted and the time spent in executing the task, such that investing more effort decreased the opportunity cost. In the time-constrained task, the effort duration was constant, but exerting more force allowed the subject to earn more substantial reward instead of saving time. Contrary to the model predictions, we found that levodopa caused an increase in the force exerted only in the time-constrained task, in which there was no trade-off between effort and opportunity cost. In addition, a computational model showed that dopamine manipulation left the opportunity cost factor unaffected but altered the ratio between the effort cost and reinforcement value. These findings suggest that dopamine does not represent the opportunity cost but rather modulates how much effort a given reward is worth. Dopamine has been proposed in a prevalent theory to signal the average reward rate, used to estimate the cost of investing time in an action, also referred to as opportunity cost. We contrasted the effect of dopamine manipulation in healthy participants in two tasks, in which increasing response vigor (i.e., the amount of effort invested in an action) allowed either to save time or to earn more

  17. 24 CFR 208.112 - Cost.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... increases. (b) At the owner's option, the cost of the computer software may include service contracts to... requirements. (c) The source of funds for the purchase of hardware or software, or contracting for services for... formatted data, including either the purchase and maintenance of computer hardware or software, or both, the...

  18. Cost utility, budget impact, and scenario analysis of racecadotril in addition to oral rehydration for acute diarrhea in children in Malaysia

    PubMed Central

    Rautenberg, Tamlyn Anne; Zerwes, Ute; Lee, Way Seah

    2018-01-01

    Objective To perform cost utility (CU) and budget impact (BI) analyses augmented by scenario analyses of critical model structure components to evaluate racecadotril as adjuvant to oral rehydration solution (ORS) for children under 5 years with acute diarrhea in Malaysia. Methods A CU model was adapted to evaluate racecadotril plus ORS vs ORS alone for acute diarrhea in children younger than 5 years from a Malaysian public payer’s perspective. A bespoke BI analysis was undertaken in addition to detailed scenario analyses with respect to critical model structure components. Results According to the CU model, the intervention is less costly and more effective than comparator for the base case with a dominant incremental cost-effectiveness ratio of −RM 1,272,833/quality-adjusted life year (USD −312,726/quality-adjusted life year) in favor of the intervention. According to the BI analysis (assuming an increase of 5% market share per year for racecadotril+ORS for 5 years), the total cumulative incremental percentage reduction in health care expenditure for diarrhea in children is 0.136578%, resulting in a total potential cumulative cost savings of −RM 73,193,603 (USD −17,983,595) over a 5-year period. Results hold true across a range of plausible scenarios focused on critical model components. Conclusion Adjuvant racecadotril vs ORS alone is potentially cost-effective from a Malaysian public payer perspective subject to the assumptions and limitations of the model. BI analysis shows that this translates into potential cost savings for the Malaysian public health care system. Results hold true at evidence-based base case values and over a range of alternate scenarios. PMID:29588606

  19. A Home Computer Primer.

    ERIC Educational Resources Information Center

    Stone, Antonia

    1982-01-01

    Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…

  20. Clinical and cost-effectiveness of the Lightning Process in addition to specialist medical care for paediatric chronic fatigue syndrome: randomised controlled trial

    PubMed Central

    Crawley, Esther M; Gaunt, Daisy M; Garfield, Kirsty; Hollingworth, William; Sterne, Jonathan A C; Beasant, Lucy; Collin, Simon M; Mills, Nicola; Montgomery, Alan A

    2018-01-01

    Objective Investigate the effectiveness and cost-effectiveness of the Lightning Process (LP) in addition to specialist medical care (SMC) compared with SMC alone, for children with chronic fatigue syndrome (CFS)/myalgic encephalitis (ME). Design Pragmatic randomised controlled open trial. Participants were randomly assigned to SMC or SMC+LP. Randomisation was minimised by age and gender. Setting Specialist paediatric CFS/ME service. Patients 12–18 year olds with mild/moderate CFS/ME. Main outcome measures The primary outcome was the the 36-Item Short-Form Health Survey Physical Function Subscale (SF-36-PFS) at 6 months. Secondary outcomes included pain, anxiety, depression, school attendance and cost-effectiveness from a health service perspective at 3, 6 and 12 months. Results We recruited 100 participants, of whom 51 were randomised to SMC+LP. Data from 81 participants were analysed at 6 months. Physical function (SF-36-PFS) was better in those allocated SMC+LP (adjusted difference in means 12.5(95% CI 4.5 to 20.5), p=0.003) and this improved further at 12 months (15.1 (5.8 to 24.4), p=0.002). At 6 months, fatigue and anxiety were reduced, and at 12 months, fatigue, anxiety, depression and school attendance had improved in the SMC+LP arm. Results were similar following multiple imputation. SMC+LP was probably more cost-effective in the multiple imputation dataset (difference in means in net monetary benefit at 12 months £1474(95% CI £111 to £2836), p=0.034) but not for complete cases. Conclusion The LP is effective and is probably cost-effective when provided in addition to SMC for mild/moderately affected adolescents with CFS/ME. Trial registration number ISRCTN81456207. PMID:28931531

  1. Computer-assisted Behavioral Therapy and Contingency Management for Cannabis Use Disorder

    PubMed Central

    Budney, Alan J.; Stanger, Catherine; Tilford, J. Mick; Scherer, Emily; Brown, Pamela C.; Li, Zhongze; Li, Zhigang; Walker, Denise

    2015-01-01

    Computer-assisted behavioral treatments hold promise for enhancing access to and reducing costs of treatments for substance use disorders. This study assessed the efficacy of a computer-assisted version of an efficacious, multicomponent treatment for cannabis use disorders (CUD), i.e., motivational enhancement therapy, cognitive-behavioral therapy, and abstinence-based contingency-management (MET/CBT/CM). An initial cost comparison was also performed. Seventy-five adult participants, 59% African Americans, seeking treatment for CUD received either, MET only (BRIEF), therapist-delivered MET/CBT/CM (THERAPIST), or computer-delivered MET/CBT/CM (COMPUTER). During treatment, the THERAPIST and COMPUTER conditions engendered longer durations of continuous cannabis abstinence than BRIEF (p < .05), but did not differ from each other. Abstinence rates and reduction in days of use over time were maintained in COMPUTER at least as well as in THERAPIST. COMPUTER averaged approximately $130 (p < .05) less per case than THERAPIST in therapist costs, which offset most of the costs of CM. Results add to promising findings that illustrate potential for computer-assisted delivery methods to enhance access to evidence-based care, reduce costs, and possibly improve outcomes. The observed maintenance effects and the cost findings require replication in larger clinical trials. PMID:25938629

  2. Cost-effectiveness of per oral endoscopic myotomy relative to laparoscopic Heller myotomy for the treatment of achalasia.

    PubMed

    Greenleaf, Erin K; Winder, Joshua S; Hollenbeak, Christopher S; Haluck, Randy S; Mathew, Abraham; Pauli, Eric M

    2018-01-01

    Per oral endoscopic myotomy (POEM) has recently emerged as a viable option relative to the classic approach of laparoscopic Heller myotomy (LHM) for the treatment of esophageal achalasia. In this cost-utility analysis of POEM and LHM, we hypothesized that POEM would be cost-effective relative to LHM. A stochastic cost-utility analysis of treatment for achalasia was performed to determine the cost-effectiveness of POEM relative to LHM. Costs were estimated from the provider perspective and obtained from our institution's cost-accounting database. The measure of effectiveness was quality-adjusted life years (QALYs) which were estimated from direct elicitation of utility using a visual analog scale. The primary outcome was the incremental cost-effectiveness ratio (ICER). Uncertainty was assessed by bootstrapping the sample and computing the cost-effectiveness acceptability curve (CEAC). Patients treated within an 11-year period (2004-2016) were recruited for participation (20 POEM, 21 LHM). During the index admission, the mean costs for POEM ($8630 ± $2653) and the mean costs for LHM ($7604 ± $2091) were not significantly different (P = 0.179). Additionally, mean QALYs for POEM (0.413 ± 0.248) were higher than that associated with LHM (0.357 ± 0.338), but this difference was also not statistically significant (P = 0.55). The ICER suggested that it would cost an additional $18,536 for each QALY gained using POEM. There was substantial uncertainty in the ICER; there was a 48.25% probability that POEM was cost-effective at the mean ICER. At a willingness-to-pay threshold of $100,000, there was a 68.31% probability that POEM was cost-effective relative to LHM. In the treatment of achalasia, POEM appears to be cost-effective relative to LHM depending on one's willingness-to-pay for an additional QALY.

  3. 14 CFR 152.203 - Allowable project costs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Allowable project costs. 152.203 Section...) AIRPORTS AIRPORT AID PROGRAM Funding of Approved Projects § 152.203 Allowable project costs. (a) Airport development. To be an allowable project cost, for the purposes of computing the amount of an airport...

  4. 14 CFR 152.203 - Allowable project costs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Allowable project costs. 152.203 Section...) AIRPORTS AIRPORT AID PROGRAM Funding of Approved Projects § 152.203 Allowable project costs. (a) Airport development. To be an allowable project cost, for the purposes of computing the amount of an airport...

  5. 14 CFR 152.203 - Allowable project costs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Allowable project costs. 152.203 Section...) AIRPORTS AIRPORT AID PROGRAM Funding of Approved Projects § 152.203 Allowable project costs. (a) Airport development. To be an allowable project cost, for the purposes of computing the amount of an airport...

  6. 14 CFR 152.203 - Allowable project costs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Allowable project costs. 152.203 Section...) AIRPORTS AIRPORT AID PROGRAM Funding of Approved Projects § 152.203 Allowable project costs. (a) Airport development. To be an allowable project cost, for the purposes of computing the amount of an airport...

  7. 14 CFR 152.203 - Allowable project costs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Allowable project costs. 152.203 Section...) AIRPORTS AIRPORT AID PROGRAM Funding of Approved Projects § 152.203 Allowable project costs. (a) Airport development. To be an allowable project cost, for the purposes of computing the amount of an airport...

  8. Health Monitoring System Technology Assessments: Cost Benefits Analysis

    NASA Technical Reports Server (NTRS)

    Kent, Renee M.; Murphy, Dennis A.

    2000-01-01

    The subject of sensor-based structural health monitoring is very diverse and encompasses a wide range of activities including initiatives and innovations involving the development of advanced sensor, signal processing, data analysis, and actuation and control technologies. In addition, it embraces the consideration of the availability of low-cost, high-quality contributing technologies, computational utilities, and hardware and software resources that enable the operational realization of robust health monitoring technologies. This report presents a detailed analysis of the cost benefit and other logistics and operational considerations associated with the implementation and utilization of sensor-based technologies for use in aerospace structure health monitoring. The scope of this volume is to assess the economic impact, from an end-user perspective, implementation health monitoring technologies on three structures. It specifically focuses on evaluating the impact on maintaining and supporting these structures with and without health monitoring capability.

  9. Arduino: a low-cost multipurpose lab equipment.

    PubMed

    D'Ausilio, Alessandro

    2012-06-01

    Typical experiments in psychological and neurophysiological settings often require the accurate control of multiple input and output signals. These signals are often generated or recorded via computer software and/or external dedicated hardware. Dedicated hardware is usually very expensive and requires additional software to control its behavior. In the present article, I present some accuracy tests on a low-cost and open-source I/O board (Arduino family) that may be useful in many lab environments. One of the strengths of Arduinos is the possibility they afford to load the experimental script on the board's memory and let it run without interfacing with computers or external software, thus granting complete independence, portability, and accuracy. Furthermore, a large community has arisen around the Arduino idea and offers many hardware add-ons and hundreds of free scripts for different projects. Accuracy tests show that Arduino boards may be an inexpensive tool for many psychological and neurophysiological labs.

  10. 33 CFR 239.8 - Cost sharing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to provide additional cost sharing to reflect special local benefits or betterments. Such additional... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Cost sharing. 239.8 Section 239.8... RESOURCES POLICIES AND AUTHORITIES: FEDERAL PARTICIPATION IN COVERED FLOOD CONTROL CHANNELS § 239.8 Cost...

  11. Cost and Cost-Effectiveness of Students for Nutrition and eXercise (SNaX).

    PubMed

    Ladapo, Joseph A; Bogart, Laura M; Klein, David J; Cowgill, Burton O; Uyeda, Kimberly; Binkle, David G; Stevens, Elizabeth R; Schuster, Mark A

    2016-04-01

    To examine the cost and cost-effectiveness of implementing Students for Nutrition and eXercise (SNaX), a 5-week middle school-based obesity-prevention intervention combining school-wide environmental changes, multimedia, encouragement to eat healthy school cafeteria foods, and peer-led education. Five intervention and 5 control middle schools (mean enrollment, 1520 students) from the Los Angeles Unified School District participated in a randomized controlled trial of SNaX. Acquisition costs for materials and time and wage data for employees involved in implementing the program were used to estimate fixed and variable costs. Cost-effectiveness was determined using the ratio of variable costs to program efficacy outcomes. The costs of implementing the program over 5 weeks were $5433.26 per school in fixed costs and $2.11 per student in variable costs, equaling a total cost of $8637.17 per school, or $0.23 per student per day. This investment yielded significant increases in the proportion of students served fruit and lunch and a significant decrease in the proportion of students buying snacks. The cost-effectiveness of the program, per student over 5 weeks, was $1.20 per additional fruit served during meals, $8.43 per additional full-priced lunch served, $2.11 per additional reduced-price/free lunch served, and $1.69 per reduction in snacks sold. SNaX demonstrated the feasibility and cost-effectiveness of a middle school-based obesity-prevention intervention combining school-wide environmental changes, multimedia, encouragement to eat healthy school cafeteria foods, and peer-led education. Its cost is modest and unlikely to be a significant barrier to adoption for many schools considering its implementation. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  12. Program Tracks Cost Of Travel

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III

    1993-01-01

    Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.

  13. Clinical Effectiveness and Impact on Insulin Therapy Cost After Addition of Dapagliflozin to Patients with Uncontrolled Type 2 Diabetes.

    PubMed

    Sosale, Bhavana; Sosale, Aravind; Bhattacharyya, Arpandev

    2016-12-01

    and body weight along with minimal side effects was observed. Addition of dapagliflozin reduced the insulin daily dose requirement and cost of insulin therapy in these patients. Diacon Hospital, Bangalore, India.

  14. Method and computer program product for maintenance and modernization backlogging

    DOEpatents

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  15. Low cost infrared and near infrared sensors for UAVs

    NASA Astrophysics Data System (ADS)

    Aden, S. T.; Bialas, J. P.; Champion, Z.; Levin, E.; McCarty, J. L.

    2014-11-01

    Thermal remote sensing has a wide range of applications, though the extent of its use is inhibited by cost. Robotic and computer components are now widely available to consumers on a scale that makes thermal data a readily accessible resource. In this project, thermal imagery collected via a lightweight remote sensing Unmanned Aerial Vehicle (UAV) was used to create a surface temperature map for the purpose of providing wildland firefighting crews with a cost-effective and time-saving resource. The UAV system proved to be flexible, allowing for customized sensor packages to be designed that could include visible or infrared cameras, GPS, temperature sensors, and rangefinders, in addition to many data management options. Altogether, such a UAV system could be used to rapidly collect thermal and aerial data, with a geographic accuracy of less than one meter.

  16. Is the progression free survival advantage of concurrent gemcitabine plus cisplatin and radiation followed by adjuvant gemcitabine and cisplatin in patients with advanced cervical cancer worth the additional cost? A cost-effectiveness analysis.

    PubMed

    Smith, B; Cohn, D E; Clements, A; Tierney, B J; Straughn, J M

    2013-09-01

    The objective of this study is to determine whether concurrent and adjuvant chemoradiation with gemcitabine/cisplatin is cost-effective in patients with stage IIB to IVA cervical cancer. A cost-effectiveness model compared two arms of the trial performed by Duenas-Gonzalez et al. [1]: concurrent and adjuvant chemoradiation with gemcitabine/cisplatin (RT/GC+GC) versus concurrent radiation with cisplatin (RT/C). Major adverse events (AEs) and progression free survival (PFS) rates of each arm were incorporated in the model. AEs were defined as any hospitalization including grade 4 anemia, grade 4 neutropenia, and death. Medicare data and literature review were used to estimate costs. Incremental cost-effectiveness ratios (ICERs) per progression-free life-year saved (PF-LYS) were calculated. Sensitivity analyses were performed for pertinent uncertainties. For 10,000 women with locally advanced cervical cancer, the cost of therapy and AEs was $173.9 million (M) for RT/C versus $259.8M for RT/GC+GC. There were 879 additional 3-year progression-free survivors in the RT/GC+GC arm. The ICER for RT/GC+GC was $97,799 per PF-LYS. When the rate of hospitalization was equalized to 4.3%, the ICER for RT/GC+GC exceeded $80,000. The resultant ICER when increasing PFS in the RT/GC+GC arm by 5% was $62,605 per PF-LYS. When the cost of chemotherapy was decreased by 50%, the ICER was below $50,000 at $41,774 per PF-LYS. Radiation and gemcitabine/cisplatin for patients with stage IIB to IVA cervical cancer are not cost-effective. The increased financial burden of radiation with gemcitabine/cisplatin and associated toxicities appears to outweigh the benefit of increased 3-year PFS and is primarily dependent on chemotherapy drug costs. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Middleware enabling computational self-reflection: exploring the need for and some costs of selfreflecting networks with application to homeland defense

    NASA Astrophysics Data System (ADS)

    Kramer, Michael J.; Bellman, Kirstie L.; Landauer, Christopher

    2002-07-01

    This paper will review and examine the definitions of Self-Reflection and Active Middleware. Then it will illustrate a conceptual framework for understanding and enumerating the costs of Self-Reflection and Active Middleware at increasing levels of Application. Then it will review some application of Self-Reflection and Active Middleware to simulations. Finally it will consider the application and additional kinds of costs applying Self-Reflection and Active Middleware to sharing information among the organizations expected to participate in Homeland Defense.

  18. Computer simulation of energy use, greenhouse gas emissions, and costs for alternative methods of processing fluid milk.

    PubMed

    Tomasula, P M; Datta, N; Yee, W C F; McAloon, A J; Nutter, D W; Sampedro, F; Bonnaillie, L M

    2014-07-01

    Computer simulation is a useful tool for benchmarking electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature, short time (HTST) pasteurization was extended to include models for processes for shelf-stable milk and extended shelf-life milk that may help prevent the loss or waste of milk that leads to increases in the greenhouse gas (GHG) emissions for fluid milk. The models were for UHT processing, crossflow microfiltration (MF) without HTST pasteurization, crossflow MF followed by HTST pasteurization (MF/HTST), crossflow MF/HTST with partial homogenization, and pulsed electric field (PEF) processing, and were incorporated into the existing model for the fluid milk process. Simulation trials were conducted assuming a production rate for the plants of 113.6 million liters of milk per year to produce only whole milk (3.25%) and 40% cream. Results showed that GHG emissions in the form of process-related CO₂ emissions, defined as CO₂ equivalents (e)/kg of raw milk processed (RMP), and specific energy consumptions (SEC) for electricity and natural gas use for the HTST process alone were 37.6g of CO₂e/kg of RMP, 0.14 MJ/kg of RMP, and 0.13 MJ/kg of RMP, respectively. Emissions of CO2 and SEC for electricity and natural gas use were highest for the PEF process, with values of 99.1g of CO₂e/kg of RMP, 0.44 MJ/kg of RMP, and 0.10 MJ/kg of RMP, respectively, and lowest for the UHT process at 31.4 g of CO₂e/kg of RMP, 0.10 MJ/kg of RMP, and 0.17 MJ/kg of RMP. Estimated unit production costs associated with the various processes were lowest for the HTST process and MF/HTST with partial homogenization at $0.507/L and highest for the UHT process at $0.60/L. The increase in shelf life associated with the UHT and MF processes may eliminate some of the supply chain product and consumer losses and waste of milk and compensate for the small increases in GHG

  19. Low cost computer subsystem for the Solar Electric Propulsion Stage (SEPS)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Solar Electric Propulsion Stage (SEPS) subsystem which consists of the computer, custom input/output (I/O) unit, and tape recorder for mass storage of telemetry data was studied. Computer software and interface requirements were developed along with computer and I/O unit design parameters. Redundancy implementation was emphasized. Reliability analysis was performed for the complete command computer sybsystem. A SEPS fault tolerant memory breadboard was constructed and its operation demonstrated.

  20. Handgrip strength measurement as a predictor of hospitalization costs.

    PubMed

    Guerra, R S; Amaral, T F; Sousa, A S; Pichel, F; Restivo, M T; Ferreira, S; Fonseca, I

    2015-02-01

    Undernutrition status at hospital admission is related to increased hospital costs. Handgrip strength (HGS) is an indicator of undernutrition, but the ability of HGS to predict hospitalization costs has yet to be studied. To explore whether HGS measurement at hospital admission can predict patient's hospitalization costs. A prospective study was conducted in a university hospital. Inpatient's (n=637) HGS and undernutrition status by Patient-Generated Subjective Global Assessment were ascertained. Multivariable linear regression analysis, computing HGS quartiles by sex (reference: fourth quartile, highest), was conducted in order to identify the independent predictors of hospitalization costs. Costs were evaluated through percentage deviation from the mean cost, after adjustment for patients' characteristics, disease severity and undernutrition status. Being in the first or second HGS quartiles at hospital admission increased patient's hospitalization costs, respectively, by 17.5% (95% confidence interval: 2.7-32.3) and 21.4% (7.5-35.3), which translated into an increase from €375 (58-692) to €458 (161-756). After the additional adjustment for undernutrition status, being in the first or second HGS quartiles had, respectively, an economic impact of 16.6% (1.9-31.2) and 20.0% (6.2-33.8), corresponding to an increase in hospitalization expenditure from €356 (41-668) to €428 (133-724). Low HGS at hospital admission is associated with increased hospitalization costs of between 16.6 and 20.0% after controlling for possible confounders, including undernutrition status. HGS is an inexpensive, noninvasive and easy-to-use method that has clinical potential to predict hospitalization costs.

  1. How Costs Influence Decision Values for Mixed Outcomes

    PubMed Central

    Talmi, Deborah; Pine, Alex

    2012-01-01

    The things that we hold dearest often require a sacrifice, as epitomized in the maxim “no pain, no gain.” But how is the subjective value of outcomes established when they consist of mixtures of costs and benefits? We describe theoretical models for the integration of costs and benefits into a single value, drawing on both the economic and the empirical literatures, with the goal of rendering them accessible to the neuroscience community. We propose two key assays that go beyond goodness of fit for deciding between the dominant additive model and four varieties of interactive models. First, how they model decisions between costs when reward is not on offer; and second, whether they predict changes in reward sensitivity when costs are added to outcomes, and in what direction. We provide a selective review of relevant neurobiological work from a computational perspective, focusing on those studies that illuminate the underlying valuation mechanisms. Cognitive neuroscience has great potential to decide which of the theoretical models is actually employed by our brains, but empirical work has yet to fully embrace this challenge. We hope that future research improves our understanding of how our brain decides whether mixed outcomes are worthwhile. PMID:23112758

  2. Computation and Dynamics: Classical and Quantum

    NASA Astrophysics Data System (ADS)

    Kisil, Vladimir V.

    2010-05-01

    We discuss classical and quantum computations in terms of corresponding Hamiltonian dynamics. This allows us to introduce quantum computations which involve parallel processing of both: the data and programme instructions. Using mixed quantum-classical dynamics we look for a full cost of computations on quantum computers with classical terminals.

  3. Beat-ID: Towards a computationally low-cost single heartbeat biometric identity check system based on electrocardiogram wave morphology

    PubMed Central

    Paiva, Joana S.; Dias, Duarte

    2017-01-01

    In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a

  4. Beat-ID: Towards a computationally low-cost single heartbeat biometric identity check system based on electrocardiogram wave morphology.

    PubMed

    Paiva, Joana S; Dias, Duarte; Cunha, João P S

    2017-01-01

    In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a

  5. [Process-oriented cost calculation in interventional radiology. A case study].

    PubMed

    Mahnken, A H; Bruners, P; Günther, R W; Rasche, C

    2012-01-01

    Currently used costing methods such as cost centre accounting do not sufficiently reflect the process-based resource utilization in medicine. The goal of this study was to establish a process-oriented cost assessment of percutaneous radiofrequency (RF) ablation of liver and lung metastases. In each of 15 patients a detailed task analysis of the primary process of hepatic and pulmonary RF ablation was performed. Based on these data a dedicated cost calculation model was developed for each primary process. The costs of each process were computed and compared with the revenue for in-patients according to the German diagnosis-related groups (DRG) system 2010. The RF ablation of liver metastases in patients without relevant comorbidities and a low patient complexity level results in a loss of EUR 588.44, whereas the treatment of patients with a higher complexity level yields an acceptable profit. The treatment of pulmonary metastases is profitable even in cases of additional expenses due to complications. Process-oriented costing provides relevant information that is needed for understanding the economic impact of treatment decisions. It is well suited as a starting point for economically driven process optimization and reengineering. Under the terms of the German DRG 2010 system percutaneous RF ablation of lung metastases is economically reasonable, while RF ablation of liver metastases in cases of low patient complexity levels does not cover the costs.

  6. Additives for cement compositions based on modified peat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopanitsa, Natalya, E-mail: kopanitsa@mail.ru; Sarkisov, Yurij, E-mail: sarkisov@tsuab.ru; Gorshkova, Aleksandra, E-mail: kasatkina.alexandra@gmail.com

    High quality competitive dry building mixes require modifying additives for various purposes to be included in their composition. There is insufficient amount of quality additives having stable properties for controlling the properties of cement compositions produced in Russia. Using of foreign modifying additives leads to significant increasing of the final cost of the product. The cost of imported modifiers in the composition of the dry building mixes can be up to 90% of the material cost, depending on the composition complexity. Thus, the problem of import substitution becomes relevant, especially in recent years, due to difficult economic situation. The articlemore » discusses the possibility of using local raw materials as a basis for obtaining dry building mixtures components. The properties of organo-mineral additives for cement compositions based on thermally modified peat raw materials are studied. Studies of the structure and composition of the additives are carried out by physicochemical research methods: electron microscopy and X-ray analysis. Results of experimental research showed that the peat additives contribute to improving of cement-sand mortar strength and hydrophysical properties.« less

  7. Energy measurement using flow computers and chromatography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beeson, J.

    1995-12-01

    Arkla Pipeline Group (APG), along with most transmission companies, went to electronic flow measurement (EFM) to: (1) Increase resolution and accuracy; (2) Real time correction of flow variables; (3) Increase speed in data retrieval; (4) Reduce capital expenditures; and (5) Reduce operation and maintenance expenditures Prior to EFM, mechanical seven day charts were used which yielded 800 pressure and differential pressure readings. EFM yields 1.2-million readings, a 1500 time improvement in resolution and additional flow representation. The total system accuracy of the EFM system is 0.25 % compared with 2 % for the chart system which gives APG improved accuracy.more » A typical APG electronic measurement system includes a microprocessor-based flow computer, a telemetry communications package, and a gas chromatograph. Live relative density (specific gravity), BTU, CO{sub 2}, and N{sub 2} are updated from the chromatograph to the flow computer every six minutes which provides accurate MMBTU computations. Because the gas contract length has changed from years to monthly and from a majority of direct sales to transports both Arkla and its customers wanted access to actual volumes on a much more timely basis than is allowed with charts. The new electronic system allows volumes and other system data to be retrieved continuously, if EFM is on Supervisory Control and Data Acquisition (SCADA) or daily if on dial up telephone. Previously because of chart integration, information was not available for four to six weeks. EFM costs much less than the combined costs of telemetry transmitters, pressure and differential pressure chart recorders, and temperature chart recorder which it replaces. APG will install this equipment on smaller volume stations at a customers expense. APG requires backup measurement on metering facilities this size. It could be another APG flow computer or chart recorder, or the other companies flow computer or chart recorder.« less

  8. An initial assessment of the cost and utilization of the Integrated Academic Information System (IAIMS) at Columbia Presbyterian Medical Center.

    PubMed Central

    Clayton, P. D.; Anderson, R. K.; Hill, C.; McCormack, M.

    1991-01-01

    The concept of "one stop information shopping" is becoming a reality at Columbia Presbyterian Medical Center (CPMC). The goal of our effort is to provide access to university and hospital administrative systems as well as clinical and library applications from a single workstation, which also provides utility functions such as word processing and mail. Since June 1987, CPMC has invested the equivalent of $23 million dollars to install a digital communications network that encompasses 18 buildings at seven geographically separate sites and to develop clinical and library applications that are integrated with the existing hospital and university administrative and research computing facilities. During June 1991, 2425 different individuals used the clinical information system, 425 different individuals used the library applications, and 900 different individuals used the hospital administrative applications via network access. If we were to freeze the system in its current state, amortize the development and network installation costs, and add projected maintenance costs for the clinical and library applications, our integrated information system would cost $2.8 million on an annual basis. This cost is 0.3% of the medical center's annual budget. These expenditures could be justified by very small improvements in time savings for personnel and/or decreased length of hospital stay and/or more efficient use of resources. In addition to the direct benefits which we detail, a major benefit is the ease with which additional computer-based applications can be added incrementally at an extremely modest cost. PMID:1666966

  9. Market frictions: A unified model of search costs and switching costs

    PubMed Central

    Wilson, Chris M.

    2012-01-01

    It is well known that search costs and switching costs can create market power by constraining the ability of consumers to change suppliers. While previous research has examined each cost in isolation, this paper demonstrates the benefits of examining the two types of friction in unison. The paper shows how subtle distinctions between the two costs can provide important differences in their effects upon consumer behaviour, competition and welfare. In addition, the paper also illustrates a simple empirical methodology for estimating separate measures of both costs, while demonstrating a potential bias that can arise if only one cost is considered. PMID:25550674

  10. GPU-accelerated computing for Lagrangian coherent structures of multi-body gravitational regimes

    NASA Astrophysics Data System (ADS)

    Lin, Mingpei; Xu, Ming; Fu, Xiaoyu

    2017-04-01

    Based on a well-established theoretical foundation, Lagrangian Coherent Structures (LCSs) have elicited widespread research on the intrinsic structures of dynamical systems in many fields, including the field of astrodynamics. Although the application of LCSs in dynamical problems seems straightforward theoretically, its associated computational cost is prohibitive. We propose a block decomposition algorithm developed on Compute Unified Device Architecture (CUDA) platform for the computation of the LCSs of multi-body gravitational regimes. In order to take advantage of GPU's outstanding computing properties, such as Shared Memory, Constant Memory, and Zero-Copy, the algorithm utilizes a block decomposition strategy to facilitate computation of finite-time Lyapunov exponent (FTLE) fields of arbitrary size and timespan. Simulation results demonstrate that this GPU-based algorithm can satisfy double-precision accuracy requirements and greatly decrease the time needed to calculate final results, increasing speed by approximately 13 times. Additionally, this algorithm can be generalized to various large-scale computing problems, such as particle filters, constellation design, and Monte-Carlo simulation.

  11. Cost containment and KSC Shuttle facilities or cost containment and aerospace construction

    NASA Technical Reports Server (NTRS)

    Brown, J. A.

    1985-01-01

    This presentation has the objective to show examples of Cost Containment of Aerospace Construction at Kennedy Space Center (KSC), taking into account four major levels of Project Development of the Space Shuttle Facilities. The levels are related to conceptual criteria and site selection, the design of construction and ground support equipment, the construction of facilities and ground support equipment (GSE), and operation and maintenance. Examples of cost containment are discussed. The continued reduction of processing time from landing to launching represents a demonstration of the success of the cost containment methods. Attention is given to the factors which led to the selection of KSC, the use of Cost Engineering, the employment of the Construction Management Concept, and the use of Computer Aided Design/Drafting.

  12. Avoiding Split Attention in Computer-Based Testing: Is Neglecting Additional Information Facilitative?

    ERIC Educational Resources Information Center

    Jarodzka, Halszka; Janssen, Noortje; Kirschner, Paul A.; Erkens, Gijsbert

    2015-01-01

    This study investigated whether design guidelines for computer-based learning can be applied to computer-based testing (CBT). Twenty-two students completed a CBT exam with half of the questions presented in a split-screen format that was analogous to the original paper-and-pencil version and half in an integrated format. Results show that students…

  13. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  14. Distributed management of scientific projects - An analysis of two computer-conferencing experiments at NASA

    NASA Technical Reports Server (NTRS)

    Vallee, J.; Gibbs, B.

    1976-01-01

    Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.

  15. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  16. 20 CFR 404.270 - Cost-of-living increases.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Cost-of-living increases. 404.270 Section 404.270 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.270 Cost-of-living...

  17. The cost effectiveness of intracyctoplasmic sperm injection (ICSI).

    PubMed

    Hollingsworth, Bruce; Harris, Anthony; Mortimer, Duncan

    2007-12-01

    To estimate the incremental cost effectiveness of ICSI, and total costs for the population of Australia. Treatment effects for three patient groups were drawn from a published systematic review and meta-analysis of trials comparing fertilisation outcomes for ICSI. Incremental costs derived from resource-based costing of ICSI and existing practice comparators for each patient group. Incremental cost per live birth for patients unsuited to IVF is estimated between A$8,500 and 13,400. For the subnormal semen indication, cost per live birth could be as low as A$3,600, but in the worst case scenario, there would just be additional incremental costs of A$600 per procedure. Multiplying out the additional costs of ICSI over the relevant target populations in Australia gives potential total financial implications of over A$31 million per annum. While there are additional benefits from ICSI procedure, particularly for those with subnormal sperm, the additional cost for the health care system is substantial.

  18. Costs and Cost-Effectiveness of Plasmodium vivax Control

    PubMed Central

    White, Michael T.; Yeung, Shunmay; Patouillard, Edith; Cibulskis, Richard

    2016-01-01

    The continued success of efforts to reduce the global malaria burden will require sustained funding for interventions specifically targeting Plasmodium vivax. The optimal use of limited financial resources necessitates cost and cost-effectiveness analyses of strategies for diagnosing and treating P. vivax and vector control tools. Herein, we review the existing published evidence on the costs and cost-effectiveness of interventions for controlling P. vivax, identifying nine studies focused on diagnosis and treatment and seven studies focused on vector control. Although many of the results from the much more extensive P. falciparum literature can be applied to P. vivax, it is not always possible to extrapolate results from P. falciparum–specific cost-effectiveness analyses. Notably, there is a need for additional studies to evaluate the potential cost-effectiveness of radical cure with primaquine for the prevention of P. vivax relapses with glucose-6-phosphate dehydrogenase testing. PMID:28025283

  19. Index cost estimate based BIM method - Computational example for sports fields

    NASA Astrophysics Data System (ADS)

    Zima, Krzysztof

    2017-07-01

    The paper presents an example ofcost estimation in the early phase of the project. The fragment of relative database containing solution, descriptions, geometry of construction object and unit cost of sports facilities was shown. The Index Cost Estimate Based BIM method calculationswith use of Case Based Reasoning were presented, too. The article presentslocal and global similarity measurement and example of BIM based quantity takeoff process. The outcome of cost calculations based on CBR method was presented as a final result of calculations.

  20. Cloud Computing. Technology Briefing. Number 1

    ERIC Educational Resources Information Center

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  1. Cost effectiveness of alternative imaging strategies for the diagnosis of small-bowel Crohn's disease.

    PubMed

    Levesque, Barrett G; Cipriano, Lauren E; Chang, Steven L; Lee, Keane K; Owens, Douglas K; Garber, Alan M

    2010-03-01

    The cost effectiveness of alternative approaches to the diagnosis of small-bowel Crohn's disease is unknown. This study evaluates whether computed tomographic enterography (CTE) is a cost-effective alternative to small-bowel follow-through (SBFT) and whether capsule endoscopy is a cost-effective third test in patients in whom a high suspicion of disease remains after 2 previous negative tests. A decision-analytic model was developed to compare the lifetime costs and benefits of each diagnostic strategy. Patients were considered with low (20%) and high (75%) pretest probability of small-bowel Crohn's disease. Effectiveness was measured in quality-adjusted life-years (QALYs) gained. Parameter assumptions were tested with sensitivity analyses. With a moderate to high pretest probability of small-bowel Crohn's disease, and a higher likelihood of isolated jejunal disease, follow-up evaluation with CTE has an incremental cost-effectiveness ratio of less than $54,000/QALY-gained compared with SBFT. The addition of capsule endoscopy after ileocolonoscopy and negative CTE or SBFT costs greater than $500,000 per QALY-gained in all scenarios. Results were not sensitive to costs of tests or complications but were sensitive to test accuracies. The cost effectiveness of strategies depends critically on the pretest probability of Crohn's disease and if the terminal ileum is examined at ileocolonoscopy. CTE is a cost-effective alternative to SBFT in patients with moderate to high suspicion of small-bowel Crohn's disease. The addition of capsule endoscopy as a third test is not a cost-effective third test, even in patients with high pretest probability of disease. Copyright 2010 AGA Institute. Published by Elsevier Inc. All rights reserved.

  2. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE PAGES

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...

    2015-02-19

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  3. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  4. Modeling methods for merging computational and experimental aerodynamic pressure data

    NASA Astrophysics Data System (ADS)

    Haderlie, Jacob C.

    This research describes a process to model surface pressure data sets as a function of wing geometry from computational and wind tunnel sources and then merge them into a single predicted value. The described merging process will enable engineers to integrate these data sets with the goal of utilizing the advantages of each data source while overcoming the limitations of both; this provides a single, combined data set to support analysis and design. The main challenge with this process is accurately representing each data source everywhere on the wing. Additionally, this effort demonstrates methods to model wind tunnel pressure data as a function of angle of attack as an initial step towards a merging process that uses both location on the wing and flow conditions (e.g., angle of attack, flow velocity or Reynold's number) as independent variables. This surrogate model of pressure as a function of angle of attack can be useful for engineers that need to predict the location of zero-order discontinuities, e.g., flow separation or normal shocks. Because, to the author's best knowledge, there is no published, well-established merging method for aerodynamic pressure data (here, the coefficient of pressure Cp), this work identifies promising modeling and merging methods, and then makes a critical comparison of these methods. Surrogate models represent the pressure data for both data sets. Cubic B-spline surrogate models represent the computational simulation results. Machine learning and multi-fidelity surrogate models represent the experimental data. This research compares three surrogates for the experimental data (sequential--a.k.a. online--Gaussian processes, batch Gaussian processes, and multi-fidelity additive corrector) on the merits of accuracy and computational cost. The Gaussian process (GP) methods employ cubic B-spline CFD surrogates as a model basis function to build a surrogate model of the WT data, and this usage of the CFD surrogate in building the WT

  5. Cutting Transportation Costs.

    ERIC Educational Resources Information Center

    Lewis, Barbara

    1982-01-01

    Beginning on the front cover, this article tells how school districts are reducing their transportation costs. Particularly effective measures include the use of computers for bus maintenance and scheduling, school board ownership of buses, and the conversion of gasoline-powered buses to alternative fuels. (Author/MLF)

  6. Trade-off between speed and cost in shortcuts to adiabaticity

    NASA Astrophysics Data System (ADS)

    Campbell, Steve

    Recent years have witnessed a surge of interest in the study of thermal nano-machines that are capable of converting disordered forms of energy into useful work. It has been shown for both classical and quantum systems that external drivings can allow a system to evolve adiabatically even when driven in finite time, a technique commonly known as shortcuts to adiabaticity. It was suggested to use such external drivings to render the unitary processes of a thermodynamic cycle quantum adiabatic, while being performed in finite time. However, implementing an additional external driving requires resources that should be accounted for. Furthermore, and in line with natural intuition, these transformations should not be achievable in arbitrarily short times. First, we will present a computable measure of the cost of a shortcut to adiabaticity. Using this, we then examine the speed with which a quantum system can be driven. As a main result, we will establish a rigorous link between this speed, the quantum speed limit, and the (energetic) cost of implementing such a shortcut to adiabaticity. Interestingly, this link elucidates a trade-off between speed and cost, namely that instantaneous manipulation is impossible as it requires an infinite cost.

  7. A novel cost-effective computer-assisted imaging technology for accurate placement of thoracic pedicle screws.

    PubMed

    Abe, Yuichiro; Ito, Manabu; Abumi, Kuniyoshi; Kotani, Yoshihisa; Sudo, Hideki; Minami, Akio

    2011-11-01

    region in corrective surgery for AIS. The authors found that 3D-VG TIPS worked on a consumer-class computer and easily visualized the ideal entry point and trajectory of PSs in any operating theater without costly special equipment. This new technique is suitable for preoperative planning and intraoperative guidance when performing reconstructive surgery with PSs.

  8. Fast mode decision based on human noticeable luminance difference and rate distortion cost for H.264/AVC

    NASA Astrophysics Data System (ADS)

    Li, Mian-Shiuan; Chen, Mei-Juan; Tai, Kuang-Han; Sue, Kuen-Liang

    2013-12-01

    This article proposes a fast mode decision algorithm based on the correlation of the just-noticeable-difference (JND) and the rate distortion cost (RD cost) to reduce the computational complexity of H.264/AVC. First, the relationship between the average RD cost and the number of JND pixels is established by Gaussian distributions. Thus, the RD cost of the Inter 16 × 16 mode is compared with the predicted thresholds from these models for fast mode selection. In addition, we use the image content, the residual data, and JND visual model for horizontal/vertical detection, and then utilize the result to predict the partition in a macroblock. From the experimental results, a greater time saving can be achieved while the proposed algorithm also maintains performance and quality effectively.

  9. From Phonomecanocardiography to Phonocardiography computer aided

    NASA Astrophysics Data System (ADS)

    Granados, J.; Tavera, F.; López, G.; Velázquez, J. M.; Hernández, R. T.; López, G. A.

    2017-01-01

    Due to lack of training doctors to identify many of the disorders in the heart by conventional listening, it is necessary to add an objective and methodological analysis to support this technique. In order to obtain information of the performance of the heart to be able to diagnose heart disease through a simple, cost-effective procedure by means of a data acquisition system, we have obtained Phonocardiograms (PCG), which are images of the sounds emitted by the heart. A program of acoustic, visual and artificial vision recognition was elaborated to interpret them. Based on the results of previous research of cardiologists a code of interpretation of PCG and associated diseases was elaborated. Also a site, within the university campus, of experimental sampling of cardiac data was created. Phonocardiography computer-aided is a viable and low cost procedure which provides additional medical information to make a diagnosis of complex heart diseases. We show some previous results.

  10. Costs and Cost-Effectiveness of Plasmodium vivax Control.

    PubMed

    White, Michael T; Yeung, Shunmay; Patouillard, Edith; Cibulskis, Richard

    2016-12-28

    The continued success of efforts to reduce the global malaria burden will require sustained funding for interventions specifically targeting Plasmodium vivax The optimal use of limited financial resources necessitates cost and cost-effectiveness analyses of strategies for diagnosing and treating P. vivax and vector control tools. Herein, we review the existing published evidence on the costs and cost-effectiveness of interventions for controlling P. vivax, identifying nine studies focused on diagnosis and treatment and seven studies focused on vector control. Although many of the results from the much more extensive P. falciparum literature can be applied to P. vivax, it is not always possible to extrapolate results from P. falciparum-specific cost-effectiveness analyses. Notably, there is a need for additional studies to evaluate the potential cost-effectiveness of radical cure with primaquine for the prevention of P. vivax relapses with glucose-6-phosphate dehydrogenase testing. © The American Society of Tropical Medicine and Hygiene.

  11. The Cost of CAI: A Matter of Assumptions.

    ERIC Educational Resources Information Center

    Kearsley, Greg P.

    Cost estimates for Computer Assisted Instruction (CAI) depend crucially upon the particular assumptions made about the components of the system to be included in the costs, the expected lifetime of the system and courseware, and the anticipated student utilization of the system/courseware. The cost estimates of three currently operational systems…

  12. Additive manufacturing of titanium alloys in the biomedical field: processes, properties and applications.

    PubMed

    Trevisan, Francesco; Calignano, Flaviana; Aversa, Alberta; Marchese, Giulio; Lombardi, Mariangela; Biamino, Sara; Ugues, Daniele; Manfredi, Diego

    2018-04-01

    The mechanical properties and biocompatibility of titanium alloy medical devices and implants produced by additive manufacturing (AM) technologies - in particular, selective laser melting (SLM), electron beam melting (EBM) and laser metal deposition (LMD) - have been investigated by several researchers demonstrating how these innovative processes are able to fulfil medical requirements for clinical applications. This work reviews the advantages given by these technologies, which include the possibility to create porous complex structures to improve osseointegration and mechanical properties (best match with the modulus of elasticity of local bone), to lower processing costs, to produce custom-made implants according to the data for the patient acquired via computed tomography and to reduce waste.

  13. On computational methods for crashworthiness

    NASA Technical Reports Server (NTRS)

    Belytschko, T.

    1992-01-01

    The evolution of computational methods for crashworthiness and related fields is described and linked with the decreasing cost of computational resources and with improvements in computation methodologies. The latter includes more effective time integration procedures and more efficient elements. Some recent developments in methodologies and future trends are also summarized. These include multi-time step integration (or subcycling), further improvements in elements, adaptive meshes, and the exploitation of parallel computers.

  14. The science of computing - Parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  15. Can a costly intervention be cost-effective?: An analysis of violence prevention.

    PubMed

    Foster, E Michael; Jones, Damon

    2006-11-01

    To examine the cost-effectiveness of the Fast Track intervention, a multi-year, multi-component intervention designed to reduce violence among at-risk children. A previous report documented the favorable effect of intervention on the highest-risk group of ninth-graders diagnosed with conduct disorder, as well as self-reported delinquency. The current report addressed the cost-effectiveness of the intervention for these measures of program impact. Costs of the intervention were estimated using program budgets. Incremental cost-effectiveness ratios were computed to determine the cost per unit of improvement in the 3 outcomes measured in the 10th year of the study. Examination of the total sample showed that the intervention was not cost-effective at likely levels of policymakers' willingness to pay for the key outcomes. Subsequent analysis of those most at risk, however, showed that the intervention likely was cost-effective given specified willingness-to-pay criteria. Results indicate that the intervention is cost-effective for the children at highest risk. From a policy standpoint, this finding is encouraging because such children are likely to generate higher costs for society over their lifetimes. However, substantial barriers to cost-effectiveness remain, such as the ability to effectively identify and recruit such higher-risk children in future implementations.

  16. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  17. Usability of a Low-Cost Head Tracking Computer Access Method following Stroke.

    PubMed

    Mah, Jasmine; Jutai, Jeffrey W; Finestone, Hillel; Mckee, Hilary; Carter, Melanie

    2015-01-01

    Assistive technology devices for computer access can facilitate social reintegration and promote independence for people who have had a stroke. This work describes the exploration of the usefulness and acceptability of a new computer access device called the Nouse™ (Nose-as-mouse). The device uses standard webcam and video recognition algorithms to map the movement of the user's nose to a computer cursor, thereby allowing hands-free computer operation. Ten participants receiving in- or outpatient stroke rehabilitation completed a series of standardized and everyday computer tasks using the Nouse™ and then completed a device usability questionnaire. Task completion rates were high (90%) for computer activities only in the absence of time constraints. Most of the participants were satisfied with ease of use (70%) and liked using the Nouse™ (60%), indicating they could resume most of their usual computer activities apart from word-processing using the device. The findings suggest that hands-free computer access devices like the Nouse™ may be an option for people who experience upper motor impairment caused by stroke and are highly motivated to resume personal computing. More research is necessary to further evaluate the effectiveness of this technology, especially in relation to other computer access assistive technology devices.

  18. Performance limits and trade-offs in entropy-driven biochemical computers.

    PubMed

    Chu, Dominique

    2018-04-14

    It is now widely accepted that biochemical reaction networks can perform computations. Examples are kinetic proof reading, gene regulation, or signalling networks. For many of these systems it was found that their computational performance is limited by a trade-off between the metabolic cost, the speed and the accuracy of the computation. In order to gain insight into the origins of these trade-offs, we consider entropy-driven computers as a model of biochemical computation. Using tools from stochastic thermodynamics, we show that entropy-driven computation is subject to a trade-off between accuracy and metabolic cost, but does not involve time-trade-offs. Time trade-offs appear when it is taken into account that the result of the computation needs to be measured in order to be known. We argue that this measurement process, although usually ignored, is a major contributor to the cost of biochemical computation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Cost-effectiveness of Standard vs. a Navigated Intervention on Colorectal Cancer Screening Use in Primary Care

    PubMed Central

    Lairson, David; DiCarlo, Melissa; Deshmuk, Ashish A.; Fagan, Heather B.; Sifri, Randa; Katurakes, Nora; Cocroft, James; Sendecki, Jocelyn; Swan, Heidi; Vernon, Sally W.; Myers, Ronald E.

    2014-01-01

    Background Colorectal cancer (CRC) screening is cost-effective but underutilized. This study aimed to determine the cost-effectiveness of mailed standard intervention (SI) and tailored navigation interventions (TNI) to increase CRC screening use in the context of a randomized trial among primary care patients. Methods Participants (n=945) were randomized either to a usual care Control Group (n=317), SI Group (n=316), or TNI Group (n=312). The SI Group was sent both colonoscopy instructions and stool blood tests irrespective of baseline preference. TNI Group participants were sent instructions for scheduling a colonoscopy, a stool blood test, or both based on their test preference as determined at baseline, and then received a navigation telephone call. Activity cost estimation was used to determine the cost of each intervention and compute incremental cost-effectiveness ratios . Statistical uncertainty within the base case was assessed with 95 percent confidence intervals derived from net benefit regression analysis. Effects of uncertain parameters such as the cost of planning, training, and involvement of those receiving “investigator salaries” were assessed with sensitivity analyses. Results Program costs of the SI were $167 per participant. Average cost of the TNI was $289 per participant. Conclusion The TNI was more effective than the SI, but substantially increased the cost per additional person screened. Decision-makers need to consider cost structure, level of planning, and training required to implement these two intervention strategies, and their willingness to pay for additional persons screened, to determine whether tailored navigation would be justified and feasible. PMID:24435411

  20. How rebates, copayments, and administration costs affect the cost-effectiveness of osteoporosis therapies.

    PubMed

    Ferko, Nicole C; Borisova, Natalie; Airia, Parisa; Grima, Daniel T; Thompson, Melissa F

    2012-11-01

    Because of rising drug expenditures, cost considerations have become essential, necessitating the requirement for cost-effectiveness analyses for managed care organizations (MCOs). The study objective is to examine the impact of various drug-cost components, in addition to wholesale acquisition cost (WAC), on the cost-effectiveness of osteoporosis therapies. A Markov model of osteoporosis was used to exemplify different drug cost scenarios. We examined the effect of varying rebates for oral bisphosphonates--risedronate and ibandronate--as well as considering the impact of varying copayments and administration costs for intravenous zoledronate. The population modeled was 1,000 American women, > or = 50 years with osteoporosis. Patients were followed for 1 year to reflect an annual budget review of formularies by MCOs. The cost of therapy was based on an adjusted WAC, and is referred to as net drug cost. The total annual cost incurred by an MCO for each drug regimen was calculated using the net drug cost and fracture cost. We estimated cost on a quality adjusted life year (QALY) basis. When considering different rebates, results for risedronate versus ibandronate vary from cost-savings (i.e., costs less and more effective) to approximately $70,000 per QALY. With no risedronate rebate, an ibandronate rebate of approximately 65% is required before cost per QALY surpasses $50,000. With rebates greater than 25% for risedronate, irrespective of ibandronate rebates, results become cost-saving. Results also showed the magnitude of cost savings to the MCO varied by as much as 65% when considering no administration cost and the highest coinsurance rate for zoledronate. Our study showed that cost-effectiveness varies considerably when factors in addition to the WAC are considered. This paper provides recommendations for pharmaceutical manufacturers and MCOs when developing and interpreting such analyses.

  1. The Effect of Emphasizing Mathematical Structure in the Acquisition of Whole Number Computation Skills (Addition and Subtraction) By Seven- and Eight-Year Olds: A Clinical Investigation.

    ERIC Educational Resources Information Center

    Uprichard, A. Edward; Collura, Carolyn

    This investigation sought to determine the effect of emphasizing mathematical structure in the acquisition of computational skills by seven- and eight-year-olds. The meaningful development-of-structure approach emphasized closure, commutativity, associativity, and the identity element of addition; the inverse relationship between addition and…

  2. Design and development of data glove based on printed polymeric sensors and Zigbee networks for Human-Computer Interface.

    PubMed

    Tongrod, Nattapong; Lokavee, Shongpun; Watthanawisuth, Natthapol; Tuantranont, Adisorn; Kerdcharoen, Teerakiat

    2013-03-01

    Current trends in Human-Computer Interface (HCI) have brought on a wave of new consumer devices that can track the motion of our hands. These devices have enabled more natural interfaces with computer applications. Data gloves are commonly used as input devices, equipped with sensors that detect the movements of hands and communication unit that interfaces those movements with a computer. Unfortunately, the high cost of sensor technology inevitably puts some burden to most general users. In this research, we have proposed a low-cost data glove concept based on printed polymeric sensor to make pressure and bending sensors fabricated by a consumer ink-jet printer. These sensors were realized using a conductive polymer (poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) [PEDOT:PSS]) thin film printed on glossy photo paper. Performance of these sensors can be enhanced by addition of dimethyl sulfoxide (DMSO) into the aqueous dispersion of PEDOT:PSS. The concept of surface resistance was successfully adopted for the design and fabrication of sensors. To demonstrate the printed sensors, we constructed a data glove using such sensors and developed software for real time hand tracking. Wireless networks based on low-cost Zigbee technology were used to transfer data from the glove to a computer. To our knowledge, this is the first report on low cost data glove based on paper pressure sensors. This low cost implementation of both sensors and communication network as proposed in this paper should pave the way toward a widespread implementation of data glove for real-time hand tracking applications.

  3. A Low Cost Structurally Optimized Design for Diverse Filter Types

    PubMed Central

    Kazmi, Majida; Aziz, Arshad; Akhtar, Pervez; Ikram, Nassar

    2016-01-01

    A wide range of image processing applications deploys two dimensional (2D)-filters for performing diversified tasks such as image enhancement, edge detection, noise suppression, multi scale decomposition and compression etc. All of these tasks require multiple type of 2D-filters simultaneously to acquire the desired results. The resource hungry conventional approach is not a viable option for implementing these computationally intensive 2D-filters especially in a resource constraint environment. Thus it calls for optimized solutions. Mostly the optimization of these filters are based on exploiting structural properties. A common shortcoming of all previously reported optimized approaches is their restricted applicability only for a specific filter type. These narrow scoped solutions completely disregard the versatility attribute of advanced image processing applications and in turn offset their effectiveness while implementing a complete application. This paper presents an efficient framework which exploits the structural properties of 2D-filters for effectually reducing its computational cost along with an added advantage of versatility for supporting diverse filter types. A composite symmetric filter structure is introduced which exploits the identities of quadrant and circular T-symmetries in two distinct filter regions simultaneously. These T-symmetries effectually reduce the number of filter coefficients and consequently its multipliers count. The proposed framework at the same time empowers this composite filter structure with additional capabilities of realizing all of its Ψ-symmetry based subtypes and also its special asymmetric filters case. The two-fold optimized framework thus reduces filter computational cost up to 75% as compared to the conventional approach as well as its versatility attribute not only supports diverse filter types but also offers further cost reduction via resource sharing for sequential implementation of diversified image

  4. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  5. Life-cycle costs of high-performance cells

    NASA Technical Reports Server (NTRS)

    Daniel, R.; Burger, D.; Reiter, L.

    1985-01-01

    A life cycle cost analysis of high efficiency cells was presented. Although high efficiency cells produce more power, they also cost more to make and are more susceptible to array hot-spot heating. Three different computer analysis programs were used: SAMICS (solar array manufacturing industry costing standards), PVARRAY (an array failure mode/degradation simulator), and LCP (lifetime cost and performance). The high efficiency cell modules were found to be more economical in this study, but parallel redundancy is recommended.

  6. Effects on costs of frontline diagnostic evaluation in patients suspected of angina: coronary computed tomography angiography vs. conventional ischaemia testing.

    PubMed

    Nielsen, Lene H; Olsen, Jens; Markenvard, John; Jensen, Jesper M; Nørgaard, Bjarne L

    2013-05-01

    The aim of this study was to investigate in patients with stable angina the effects on costs of frontline diagnostics by exercise-stress testing (ex-test) vs. coronary computed tomography angiography (CTA). In two coronary units at Lillebaelt Hospital, Denmark, 498 patients were identified in whom either ex-test (n = 247) or CTA (n = 251) were applied as the frontline diagnostic strategy in symptomatic patients with a low-intermediate pre-test probability of coronary artery disease (CAD). During 12 months of follow-up, death, myocardial infarction and costs associated with downstream diagnostic utilization (DTU), treatment, ambulatory visits, and hospitalizations were registered. There was no difference between cohorts in demographic characteristics or the pre-test probability of significant CAD. The mean (SD) age was 56 (11) years; 52% were men; and 96% were at low-intermediate pre-test probability of CAD. All serious cardiac events (n = 3) during follow-up occurred in patients with a negative ex-test result. Mean costs per patient associated with DTU, ambulatory visits, and cardiovascular medication were significantly higher in the ex-test than in the CTA group. The mean (SD) total costs per patient at the end of the follow-up were 14% lower in the CTA group than in the ex-test group, € 1510 (3474) vs. €1777 (3746) (P = 0.03). Diagnostic assessment of symptomatic patients with a low-intermediate probability of CAD by CTA incurred lower costs when compared with the ex-test. These findings need confirmation in future prospective trials.

  7. RVU costing applications.

    PubMed

    Berlin, M F; Faber, B P; Berlin, L M; Budzynski, M R

    1997-11-01

    Relative value unit (RVU) cost accounting which uses the resource-based relative value scale (RBRVS), can be used to determine the cost to produce given services and determine appropriate physician fees. The calculations derived from RVU costing have additional applications, such as analyzing fee schedules, evaluating the profitability of third-party payer reimbursement, calculating a floor capitation rate, and allocating capitation payments within the group. The ability to produce this information can help group practice administrators determine ways to manage the cost of providing services, set more realistic fees, and negotiate more profitable contracts.

  8. 42 CFR 412.29 - Excluded rehabilitation units: Additional requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Costs and Inpatient Capital-Related Costs § 412.29 Excluded rehabilitation units: Additional..., social services, psychological services (including neuropsychological services), and orthotic and...

  9. Computerized LCC/ORLA methodology. [Life cycle cost/optimum repair level analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, J.T.

    1979-01-01

    The effort by Sandia Laboratories in developing CDC6600 computer programs for Optimum Repair Level Analysis (ORLA) and Life Cycle Cost (LCC) analysis is described. Investigation of the three repair-level strategies referenced in AFLCM/AFSCM 800-4 (base discard of subassemblies, base repair of subassemblies, and depot repair of subassemblies) was expanded to include an additional three repair-level strategies (base discard of complete assemblies and, upon shipment of complete assemblies to the depot, depot repair of assemblies by subassembly repair, and depot repair of assemblies by subassembly discard). The expanded ORLA was used directly in an LCC model that was procedurally altered tomore » accommodate the ORLA input data. Available from the LCC computer run was an LCC value corresponding to the strategy chosen from the ORLA. 2 figures.« less

  10. DOD Civilian and Contractor Workforces: Additional Cost Savings Data and Efficiencies Plan are Needed

    DTIC Science & Technology

    2016-10-01

    reductions reported in average strength bNumber of reductions reported in full-time equivalents Note: DOD costs savings provided for the prior FY are...comparing costs from FY 2012 to FY 2017, and not each year in between. Further, officials stated that DOD did not include full- time equivalents ...Application FTE Full-time Equivalent NDAA National Defense Authorization Act This is a work of the U.S. government and is not subject to copyright

  11. Applying Activity Based Costing (ABC) Method to Calculate Cost Price in Hospital and Remedy Services

    PubMed Central

    Rajabi, A; Dabiri, A

    2012-01-01

    Background Activity Based Costing (ABC) is one of the new methods began appearing as a costing methodology in the 1990’s. It calculates cost price by determining the usage of resources. In this study, ABC method was used for calculating cost price of remedial services in hospitals. Methods: To apply ABC method, Shahid Faghihi Hospital was selected. First, hospital units were divided into three main departments: administrative, diagnostic, and hospitalized. Second, activity centers were defined by the activity analysis method. Third, costs of administrative activity centers were allocated into diagnostic and operational departments based on the cost driver. Finally, with regard to the usage of cost objectives from services of activity centers, the cost price of medical services was calculated. Results: The cost price from ABC method significantly differs from tariff method. In addition, high amount of indirect costs in the hospital indicates that capacities of resources are not used properly. Conclusion: Cost price of remedial services with tariff method is not properly calculated when compared with ABC method. ABC calculates cost price by applying suitable mechanisms but tariff method is based on the fixed price. In addition, ABC represents useful information about the amount and combination of cost price services. PMID:23113171

  12. Adsorption of molecular additive onto lead halide perovskite surfaces: A computational study on Lewis base thiophene additive passivation

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Yu, Fengxi; Chen, Lihong; Li, Jingfa

    2018-06-01

    Organic additives, such as the Lewis base thiophene, have been successfully applied to passivate halide perovskite surfaces, improving the stability and properties of perovskite devices based on CH3NH3PbI3. Yet, the detailed nanostructure of the perovskite surface passivated by additives and the mechanisms of such passivation are not well understood. This study presents a nanoscopic view on the interfacial structure of an additive/perovskite interface, consisting of a Lewis base thiophene molecular additive and a lead halide perovskite surface substrate, providing insights on the mechanisms that molecular additives can passivate the halide perovskite surfaces and enhance the perovskite-based device performance. Molecular dynamics study on the interactions between water molecules and the perovskite surfaces passivated by the investigated additive reveal the effectiveness of employing the molecular additives to improve the stability of the halide perovskite materials. The additive/perovskite surface system is further probed via molecular engineering the perovskite surfaces. This study reveals the nanoscopic structure-property relationships of the halide perovskite surface passivated by molecular additives, which helps the fundamental understanding of the surface/interface engineering strategies for the development of halide perovskite based devices.

  13. Mechanism and enantioselectivity in palladium-catalyzed conjugate addition of arylboronic acids to β-substituted cyclic enones: insights from computation and experiment.

    PubMed

    Holder, Jeffrey C; Zou, Lufeng; Marziale, Alexander N; Liu, Peng; Lan, Yu; Gatti, Michele; Kikushima, Kotaro; Houk, K N; Stoltz, Brian M

    2013-10-09

    Enantioselective conjugate additions of arylboronic acids to β-substituted cyclic enones have been previously reported from our laboratories. Air- and moisture-tolerant conditions were achieved with a catalyst derived in situ from palladium(II) trifluoroacetate and the chiral ligand (S)-t-BuPyOx. We now report a combined experimental and computational investigation on the mechanism, the nature of the active catalyst, the origins of the enantioselectivity, and the stereoelectronic effects of the ligand and the substrates of this transformation. Enantioselectivity is controlled primarily by steric repulsions between the t-Bu group of the chiral ligand and the α-methylene hydrogens of the enone substrate in the enantiodetermining carbopalladation step. Computations indicate that the reaction occurs via formation of a cationic arylpalladium(II) species, and subsequent carbopalladation of the enone olefin forms the key carbon-carbon bond. Studies of nonlinear effects and stoichiometric and catalytic reactions of isolated (PyOx)Pd(Ph)I complexes show that a monomeric arylpalladium-ligand complex is the active species in the selectivity-determining step. The addition of water and ammonium hexafluorophosphate synergistically increases the rate of the reaction, corroborating the hypothesis that a cationic palladium species is involved in the reaction pathway. These additives also allow the reaction to be performed at 40 °C and facilitate an expanded substrate scope.

  14. Mechanism and Enantioselectivity in Palladium-Catalyzed Conjugate Addition of Arylboronic Acids to β-Substituted Cyclic Enones: Insights from Computation and Experiment

    PubMed Central

    Holder, Jeffrey C.; Zou, Lufeng; Marziale, Alexander N.; Liu, Peng; Lan, Yu; Gatti, Michele; Kikushima, Kotaro; Houk, K. N.; Stoltz, Brian M.

    2013-01-01

    Enantioselective conjugate additions of arylboronic acids to β-substituted cyclic enones have been reported previously from our laboratories. Air and moisture tolerant conditions were achieved with a catalyst derived in situ from palladium(II) trifluoroacetate and the chiral ligand (S)-t-BuPyOx. We now report a combined experimental and computational investigation on the mechanism, the nature of the active catalyst, the origins of the enantioselectivity, and the stereoelectronic effects of the ligand and the substrates of this transformation. Enantioselectivity is controlled primarily by steric repulsions between the t-Bu group of the chiral ligand and the α-methylene hydrogens of the enone substrate in the enantiodetermining carbopalladation step. Computations indicate that the reaction occurs via formation of a cationic arylpalladium(II) species, and subsequent carbopalladation of the enone olefin forms the key carbon-carbon bond. Studies of non-linear effects and stoichiometric and catalytic reactions of isolated (PyOx)Pd(Ph)I complexes show that a monomeric arylpalladium-ligand complex is the active species in the selectivity-determining step. The addition of water and ammonium hexafluorophosphate synergistically increases the rate of the reaction, corroborating the hypothesis that a cationic palladium species is involved in the reaction pathway. These additives also allow the reaction to be performed at 40 °C and facilitate an expanded substrate scope. PMID:24028424

  15. The Winds, and the Costs, of Change.

    ERIC Educational Resources Information Center

    Moran, Charles

    1993-01-01

    Outlines the drastic changes involved in introducing computer technology into the teaching of writing, particularly with regard to the overarching costs of such changes and the sense of dislocation that often accompanies them. Discusses problems that teachers typically have when shifting to computer instruction. (HB)

  16. omniClassifier: a Desktop Grid Computing System for Big Data Prediction Modeling

    PubMed Central

    Phan, John H.; Kothari, Sonal; Wang, May D.

    2016-01-01

    Robust prediction models are important for numerous science, engineering, and biomedical applications. However, best-practice procedures for optimizing prediction models can be computationally complex, especially when choosing models from among hundreds or thousands of parameter choices. Computational complexity has further increased with the growth of data in these fields, concurrent with the era of “Big Data”. Grid computing is a potential solution to the computational challenges of Big Data. Desktop grid computing, which uses idle CPU cycles of commodity desktop machines, coupled with commercial cloud computing resources can enable research labs to gain easier and more cost effective access to vast computing resources. We have developed omniClassifier, a multi-purpose prediction modeling application that provides researchers with a tool for conducting machine learning research within the guidelines of recommended best-practices. omniClassifier is implemented as a desktop grid computing system using the Berkeley Open Infrastructure for Network Computing (BOINC) middleware. In addition to describing implementation details, we use various gene expression datasets to demonstrate the potential scalability of omniClassifier for efficient and robust Big Data prediction modeling. A prototype of omniClassifier can be accessed at http://omniclassifier.bme.gatech.edu/. PMID:27532062

  17. Additive Manufacturing for Affordable Rocket Engines

    NASA Technical Reports Server (NTRS)

    West, Brian; Robertson, Elizabeth; Osborne, Robin; Calvert, Marty

    2016-01-01

    Additive manufacturing (also known as 3D printing) technology has the potential to drastically reduce costs and lead times associated with the development of complex liquid rocket engine systems. NASA is using 3D printing to manufacture rocket engine components including augmented spark igniters, injectors, turbopumps, and valves. NASA is advancing the process to certify these components for flight. Success Story: MSFC has been developing rocket 3D-printing technology using the Selective Laser Melting (SLM) process. Over the last several years, NASA has built and tested several injectors and combustion chambers. Recently, MSFC has 3D printed an augmented spark igniter for potential use the RS-25 engines that will be used on the Space Launch System. The new design is expected to reduce the cost of the igniter by a factor of four. MSFC has also 3D printed and tested a liquid hydrogen turbopump for potential use on an Upper Stage Engine. Additive manufacturing of the turbopump resulted in a 45% part count reduction. To understanding how the 3D printed parts perform and to certify them for flight, MSFC built a breadboard liquid rocket engine using additive manufactured components including injectors, turbomachinery, and valves. The liquid rocket engine was tested seven times in 2016 using liquid oxygen and liquid hydrogen. In addition to exposing the hardware to harsh environments, engineers learned to design for the new manufacturing technique, taking advantage of its capabilities and gaining awareness of its limitations. Benefit: The 3D-printing technology promises reduced cost and schedule for rocket engines. Cost is a function of complexity, and the most complicated features provide the largest opportunities for cost reductions. This is especially true where brazes or welds can be eliminated. The drastic reduction in part count achievable with 3D printing creates a waterfall effect that reduces the number of processes and drawings, decreases the amount of touch

  18. Computer software to estimate timber harvesting system production, cost, and revenue

    Treesearch

    Dr. John E. Baumgras; Dr. Chris B. LeDoux

    1992-01-01

    Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...

  19. Low-cost data analysis systems for processing multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Whitely, S. L.

    1976-01-01

    The basic hardware and software requirements are described for four low cost analysis systems for computer generated land use maps. The data analysis systems consist of an image display system, a small digital computer, and an output recording device. Software is described together with some of the display and recording devices, and typical costs are cited. Computer requirements are given, and two approaches are described for converting black-white film and electrostatic printer output to inexpensive color output products. Examples of output products are shown.

  20. Computers for Your Classroom: CAI and CMI.

    ERIC Educational Resources Information Center

    Thomas, David B.; Bozeman, William C.

    1981-01-01

    The availability of compact, low-cost computer systems provides a means of assisting classroom teachers in the performance of their duties. Computer-assisted instruction (CAI) and computer-managed instruction (CMI) are two applications of computer technology with which school administrators should become familiar. CAI is a teaching medium in which…

  1. Radiation Tolerant, FPGA-Based SmallSat Computer System

    NASA Technical Reports Server (NTRS)

    LaMeres, Brock J.; Crum, Gary A.; Martinez, Andres; Petro, Andrew

    2015-01-01

    The Radiation Tolerant, FPGA-based SmallSat Computer System (RadSat) computing platform exploits a commercial off-the-shelf (COTS) Field Programmable Gate Array (FPGA) with real-time partial reconfiguration to provide increased performance, power efficiency and radiation tolerance at a fraction of the cost of existing radiation hardened computing solutions. This technology is ideal for small spacecraft that require state-of-the-art on-board processing in harsh radiation environments but where using radiation hardened processors is cost prohibitive.

  2. Computer Based Education.

    ERIC Educational Resources Information Center

    Fauley, Franz E.

    1980-01-01

    A case study of what one company did to increase the productivity of its sales force and generate cost savings by using computer-assisted instruction to teach salespeople at regional offices. (Editor)

  3. Battlefield awareness computers: the engine of battlefield digitization

    NASA Astrophysics Data System (ADS)

    Ho, Jackson; Chamseddine, Ahmad

    1997-06-01

    To modernize the army for the 21st century, the U.S. Army Digitization Office (ADO) initiated in 1995 the Force XXI Battle Command Brigade-and-Below (FBCB2) Applique program which became a centerpiece in the U.S. Army's master plan to win future information wars. The Applique team led by TRW fielded a 'tactical Internet' for Brigade and below command to demonstrate the advantages of 'shared situation awareness' and battlefield digitization in advanced war-fighting experiments (AWE) to be conducted in March 1997 at the Army's National Training Center in California. Computing Devices is designated the primary hardware developer for the militarized version of the battlefield awareness computers. The first generation of militarized battlefield awareness computer, designated as the V3 computer, was an integration of off-the-shelf components developed to meet the agressive delivery requirements of the Task Force XXI AWE. The design efficiency and cost effectiveness of the computer hardware were secondary in importance to delivery deadlines imposed by the March 1997 AWE. However, declining defense budgets will impose cost constraints on the Force XXI production hardware that can only be met by rigorous value engineering to further improve design optimization for battlefield awareness without compromising the level of reliability the military has come to expect in modern military hardened vetronics. To answer the Army's needs for a more cost effective computing solution, Computing Devices developed a second generation 'combat ready' battlefield awareness computer, designated the V3+, which is designed specifically to meet the upcoming demands of Force XXI (FBCB2) and beyond. The primary design objective is to achieve a technologically superior design, value engineered to strike an optimal balance between reliability, life cycle cost, and procurement cost. Recognizing that the diverse digitization demands of Force XXI cannot be adequately met by any one computer hardware

  4. Electricity from fossil fuels without CO2 emissions: assessing the costs of carbon dioxide capture and sequestration in U.S. electricity markets.

    PubMed

    Johnson, T L; Keith, D W

    2001-10-01

    The decoupling of fossil-fueled electricity production from atmospheric CO2 emissions via CO2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.

  5. Electricity from Fossil Fuels without CO2 Emissions: Assessing the Costs of Carbon Dioxide Capture and Sequestration in U.S. Electricity Markets.

    PubMed

    Johnson, Timothy L; Keith, David W

    2001-10-01

    The decoupling of fossil-fueled electricity production from atmospheric CO 2 emissions via CO 2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO 2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO 2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.

  6. Cloud computing for comparative genomics with windows azure platform.

    PubMed

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  7. The Health Care Costs of Violence against Women

    ERIC Educational Resources Information Center

    Kruse, Marie; Sorensen, Jan; Bronnum-Hansen, Henrik; Helweg-Larsen, Karin

    2011-01-01

    The aim of this study is to analyze the health care costs of violence against women. For the study, we used a register-based approach where we identified victims of violence and assessed their actual health care costs at individual level in a bottom-up analysis. Furthermore, we identified a reference population. We computed the attributable costs,…

  8. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  9. Environmental Cracking and Irradiation Resistant Stainless Steels by Additive Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rebak, Raul B.; Lou, Xiaoyuan

    Metal additive manufacturing (AM), or metal 3D printing is an emergent advanced manufacturing method that can create near net shape geometries directly from computer models. This technology can provide the capability to rapidly fabricate complex parts that may be required to enhance the integrity of reactor internals components. Such opportunities may be observed during a plant refueling outage and AM parts can be rapidly custom designed, manufactured and deployed within the outage interval. Additive manufacturing of stainless steel (SS) components can add business benefits on fast delivery on repair hardware, installation tooling, new design prototypes tests, etc. For the nuclearmore » industry, the supply chain is always an issue for reactor service. AM can provide through-life supply chain (40-60 years) for high-value low-volume components. In the meantime, the capability of generating complex geometries and functional gradient materials will improve the performance, reduce the overall component cost, plant asset management cost and increase the plant reliability by the improvement in materials performance in nuclear environments. While extensive work has been conducted regarding additively manufacturing of austenitic SS parts, most efforts focused only on basic attributes such as porosity, residual stress, basic tensile properties, along with components yield and process monitoring. Little work has been done to define and evaluate the material requirements for nuclear applications. Technical gaps exist, which limit this technology adoption in the nuclear industry, which includes high manufacturing cost, unknown risks, limited nuclear related data, lack of specification and qualification methods, and no prior business experience. The main objective of this program was to generate research data to address all these technical gaps and establish a commercial practice to use AM technology in the nuclear power industry. The detailed objectives are listed as follows

  10. PACE 2: Pricing and Cost Estimating Handbook

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.; Shepherd, T.

    1977-01-01

    An automatic data processing system to be used for the preparation of industrial engineering type manhour and material cost estimates has been established. This computer system has evolved into a highly versatile and highly flexible tool which significantly reduces computation time, eliminates computational errors, and reduces typing and reproduction time for estimators and pricers since all mathematical and clerical functions are automatic once basic inputs are derived.

  11. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  12. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  13. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  14. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  15. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 2 2013-10-01 2013-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  16. 47 CFR 32.2124 - General purpose computers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 2 2012-10-01 2012-10-01 false General purpose computers. 32.2124 Section 32.2124 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM... General purpose computers. (a) This account shall include the original cost of computers and peripheral...

  17. Performance Models for Split-execution Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S; McCaskey, Alex; Schrock, Jonathan

    Split-execution computing leverages the capabilities of multiple computational models to solve problems, but splitting program execution across different computational models incurs costs associated with the translation between domains. We analyze the performance of a split-execution computing system developed from conventional and quantum processing units (QPUs) by using behavioral models that track resource usage. We focus on asymmetric processing models built using conventional CPUs and a family of special-purpose QPUs that employ quantum computing principles. Our performance models account for the translation of a classical optimization problem into the physical representation required by the quantum processor while also accounting for hardwaremore » limitations and conventional processor speed and memory. We conclude that the bottleneck in this split-execution computing system lies at the quantum-classical interface and that the primary time cost is independent of quantum processor behavior.« less

  18. Cost optimization for buildings with hybrid ventilation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ji, Kun; Lu, Yan

    A method including: computing a total cost for a first zone in a building, wherein the total cost is equal to an actual energy cost of the first zone plus a thermal discomfort cost of the first zone; and heuristically optimizing the total cost to identify temperature setpoints for a mechanical heating/cooling system and a start time and an end time of the mechanical heating/cooling system, based on external weather data and occupancy data of the first zone.

  19. Costs Associated with Implementation of Computer-Assisted Clinical Decision Support System for Antenatal and Delivery Care: Case Study of Kassena-Nankana District of Northern Ghana

    PubMed Central

    Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Williams, John; Saronga, Happiness Pius; Tonchev, Pencho; Sauerborn, Rainer; Mensah, Nathan; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla

    2014-01-01

    Objective This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS) in selected health care centres in Ghana. Methods A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND). CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention) were collected for the period between 2009–2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs) and equipment costs (capital cost). We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost. Results Twenty-two trained CDSS users (at least 2 users per health centre) participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64%) and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death). The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272) was pre-intervention cost and intervention cost was 52% (US$12,044). Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917). When economic cost was considered, total cost of implementation was US$17,128–lower than the financial cost by 26.5%. Conclusions The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines

  20. Costs associated with implementation of computer-assisted clinical decision support system for antenatal and delivery care: case study of Kassena-Nankana district of northern Ghana.

    PubMed

    Dalaba, Maxwell Ayindenaba; Akweongo, Patricia; Williams, John; Saronga, Happiness Pius; Tonchev, Pencho; Sauerborn, Rainer; Mensah, Nathan; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla

    2014-01-01

    This study analyzed cost of implementing computer-assisted Clinical Decision Support System (CDSS) in selected health care centres in Ghana. A descriptive cross sectional study was conducted in the Kassena-Nankana district (KND). CDSS was deployed in selected health centres in KND as an intervention to manage patients attending antenatal clinics and the labour ward. The CDSS users were mainly nurses who were trained. Activities and associated costs involved in the implementation of CDSS (pre-intervention and intervention) were collected for the period between 2009-2013 from the provider perspective. The ingredients approach was used for the cost analysis. Costs were grouped into personnel, trainings, overheads (recurrent costs) and equipment costs (capital cost). We calculated cost without annualizing capital cost to represent financial cost and cost with annualizing capital costs to represent economic cost. Twenty-two trained CDSS users (at least 2 users per health centre) participated in the study. Between April 2012 and March 2013, users managed 5,595 antenatal clients and 872 labour clients using the CDSS. We observed a decrease in the proportion of complications during delivery (pre-intervention 10.74% versus post-intervention 9.64%) and a reduction in the number of maternal deaths (pre-intervention 4 deaths versus post-intervention 1 death). The overall financial cost of CDSS implementation was US$23,316, approximately US$1,060 per CDSS user trained. Of the total cost of implementation, 48% (US$11,272) was pre-intervention cost and intervention cost was 52% (US$12,044). Equipment costs accounted for the largest proportion of financial cost: 34% (US$7,917). When economic cost was considered, total cost of implementation was US$17,128-lower than the financial cost by 26.5%. The study provides useful information in the implementation of CDSS at health facilities to enhance health workers' adherence to practice guidelines and taking accurate decisions to improve