Sample records for cost model validation

  1. Modeling and validating the cost and clinical pathway of colorectal cancer.

    PubMed

    Joranger, Paal; Nesbakken, Arild; Hoff, Geir; Sorbye, Halfdan; Oshaug, Arne; Aas, Eline

    2015-02-01

    Cancer is a major cause of morbidity and mortality, and colorectal cancer (CRC) is the third most common cancer in the world. The estimated costs of CRC treatment vary considerably, and if CRC costs in a model are based on empirically estimated total costs of stage I, II, III, or IV treatments, then they lack some flexibility to capture future changes in CRC treatment. The purpose was 1) to describe how to model CRC costs and survival and 2) to validate the model in a transparent and reproducible way. We applied a semi-Markov model with 70 health states and tracked age and time since specific health states (using tunnels and 3-dimensional data matrix). The model parameters are based on an observational study at Oslo University Hospital (2049 CRC patients), the National Patient Register, literature, and expert opinion. The target population was patients diagnosed with CRC. The model followed the patients diagnosed with CRC from the age of 70 until death or 100 years. The study focused on the perspective of health care payers. The model was validated for face validity, internal and external validity, and cross-validity. The validation showed a satisfactory match with other models and empirical estimates for both cost and survival time, without any preceding calibration of the model. The model can be used to 1) address a range of CRC-related themes (general model) like survival and evaluation of the cost of treatment and prevention measures; 2) make predictions from intermediate to final outcomes; 3) estimate changes in resource use and costs due to changing guidelines; and 4) adjust for future changes in treatment and trends over time. The model is adaptable to other populations. © The Author(s) 2014.

  2. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  3. SAMICS Validation. SAMICS Support Study, Phase 3

    NASA Technical Reports Server (NTRS)

    1979-01-01

    SAMICS provides a consistent basis for estimating array costs and compares production technology costs. A review and a validation of the SAMICS model are reported. The review had the following purposes: (1) to test the computational validity of the computer model by comparison with preliminary hand calculations based on conventional cost estimating techniques; (2) to review and improve the accuracy of the cost relationships being used by the model: and (3) to provide an independent verification to users of the model's value in decision making for allocation of research and developement funds and for investment in manufacturing capacity. It is concluded that the SAMICS model is a flexible, accurate, and useful tool for managerial decision making.

  4. Online or on Campus: A Student Tertiary Education Cost Model Comparing the Two, with a Quality Proviso

    ERIC Educational Resources Information Center

    Ioakimidis, Marilou

    2007-01-01

    This paper presents the development and validation of a two-level hierarchical cost model for tertiary education, which enables prospective students to compare the total cost of attending a traditional Baccalaureate degree education with that of the same programme taken through distance e-learning. The model was validated by a sample of Greek…

  5. What does it cost to prevent on-duty firefighter cardiac events? A content valid method for calculating costs.

    PubMed

    Patterson, P Daniel; Suyama, Joe; Reis, Steven E; Weaver, Matthew D; Hostler, David

    2013-01-01

    Cardiac arrest is a leading cause of mortality among firefighters. We sought to develop a valid method for determining the costs of a workplace prevention program for firefighters. In 2012, we developed a draft framework using human resource accounting and in-depth interviews with experts in the firefighting and insurance industries. The interviews produced a draft cost model with 6 components and 26 subcomponents. In 2013, we randomly sampled 100 fire chiefs out of >7,400 affiliated with the International Association of Fire Chiefs. We used the Content Validity Index (CVI) to identify the content valid components of the draft cost model. This was accomplished by having fire chiefs rate the relevancy of cost components using a 4-point Likert scale (highly relevant to not relevant). We received complete survey data from 65 fire chiefs (65% response rate). We retained 5 components and 21 subcomponents based on CVI scores ≥0.70. The five main components include, (1) investment costs, (2) orientation and training costs, (3) medical and pharmaceutical costs, (4) education and continuing education costs, and (5) maintenance costs. Data from a diverse sample of fire chiefs has produced a content valid method for calculating the cost of a prevention program among firefighters.

  6. What Does It Cost to Prevent On-Duty Firefighter Cardiac Events? A Content Valid Method for Calculating Costs

    PubMed Central

    Patterson, P. Daniel; Suyama, Joe; Reis, Steven E.; Weaver, Matthew D.; Hostler, David

    2013-01-01

    Cardiac arrest is a leading cause of mortality among firefighters. We sought to develop a valid method for determining the costs of a workplace prevention program for firefighters. In 2012, we developed a draft framework using human resource accounting and in-depth interviews with experts in the firefighting and insurance industries. The interviews produced a draft cost model with 6 components and 26 subcomponents. In 2013, we randomly sampled 100 fire chiefs out of >7,400 affiliated with the International Association of Fire Chiefs. We used the Content Validity Index (CVI) to identify the content valid components of the draft cost model. This was accomplished by having fire chiefs rate the relevancy of cost components using a 4-point Likert scale (highly relevant to not relevant). We received complete survey data from 65 fire chiefs (65% response rate). We retained 5 components and 21 subcomponents based on CVI scores ≥0.70. The five main components include, (1) investment costs, (2) orientation and training costs, (3) medical and pharmaceutical costs, (4) education and continuing education costs, and (5) maintenance costs. Data from a diverse sample of fire chiefs has produced a content valid method for calculating the cost of a prevention program among firefighters. PMID:24455288

  7. Lessons Learned from a Cross-Model Validation between a Discrete Event Simulation Model and a Cohort State-Transition Model for Personalized Breast Cancer Treatment.

    PubMed

    Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe

    2016-04-01

    Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in ICERs because of changes in the set of dominated and nondominated strategies. © The Author(s) 2015.

  8. Development and application of Model of Resource Utilization, Costs, and Outcomes for Stroke (MORUCOS): an Australian economic model for stroke.

    PubMed

    Mihalopoulos, Catherine; Cadilhac, Dominique A; Moodie, Marjory L; Dewey, Helen M; Thrift, Amanda G; Donnan, Geoffrey A; Carter, Robert C

    2005-01-01

    To outline the development, structure, data assumptions, and application of an Australian economic model for stroke (Model of Resource Utilization, Costs, and Outcomes for Stroke [MORUCOS]). The model has a linked spreadsheet format with four modules to describe the disease burden and treatment pathways, estimate prevalence-based and incidence-based costs, and derive life expectancy and quality of life consequences. The model uses patient-level, community-based, stroke cohort data and macro-level simulations. An interventions module allows options for change to be consistently evaluated by modifying aspects of the other modules. To date, model validation has included sensitivity testing, face validity, and peer review. Further validation of technical and predictive accuracy is needed. The generic pathway model was assessed by comparison with a stroke subtypes (ischemic, hemorrhagic, or undetermined) approach and used to determine the relative cost-effectiveness of four interventions. The generic pathway model produced lower costs compared with a subtypes version (total average first-year costs/case AUD$ 15,117 versus AUD$ 17,786, respectively). Optimal evidence-based uptake of anticoagulation therapy for primary and secondary stroke prevention and intravenous thrombolytic therapy within 3 hours of stroke were more cost-effective than current practice (base year, 1997). MORUCOS is transparent and flexible in describing Australian stroke care and can effectively be used to systematically evaluate a range of different interventions. Adjusting results to account for stroke subtypes, as they influence cost estimates, could enhance the generic model.

  9. Development of the Galaxy Chronic Obstructive Pulmonary Disease (COPD) Model Using Data from ECLIPSE: Internal Validation of a Linked-Equations Cohort Model.

    PubMed

    Briggs, Andrew H; Baker, Timothy; Risebrough, Nancy A; Chambers, Mike; Gonzalez-McQuire, Sebastian; Ismaila, Afisi S; Exuzides, Alex; Colby, Chris; Tabberer, Maggie; Muellerova, Hana; Locantore, Nicholas; Rutten van Mölken, Maureen P M H; Lomas, David A

    2017-05-01

    The recent joint International Society for Pharmacoeconomics and Outcomes Research / Society for Medical Decision Making Modeling Good Research Practices Task Force emphasized the importance of conceptualizing and validating models. We report a new model of chronic obstructive pulmonary disease (COPD) (part of the Galaxy project) founded on a conceptual model, implemented using a novel linked-equation approach, and internally validated. An expert panel developed a conceptual model including causal relationships between disease attributes, progression, and final outcomes. Risk equations describing these relationships were estimated using data from the Evaluation of COPD Longitudinally to Identify Predictive Surrogate Endpoints (ECLIPSE) study, with costs estimated from the TOwards a Revolution in COPD Health (TORCH) study. Implementation as a linked-equation model enabled direct estimation of health service costs and quality-adjusted life years (QALYs) for COPD patients over their lifetimes. Internal validation compared 3 years of predicted cohort experience with ECLIPSE results. At 3 years, the Galaxy COPD model predictions of annual exacerbation rate and annual decline in forced expiratory volume in 1 second fell within the ECLIPSE data confidence limits, although 3-year overall survival was outside the observed confidence limits. Projections of the risk equations over time permitted extrapolation to patient lifetimes. Averaging the predicted cost/QALY outcomes for the different patients within the ECLIPSE cohort gives an estimated lifetime cost of £25,214 (undiscounted)/£20,318 (discounted) and lifetime QALYs of 6.45 (undiscounted/5.24 [discounted]) per ECLIPSE patient. A new form of model for COPD was conceptualized, implemented, and internally validated, based on a series of linked equations using epidemiological data (ECLIPSE) and cost data (TORCH). This Galaxy model predicts COPD outcomes from treatment effects on disease attributes such as lung function, exacerbations, symptoms, or exercise capacity; further external validation is required.

  10. Preliminary Multivariable Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  11. Economic analysis of model validation for a challenge problem

    DOE PAGES

    Paez, Paul J.; Paez, Thomas L.; Hasselman, Timothy K.

    2016-02-19

    It is now commonplace for engineers to build mathematical models of the systems they are designing, building, or testing. And, it is nearly universally accepted that phenomenological models of physical systems must be validated prior to use for prediction in consequential scenarios. Yet, there are certain situations in which testing only or no testing and no modeling may be economically viable alternatives to modeling and its associated testing. This paper develops an economic framework within which benefit–cost can be evaluated for modeling and model validation relative to other options. The development is presented in terms of a challenge problem. Asmore » a result, we provide a numerical example that quantifies when modeling, calibration, and validation yield higher benefit–cost than a testing only or no modeling and no testing option.« less

  12. Population Screening for Hereditary Haemochromatosis in Australia: Construction and Validation of a State-Transition Cost-Effectiveness Model.

    PubMed

    de Graaff, Barbara; Si, Lei; Neil, Amanda L; Yee, Kwang Chien; Sanderson, Kristy; Gurrin, Lyle C; Palmer, Andrew J

    2017-03-01

    HFE-associated haemochromatosis, the most common monogenic disorder amongst populations of northern European ancestry, is characterised by iron overload. Excess iron is stored in parenchymal tissues, leading to morbidity and mortality. Population screening programmes are likely to improve early diagnosis, thereby decreasing associated disease. Our aim was to develop and validate a health economics model of screening using utilities and costs from a haemochromatosis cohort. A state-transition model was developed with Markov states based on disease severity. Australian males (aged 30 years) and females (aged 45 years) of northern European ancestry were the target populations. The screening strategy was the status quo approach in Australia; the model was run over a lifetime horizon. Costs were estimated from the government perspective and reported in 2015 Australian dollars ($A); costs and quality-adjusted life-years (QALYs) were discounted at 5% annually. Model validity was assessed using goodness-of-fit analyses. Second-order Monte-Carlo simulation was used to account for uncertainty in multiple parameters. For validity, the model reproduced mortality, life expectancy (LE) and prevalence rates in line with published data. LE for C282Y homozygote males and females were 49.9 and 40.2 years, respectively, slightly lower than population rates. Mean (95% confidence interval) QALYS were 15.7 (7.7-23.7) for males and 14.4 (6.7-22.1) for females. Mean discounted lifetime costs for C282Y homozygotes were $A22,737 (3670-85,793) for males and $A13,840 (1335-67,377) for females. Sensitivity analyses revealed discount rates and prevalence had the greatest impacts on outcomes. We have developed a transparent, validated health economics model of C282Y homozygote haemochromatosis. The model will be useful to decision makers to identify cost-effective screening strategies.

  13. Validation of the OpCost logging cost model using contractor surveys

    Treesearch

    Conor K. Bell; Robert F. Keefe; Jeremy S. Fried

    2017-01-01

    OpCost is a harvest and fuel treatment operations cost model developed to function as both a standalone tool and an integrated component of the Bioregional Inventory Originated Simulation Under Management (BioSum) analytical framework for landscape-level analysis of forest management alternatives. OpCost is an updated implementation of the Fuel Reduction Cost Simulator...

  14. Cost Modeling for Space Optical Telescope Assemblies

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2011-01-01

    Parametric cost models are used to plan missions, compare concepts and justify technology investments. This paper reviews an on-going effort to develop cost modes for space telescopes. This paper summarizes the methodology used to develop cost models and documents how changes to the database have changed previously published preliminary cost models. While the cost models are evolving, the previously published findings remain valid: it costs less per square meter of collecting aperture to build a large telescope than a small telescope; technology development as a function of time reduces cost; and lower areal density telescopes cost more than more massive telescopes.

  15. Validation Test Report For The CRWMS Analysis and Logistics Visually Interactive Model Calvin Version 3.0, 10074-Vtr-3.0-00

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. Gillespie

    2000-07-27

    This report describes the tests performed to validate the CRWMS ''Analysis and Logistics Visually Interactive'' Model (CALVIN) Version 3.0 (V3.0) computer code (STN: 10074-3.0-00). To validate the code, a series of test cases was developed in the CALVIN V3.0 Validation Test Plan (CRWMS M&O 1999a) that exercises the principal calculation models and options of CALVIN V3.0. Twenty-five test cases were developed: 18 logistics test cases and 7 cost test cases. These cases test the features of CALVIN in a sequential manner, so that the validation of each test case is used to demonstrate the accuracy of the input to subsequentmore » calculations. Where necessary, the test cases utilize reduced-size data tables to make the hand calculations used to verify the results more tractable, while still adequately testing the code's capabilities. Acceptance criteria, were established for the logistics and cost test cases in the Validation Test Plan (CRWMS M&O 1999a). The Logistics test cases were developed to test the following CALVIN calculation models: Spent nuclear fuel (SNF) and reactivity calculations; Options for altering reactor life; Adjustment of commercial SNF (CSNF) acceptance rates for fiscal year calculations and mid-year acceptance start; Fuel selection, transportation cask loading, and shipping to the Monitored Geologic Repository (MGR); Transportation cask shipping to and storage at an Interim Storage Facility (ISF); Reactor pool allocation options; and Disposal options at the MGR. Two types of cost test cases were developed: cases to validate the detailed transportation costs, and cases to validate the costs associated with the Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) and Regional Servicing Contractors (RSCs). For each test case, values calculated using Microsoft Excel 97 worksheets were compared to CALVIN V3.0 scenarios with the same input data and assumptions. All of the test case results compare with the CALVIN V3.0 results within the bounds of the acceptance criteria. Therefore, it is concluded that the CALVIN V3.0 calculation models and options tested in this report are validated.« less

  16. An integrated radar model solution for mission level performance and cost trades

    NASA Astrophysics Data System (ADS)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  17. Do decision-analytic models identify cost-effective treatments? A retrospective look at helicobacter pylori eradication.

    PubMed

    Fairman, Kathleen A; Motheral, Brenda R

    2003-01-01

    Pharmacoeconomic models of Helicobacter (H) pylori eradication have been frequently cited but never validated. Examine retrospectively whether H pylori pharmacoeconomic models direct decision makers to cost-effective therapeutic choices. We first replicated and then validated 2 models, replacing model assumptions with empirical data from a multipayer claims database. Database subjects were 435 commercially insured U.S. patients treated with bismuthmetronidazole- tetracycline (BMT), proton pump inhibitor (PPI)-clarithromycin, or PPI-amoxicillin. Patients met >1 clinical requirement (ulcer disease, gastritis/duodenitis, stomach function disorder, abdominal pain, H pylori infection, endoscopy, or H pylori assay). Sensitivity analyses included only patients with ulcer diagnosis or gastrointestinal specialist care. Outcome measures were: (1) rates of eradication retreatment; (2) use of office visits, hospitalizations, endoscopies, and antisecretory medication; and (3) cost per effectively treated (nonretreated) patient. Model results overstated the cost-effectiveness of PPI-clarithromycin and underestimated the cost-effectiveness of BMT. Prior to empirical adjustment, costs per effectively treated patient were 1,001 US dollars, 980 US dollars, and 1,730 US dollars for BMT, PPIclarithromycin, and PPI-amoxicillin, respectively. Estimates after adjustment were US dollars for BMT, 1,118 US dollars for PPI-clarithromycin, and 1,131 US dollars for PPI-amoxicillin. Key model assumptions that proved retrospectively incorrect were largely unsupported by either empirical evidence or systematic assessment of expert opinion. Organizations with access to medical and pharmacy claims databases should test key assumptions of influential models to determine their validity. Journal peer-review processes should pay particular attention to the basis of model assumptions.

  18. Modeling complex treatment strategies: construction and validation of a discrete event simulation model for glaucoma.

    PubMed

    van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G

    2010-01-01

    Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.

  19. Development and validation of chemistry agnostic flow battery cost performance model and application to nonaqueous electrolyte systems: Chemistry agnostic flow battery cost performance model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, Alasdair; Thomsen, Edwin; Reed, David

    2016-04-20

    A chemistry agnostic cost performance model is described for a nonaqueous flow battery. The model predicts flow battery performance by estimating the active reaction zone thickness at each electrode as a function of current density, state of charge, and flow rate using measured data for electrode kinetics, electrolyte conductivity, and electrode-specific surface area. Validation of the model is conducted using a 4kW stack data at various current densities and flow rates. This model is used to estimate the performance of a nonaqueous flow battery with electrode and electrolyte properties used from the literature. The optimized cost for this system ismore » estimated for various power and energy levels using component costs provided by vendors. The model allows optimization of design parameters such as electrode thickness, area, flow path design, and operating parameters such as power density, flow rate, and operating SOC range for various application duty cycles. A parametric analysis is done to identify components and electrode/electrolyte properties with the highest impact on system cost for various application durations. A pathway to 100$kWh -1 for the storage system is identified.« less

  20. Rationality Validation of a Layered Decision Model for Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Huaqiang; Alves-Foss, James; Zhang, Du

    2007-08-31

    We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDMmore » rationality through simulation.« less

  1. Logistics: Implementation of Performance - Based Logistics for the Javelin Weapon System

    DTIC Science & Technology

    2005-03-07

    the c.ontext of each lice within the Automated Cost 24 Batimating-hTasgraled Tools ( ACEIT ) mode], the Army’s standard cost model, containing the EA was...fully validated the EA, The Javelin E.A was valihdted through an extensive review of the EA cost documentation in (te ACEIT file in coordination with... ACEIT file of the system cost estimate- This documentation was conndered to be suflicienT by the CEAC Director once the EA was determinmd to be valid

  2. TEAM-HF Cost-Effectiveness Model: A Web-Based Program Designed to Evaluate the Cost-Effectiveness of Disease Management Programs in Heart Failure

    PubMed Central

    Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.

    2015-01-01

    Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504

  3. Ecological validity of cost-effectiveness models of universal HPV vaccination: a protocol for a systematic review.

    PubMed

    Favato, Giampiero; Noikokyris, Emmanouil; Vecchiato, Riccardo

    2017-01-25

    Sexually transmitted infection with high-risk, oncogenic strains of human papillomavirus (HPV) still induces a relevant burden of diseases on both men and women. Although vaccines appear to be highly efficacious in preventing the infection of the most common high-risk strains (HPV 6, 11, 16, 18), important questions regarding the appropriate target population for prophylactic vaccination are still debated. Models in the extant literature seem to converge on the cost-effectiveness of high coverage (>80%) of a single cohort of 12-year-old girls. This vaccination strategy should provide an adequate level of indirect protection (herd immunity) to the unvaccinated boys. This argument presupposes the ecological validity of the cost-effectiveness models; the implicit condition that the characteristics of the individuals and the sexual behaviours observed in the models is generalisable to the natural behaviours of the population. The primary aim of this review is to test the ecological validity of the cost-effectiveness models of universal HPV vaccination available in the literature. The ecological validity of each model will be defined by the number of representative characteristics and behaviours taken into consideration. Nine bibliographic databases will be searched: MEDLINE (via PubMed); Scopus; Science Direct; EMBASE via OVID SP, Web of Science, DARE, NHIR EED and HTA (via NHIR CRD); and CINHAL Plus. An additional search for grey literature will be conducted on Google Scholar and Open Grey. A search strategy will be developed for each of the databases. Data will be extracted following a pre-determined spreadsheet and then clustered and prioritised: the main outcomes will report the inputs to the demographic and epidemiological model, while additional outcomes will refer to basic inputs to the cost-effectiveness valuation. Each study included in the review will be scored by the number of representative characteristics and behaviours taken into consideration (yes or no) on both dimensions. Individual study's scores will be plotted in a 2 by 2 matrix: studies included in the upper right quadrant will be defined as ecologically valid, since which both individuals' characteristics and their sexual behaviours are representative. The proposed systematic review will be the first to assess the ecological validity of cost-effectiveness studies. In the context of sexually transmitted diseases, when this condition is violated, an error in predicting the protective impact of herd immunity would occur. Hence, a vaccination policy informed on ecologically invalid models would potentially expose boys to a residual risk of contracting HPV-induced malignancies. PROSPERO CRD42016034145.

  4. Software Development Cost Estimation Executive Summary

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Menzies, Tim

    2006-01-01

    Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.

  5. Optimization study on multiple train formation scheme of urban rail transit

    NASA Astrophysics Data System (ADS)

    Xia, Xiaomei; Ding, Yong; Wen, Xin

    2018-05-01

    The new organization method, represented by the mixed operation of multi-marshalling trains, can adapt to the characteristics of the uneven distribution of passenger flow, but the research on this aspect is still not perfect enough. This paper introduced the passenger sharing rate and congestion penalty coefficient with different train formations. On this basis, this paper established an optimization model with the minimum passenger cost and operation cost as objective, and operation frequency and passenger demand as constraint. The ideal point method is used to solve this model. Compared with the fixed marshalling operation model, the overall cost of this scheme saves 9.24% and 4.43% respectively. This result not only validates the validity of the model, but also illustrate the advantages of the multiple train formations scheme.

  6. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Development and validation of a cost-utility model for Type 1 diabetes mellitus.

    PubMed

    Wolowacz, S; Pearson, I; Shannon, P; Chubb, B; Gundgaard, J; Davies, M; Briggs, A

    2015-08-01

    To develop a health economic model to evaluate the cost-effectiveness of new interventions for Type 1 diabetes mellitus by their effects on long-term complications (measured through mean HbA1c ) while capturing the impact of treatment on hypoglycaemic events. Through a systematic review, we identified complications associated with Type 1 diabetes mellitus and data describing the long-term incidence of these complications. An individual patient simulation model was developed and included the following complications: cardiovascular disease, peripheral neuropathy, microalbuminuria, end-stage renal disease, proliferative retinopathy, ketoacidosis, cataract, hypoglycemia and adverse birth outcomes. Risk equations were developed from published cumulative incidence data and hazard ratios for the effect of HbA1c , age and duration of diabetes. We validated the model by comparing model predictions with observed outcomes from studies used to build the model (internal validation) and from other published data (external validation). We performed illustrative analyses for typical patient cohorts and a hypothetical intervention. Model predictions were within 2% of expected values in the internal validation and within 8% of observed values in the external validation (percentages represent absolute differences in the cumulative incidence). The model utilized high-quality, recent data specific to people with Type 1 diabetes mellitus. In the model validation, results deviated less than 8% from expected values. © 2014 Research Triangle Institute d/b/a RTI Health Solutions. Diabetic Medicine © 2014 Diabetes UK.

  8. Comparative analysis for various redox flow batteries chemistries using a cost performance model

    NASA Astrophysics Data System (ADS)

    Crawford, Alasdair; Viswanathan, Vilayanur; Stephenson, David; Wang, Wei; Thomsen, Edwin; Reed, David; Li, Bin; Balducci, Patrick; Kintner-Meyer, Michael; Sprenkle, Vincent

    2015-10-01

    The total energy storage system cost is determined by means of a robust performance-based cost model for multiple flow battery chemistries. Systems aspects such as shunt current losses, pumping losses and various flow patterns through electrodes are accounted for. The system cost minimizing objective function determines stack design by optimizing the state of charge operating range, along with current density and current-normalized flow. The model cost estimates are validated using 2-kW stack performance data for the same size electrodes and operating conditions. Using our validated tool, it has been demonstrated that an optimized all-vanadium system has an estimated system cost of < 350 kWh-1 for 4-h application. With an anticipated decrease in component costs facilitated by economies of scale from larger production volumes, coupled with performance improvements enabled by technology development, the system cost is expected to decrease to 160 kWh-1 for a 4-h application, and to 100 kWh-1 for a 10-h application. This tool has been shared with the redox flow battery community to enable cost estimation using their stack data and guide future direction.

  9. The clinical effectiveness and cost-effectiveness of testing for cytochrome P450 polymorphisms in patients with schizophrenia treated with antipsychotics: a systematic review and economic evaluation.

    PubMed

    Fleeman, N; McLeod, C; Bagust, A; Beale, S; Boland, A; Dundar, Y; Jorgensen, A; Payne, K; Pirmohamed, M; Pushpakom, S; Walley, T; de Warren-Penny, P; Dickson, R

    2010-01-01

    To determine whether testing for cytochrome P450 (CYP) polymorphisms in adults entering antipsychotic treatment for schizophrenia leads to improvement in outcomes, is useful in medical, personal or public health decision-making, and is a cost-effective use of health-care resources. The following electronic databases were searched for relevant published literature: Cochrane Controlled Trials Register, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effectiveness, EMBASE, Health Technology Assessment database, ISI Web of Knowledge, MEDLINE, PsycINFO, NHS Economic Evaluation Database, Health Economic Evaluation Database, Cost-effectiveness Analysis (CEA) Registry and the Centre for Health Economics website. In addition, publicly available information on various genotyping tests was sought from the internet and advisory panel members. A systematic review of analytical validity, clinical validity and clinical utility of CYP testing was undertaken. Data were extracted into structured tables and narratively discussed, and meta-analysis was undertaken when possible. A review of economic evaluations of CYP testing in psychiatry and a review of economic models related to schizophrenia were also carried out. For analytical validity, 46 studies of a range of different genotyping tests for 11 different CYP polymorphisms (most commonly CYP2D6) were included. Sensitivity and specificity were high (99-100%). For clinical validity, 51 studies were found. In patients tested for CYP2D6, an association between genotype and tardive dyskinesia (including Abnormal Involuntary Movement Scale scores) was found. The only other significant finding linked the CYP2D6 genotype to parkinsonism. One small unpublished study met the inclusion criteria for clinical utility. One economic evaluation assessing the costs and benefits of CYP testing for prescribing antidepressants and 28 economic models of schizophrenia were identified; none was suitable for developing a model to examine the cost-effectiveness of CYP testing. Tests for determining genotypes appear to be accurate although not all aspects of analytical validity were reported. Given the absence of convincing evidence from clinical validity studies, the lack of clinical utility and economic studies, and the unsuitability of published schizophrenia models, no model was developed; instead key features and data requirements for economic modelling are presented. Recommendations for future research cover both aspects of research quality and data that will be required to inform the development of future economic models.

  10. A review of the solar array manufacturing industry costing standards

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The solar array manufacturing industry costing standards model is designed to compare the cost of producing solar arrays using alternative manufacturing processes. Constructive criticism of the methodology used is intended to enhance its implementation as a practical design tool. Three main elements of the procedure include workbook format and presentation, theoretical model validity and standard financial parameters.

  11. Development and validation of a Markov microsimulation model for the economic evaluation of treatments in osteoporosis.

    PubMed

    Hiligsmann, Mickaël; Ethgen, Olivier; Bruyère, Olivier; Richy, Florent; Gathon, Henry-Jean; Reginster, Jean-Yves

    2009-01-01

    Markov models are increasingly used in economic evaluations of treatments for osteoporosis. Most of the existing evaluations are cohort-based Markov models missing comprehensive memory management and versatility. In this article, we describe and validate an original Markov microsimulation model to accurately assess the cost-effectiveness of prevention and treatment of osteoporosis. We developed a Markov microsimulation model with a lifetime horizon and a direct health-care cost perspective. The patient history was recorded and was used in calculations of transition probabilities, utilities, and costs. To test the internal consistency of the model, we carried out an example calculation for alendronate therapy. Then, external consistency was investigated by comparing absolute lifetime risk of fracture estimates with epidemiologic data. For women at age 70 years, with a twofold increase in the fracture risk of the average population, the costs per quality-adjusted life-year gained for alendronate therapy versus no treatment were estimated at €9105 and €15,325, respectively, under full and realistic adherence assumptions. All the sensitivity analyses in terms of model parameters and modeling assumptions were coherent with expected conclusions and absolute lifetime risk of fracture estimates were within the range of previous estimates, which confirmed both internal and external consistency of the model. Microsimulation models present some major advantages over cohort-based models, increasing the reliability of the results and being largely compatible with the existing state of the art, evidence-based literature. The developed model appears to be a valid model for use in economic evaluations in osteoporosis.

  12. Process-Improvement Cost Model for the Emergency Department.

    PubMed

    Dyas, Sheila R; Greenfield, Eric; Messimer, Sherri; Thotakura, Swati; Gholston, Sampson; Doughty, Tracy; Hays, Mary; Ivey, Richard; Spalding, Joseph; Phillips, Robin

    2015-01-01

    The objective of this report is to present a simplified, activity-based costing approach for hospital emergency departments (EDs) to use with Lean Six Sigma cost-benefit analyses. The cost model complexity is reduced by removing diagnostic and condition-specific costs, thereby revealing the underlying process activities' cost inefficiencies. Examples are provided for evaluating the cost savings from reducing discharge delays and the cost impact of keeping patients in the ED (boarding) after the decision to admit has been made. The process-improvement cost model provides a needed tool in selecting, prioritizing, and validating Lean process-improvement projects in the ED and other areas of patient care that involve multiple dissimilar diagnoses.

  13. A health economic model to determine the long-term costs and clinical outcomes of raising low HDL-cholesterol in the prevention of coronary heart disease.

    PubMed

    Roze, S; Liens, D; Palmer, A; Berger, W; Tucker, D; Renaudin, C

    2006-12-01

    The aim of this study was to describe a health economic model developed to project lifetime clinical and cost outcomes of lipid-modifying interventions in patients not reaching target lipid levels and to assess the validity of the model. The internet-based, computer simulation model is made up of two decision analytic sub-models, the first utilizing Monte Carlo simulation, and the second applying Markov modeling techniques. Monte Carlo simulation generates a baseline cohort for long-term simulation by assigning an individual lipid profile to each patient, and applying the treatment effects of interventions under investigation. The Markov model then estimates the long-term clinical (coronary heart disease events, life expectancy, and quality-adjusted life expectancy) and cost outcomes up to a lifetime horizon, based on risk equations from the Framingham study. Internal and external validation analyses were performed. The results of the model validation analyses, plotted against corresponding real-life values from Framingham, 4S, AFCAPS/TexCAPS, and a meta-analysis by Gordon et al., showed that the majority of values were close to the y = x line, which indicates a perfect fit. The R2 value was 0.9575 and the gradient of the regression line was 0.9329, both very close to the perfect fit (= 1). Validation analyses of the computer simulation model suggest the model is able to recreate the outcomes from published clinical studies and would be a valuable tool for the evaluation of new and existing therapy options for patients with persistent dyslipidemia.

  14. Designing an activity-based costing model for a non-admitted prisoner healthcare setting.

    PubMed

    Cai, Xiao; Moore, Elizabeth; McNamara, Martin

    2013-09-01

    To design and deliver an activity-based costing model within a non-admitted prisoner healthcare setting. Key phases from the NSW Health clinical redesign methodology were utilised: diagnostic, solution design and implementation. The diagnostic phase utilised a range of strategies to identify issues requiring attention in the development of the costing model. The solution design phase conceptualised distinct 'building blocks' of activity and cost based on the speciality of clinicians providing care. These building blocks enabled the classification of activity and comparisons of costs between similar facilities. The implementation phase validated the model. The project generated an activity-based costing model based on actual activity performed, gained acceptability among clinicians and managers, and provided the basis for ongoing efficiency and benchmarking efforts.

  15. A Novel Cost Based Model for Energy Consumption in Cloud Computing

    PubMed Central

    Horri, A.; Dastghaibyfard, Gh.

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. PMID:25705716

  16. A novel cost based model for energy consumption in cloud computing.

    PubMed

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  17. Further Validation of the Psychosocial Costs of Racism to Whites Scale on a Sample of University Students in the Southeastern United States

    ERIC Educational Resources Information Center

    Sifford, Amy; Ng, Kok-Mun; Wang, Chuang

    2009-01-01

    We examined the factor structure of the Psychosocial Costs of Racism to Whites Scale (PCRW; Spanierman & Heppner, 2004) on 766 White American university students from the southeastern United States. Results from confirmatory factor analyses supported the 3-factor model proposed by Spanierman and Heppner (2004). The construct validity of the…

  18. A systematic review on the quality, validity and usefulness of current cost-effectiveness studies for treatments of neovascular age-related macular degeneration.

    PubMed

    Elshout, Mari; Webers, Carroll A B; van der Reis, Margriet I; Schouten, Jan S A G

    2018-06-04

    Ophthalmologists increasingly depend on new drugs to advance their treatment options. These options are limited by restraints on reimbursements for new and expensive drugs. These restraints are put in place through health policy decisions based on cost-effectiveness analyses (CEA). Cost-effectiveness analyses need to be valid and of good quality to support correct decisions to create new treatment opportunities. In this study, we report the quality, validity and usefulness of CEAs for therapies for nAMD. A systematic review in PubMed, EMBASE and Cochrane was performed to include CEAs. Quality and validity assessment was based on current general quality criteria and on elements that are specific to the field of ophthalmology. Forty-eight CEAs were included in the review. Forty-four CEAs did not meet four basic model quality and validity criteria specific to CEAs in the field of ophthalmology (both eyes analysed instead of one; a time horizon extending beyond 4 years; extrapolating VA and treatment intervals beyond trial data realistically; and including the costs of low-vision). Four CEAs aligned with the quality and validity criteria. In two of these CEAs bevacizumab as-needed (PRN) was more cost-effective than bevacizumab monthly; aflibercept (VIEW); or ranibizumab monthly or PRN. In two CEAs, ranibizumab (PRN or treat and extent) was dominant over aflibercept. In two other CEAs, aflibercept was either more cost-effective or dominant over ranibizumab monthly or PRN. Two of the CEAs of sufficient quality and validity show that bevacizumab PRN is the most cost-effective treatment. Comparing ranibizumab and aflibercept, either treatment can be more cost-effective depending on the assumptions used for drug prices and treatment frequencies. The majority of the published CEAs are of insufficient quality and validity. They wrongly inform decision-makers at the cost of opportunities for ophthalmologists to treat patients. As such, they may negatively influence overall patient outcomes and societal costs. For future ophthalmic treatments, CEAs need to be improved and only published when they are of sufficient quality and validity. © 2018 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  19. The relationship between cost estimates reliability and BIM adoption: SEM analysis

    NASA Astrophysics Data System (ADS)

    Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.

    2018-02-01

    This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.079<0.08, GFI=0.824, CFI=0.962>0.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.

  20. Extending the Strategy Based Risk Model Using the Delphi Method: An Application to the Validation Process for Research and Developmental (R&D) Satellites

    DTIC Science & Technology

    2009-12-01

    correctly Risk before validation step: 41-60% - Is this too high/ low ? Why? Risk 8: Operational or data latency impacts based on relationship between...too high, too low , or correct. We also asked them to comment on why they felt this way. Finally, we left additional space on the survey for any...cost of each validation effort was too high, too low , or acceptable. They then gave us rationale for their beliefs. The second cost associated with

  1. Prediction of risk of recurrence of venous thromboembolism following treatment for a first unprovoked venous thromboembolism: systematic review, prognostic model and clinical decision rule, and economic evaluation.

    PubMed

    Ensor, Joie; Riley, Richard D; Jowett, Sue; Monahan, Mark; Snell, Kym Ie; Bayliss, Susan; Moore, David; Fitzmaurice, David

    2016-02-01

    Unprovoked first venous thromboembolism (VTE) is defined as VTE in the absence of a temporary provoking factor such as surgery, immobility and other temporary factors. Recurrent VTE in unprovoked patients is highly prevalent, but easily preventable with oral anticoagulant (OAC) therapy. The unprovoked population is highly heterogeneous in terms of risk of recurrent VTE. The first aim of the project is to review existing prognostic models which stratify individuals by their recurrence risk, therefore potentially allowing tailored treatment strategies. The second aim is to enhance the existing research in this field, by developing and externally validating a new prognostic model for individual risk prediction, using a pooled database containing individual patient data (IPD) from several studies. The final aim is to assess the economic cost-effectiveness of the proposed prognostic model if it is used as a decision rule for resuming OAC therapy, compared with current standard treatment strategies. Standard systematic review methodology was used to identify relevant prognostic model development, validation and cost-effectiveness studies. Bibliographic databases (including MEDLINE, EMBASE and The Cochrane Library) were searched using terms relating to the clinical area and prognosis. Reviewing was undertaken by two reviewers independently using pre-defined criteria. Included full-text articles were data extracted and quality assessed. Critical appraisal of included full texts was undertaken and comparisons made of model performance. A prognostic model was developed using IPD from the pooled database of seven trials. A novel internal-external cross-validation (IECV) approach was used to develop and validate a prognostic model, with external validation undertaken in each of the trials iteratively. Given good performance in the IECV approach, a final model was developed using all trials data. A Markov patient-level simulation was used to consider the economic cost-effectiveness of using a decision rule (based on the prognostic model) to decide on resumption of OAC therapy (or not). Three full-text articles were identified by the systematic review. Critical appraisal identified methodological and applicability issues; in particular, all three existing models did not have external validation. To address this, new prognostic models were sought with external validation. Two potential models were considered: one for use at cessation of therapy (pre D-dimer), and one for use after cessation of therapy (post D-dimer). Model performance measured in the external validation trials showed strong calibration performance for both models. The post D-dimer model performed substantially better in terms of discrimination (c = 0.69), better separating high- and low-risk patients. The economic evaluation identified that a decision rule based on the final post D-dimer model may be cost-effective for patients with predicted risk of recurrence of over 8% annually; this suggests continued therapy for patients with predicted risks ≥ 8% and cessation of therapy otherwise. The post D-dimer model performed strongly and could be useful to predict individuals' risk of recurrence at any time up to 2-3 years, thereby aiding patient counselling and treatment decisions. A decision rule using this model may be cost-effective for informing clinical judgement and patient opinion in treatment decisions. Further research may investigate new predictors to enhance model performance and aim to further externally validate to confirm performance in new, non-trial populations. Finally, it is essential that further research is conducted to develop a model predicting bleeding risk on therapy, to manage the balance between the risks of recurrence and bleeding. This study is registered as PROSPERO CRD42013003494. The National Institute for Health Research Health Technology Assessment programme.

  2. Cost-effectiveness of wound management in France: pressure ulcers and venous leg ulcers.

    PubMed

    Meaume, S; Gemmen, E

    2002-06-01

    This study set out to define realistic protocols of care for the treatment of chronic venous leg ulcers and pressure ulcers in France and, by developing cost-effectiveness models, to compare the different protocols of care for the two ulcer groups, enabling a calculation of direct medical costs per ulcer healed in a typical French health insurance plan. Clinical outcomes and some treatment patterns were obtained from published literature. Validations of different treatment patterns were developed using an expert consensus panel similar to the Delphi approach. Costs were calculated based on national averages and estimates from the UK and Germany. The models were used to measure costs per healed ulcer over a 12-week period. For both the pressure ulcer and venous leg ulcer models, three protocols of care were identified. For pressure ulcers and venous leg ulcers, the hydrocolloid DuoDERM (ConvaTec, also known as Granuflex in the UK and Varihesive in Germany) was most cost-effective in France. The combination of published data and expert consensus opinion is a valid technique, and in this case suggests that treating pressure ulcers and venous leg ulcers with hydrocolloid dressings is more cost-effective than treating them with saline gauze, in spite of the lower unit cost of the latter.

  3. Ecological validity of cost-effectiveness models of universal HPV vaccination: A systematic literature review.

    PubMed

    Favato, Giampiero; Easton, Tania; Vecchiato, Riccardo; Noikokyris, Emmanouil

    2017-05-09

    The protective (herd) effect of the selective vaccination of pubertal girls against human papillomavirus (HPV) implies a high probability that one of the two partners involved in intercourse is immunised, hence preventing the other from this sexually transmitted infection. The dynamic transmission models used to inform immunisation policy should include consideration of sexual behaviours and population mixing in order to demonstrate an ecological validity, whereby the scenarios modelled remain faithful to the real-life social and cultural context. The primary aim of this review is to test the ecological validity of the universal HPV vaccination cost-effectiveness modelling available in the published literature. The research protocol related to this systematic review has been registered in the International Prospective Register of Systematic Reviews (PROSPERO: CRD42016034145). Eight published economic evaluations were reviewed. None of the studies showed due consideration of the complexities of human sexual behaviour and the impact this may have on the transmission of HPV. Our findings indicate that all the included models might be affected by a different degree of ecological bias, which implies an inability to reflect the natural demographic and behavioural trends in their outcomes and, consequently, to accurately inform public healthcare policy. In particular, ecological bias have the effect to over-estimate the preference-based outcomes of selective immunisation. A relatively small (15-20%) over-estimation of quality-adjusted life years (QALYs) gained with selective immunisation programmes could induce a significant error in the estimate of cost-effectiveness of universal immunisation, by inflating its incremental cost effectiveness ratio (ICER) beyond the acceptability threshold. The results modelled here demonstrate the limitations of the cost-effectiveness studies for HPV vaccination, and highlight the concern that public healthcare policy might have been built upon incomplete studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Uncertainty quantification and validation of 3D lattice scaffolds for computer-aided biomedical applications.

    PubMed

    Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J

    2017-07-01

    A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Input-Output Modeling and Control of the Departure Process of Congested Airports

    NASA Technical Reports Server (NTRS)

    Pujet, Nicolas; Delcaire, Bertrand; Feron, Eric

    2003-01-01

    A simple queueing model of busy airport departure operations is proposed. This model is calibrated and validated using available runway configuration and traffic data. The model is then used to evaluate preliminary control schemes aimed at alleviating departure traffic congestion on the airport surface. The potential impact of these control strategies on direct operating costs, environmental costs and overall delay is quantified and discussed.

  6. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    NASA Technical Reports Server (NTRS)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  7. Replica Approach for Minimal Investment Risk with Cost

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-06-01

    In the present work, the optimal portfolio minimizing the investment risk with cost is discussed analytically, where an objective function is constructed in terms of two negative aspects of investment, the risk and cost. We note the mathematical similarity between the Hamiltonian in the mean-variance model and the Hamiltonians in the Hopfield model and the Sherrington-Kirkpatrick model, show that we can analyze this portfolio optimization problem by using replica analysis, and derive the minimal investment risk with cost and the investment concentration of the optimal portfolio. Furthermore, we validate our proposed method through numerical simulations.

  8. An Innovative Software Tool Suite for Power Plant Model Validation and Parameter Calibration using PMU Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yuanyuan; Diao, Ruisheng; Huang, Renke

    Maintaining good quality of power plant stability models is of critical importance to ensure the secure and economic operation and planning of today’s power grid with its increasing stochastic and dynamic behavior. According to North American Electric Reliability (NERC) standards, all generators in North America with capacities larger than 10 MVA are required to validate their models every five years. Validation is quite costly and can significantly affect the revenue of generator owners, because the traditional staged testing requires generators to be taken offline. Over the past few years, validating and calibrating parameters using online measurements including phasor measurement unitsmore » (PMUs) and digital fault recorders (DFRs) has been proven to be a cost-effective approach. In this paper, an innovative open-source tool suite is presented for validating power plant models using PPMV tool, identifying bad parameters with trajectory sensitivity analysis, and finally calibrating parameters using an ensemble Kalman filter (EnKF) based algorithm. The architectural design and the detailed procedures to run the tool suite are presented, with results of test on a realistic hydro power plant using PMU measurements for 12 different events. The calibrated parameters of machine, exciter, governor and PSS models demonstrate much better performance than the original models for all the events and show the robustness of the proposed calibration algorithm.« less

  9. Temporal validation for landsat-based volume estimation model

    Treesearch

    Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan

    2015-01-01

    Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...

  10. Operations Assessment of Launch Vehicle Architectures using Activity Based Cost Models

    NASA Technical Reports Server (NTRS)

    Ruiz-Torres, Alex J.; McCleskey, Carey

    2000-01-01

    The growing emphasis on affordability for space transportation systems requires the assessment of new space vehicles for all life cycle activities, from design and development, through manufacturing and operations. This paper addresses the operational assessment of launch vehicles, focusing on modeling the ground support requirements of a vehicle architecture, and estimating the resulting costs and flight rate. This paper proposes the use of Activity Based Costing (ABC) modeling for this assessment. The model uses expert knowledge to determine the activities, the activity times and the activity costs based on vehicle design characteristics. The approach provides several advantages to current approaches to vehicle architecture assessment including easier validation and allowing vehicle designers to understand the cost and cycle time drivers.

  11. Experimental Validation of Plastic Mandible Models Produced by a "Low-Cost" 3-Dimensional Fused Deposition Modeling Printer.

    PubMed

    Maschio, Federico; Pandya, Mirali; Olszewski, Raphael

    2016-03-22

    The objective of this study was to investigate the accuracy of 3-dimensional (3D) plastic (ABS) models generated using a low-cost 3D fused deposition modelling printer. Two human dry mandibles were scanned with a cone beam computed tomography (CBCT) Accuitomo device. Preprocessing consisted of 3D reconstruction with Maxilim software and STL file repair with Netfabb software. Then, the data were used to print 2 plastic replicas with a low-cost 3D fused deposition modeling printer (Up plus 2®). Two independent observers performed the identification of 26 anatomic landmarks on the 4 mandibles (2 dry and 2 replicas) with a 3D measuring arm. Each observer repeated the identifications 20 times. The comparison between the dry and plastic mandibles was based on 13 distances: 8 distances less than 12 mm and 5 distances greater than 12 mm. The mean absolute difference (MAD) was 0.37 mm, and the mean dimensional error (MDE) was 3.76%. The MDE decreased to 0.93% for distances greater than 12 mm. Plastic models generated using the low-cost 3D printer UPplus2® provide dimensional accuracies comparable to other well-established rapid prototyping technologies. Validated low-cost 3D printers could represent a step toward the better accessibility of rapid prototyping technologies in the medical field.

  12. Engineering and fabrication cost considerations for cryogenic wind tunnel models

    NASA Technical Reports Server (NTRS)

    Boykin, R. M., Jr.; Davenport, J. B., Jr.

    1983-01-01

    Design and fabrication cost drivers for cryogenic transonic wind tunnel models are defined. The major cost factors for wind tunnel models are model complexity, tolerances, surface finishes, materials, material validation, and model inspection. The cryogenic temperatures require the use of materials with relatively high fracture toughness but at the same time high strength. Some of these materials are very difficult to machine, requiring extensive machine hours which can add significantly to the manufacturing costs. Some additional engineering costs are incurred to certify the materials through mechanical tests and nondestructive evaluation techniques, which are not normally required with conventional models. When instrumentation such as accelerometers and electronically scanned pressure modules is required, temperature control of these devices needs to be incorporated into the design, which requires added effort. Additional thermal analyses and subsystem tests may be necessary, which also adds to the design costs. The largest driver to the design costs is potentially the additional static and dynamic analyses required to insure structural integrity of the model and support system.

  13. [Modeling in value-based medicine].

    PubMed

    Neubauer, A S; Hirneiss, C; Kampik, A

    2010-03-01

    Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.

  14. Development and validation of a new population-based simulation model of osteoarthritis in New Zealand.

    PubMed

    Wilson, R; Abbott, J H

    2018-04-01

    To describe the construction and preliminary validation of a new population-based microsimulation model developed to analyse the health and economic burden and cost-effectiveness of treatments for knee osteoarthritis (OA) in New Zealand (NZ). We developed the New Zealand Management of Osteoarthritis (NZ-MOA) model, a discrete-time state-transition microsimulation model of the natural history of radiographic knee OA. In this article, we report on the model structure, derivation of input data, validation of baseline model parameters against external data sources, and validation of model outputs by comparison of the predicted population health loss with previous estimates. The NZ-MOA model simulates both the structural progression of radiographic knee OA and the stochastic development of multiple disease symptoms. Input parameters were sourced from NZ population-based data where possible, and from international sources where NZ-specific data were not available. The predicted distributions of structural OA severity and health utility detriments associated with OA were externally validated against other sources of evidence, and uncertainty resulting from key input parameters was quantified. The resulting lifetime and current population health-loss burden was consistent with estimates of previous studies. The new NZ-MOA model provides reliable estimates of the health loss associated with knee OA in the NZ population. The model structure is suitable for analysis of the effects of a range of potential treatments, and will be used in future work to evaluate the cost-effectiveness of recommended interventions within the NZ healthcare system. Copyright © 2018 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  15. Development and Validation of the Total HUman Model for Safety (THUMS) Toward Further Understanding of Occupant Injury Mechanisms in Precrash and During Crash.

    PubMed

    Iwamoto, Masami; Nakahira, Yuko; Kimpara, Hideyuki

    2015-01-01

    Active safety devices such as automatic emergency brake (AEB) and precrash seat belt have the potential to accomplish further reduction in the number of the fatalities due to automotive accidents. However, their effectiveness should be investigated by more accurate estimations of their interaction with human bodies. Computational human body models are suitable for investigation, especially considering muscular tone effects on occupant motions and injury outcomes. However, the conventional modeling approaches such as multibody models and detailed finite element (FE) models have advantages and disadvantages in computational costs and injury predictions considering muscular tone effects. The objective of this study is to develop and validate a human body FE model with whole body muscles, which can be used for the detailed investigation of interaction between human bodies and vehicular structures including some safety devices precrash and during a crash with relatively low computational costs. In this study, we developed a human body FE model called THUMS (Total HUman Model for Safety) with a body size of 50th percentile adult male (AM50) and a sitting posture. The model has anatomical structures of bones, ligaments, muscles, brain, and internal organs. The total number of elements is 281,260, which would realize relatively low computational costs. Deformable material models were assigned to all body parts. The muscle-tendon complexes were modeled by truss elements with Hill-type muscle material and seat belt elements with tension-only material. The THUMS was validated against 35 series of cadaver or volunteer test data on frontal, lateral, and rear impacts. Model validations for 15 series of cadaver test data associated with frontal impacts are presented in this article. The THUMS with a vehicle sled model was applied to investigate effects of muscle activations on occupant kinematics and injury outcomes in specific frontal impact situations with AEB. In the validations using 5 series of cadaver test data, force-time curves predicted by the THUMS were quantitatively evaluated using correlation and analysis (CORA), which showed good or acceptable agreement with cadaver test data in most cases. The investigation of muscular effects showed that muscle activation levels and timing had significant effects on occupant kinematics and injury outcomes. Although further studies on accident injury reconstruction are needed, the THUMS has the potential for predictions of occupant kinematics and injury outcomes considering muscular tone effects with relatively low computational costs.

  16. A global economic model to assess the cost-effectiveness of new treatments for advanced breast cancer in Canada.

    PubMed

    Beauchemin, C; Letarte, N; Mathurin, K; Yelle, L; Lachaine, J

    2016-06-01

    Objective Considering the increasing number of treatment options for metastatic breast cancer (MBC), it is important to develop high-quality methods to assess the cost-effectiveness of new anti-cancer drugs. This study aims to develop a global economic model that could be used as a benchmark for the economic evaluation of new therapies for MBC. Methods The Global Pharmacoeconomics of Metastatic Breast Cancer (GPMBC) model is a Markov model that was constructed to estimate the incremental cost per quality-adjusted life years (QALY) of new treatments for MBC from a Canadian healthcare system perspective over a lifetime horizon. Specific parameters included in the model are cost of drug treatment, survival outcomes, and incidence of treatment-related adverse events (AEs). Global parameters are patient characteristics, health states utilities, disutilities, and costs associated with treatment-related AEs, as well as costs associated with drug administration, medical follow-up, and end-of-life care. The GPMBC model was tested and validated in a specific context, by assessing the cost-effectiveness of lapatinib plus letrozole compared with other widely used first-line therapies for post-menopausal women with hormone receptor-positive (HR+) and epidermal growth factor receptor 2-positive (HER2+) MBC. Results When tested, the GPMBC model led to incremental cost-utility ratios of CA$131 811 per QALY, CA$56 211 per QALY, and CA$102 477 per QALY for the comparison of lapatinib plus letrozole vs letrozole alone, trastuzumab plus anastrozole, and anastrozole alone, respectively. Results of the model testing were quite similar to those obtained by Delea et al., who also assessed the cost-effectiveness of lapatinib in combination with letrozole in HR+/HER2 + MBC in Canada, thus suggesting that the GPMBC model can replicate results of well-conducted economic evaluations. Conclusions The GPMBC model can be very valuable as it allows a quick and valid assessment of the cost-effectiveness of any new treatments for MBC in a Canadian context.

  17. Implementation and validation of an economic module in the Be-FAST model to predict costs generated by livestock disease epidemics: Application to classical swine fever epidemics in Spain.

    PubMed

    Fernández-Carrión, E; Ivorra, B; Martínez-López, B; Ramos, A M; Sánchez-Vizcaíno, J M

    2016-04-01

    Be-FAST is a computer program based on a time-spatial stochastic spread mathematical model for studying the transmission of infectious livestock diseases within and between farms. The present work describes a new module integrated into Be-FAST to model the economic consequences of the spreading of classical swine fever (CSF) and other infectious livestock diseases within and between farms. CSF is financially one of the most damaging diseases in the swine industry worldwide. Specifically in Spain, the economic costs in the two last CSF epidemics (1997 and 2001) reached jointly more than 108 million euros. The present analysis suggests that severe CSF epidemics are associated with significant economic costs, approximately 80% of which are related to animal culling. Direct costs associated with control measures are strongly associated with the number of infected farms, while indirect costs are more strongly associated with epidemic duration. The economic model has been validated with economic information around the last outbreaks in Spain. These results suggest that our economic module may be useful for analysing and predicting economic consequences of livestock disease epidemics. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Analysis of various quality attributes of sunflower and soybean plants by near infra-red reflectance spectroscopy: Development and validation calibration models

    USDA-ARS?s Scientific Manuscript database

    Sunflower and soybean are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems. Rapid and low cost methods of analyzing plant quality would be helpful for crop management. We developed and validated calibration models for Near-infrar...

  19. Development and Validation of a Computational Model for Androgen Receptor Activity

    EPA Science Inventory

    Testing thousands of chemicals to identify potential androgen receptor (AR) agonists or antagonists would cost millions of dollars and take decades to complete using current validated methods. High-throughput in vitro screening (HTS) and computational toxicology approaches can mo...

  20. Development and Validation of High Precision Thermal, Mechanical, and Optical Models for the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles

    2006-01-01

    SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.

  1. Calibration and validation of an activated sludge model for greenhouse gases no. 1 (ASMG1): prediction of temperature-dependent N₂O emission dynamics.

    PubMed

    Guo, Lisha; Vanrolleghem, Peter A

    2014-02-01

    An activated sludge model for greenhouse gases no. 1 was calibrated with data from a wastewater treatment plant (WWTP) without control systems and validated with data from three similar plants equipped with control systems. Special about the calibration/validation approach adopted in this paper is that the data are obtained from simulations with a mathematical model that is widely accepted to describe effluent quality and operating costs of actual WWTPs, the Benchmark Simulation Model No. 2 (BSM2). The calibration also aimed at fitting the model to typical observed nitrous oxide (N₂O) emission data, i.e., a yearly average of 0.5% of the influent total nitrogen load emitted as N₂O-N. Model validation was performed by challenging the model in configurations with different control strategies. The kinetic term describing the dissolved oxygen effect on the denitrification by ammonia-oxidizing bacteria (AOB) was modified into a Haldane term. Both original and Haldane-modified models passed calibration and validation. Even though their yearly averaged values were similar, the two models presented different dynamic N₂O emissions under cold temperature conditions and control. Therefore, data collected in such situations can potentially permit model discrimination. Observed seasonal trends in N₂O emissions are simulated well with both original and Haldane-modified models. A mechanistic explanation based on the temperature-dependent interaction between heterotrophic and autotrophic N₂O pathways was provided. Finally, while adding the AOB denitrification pathway to a model with only heterotrophic N₂O production showed little impact on effluent quality and operating cost criteria, it clearly affected N2O emission productions.

  2. External Validation of Health Economic Decision Models for Chronic Obstructive Pulmonary Disease (COPD): Report of the Third COPD Modeling Meeting.

    PubMed

    Hoogendoorn, Martine; Feenstra, Talitha L; Asukai, Yumi; Briggs, Andrew H; Hansen, Ryan N; Leidl, Reiner; Risebrough, Nancy; Samyshkin, Yevgeniy; Wacker, Margarethe; Rutten-van Mölken, Maureen P M H

    2017-03-01

    To validate outcomes of presently available chronic obstructive pulmonary disease (COPD) cost-effectiveness models against results of two large COPD trials-the 3-year TOwards a Revolution in COPD Health (TORCH) trial and the 4-year Understanding Potential Long-term Impacts on Function with Tiotropium (UPLIFT) trial. Participating COPD modeling groups simulated the outcomes for the placebo-treated groups of the TORCH and UPLIFT trials using baseline characteristics of the trial populations as input. Groups then simulated treatment effectiveness by using relative reductions in annual decline in lung function and exacerbation frequency observed in the most intensively treated group compared with placebo as input for the models. Main outcomes were (change in) total/severe exacerbations and mortality. Furthermore, the absolute differences in total exacerbations and quality-adjusted life-years (QALYs) were used to approximate the cost per exacerbation avoided and the cost per QALY gained. Of the six participating models, three models reported higher total exacerbation rates than observed in the TORCH trial (1.13/patient-year) (models: 1.22-1.48). Four models reported higher rates than observed in the UPLIFT trial (0.85/patient-year) (models: 1.13-1.52). Two models reported higher mortality rates than in the TORCH trial (15.2%) (models: 20.0% and 30.6%) and the UPLIFT trial (16.3%) (models: 24.8% and 36.0%), whereas one model reported lower rates (9.8% and 12.1%, respectively). Simulation of treatment effectiveness showed that the absolute reduction in total exacerbations, the gain in QALYs, and the cost-effectiveness ratios did not differ from the trials, except for one model. Although most of the participating COPD cost-effectiveness models reported higher total exacerbation rates than observed in the trials, estimates of the absolute treatment effect and cost-effectiveness ratios do not seem different from the trials in most models. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  4. Modeling the clinical and economic implications of obesity using microsimulation.

    PubMed

    Su, W; Huang, J; Chen, F; Iacobucci, W; Mocarski, M; Dall, T M; Perreault, L

    2015-01-01

    The obesity epidemic has raised considerable public health concerns, but there are few validated longitudinal simulation models examining the human and economic cost of obesity. This paper describes a microsimulation model as a comprehensive tool to understand the relationship between body weight, health, and economic outcomes. Patient health and economic outcomes were simulated annually over 10 years using a Markov-based microsimulation model. The obese population examined is nationally representative of obese adults in the US from the 2005-2012 National Health and Nutrition Examination Surveys, while a matched normal weight population was constructed to have similar demographics as the obese population during the same period. Prediction equations for onset of obesity-related comorbidities, medical expenditures, economic outcomes, mortality, and quality-of-life came from published trials and studies supplemented with original research. Model validation followed International Society for Pharmacoeconomics and Outcomes Research practice guidelines. Among surviving adults, relative to a matched normal weight population, obese adults averaged $3900 higher medical expenditures in the initial year, growing to $4600 higher expenditures in year 10. Obese adults had higher initial prevalence and higher simulated onset of comorbidities as they aged. Over 10 years, excess medical expenditures attributed to obesity averaged $4280 annually-ranging from $2820 for obese category I to $5100 for obese category II, and $8710 for obese category III. Each excess kilogram of weight contributed to $140 higher annual costs, on average, ranging from $136 (obese I) to $152 (obese III). Poor health associated with obesity increased work absenteeism and mortality, and lowered employment probability, personal income, and quality-of-life. This validated model helps illustrate why obese adults have higher medical and indirect costs relative to normal weight adults, and shows that medical costs for obese adults rise more rapidly with aging relative to normal weight adults.

  5. The OncoSim model: development and use for better decision-making in Canadian cancer control.

    PubMed

    Gauvreau, C L; Fitzgerald, N R; Memon, S; Flanagan, W M; Nadeau, C; Asakawa, K; Garner, R; Miller, A B; Evans, W K; Popadiuk, C M; Wolfson, M; Coldman, A J

    2017-12-01

    The Canadian Partnership Against Cancer was created in 2007 by the federal government to accelerate cancer control across Canada. Its OncoSim microsimulation model platform, which consists of a suite of specific cancer models, was conceived as a tool to augment conventional resources for population-level policy- and decision-making. The Canadian Partnership Against Cancer manages the OncoSim program, with funding from Health Canada and model development by Statistics Canada. Microsimulation modelling allows for the detailed capture of population heterogeneity and health and demographic history over time. Extensive data from multiple Canadian sources were used as inputs or to validate the model. OncoSim has been validated through expert consultation; assessments of face validity, internal validity, and external validity; and model fit against observed data. The platform comprises three in-depth cancer models (lung, colorectal, cervical), with another in-depth model (breast) and a generalized model (25 cancers) being in development. Unique among models of its class, OncoSim is available online for public sector use free of charge. Users can customize input values and output display, and extensive user support is provided. OncoSim has been used to support decision-making at the national and jurisdictional levels. Although simulation studies are generally not included in hierarchies of evidence, they are integral to informing cancer control policy when clinical studies are not feasible. OncoSim can evaluate complex intervention scenarios for multiple cancers. Canadian decision-makers thus have a powerful tool to assess the costs, benefits, cost-effectiveness, and budgetary effects of cancer control interventions when faced with difficult choices for improvements in population health and resource allocation.

  6. Validating MDS Data about Risk Factors for Perineal Dermatitis by Comparing With Nursing Home Records

    PubMed Central

    Toth, Anna M.; Bliss, Donna Z.; Savik, Kay; Wyman, Jean F.

    2011-01-01

    Perineal dermatitis is one of the main complications of incontinence and increases the cost of health care. The Minimum Data Set (MDS) contains data about factors associated with perineal dermatitis identified in a published conceptual model of perineal dermatitis. The purpose of this study was to determine the validity of MDS data related to perineal dermatitis risk factors by comparing them with data in nursing home chart records. Findings indicate that MDS items defining factors associated with perineal dermatitis were valid and supported use of the MDS in further investigation of a significant, costly, and understudied health problem of nursing home residents. PMID:18512629

  7. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Fifield, Leonard S.; Gandhi, Umesh N.

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used asmore » resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.« less

  8. Validation of the Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM).

    PubMed

    Willis, Michael; Johansen, Pierre; Nilsson, Andreas; Asseburg, Christian

    2017-03-01

    The Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM) was developed to address study questions pertaining to the cost-effectiveness of treatment alternatives in the care of patients with type 2 diabetes mellitus (T2DM). Naturally, the usefulness of a model is determined by the accuracy of its predictions. A previous version of ECHO-T2DM was validated against actual trial outcomes and the model predictions were generally accurate. However, there have been recent upgrades to the model, which modify model predictions and necessitate an update of the validation exercises. The objectives of this study were to extend the methods available for evaluating model validity, to conduct a formal model validation of ECHO-T2DM (version 2.3.0) in accordance with the principles espoused by the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) and the Society for Medical Decision Making (SMDM), and secondarily to evaluate the relative accuracy of four sets of macrovascular risk equations included in ECHO-T2DM. We followed the ISPOR/SMDM guidelines on model validation, evaluating face validity, verification, cross-validation, and external validation. Model verification involved 297 'stress tests', in which specific model inputs were modified systematically to ascertain correct model implementation. Cross-validation consisted of a comparison between ECHO-T2DM predictions and those of the seminal National Institutes of Health model. In external validation, study characteristics were entered into ECHO-T2DM to replicate the clinical results of 12 studies (including 17 patient populations), and model predictions were compared to observed values using established statistical techniques as well as measures of average prediction error, separately for the four sets of macrovascular risk equations supported in ECHO-T2DM. Sub-group analyses were conducted for dependent vs. independent outcomes and for microvascular vs. macrovascular vs. mortality endpoints. All stress tests were passed. ECHO-T2DM replicated the National Institutes of Health cost-effectiveness application with numerically similar results. In external validation of ECHO-T2DM, model predictions agreed well with observed clinical outcomes. For all sets of macrovascular risk equations, the results were close to the intercept and slope coefficients corresponding to a perfect match, resulting in high R 2 and failure to reject concordance using an F test. The results were similar for sub-groups of dependent and independent validation, with some degree of under-prediction of macrovascular events. ECHO-T2DM continues to match health outcomes in clinical trials in T2DM, with prediction accuracy similar to other leading models of T2DM.

  9. Validation of an effective, low cost, Free/open access 3D-printed stethoscope

    PubMed Central

    Pavlosky, Alexander; Glauche, Jennifer; Chambers, Spencer; Al-Alawi, Mahmoud; Yanev, Kliment

    2018-01-01

    The modern acoustic stethoscope is a useful clinical tool used to detect subtle, pathological changes in cardiac, pulmonary and vascular sounds. Currently, brand-name stethoscopes are expensive despite limited innovations in design or fabrication in recent decades. Consequently, the high cost of high quality, brand name models serves as a barrier to clinicians practicing in various settings, especially in low- and middle-income countries. In this publication, we describe the design and validation of a low-cost open-access (Free/Libre) 3D-printed stethoscope which is comparable to the Littmann Cardiology III for use in low-access clinics. PMID:29538426

  10. Simulation models in population breast cancer screening: A systematic review.

    PubMed

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    2015-08-01

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    NASA Technical Reports Server (NTRS)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  12. Reducing Interaction Costs for Self-interested Agents

    NASA Astrophysics Data System (ADS)

    Zhang, Yunqi; Larson, Kate

    In many multiagent systems, agents are not able to freely interact with each other or with a centralized mechanism. They may be limited in their interactions by cost or by the inherent structure of the system. Using a combinatorial auction application as motivation, we study the impact of interaction costs and structure on the strategic behaviour of self-interested agents. We present a particular model of costly agent-interaction, and argue that self-interested agents may wish to coordinate their actions with their neighbours so as to reduce their individual costs. We highlight the issues that arise in such a setting, propose a cost-sharing mechanism that agents can use, and discuss group coordination procedures. Experimental work validates our model.

  13. Global cost of child survival: estimates from country-level validation

    PubMed Central

    van Ekdom, Liselore; Scherpbier, Robert W; Niessen, Louis W

    2011-01-01

    Abstract Objective To cross-validate the global cost of scaling up child survival interventions to achieve the fourth Millennium Development Goal (MDG4) as estimated by the World Health Organization (WHO) in 2007 by using the latest country-provided data and new assumptions. Methods After the main cost categories for each country were identified, validation questionnaires were sent to 32 countries with high child mortality. Publicly available estimates for disease incidence, intervention coverage, prices and resources for individual-level and programme-level activities were validated against local data. Nine updates to the 2007 WHO model were generated using revised assumptions. Finally, estimates were extrapolated to 75 countries and combined with cost estimates for immunization and malaria programmes and for programmes for the prevention of mother-to-child transmission of the human immunodeficiency virus (HIV). Findings Twenty-six countries responded. Adjustments were largest for system- and programme-level data and smallest for patient data. Country-level validation caused a 53% increase in original cost estimates (i.e. 9 billion 2004 United States dollars [US$]) for 26 countries owing to revised system and programme assumptions, especially surrounding community health worker costs. The additional effect of updated population figures was small; updated epidemiologic figures increased costs by US$ 4 billion (+15%). New unit prices in the 26 countries that provided data increased estimates by US$ 4.3 billion (+16%). Extrapolation to 75 countries increased the original price estimate by US$ 33 billion (+80%) for 2010–2015. Conclusion Country-level validation had a significant effect on the cost estimate. Price adaptations and programme-related assumptions contributed substantially. An additional 74 billion US$ 2005 (representing a 12% increase in total health expenditure) would be needed between 2010 and 2015. Given resource constraints, countries will need to prioritize health activities within their national resource envelope. PMID:21479091

  14. Microscopic simulation model calibration and validation handbook.

    DOT National Transportation Integrated Search

    2006-01-01

    Microscopic traffic simulation models are widely used in the transportation engineering field. Because of their cost-effectiveness, risk-free nature, and high-speed benefits, areas of use include transportation system design, traffic operations, and ...

  15. Outcome evaluation of a new model of critical care orientation.

    PubMed

    Morris, Linda L; Pfeifer, Pamela; Catalano, Rene; Fortney, Robert; Nelson, Greta; Rabito, Robb; Harap, Rebecca

    2009-05-01

    The shortage of critical care nurses and the service expansion of 2 intensive care units provided a unique opportunity to create a new model of critical care orientation. The goal was to design a program that assessed critical thinking, validated competence, and provided learning pathways that accommodated diverse experience. To determine the effect of a new model of critical care orientation on satisfaction, retention, turnover, vacancy, preparedness to manage patient care assignment, length of orientation, and cost of orientation. A prospective, quasi-experimental design with both quantitative and qualitative methods. The new model improved satisfaction scores, retention rates, and recruitment of critical care nurses. Length of orientation was unchanged. Cost was increased, primarily because a full-time education consultant was added. A new model for nurse orientation that was focused on critical thinking and competence validation improved retention and satisfaction and serves as a template for orientation of nurses throughout the medical center.

  16. Beyond discrimination: A comparison of calibration methods and clinical usefulness of predictive models of readmission risk.

    PubMed

    Walsh, Colin G; Sharman, Kavya; Hripcsak, George

    2017-12-01

    Prior to implementing predictive models in novel settings, analyses of calibration and clinical usefulness remain as important as discrimination, but they are not frequently discussed. Calibration is a model's reflection of actual outcome prevalence in its predictions. Clinical usefulness refers to the utilities, costs, and harms of using a predictive model in practice. A decision analytic approach to calibrating and selecting an optimal intervention threshold may help maximize the impact of readmission risk and other preventive interventions. To select a pragmatic means of calibrating predictive models that requires a minimum amount of validation data and that performs well in practice. To evaluate the impact of miscalibration on utility and cost via clinical usefulness analyses. Observational, retrospective cohort study with electronic health record data from 120,000 inpatient admissions at an urban, academic center in Manhattan. The primary outcome was thirty-day readmission for three causes: all-cause, congestive heart failure, and chronic coronary atherosclerotic disease. Predictive modeling was performed via L1-regularized logistic regression. Calibration methods were compared including Platt Scaling, Logistic Calibration, and Prevalence Adjustment. Performance of predictive modeling and calibration was assessed via discrimination (c-statistic), calibration (Spiegelhalter Z-statistic, Root Mean Square Error [RMSE] of binned predictions, Sanders and Murphy Resolutions of the Brier Score, Calibration Slope and Intercept), and clinical usefulness (utility terms represented as costs). The amount of validation data necessary to apply each calibration algorithm was also assessed. C-statistics by diagnosis ranged from 0.7 for all-cause readmission to 0.86 (0.78-0.93) for congestive heart failure. Logistic Calibration and Platt Scaling performed best and this difference required analyzing multiple metrics of calibration simultaneously, in particular Calibration Slopes and Intercepts. Clinical usefulness analyses provided optimal risk thresholds, which varied by reason for readmission, outcome prevalence, and calibration algorithm. Utility analyses also suggested maximum tolerable intervention costs, e.g., $1720 for all-cause readmissions based on a published cost of readmission of $11,862. Choice of calibration method depends on availability of validation data and on performance. Improperly calibrated models may contribute to higher costs of intervention as measured via clinical usefulness. Decision-makers must understand underlying utilities or costs inherent in the use-case at hand to assess usefulness and will obtain the optimal risk threshold to trigger intervention with intervention cost limits as a result. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A novel heuristic for optimization aggregate production problem: Evidence from flat panel display in Malaysia

    NASA Astrophysics Data System (ADS)

    Al-Kuhali, K.; Hussain M., I.; Zain Z., M.; Mullenix, P.

    2015-05-01

    Aim: This paper contribute to the flat panel display industry it terms of aggregate production planning. Methodology: For the minimization cost of total production of LCD manufacturing, a linear programming was applied. The decision variables are general production costs, additional cost incurred for overtime production, additional cost incurred for subcontracting, inventory carrying cost, backorder costs and adjustments for changes incurred within labour levels. Model has been developed considering a manufacturer having several product types, which the maximum types are N, along a total time period of T. Results: Industrial case study based on Malaysia is presented to test and to validate the developed linear programming model for aggregate production planning. Conclusion: The model development is fit under stable environment conditions. Overall it can be recommended to adapt the proven linear programming model to production planning of Malaysian flat panel display industry.

  18. Design optimization of space launch vehicles using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Bayley, Douglas James

    The United States Air Force (USAF) continues to have a need for assured access to space. In addition to flexible and responsive spacelift, a reduction in the cost per launch of space launch vehicles is also desirable. For this purpose, an investigation of the design optimization of space launch vehicles has been conducted. Using a suite of custom codes, the performance aspects of an entire space launch vehicle were analyzed. A genetic algorithm (GA) was employed to optimize the design of the space launch vehicle. A cost model was incorporated into the optimization process with the goal of minimizing the overall vehicle cost. The other goals of the design optimization included obtaining the proper altitude and velocity to achieve a low-Earth orbit. Specific mission parameters that are particular to USAF space endeavors were specified at the start of the design optimization process. Solid propellant motors, liquid fueled rockets, and air-launched systems in various configurations provided the propulsion systems for two, three and four-stage launch vehicles. Mass properties models, an aerodynamics model, and a six-degree-of-freedom (6DOF) flight dynamics simulator were all used to model the system. The results show the feasibility of this method in designing launch vehicles that meet mission requirements. Comparisons to existing real world systems provide the validation for the physical system models. However, the ability to obtain a truly minimized cost was elusive. The cost model uses an industry standard approach, however, validation of this portion of the model was challenging due to the proprietary nature of cost figures and due to the dependence of many existing systems on surplus hardware.

  19. Virtual experiments, physical validation: dental morphology at the intersection of experiment and theory

    PubMed Central

    Anderson, P. S. L.; Rayfield, E. J.

    2012-01-01

    Computational models such as finite-element analysis offer biologists a means of exploring the structural mechanics of biological systems that cannot be directly observed. Validated against experimental data, a model can be manipulated to perform virtual experiments, testing variables that are hard to control in physical experiments. The relationship between tooth form and the ability to break down prey is key to understanding the evolution of dentition. Recent experimental work has quantified how tooth shape promotes fracture in biological materials. We present a validated finite-element model derived from physical compression experiments. The model shows close agreement with strain patterns observed in photoelastic test materials and reaction forces measured during these experiments. We use the model to measure strain energy within the test material when different tooth shapes are used. Results show that notched blades deform materials for less strain energy cost than straight blades, giving insights into the energetic relationship between tooth form and prey materials. We identify a hypothetical ‘optimal’ blade angle that minimizes strain energy costs and test alternative prey materials via virtual experiments. Using experimental data and computational models offers an integrative approach to understand the mechanics of tooth morphology. PMID:22399789

  20. Calibration and Validation of the COCOMO II.1997.0 Cost/Schedule Estimating Model to the Space and Missile Systems Center Database

    DTIC Science & Technology

    1997-09-01

    Daly chose five models (REVIC, PRICE-S, SEER, System-4, and SPQR /20) to estimate schedule for 21 separate projects from the Electronic System Division...PRICE-S, two variants of COCOMO, System-3, SPQR /20, SASET, SoftCost-Ada) to 11 eight Ada specific programs. Ada was specifically designed for and is

  1. An integrated approach to evaluating alternative risk prediction strategies: a case study comparing alternative approaches for preventing invasive fungal disease.

    PubMed

    Sadique, Z; Grieve, R; Harrison, D A; Jit, M; Allen, E; Rowan, K M

    2013-12-01

    This article proposes an integrated approach to the development, validation, and evaluation of new risk prediction models illustrated with the Fungal Infection Risk Evaluation study, which developed risk models to identify non-neutropenic, critically ill adult patients at high risk of invasive fungal disease (IFD). Our decision-analytical model compared alternative strategies for preventing IFD at up to three clinical decision time points (critical care admission, after 24 hours, and end of day 3), followed with antifungal prophylaxis for those judged "high" risk versus "no formal risk assessment." We developed prognostic models to predict the risk of IFD before critical care unit discharge, with data from 35,455 admissions to 70 UK adult, critical care units, and validated the models externally. The decision model was populated with positive predictive values and negative predictive values from the best-fitting risk models. We projected lifetime cost-effectiveness and expected value of partial perfect information for groups of parameters. The risk prediction models performed well in internal and external validation. Risk assessment and prophylaxis at the end of day 3 was the most cost-effective strategy at the 2% and 1% risk threshold. Risk assessment at each time point was the most cost-effective strategy at a 0.5% risk threshold. Expected values of partial perfect information were high for positive predictive values or negative predictive values (£11 million-£13 million) and quality-adjusted life-years (£11 million). It is cost-effective to formally assess the risk of IFD for non-neutropenic, critically ill adult patients. This integrated approach to developing and evaluating risk models is useful for informing clinical practice and future research investment. © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR) Published by International Society for Pharmacoeconomics and Outcomes Research (ISPOR) All rights reserved.

  2. The effect of adopting new storage methods for extending product validity periods on manufacturers expected inventory costs.

    PubMed

    Chen, Po-Yu

    2014-01-01

    The validness of the expiration dates (validity period) that manufacturers provide on food product labels is a crucial food safety problem. Governments must study how to use their authority by implementing fair awards and punishments to prompt manufacturers into adopting rigorous considerations, such as the effect of adopting new storage methods for extending product validity periods on expected costs. Assuming that a manufacturer sells fresh food or drugs, this manufacturer must respond to current stochastic demands at each unit of time to determine the purchase amount of products for sale. If this decision maker is capable and an opportunity arises, new packaging methods (e.g., aluminum foil packaging, vacuum packaging, high-temperature sterilization after glass packaging, or packaging with various degrees of dryness) or storage methods (i.e., adding desiccants or various antioxidants) can be chosen to extend the validity periods of products. To minimize expected costs, this decision maker must be aware of the processing costs of new storage methods, inventory standards, inventory cycle lengths, and changes in relationships between factors such as stochastic demand functions in a cycle. Based on these changes in relationships, this study established a mathematical model as a basis for discussing the aforementioned topics.

  3. Recent developments in the economic modeling of photovoltaic module manufacturing

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1979-01-01

    Recent developments in the solar array manufacturing industry costing standards (SAMICS) are described. Consideration is given to the added capability to handle arbitrary operating schedules and the revised procedure for calculation of one-time costs. The results of an extensive validation study are summarized.

  4. Experimental Validation of Plastic Mandible Models Produced by a “Low-Cost” 3-Dimensional Fused Deposition Modeling Printer

    PubMed Central

    Maschio, Federico; Pandya, Mirali; Olszewski, Raphael

    2016-01-01

    Background The objective of this study was to investigate the accuracy of 3-dimensional (3D) plastic (ABS) models generated using a low-cost 3D fused deposition modelling printer. Material/Methods Two human dry mandibles were scanned with a cone beam computed tomography (CBCT) Accuitomo device. Preprocessing consisted of 3D reconstruction with Maxilim software and STL file repair with Netfabb software. Then, the data were used to print 2 plastic replicas with a low-cost 3D fused deposition modeling printer (Up plus 2®). Two independent observers performed the identification of 26 anatomic landmarks on the 4 mandibles (2 dry and 2 replicas) with a 3D measuring arm. Each observer repeated the identifications 20 times. The comparison between the dry and plastic mandibles was based on 13 distances: 8 distances less than 12 mm and 5 distances greater than 12 mm. Results The mean absolute difference (MAD) was 0.37 mm, and the mean dimensional error (MDE) was 3.76%. The MDE decreased to 0.93% for distances greater than 12 mm. Conclusions Plastic models generated using the low-cost 3D printer UPplus2® provide dimensional accuracies comparable to other well-established rapid prototyping technologies. Validated low-cost 3D printers could represent a step toward the better accessibility of rapid prototyping technologies in the medical field. PMID:27003456

  5. Mathematical model for dynamic cell formation in fast fashion apparel manufacturing stage

    NASA Astrophysics Data System (ADS)

    Perera, Gayathri; Ratnayake, Vijitha

    2018-05-01

    This paper presents a mathematical programming model for dynamic cell formation to minimize changeover-related costs (i.e., machine relocation costs and machine setup cost) and inter-cell material handling cost to cope with the volatile production environments in apparel manufacturing industry. The model is formulated through findings of a comprehensive literature review. Developed model is validated based on data collected from three different factories in apparel industry, manufacturing fast fashion products. A program code is developed using Lingo 16.0 software package to generate optimal cells for developed model and to determine the possible cost-saving percentage when the existing layouts used in three factories are replaced by generated optimal cells. The optimal cells generated by developed mathematical model result in significant cost saving when compared with existing product layouts used in production/assembly department of selected factories in apparel industry. The developed model can be considered as effective in minimizing the considered cost terms in dynamic production environment of fast fashion apparel manufacturing industry. Findings of this paper can be used for further researches on minimizing the changeover-related costs in fast fashion apparel production stage.

  6. Modeling and applications in microbial food safety

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling is a scientific and systematic approach to study and describe the recurrent events or phenomena with successful application track for decades. When models are properly developed and validated, their applications may save costs and time. For the microbial food safety concerns, ...

  7. Evaluation of land use regression models in Detroit, Michigan

    EPA Science Inventory

    Introduction: Land use regression (LUR) models have emerged as a cost-effective tool for characterizing exposure in epidemiologic health studies. However, little critical attention has been focused on validation of these models as a step toward temporal and spatial extension of ...

  8. Generator Dynamic Model Validation and Parameter Calibration Using Phasor Measurements at the Point of Connection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhenyu; Du, Pengwei; Kosterev, Dmitry

    2013-05-01

    Disturbance data recorded by phasor measurement units (PMU) offers opportunities to improve the integrity of dynamic models. However, manually tuning parameters through play-back events demands significant efforts and engineering experiences. In this paper, a calibration method using the extended Kalman filter (EKF) technique is proposed. The formulation of EKF with parameter calibration is discussed. Case studies are presented to demonstrate its validity. The proposed calibration method is cost-effective, complementary to traditional equipment testing for improving dynamic model quality.

  9. Policy evaluation in diabetes prevention and treatment using a population-based macro simulation model: the MICADO model.

    PubMed

    van der Heijden, A A W A; Feenstra, T L; Hoogenveen, R T; Niessen, L W; de Bruijne, M C; Dekker, J M; Baan, C A; Nijpels, G

    2015-12-01

    To test a simulation model, the MICADO model, for estimating the long-term effects of interventions in people with and without diabetes. The MICADO model includes micro- and macrovascular diseases in relation to their risk factors. The strengths of this model are its population scope and the possibility to assess parameter uncertainty using probabilistic sensitivity analyses. Outcomes include incidence and prevalence of complications, quality of life, costs and cost-effectiveness. We externally validated MICADO's estimates of micro- and macrovascular complications in a Dutch cohort with diabetes (n = 498,400) by comparing these estimates with national and international empirical data. For the annual number of people undergoing amputations, MICADO's estimate was 592 (95% interquantile range 291-842), which compared well with the registered number of people with diabetes-related amputations in the Netherlands (728). The incidence of end-stage renal disease estimated using the MICADO model was 247 people (95% interquartile range 120-363), which was also similar to the registered incidence in the Netherlands (277 people). MICADO performed well in the validation of macrovascular outcomes of population-based cohorts, while it had more difficulty in reflecting a highly selected trial population. Validation by comparison with independent empirical data showed that the MICADO model simulates the natural course of diabetes and its micro- and macrovascular complications well. As a population-based model, MICADO can be applied for projections as well as scenario analyses to evaluate the long-term (cost-)effectiveness of population-level interventions targeting diabetes and its complications in the Netherlands or similar countries. © 2015 The Authors. Diabetic Medicine © 2015 Diabetes UK.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, Aladsair J.; Viswanathan, Vilayanur V.; Stephenson, David E.

    A robust performance-based cost model is developed for all-vanadium, iron-vanadium and iron chromium redox flow batteries. Systems aspects such as shunt current losses, pumping losses and thermal management are accounted for. The objective function, set to minimize system cost, allows determination of stack design and operating parameters such as current density, flow rate and depth of discharge (DOD). Component costs obtained from vendors are used to calculate system costs for various time frames. A 2 kW stack data was used to estimate unit energy costs and compared with model estimates for the same size electrodes. The tool has been sharedmore » with the redox flow battery community to both validate their stack data and guide future direction.« less

  11. Analysis on the cost structure of product recall for reverse supply chain

    NASA Astrophysics Data System (ADS)

    Yanhua, Feng; Xuhui, Xia; Zheng, Yang

    2017-12-01

    The research on the reverse supply chain of product recall mainly focused on the recall network structure, logistics mode and so on. In this paper, when product recall and supply channel are fixed, the specific structure and function expression of cost are analyzed according to the peak season and off-season of recall activities, and whether the assembly manufacturer, supplier and recyclers are cooperated situation, respectively, to build the total cost structure of the function model. Finally, the model is validated correctly through the automotive industry and the electromechanical industry.

  12. A diagnostic model for the detection of sensitization to wheat allergens was developed and validated in bakery workers.

    PubMed

    Suarthana, Eva; Vergouwe, Yvonne; Moons, Karel G; de Monchy, Jan; Grobbee, Diederick; Heederik, Dick; Meijer, Evert

    2010-09-01

    To develop and validate a prediction model to detect sensitization to wheat allergens in bakery workers. The prediction model was developed in 867 Dutch bakery workers (development set, prevalence of sensitization 13%) and included questionnaire items (candidate predictors). First, principal component analysis was used to reduce the number of candidate predictors. Then, multivariable logistic regression analysis was used to develop the model. Internal validation and extent of optimism was assessed with bootstrapping. External validation was studied in 390 independent Dutch bakery workers (validation set, prevalence of sensitization 20%). The prediction model contained the predictors nasoconjunctival symptoms, asthma symptoms, shortness of breath and wheeze, work-related upper and lower respiratory symptoms, and traditional bakery. The model showed good discrimination with an area under the receiver operating characteristic (ROC) curve area of 0.76 (and 0.75 after internal validation). Application of the model in the validation set gave a reasonable discrimination (ROC area=0.69) and good calibration after a small adjustment of the model intercept. A simple model with questionnaire items only can be used to stratify bakers according to their risk of sensitization to wheat allergens. Its use may increase the cost-effectiveness of (subsequent) medical surveillance.

  13. A mechanistic model for electricity consumption on dairy farms: definition, validation, and demonstration.

    PubMed

    Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat tariff. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Validity of a small low-cost triaxial accelerometer with integrated logger for uncomplicated measurements of postures and movements of head, upper back and upper arms.

    PubMed

    Dahlqvist, Camilla; Hansson, Gert-Åke; Forsman, Mikael

    2016-07-01

    Repetitive work and work in constrained postures are risk factors for developing musculoskeletal disorders. Low-cost, user-friendly technical methods to quantify these risks are needed. The aims were to validate inclination angles and velocities of one model of the new generation of accelerometers with integrated data loggers against a previously validated one, and to compare meaurements when using a plain reference posture with that of a standardized one. All mean (n = 12 subjects) angular RMS-differences in 4 work tasks and 4 body parts were <2.5° and all mean median angular velocity differences <5.0 °/s. The mean correlation between the inclination signal-pairs was 0.996. This model of the new generation of triaxial accelerometers proved to be comparable to the validated accelerometer using a data logger. This makes it well-suited, for both researchers and practitioners, to measure postures and movements during work. Further work is needed for validation of the plain reference posture for upper arms. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Predictive microbiology for food packaging applications

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling has been applied to describe the microbial growth and inactivation in foods for decades and is also known as ‘Predictive microbiology’. When models are developed and validated, their applications may save cost and time. The Pathogen Modeling Program (PMP), a collection of mode...

  16. NASA Instrument Cost/Schedule Model

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  17. Cost and cost effectiveness of vaginal progesterone gel in reducing preterm birth: an economic analysis of the PREGNANT trial.

    PubMed

    Pizzi, Laura T; Seligman, Neil S; Baxter, Jason K; Jutkowitz, Eric; Berghella, Vincenzo

    2014-05-01

    Preterm birth (PTB) is a costly public health problem in the USA. The PREGNANT trial tested the efficacy of vaginal progesterone (VP) 8 % gel in reducing the likelihood of PTB among women with a short cervix. We calculated the costs and cost effectiveness of VP gel versus placebo using decision analytic models informed by PREGNANT patient-level data. PREGNANT enrolled 459 pregnant women with a cervical length of 10-20 mm and randomized them to either VP 8 % gel or placebo. We used a cost model to estimate the total cost of treatment per mother and a cost-effectiveness model to estimate the cost per PTB averted with VP gel versus placebo. Patient-level trial data informed model inputs and included PTB rates in low- and high-risk women in each study group at <28 weeks gestation, 28-31, 32-36, and ≥37 weeks. Cost assumptions were based on 2010 US healthcare services reimbursements. The cost model was validated against patient-level data. Sensitivity analyses were used to test the robustness of the cost-effectiveness model. The estimated cost per mother was $US23,079 for VP gel and $US36,436 for placebo. The cost-effectiveness model showed savings of $US24,071 per PTB averted with VP gel. VP gel realized cost savings and cost effectiveness in 79 % of simulations. Based on findings from PREGNANT, VP gel was associated with cost savings and cost effectiveness compared with placebo. Future trials designed to include cost metrics are needed to better understand the value of VP.

  18. Long-term medical costs and life expectancy of acute myeloid leukemia: a probabilistic decision model.

    PubMed

    Wang, Han-I; Aas, Eline; Howell, Debra; Roman, Eve; Patmore, Russell; Jack, Andrew; Smith, Alexandra

    2014-03-01

    Acute myeloid leukemia (AML) can be diagnosed at any age and treatment, which can be given with supportive and/or curative intent, is considered expensive compared with that for other cancers. Despite this, no long-term predictive models have been developed for AML, mainly because of the complexities associated with this disease. The objective of the current study was to develop a model (based on a UK cohort) to predict cost and life expectancy at a population level. The model developed in this study combined a decision tree with several Markov models to reflect the complexity of the prognostic factors and treatments of AML. The model was simulated with a cycle length of 1 month for a time period of 5 years and further simulated until age 100 years or death. Results were compared for two age groups and five different initial treatment intents and responses. Transition probabilities, life expectancies, and costs were derived from a UK population-based specialist registry-the Haematological Malignancy Research Network (www.hmrn.org). Overall, expected 5-year medical costs and life expectancy ranged from £8,170 to £81,636 and 3.03 to 34.74 months, respectively. The economic and health outcomes varied with initial treatment intent, age at diagnosis, trial participation, and study time horizon. The model was validated by using face, internal, and external validation methods. The results show that the model captured more than 90% of the empirical costs, and it demonstrated good fit with the empirical overall survival. Costs and life expectancy of AML varied with patient characteristics and initial treatment intent. The robust AML model developed in this study could be used to evaluate new diagnostic tools/treatments, as well as enable policy makers to make informed decisions. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. When to use discrete event simulation (DES) for the economic evaluation of health technologies? A review and critique of the costs and benefits of DES.

    PubMed

    Karnon, Jonathan; Haji Ali Afzali, Hossein

    2014-06-01

    Modelling in economic evaluation is an unavoidable fact of life. Cohort-based state transition models are most common, though discrete event simulation (DES) is increasingly being used to implement more complex model structures. The benefits of DES relate to the greater flexibility around the implementation and population of complex models, which may provide more accurate or valid estimates of the incremental costs and benefits of alternative health technologies. The costs of DES relate to the time and expertise required to implement and review complex models, when perhaps a simpler model would suffice. The costs are not borne solely by the analyst, but also by reviewers. In particular, modelled economic evaluations are often submitted to support reimbursement decisions for new technologies, for which detailed model reviews are generally undertaken on behalf of the funding body. This paper reports the results from a review of published DES-based economic evaluations. Factors underlying the use of DES were defined, and the characteristics of applied models were considered, to inform options for assessing the potential benefits of DES in relation to each factor. Four broad factors underlying the use of DES were identified: baseline heterogeneity, continuous disease markers, time varying event rates, and the influence of prior events on subsequent event rates. If relevant, individual-level data are available, representation of the four factors is likely to improve model validity, and it is possible to assess the importance of their representation in individual cases. A thorough model performance evaluation is required to overcome the costs of DES from the users' perspective, but few of the reviewed DES models reported such a process. More generally, further direct, empirical comparisons of complex models with simpler models would better inform the benefits of DES to implement more complex models, and the circumstances in which such benefits are most likely.

  20. Model of delivery consolidation of critical spare part : case study of an oil and gas company

    NASA Astrophysics Data System (ADS)

    Hartanto, D.; Agustinita, A.

    2018-04-01

    The availability of spare parts in oil and gas industry is very important to prevent the occurrence of very high opportunity cost, that is the loss caused by exploitation equipment which must stop because of unavailability of the spare part. This is done by providing safety stock with a very high service level that leads to high inventory costs. If the company wants to lower inventory costs, the choices are not to lower the service level but to lower the ordering cost. One of the components of ordering cost is the delivery cost. Exploitation facilities are usually located in remote areas so that the cost of delivery is high. In addition, many spare parts are supplied by the same supplier. Therefore, there is an opportunity to lower the cost of delivery of spare parts by consolidation. In this paper,mixed integer linear programming (MILP) model is developed to plan the procurement of spare parts so that inventory costs which include holding and ordering cost for spare parts can be minimized. The model has been verified and validated. Using this model the company can lower inventory costs of the spare part by 32%.

  1. Demonstrating the Alaska Ocean Observing System in Prince William Sound

    NASA Astrophysics Data System (ADS)

    Schoch, G. Carl; McCammon, Molly

    2013-07-01

    The Alaska Ocean Observing System and the Oil Spill Recovery Institute developed a demonstration project over a 5 year period in Prince William Sound. The primary goal was to develop a quasi-operational system that delivers weather and ocean information in near real time to diverse user communities. This observing system now consists of atmospheric and oceanic sensors, and a new generation of computer models to numerically simulate and forecast weather, waves, and ocean circulation. A state of the art data management system provides access to these products from one internet portal at http://www.aoos.org. The project culminated in a 2009 field experiment that evaluated the observing system and performance of the model forecasts. Observations from terrestrial weather stations and weather buoys validated atmospheric circulation forecasts. Observations from wave gages on weather buoys validated forecasts of significant wave heights and periods. There was an emphasis on validation of surface currents forecasted by the ocean circulation model for oil spill response and search and rescue applications. During the 18 day field experiment a radar array mapped surface currents and drifting buoys were deployed. Hydrographic profiles at fixed stations, and by autonomous vehicles along transects, were made to acquire measurements through the water column. Terrestrial weather stations were the most reliable and least costly to operate, and in situ ocean sensors were more costly and considerably less reliable. The radar surface current mappers were the least reliable and most costly but provided the assimilation and validation data that most improved ocean circulation forecasts. We describe the setting of Prince William Sound and the various observational platforms and forecast models of the observing system, and discuss recommendations for future development.

  2. [Cost-effectiveness of new drugs impacts reimbursement decision making but room for improvement].

    PubMed

    Hoomans, Ties; van der Roer, Nicole; Severens, Johan L; Delwel, Gepke O

    2010-01-01

    For new drugs to be included in appendix 1B of the drug reimbursement system, they must have proven added therapeutic value, an acceptable budget impact, and be cost-effective. To validate the latter, pharmacoeconomic evaluations have become mandatory. These evaluations should adhere to guidelines for pharmacoeconomic research. Our study evaluates: 1) the extent to which the pharmacoeconomic evaluations adherence pharmacoeconomic guidelines; 2) which guidelines are decisive in evaluating the validation of cost-effectiveness of new drugs; and 3) the impact of pharmacoeconomics in the recommendations and final decision making on drug reimbursement. Retrospective, descriptive study. We examined all 1B requests for reimbursement submitted to the Dutch Health Care Insurance Board and the Medicinal Products Reimbursement Committee between 1 January 2005 and 30 September 2008, and on which recommendations on drug reimbursement have been published (n = 21). Data on adherence to guidelines, validation of cost-effectiveness, and recommendations and decision making on drug reimbursement were extracted from publicly available sources by two independent evaluators. Quantitatively and qualitatively descriptive analyses were carried out. Since pharmacoeconomic evaluations have become mandatory, these evaluations increasingly adhere to guidelines for pharmacoeconomic research. This was particularly true of the perspective chosen, the relevant treatment comparator and the incremental and total analyses of costs and effects of the drugs under comparison. However, cost-effectiveness of new drugs was often inadequately validated by incorrect indications for drug use, and incorrect forms of evaluation or periods of analysis. In addition, costs and effects were not always correctly analysed, nor and not enough insight was provided into the analysis model used. Partially on the basis of pharmacoeconomics, 12 new drugs are reimbursed and 9 not. Cost-effectiveness of new drugs and more valid pharmacoeconomic evaluations appear to play an ever more important role in reimbursement decision making and the pursuit of better and affordable health care.

  3. Laparoscopic Common Bile Duct Exploration Four-Task Training Model: Construct Validity

    PubMed Central

    Otaño, Natalia; Rodríguez, Omaira; Sánchez, Renata; Benítez, Gustavo; Schweitzer, Michael

    2012-01-01

    Background: Training models in laparoscopic surgery allow the surgical team to practice procedures in a safe environment. We have proposed the use of a 4-task, low-cost inert model to practice critical steps of laparoscopic common bile duct exploration. Methods: The performance of 3 groups with different levels of expertise in laparoscopic surgery, novices (A), intermediates (B), and experts (C), was evaluated using a low-cost inert model in the following tasks: (1) intraoperative cholangiography catheter insertion, (2) transcystic exploration, (3) T-tube placement, and (4) choledochoscope management. Kruskal-Wallis and Mann-Whitney tests were used to identify differences among the groups. Results: A total of 14 individuals were evaluated: 5 novices (A), 5 intermediates (B), and 4 experts (C). The results involving intraoperative cholangiography catheter insertion were similar among the 3 groups. As for the other tasks, the expert had better results than the other 2, in which no significant differences occurred. The proposed model is able to discriminate among individuals with different levels of expertise, indicating that the abilities that the model evaluates are relevant in the surgeon's performance in CBD exploration. Conclusions: Construct validity for tasks 2 and 3 was demonstrated. However, task 1 was no capable of distinguishing between groups, and task 4 was not statistically validated. PMID:22906323

  4. Heavy Metal Adsorption onto Kappaphycus sp. from Aqueous Solutions: The Use of Error Functions for Validation of Isotherm and Kinetics Models

    PubMed Central

    Rahman, Md. Sayedur; Sathasivam, Kathiresan V.

    2015-01-01

    Biosorption process is a promising technology for the removal of heavy metals from industrial wastes and effluents using low-cost and effective biosorbents. In the present study, adsorption of Pb2+, Cu2+, Fe2+, and Zn2+ onto dried biomass of red seaweed Kappaphycus sp. was investigated as a function of pH, contact time, initial metal ion concentration, and temperature. The experimental data were evaluated by four isotherm models (Langmuir, Freundlich, Temkin, and Dubinin-Radushkevich) and four kinetic models (pseudo-first-order, pseudo-second-order, Elovich, and intraparticle diffusion models). The adsorption process was feasible, spontaneous, and endothermic in nature. Functional groups in the biomass involved in metal adsorption process were revealed as carboxylic and sulfonic acids and sulfonate by Fourier transform infrared analysis. A total of nine error functions were applied to validate the models. We strongly suggest the analysis of error functions for validating adsorption isotherm and kinetic models using linear methods. The present work shows that the red seaweed Kappaphycus sp. can be used as a potentially low-cost biosorbent for the removal of heavy metal ions from aqueous solutions. Further study is warranted to evaluate its feasibility for the removal of heavy metals from the real environment. PMID:26295032

  5. Heavy Metal Adsorption onto Kappaphycus sp. from Aqueous Solutions: The Use of Error Functions for Validation of Isotherm and Kinetics Models.

    PubMed

    Rahman, Md Sayedur; Sathasivam, Kathiresan V

    2015-01-01

    Biosorption process is a promising technology for the removal of heavy metals from industrial wastes and effluents using low-cost and effective biosorbents. In the present study, adsorption of Pb(2+), Cu(2+), Fe(2+), and Zn(2+) onto dried biomass of red seaweed Kappaphycus sp. was investigated as a function of pH, contact time, initial metal ion concentration, and temperature. The experimental data were evaluated by four isotherm models (Langmuir, Freundlich, Temkin, and Dubinin-Radushkevich) and four kinetic models (pseudo-first-order, pseudo-second-order, Elovich, and intraparticle diffusion models). The adsorption process was feasible, spontaneous, and endothermic in nature. Functional groups in the biomass involved in metal adsorption process were revealed as carboxylic and sulfonic acids and sulfonate by Fourier transform infrared analysis. A total of nine error functions were applied to validate the models. We strongly suggest the analysis of error functions for validating adsorption isotherm and kinetic models using linear methods. The present work shows that the red seaweed Kappaphycus sp. can be used as a potentially low-cost biosorbent for the removal of heavy metal ions from aqueous solutions. Further study is warranted to evaluate its feasibility for the removal of heavy metals from the real environment.

  6. Experience with botulinum toxin therapy for axillary hyperhidrosis and comparison to modelled data for endoscopic thoracic sympathectomy - A quality of life and cost effectiveness analysis.

    PubMed

    Gibbons, John P; Nugent, Emmeline; O'Donohoe, Nollaig; Maher, Barry; Egan, Bridget; Feeley, Martin; Tierney, Sean

    2016-10-01

    To estimate cost-effectiveness of botulinum toxin therapy for axillary hyperhidrosis compared to the standard surgical intervention of endoscopic thoracic sympathectomy (ETS). The validated dermatology life quality index questionnaire was given to patients attending for treatment over a 4 month period, to assess their quality of life (QoL) over the preceding week (n = 44). Follow-up was performed 4-6 weeks later by telephone using the same questionnaire to validate the effectiveness of the treatment. The duration of effect of the botulinum toxin treatment was also recorded and this data was used as the basis for cost effectiveness analysis. Using HIPE data, the baseline cost for single intervention using botulinum toxin and ETS was retrieved. Using figures provided by HIPE and expert opinion of the costs of complications, a stochastic model for 10,000 patients was used to evaluate the total costs for ETS including the complications. The results from the QoL analysis show that botulinum toxin therapy is a successful therapy for improvement of symptoms. It was revealed that the mean interval before recurrence of original symptoms after botulinum toxin therapy was 5.6 months. The baseline cost for both treatments are €389 for botulinum toxin and €9389 for uncomplicated ETS. The stochastic model yields a mean cost of €11,390 for ETS including complications. Treatments reached cost equivalence after 13.3 years. However, given the efficacy of the botulinum toxin therapy and the low risk we propose that botulinum toxin therapy for hyperhidrosis should be considered the gold standard. Copyright © 2015 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  7. Meeting the needs of an ever-demanding market.

    PubMed

    Rigby, Richard

    2002-04-01

    Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.

  8. Development and evaluation of a calibration and validation procedure for microscopic simulation models.

    DOT National Transportation Integrated Search

    2004-01-01

    Microscopic traffic simulation models have been widely accepted and applied in transportation engineering and planning practice for the past decades because simulation is cost-effective, safe, and fast. To achieve high fidelity and credibility for a ...

  9. Southern Regional Center for Lightweight Innovative Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Paul T.

    The Southern Regional Center for Lightweight Innovative Design (SRCLID) has developed an experimentally validated cradle-to-grave modeling and simulation effort to optimize automotive components in order to decrease weight and cost, yet increase performance and safety in crash scenarios. In summary, the three major objectives of this project are accomplished: To develop experimentally validated cradle-to-grave modeling and simulation tools to optimize automotive and truck components for lightweighting materials (aluminum, steel, and Mg alloys and polymer-based composites) with consideration of uncertainty to decrease weight and cost, yet increase the performance and safety in impact scenarios; To develop multiscale computational models that quantifymore » microstructure-property relations by evaluating various length scales, from the atomic through component levels, for each step of the manufacturing process for vehicles; and To develop an integrated K-12 educational program to educate students on lightweighting designs and impact scenarios. In this final report, we divided the content into two parts: the first part contains the development of building blocks for the project, including materials and process models, process-structure-property (PSP) relationship, and experimental validation capabilities; the second part presents the demonstration task for Mg front-end work associated with USAMP projects.« less

  10. A Layered Decision Model for Cost-Effective System Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Huaqiang; Alves-Foss, James; Soule, Terry

    System security involves decisions in at least three areas: identification of well-defined security policies, selection of cost-effective defence strategies, and implementation of real-time defence tactics. Although choices made in each of these areas affect the others, existing decision models typically handle these three decision areas in isolation. There is no comprehensive tool that can integrate them to provide a single efficient model for safeguarding a network. In addition, there is no clear way to determine which particular combinations of defence decisions result in cost-effective solutions. To address these problems, this paper introduces a Layered Decision Model (LDM) for use inmore » deciding how to address defence decisions based on their cost-effectiveness. To validate the LDM and illustrate how it is used, we used simulation to test model rationality and applied the LDM to the design of system security for an e-commercial business case.« less

  11. Markov modeling and discrete event simulation in health care: a systematic comparison.

    PubMed

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  12. Estimating the clinical and economic benefit associated with incremental improvements in sustained virologic response in chronic hepatitis C.

    PubMed

    McEwan, Phil; Ward, Thomas; Bennett, Hayley; Kalsekar, Anupama; Webster, Samantha; Brenner, Michael; Yuan, Yong

    2015-01-01

    Hepatitis C virus (HCV) infection is one of the principle causes of chronic liver disease. Successful treatment significantly decreases the risk of hepatic morbidity and mortality. Current standard of care achieves sustained virologic response (SVR) rates of 40-80%; however, the HCV therapy landscape is rapidly evolving. The objective of this study was to quantify the clinical and economic benefit associated with increasing levels of SVR. A published Markov model (MONARCH) that simulates the natural history of hepatitis C over a lifetime horizon was used. Discounted and non-discounted life-years (LYs), quality-adjusted life-years (QALYs) and cost of complication management were estimated for various plausible SVR rates. To demonstrate the robustness of projections obtained, the model was validated to ten UK-specific HCV studies. QALY estimates ranged from 18.0 years for those treated successfully in fibrosis stage F0 to 7.5 years (discounted) for patients in fibrosis stage F4 who remain untreated. Predicted QALY gains per 10% improvement in SVR ranged from 0.23 (F0) to 0.64 (F4) and 0.58 (F0) to 1.35 (F4) in 40 year old patients (discounted and non-discounted results respectively). In those aged 40, projected discounted HCV-related costs are minimised with successful treatment in F0/F1 (at approximately £ 300), increasing to £ 49,300 in F4 patients who remain untreated. Validation of the model to published UK cost-effectiveness studies produce R2 goodness of fit statistics of 0.988, 0.978 and of 0.973 for total costs, QALYs and incremental cost effectiveness ratios, respectively. Projecting the long-term clinical and economic consequences associated with chronic hepatitis C is a necessary requirement for the evaluation of new treatments. The principle analysis demonstrates the significant impact on expected costs, LYs and QALYs associated with increasing SVR. A validation analysis demonstrated the robustness of the results reported.

  13. Budget Impact of a Comprehensive Nutrition-Focused Quality Improvement Program for Malnourished Hospitalized Patients

    PubMed Central

    Sulo, Suela; Feldstein, Josh; Partridge, Jamie; Schwander, Bjoern; Sriram, Krishnan; Summerfelt, Wm. Thomas

    2017-01-01

    Background Nutrition interventions can alleviate the burden of malnutrition by improving patient outcomes; however, evidence on the economic impact of medical nutrition intervention remains limited. A previously published nutrition-focused quality improvement program targeting malnourished hospitalized patients showed that screening patients with a validated screening tool at admission, rapidly administering oral nutritional supplements, and educating patients on supplement adherence result in significant reductions in 30-day unplanned readmissions and hospital length of stay. Objectives To assess the potential cost-savings associated with decreased 30-day readmissions and hospital length of stay in malnourished inpatients through a nutrition-focused quality improvement program using a web-based budget impact model, and to demonstrate the clinical and fiscal value of the intervention. Methods The reduction in readmission rate and length of stay for 1269 patients enrolled in the quality improvement program (between October 13, 2014, and April 2, 2015) were compared with the pre–quality improvement program baseline and validation cohorts (4611 patients vs 1319 patients, respectively) to calculate potential cost-savings as well as to inform the design of the budget impact model. Readmission rate and length-of-stay reductions were calculated by determining the change from baseline to post–quality improvement program as well as the difference between the validation cohort and the post–quality improvement program, respectively. Results As a result of improved health outcomes for the treated patients, the nutrition-focused quality improvement program led to a reduction in 30-day hospital readmissions and length of stay. The avoided hospital readmissions and reduced number of days in the hospital for the patients in the quality improvement program resulted in cost-savings of $1,902,933 versus the pre–quality improvement program baseline cohort, and $4,896,758 versus the pre–quality improvement program in the validation cohort. When these costs were assessed across the entire patient population enrolled in the quality improvement program, per-patient net savings of $1499 when using the baseline cohort as the comparator and savings per patient treated of $3858 when using the validated cohort as the comparator were achieved. Conclusion The nutrition-focused quality improvement program reduced the per-patient healthcare costs by avoiding 30-day readmissions and through reduced length of hospital stay. These clinical and economic outcomes provide a rationale for merging patient care and financial modeling to advance the delivery of value-based medicine in a malnourished hospitalized population. The use of a novel web-based budget impact model supports the integration of comparative effectiveness analytics and healthcare resource management in the hospital setting to provide optimal quality of care at a reduced overall cost. PMID:28975010

  14. Budget Impact of a Comprehensive Nutrition-Focused Quality Improvement Program for Malnourished Hospitalized Patients.

    PubMed

    Sulo, Suela; Feldstein, Josh; Partridge, Jamie; Schwander, Bjoern; Sriram, Krishnan; Summerfelt, Wm Thomas

    2017-07-01

    Nutrition interventions can alleviate the burden of malnutrition by improving patient outcomes; however, evidence on the economic impact of medical nutrition intervention remains limited. A previously published nutrition-focused quality improvement program targeting malnourished hospitalized patients showed that screening patients with a validated screening tool at admission, rapidly administering oral nutritional supplements, and educating patients on supplement adherence result in significant reductions in 30-day unplanned readmissions and hospital length of stay. To assess the potential cost-savings associated with decreased 30-day readmissions and hospital length of stay in malnourished inpatients through a nutrition-focused quality improvement program using a web-based budget impact model, and to demonstrate the clinical and fiscal value of the intervention. The reduction in readmission rate and length of stay for 1269 patients enrolled in the quality improvement program (between October 13, 2014, and April 2, 2015) were compared with the pre-quality improvement program baseline and validation cohorts (4611 patients vs 1319 patients, respectively) to calculate potential cost-savings as well as to inform the design of the budget impact model. Readmission rate and length-of-stay reductions were calculated by determining the change from baseline to post-quality improvement program as well as the difference between the validation cohort and the post-quality improvement program, respectively. As a result of improved health outcomes for the treated patients, the nutrition-focused quality improvement program led to a reduction in 30-day hospital readmissions and length of stay. The avoided hospital readmissions and reduced number of days in the hospital for the patients in the quality improvement program resulted in cost-savings of $1,902,933 versus the pre-quality improvement program baseline cohort, and $4,896,758 versus the pre-quality improvement program in the validation cohort. When these costs were assessed across the entire patient population enrolled in the quality improvement program, per-patient net savings of $1499 when using the baseline cohort as the comparator and savings per patient treated of $3858 when using the validated cohort as the comparator were achieved. The nutrition-focused quality improvement program reduced the per-patient healthcare costs by avoiding 30-day readmissions and through reduced length of hospital stay. These clinical and economic outcomes provide a rationale for merging patient care and financial modeling to advance the delivery of value-based medicine in a malnourished hospitalized population. The use of a novel web-based budget impact model supports the integration of comparative effectiveness analytics and healthcare resource management in the hospital setting to provide optimal quality of care at a reduced overall cost.

  15. Control Oriented Modeling and Validation of Aeroservoelastic Systems

    NASA Technical Reports Server (NTRS)

    Crowder, Marianne; deCallafon, Raymond (Principal Investigator)

    2002-01-01

    Lightweight aircraft design emphasizes the reduction of structural weight to maximize aircraft efficiency and agility at the cost of increasing the likelihood of structural dynamic instabilities. To ensure flight safety, extensive flight testing and active structural servo control strategies are required to explore and expand the boundary of the flight envelope. Aeroservoelastic (ASE) models can provide online flight monitoring of dynamic instabilities to reduce flight time testing and increase flight safety. The success of ASE models is determined by the ability to take into account varying flight conditions and the possibility to perform flight monitoring under the presence of active structural servo control strategies. In this continued study, these aspects are addressed by developing specific methodologies and algorithms for control relevant robust identification and model validation of aeroservoelastic structures. The closed-loop model robust identification and model validation are based on a fractional model approach where the model uncertainties are characterized in a closed-loop relevant way.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainer, Leo I.; Hoeschele, Marc A.; Apte, Michael G.

    This report addresses the results of detailed monitoring completed under Program Element 6 of Lawrence Berkeley National Laboratory's High Performance Commercial Building Systems (HPCBS) PIER program. The purpose of the Energy Simulations and Projected State-Wide Energy Savings project is to develop reasonable energy performance and cost models for high performance relocatable classrooms (RCs) across California climates. A key objective of the energy monitoring was to validate DOE2 simulations for comparison to initial DOE2 performance projections. The validated DOE2 model was then used to develop statewide savings projections by modeling base case and high performance RC operation in the 16 Californiamore » climate zones. The primary objective of this phase of work was to utilize detailed field monitoring data to modify DOE2 inputs and generate performance projections based on a validated simulation model. Additional objectives include the following: (1) Obtain comparative performance data on base case and high performance HVAC systems to determine how they are operated, how they perform, and how the occupants respond to the advanced systems. This was accomplished by installing both HVAC systems side-by-side (i.e., one per module of a standard two module, 24 ft by 40 ft RC) on the study RCs and switching HVAC operating modes on a weekly basis. (2) Develop projected statewide energy and demand impacts based on the validated DOE2 model. (3) Develop cost effectiveness projections for the high performance HVAC system in the 16 California climate zones.« less

  17. The economics of improving medication adherence in osteoporosis: validation and application of a simulation model.

    PubMed

    Patrick, Amanda R; Schousboe, John T; Losina, Elena; Solomon, Daniel H

    2011-09-01

    Adherence to osteoporosis treatment is low. Although new therapies and behavioral interventions may improve medication adherence, questions are likely to arise regarding their cost-effectiveness. Our objectives were to develop and validate a model to simulate the clinical outcomes and costs arising from various osteoporosis medication adherence patterns among women initiating bisphosphonate treatment and to estimate the cost-effectiveness of a hypothetical intervention to improve medication adherence. We constructed a computer simulation using estimates of fracture rates, bisphosphonate treatment effects, costs, and utilities for health states drawn from the published literature. Probabilities of transitioning on and off treatment were estimated from administrative claims data. Patients were women initiating bisphosphonate therapy from the general community. We evaluated a hypothetical behavioral intervention to improve medication adherence. Changes in 10-yr fracture rates and incremental cost-effectiveness ratios were evaluated. A hypothetical intervention with a one-time cost of $250 and reducing bisphosphonate discontinuation by 30% had an incremental cost-effectiveness ratio (ICER) of $29,571 per quality-adjusted life year in 65-yr-old women initiating bisphosphonates. Although the ICER depended on patient age, intervention effectiveness, and intervention cost, the ICERs were less than $50,000 per quality-adjusted life year for the majority of intervention cost and effectiveness scenarios evaluated. Results were sensitive to bisphosphonate cost and effectiveness and assumptions about the rate at which intervention and treatment effects decline over time. Our results suggests that behavioral interventions to improve osteoporosis medication adherence will likely have favorable ICERs if their efficacy can be sustained.

  18. System capacity and economic modeling computer tool for satellite mobile communications systems

    NASA Technical Reports Server (NTRS)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  19. Using risk-adjustment models to identify high-cost risks.

    PubMed

    Meenan, Richard T; Goodman, Michael J; Fishman, Paul A; Hornbrook, Mark C; O'Keeffe-Rosetti, Maureen C; Bachman, Donald J

    2003-11-01

    We examine the ability of various publicly available risk models to identify high-cost individuals and enrollee groups using multi-HMO administrative data. Five risk-adjustment models (the Global Risk-Adjustment Model [GRAM], Diagnostic Cost Groups [DCGs], Adjusted Clinical Groups [ACGs], RxRisk, and Prior-expense) were estimated on a multi-HMO administrative data set of 1.5 million individual-level observations for 1995-1996. Models produced distributions of individual-level annual expense forecasts for comparison to actual values. Prespecified "high-cost" thresholds were set within each distribution. The area under the receiver operating characteristic curve (AUC) for "high-cost" prevalences of 1% and 0.5% was calculated, as was the proportion of "high-cost" dollars correctly identified. Results are based on a separate 106,000-observation validation dataset. For "high-cost" prevalence targets of 1% and 0.5%, ACGs, DCGs, GRAM, and Prior-expense are very comparable in overall discrimination (AUCs, 0.83-0.86). Given a 0.5% prevalence target and a 0.5% prediction threshold, DCGs, GRAM, and Prior-expense captured $963,000 (approximately 3%) more "high-cost" sample dollars than other models. DCGs captured the most "high-cost" dollars among enrollees with asthma, diabetes, and depression; predictive performance among demographic groups (Medicaid members, members over 64, and children under 13) varied across models. Risk models can efficiently identify enrollees who are likely to generate future high costs and who could benefit from case management. The dollar value of improved prediction performance of the most accurate risk models should be meaningful to decision-makers and encourage their broader use for identifying high costs.

  20. Kathleen Krah | NREL

    Science.gov Websites

    tool validation. Previous to joining NREL, her Masters thesis focused on a spatial analysis of strategies. Her undergraduate thesis focused on techno-economic and logistical cost modeling of offshore wind

  1. Modeling Imperfect Generator Behavior in Power System Operation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim

    A key component in power system operations is the use of computer models to quickly study and analyze different operating conditions and futures in an efficient manner. The output of these models are sensitive to the data used in them as well as the assumptions made during their execution. One typical assumption is that generators and load assets perfectly follow operator control signals. While this is a valid simulation assumption, generators may not always accurately follow control signals. This imperfect response of generators could impact cost and reliability metrics. This paper proposes a generator model that capture this imperfect behaviormore » and examines its impact on production costs and reliability metrics using a steady-state power system operations model. Preliminary analysis shows that while costs remain relatively unchanged, there could be significant impacts on reliability metrics.« less

  2. Remote sensing validation through SOOP technology: implementation of Spectra system

    NASA Astrophysics Data System (ADS)

    Piermattei, Viviana; Madonia, Alice; Bonamano, Simone; Consalvi, Natalizia; Caligiore, Aurelio; Falcone, Daniela; Puri, Pio; Sarti, Fabio; Spaccavento, Giovanni; Lucarini, Diego; Pacci, Giacomo; Amitrano, Luigi; Iacullo, Salvatore; D'Andrea, Salvatore; Marcelli, Marco

    2017-04-01

    The development of low-cost instrumentation plays a key role in marine environmental studies and represents one of the most innovative aspects of marine research. The availability of low-cost technologies allows the realization of extended observatory networks for the study of marine phenomena through an integrated approach merging observations, remote sensing and operational oceanography. Marine services and practical applications critically depends on the availability of large amount of data collected with sufficiently dense spatial and temporal sampling. This issue directly influences the robustness both of ocean forecasting models and remote sensing observations through data assimilation and validation processes, particularly in the biological domain. For this reason it is necessary the development of cheap, small and integrated smart sensors, which could be functional both for satellite data validation and forecasting models data assimilation as well as to support early warning systems for environmental pollution control and prevention. This is particularly true in coastal areas, which are subjected to multiple anthropic pressures. Moreover, coastal waters can be classified like case 2 waters, where the optical properties of inorganic suspended matter and chromophoric dissolved organic matter must be considered and separated by the chlorophyll a contribution. Due to the high costs of mooring systems, research vessels, measure platforms and instrumentation a big effort was dedicated to the design, development and realization of a new low cost mini-FerryBox system: Spectra. Thanks to the modularity and user-friendly employment of the system, Spectra allows to acquire continuous in situ measures of temperature, conductivity, turbidity, chlorophyll a and chromophoric dissolved organic matter (CDOM) fluorescences from voluntary vessels, even by non specialized operators (Marcelli et al., 2014; 2016). This work shows the preliminary application of this technology to remote sensing data validation.

  3. Cross Validation Through Two-Dimensional Solution Surface for Cost-Sensitive SVM.

    PubMed

    Gu, Bin; Sheng, Victor S; Tay, Keng Yeow; Romano, Walter; Li, Shuo

    2017-06-01

    Model selection plays an important role in cost-sensitive SVM (CS-SVM). It has been proven that the global minimum cross validation (CV) error can be efficiently computed based on the solution path for one parameter learning problems. However, it is a challenge to obtain the global minimum CV error for CS-SVM based on one-dimensional solution path and traditional grid search, because CS-SVM is with two regularization parameters. In this paper, we propose a solution and error surfaces based CV approach (CV-SES). More specifically, we first compute a two-dimensional solution surface for CS-SVM based on a bi-parameter space partition algorithm, which can fit solutions of CS-SVM for all values of both regularization parameters. Then, we compute a two-dimensional validation error surface for each CV fold, which can fit validation errors of CS-SVM for all values of both regularization parameters. Finally, we obtain the CV error surface by superposing K validation error surfaces, which can find the global minimum CV error of CS-SVM. Experiments are conducted on seven datasets for cost sensitive learning and on four datasets for imbalanced learning. Experimental results not only show that our proposed CV-SES has a better generalization ability than CS-SVM with various hybrids between grid search and solution path methods, and than recent proposed cost-sensitive hinge loss SVM with three-dimensional grid search, but also show that CV-SES uses less running time.

  4. A quasi-Monte-Carlo comparison of parametric and semiparametric regression methods for heavy-tailed and non-normal data: an application to healthcare costs.

    PubMed

    Jones, Andrew M; Lomas, James; Moore, Peter T; Rice, Nigel

    2016-10-01

    We conduct a quasi-Monte-Carlo comparison of the recent developments in parametric and semiparametric regression methods for healthcare costs, both against each other and against standard practice. The population of English National Health Service hospital in-patient episodes for the financial year 2007-2008 (summed for each patient) is randomly divided into two equally sized subpopulations to form an estimation set and a validation set. Evaluating out-of-sample using the validation set, a conditional density approximation estimator shows considerable promise in forecasting conditional means, performing best for accuracy of forecasting and among the best four for bias and goodness of fit. The best performing model for bias is linear regression with square-root-transformed dependent variables, whereas a generalized linear model with square-root link function and Poisson distribution performs best in terms of goodness of fit. Commonly used models utilizing a log-link are shown to perform badly relative to other models considered in our comparison.

  5. A multimodal logistics service network design with time windows and environmental concerns

    PubMed Central

    Zhang, Dezhi; He, Runzhong; Wang, Zhongwei

    2017-01-01

    The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained. PMID:28934272

  6. Low-Cost Optical Mapping Systems for Panoramic Imaging of Complex Arrhythmias and Drug-Action in Translational Heart Models

    NASA Astrophysics Data System (ADS)

    Lee, Peter; Calvo, Conrado J.; Alfonso-Almazán, José M.; Quintanilla, Jorge G.; Chorro, Francisco J.; Yan, Ping; Loew, Leslie M.; Filgueiras-Rama, David; Millet, José

    2017-02-01

    Panoramic optical mapping is the primary method for imaging electrophysiological activity from the entire outer surface of Langendorff-perfused hearts. To date, it is the only method of simultaneously measuring multiple key electrophysiological parameters, such as transmembrane voltage and intracellular free calcium, at high spatial and temporal resolution. Despite the impact it has already had on the fields of cardiac arrhythmias and whole-heart computational modeling, present-day system designs precludes its adoption by the broader cardiovascular research community because of their high costs. Taking advantage of recent technological advances, we developed and validated low-cost optical mapping systems for panoramic imaging using Langendorff-perfused pig hearts, a clinically-relevant model in basic research and bioengineering. By significantly lowering financial thresholds, this powerful cardiac electrophysiology imaging modality may gain wider use in research and, even, teaching laboratories, which we substantiated using the lower-cost Langendorff-perfused rabbit heart model.

  7. Low-Cost Optical Mapping Systems for Panoramic Imaging of Complex Arrhythmias and Drug-Action in Translational Heart Models.

    PubMed

    Lee, Peter; Calvo, Conrado J; Alfonso-Almazán, José M; Quintanilla, Jorge G; Chorro, Francisco J; Yan, Ping; Loew, Leslie M; Filgueiras-Rama, David; Millet, José

    2017-02-27

    Panoramic optical mapping is the primary method for imaging electrophysiological activity from the entire outer surface of Langendorff-perfused hearts. To date, it is the only method of simultaneously measuring multiple key electrophysiological parameters, such as transmembrane voltage and intracellular free calcium, at high spatial and temporal resolution. Despite the impact it has already had on the fields of cardiac arrhythmias and whole-heart computational modeling, present-day system designs precludes its adoption by the broader cardiovascular research community because of their high costs. Taking advantage of recent technological advances, we developed and validated low-cost optical mapping systems for panoramic imaging using Langendorff-perfused pig hearts, a clinically-relevant model in basic research and bioengineering. By significantly lowering financial thresholds, this powerful cardiac electrophysiology imaging modality may gain wider use in research and, even, teaching laboratories, which we substantiated using the lower-cost Langendorff-perfused rabbit heart model.

  8. A multimodal logistics service network design with time windows and environmental concerns.

    PubMed

    Zhang, Dezhi; He, Runzhong; Li, Shuangyan; Wang, Zhongwei

    2017-01-01

    The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained.

  9. Predicting Production Costs for Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.; Weston, R. P.

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This paper outlines the development of a process-based cost model in which the physical elements of the vehicle are soared according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this paper is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool.

  10. Flood loss modelling with FLF-IT: a new flood loss function for Italian residential structures

    NASA Astrophysics Data System (ADS)

    Hasanzadeh Nafari, Roozbeh; Amadio, Mattia; Ngo, Tuan; Mysiak, Jaroslav

    2017-07-01

    The damage triggered by different flood events costs the Italian economy millions of euros each year. This cost is likely to increase in the future due to climate variability and economic development. In order to avoid or reduce such significant financial losses, risk management requires tools which can provide a reliable estimate of potential flood impacts across the country. Flood loss functions are an internationally accepted method for estimating physical flood damage in urban areas. In this study, we derived a new flood loss function for Italian residential structures (FLF-IT), on the basis of empirical damage data collected from a recent flood event in the region of Emilia-Romagna. The function was developed based on a new Australian approach (FLFA), which represents the confidence limits that exist around the parameterized functional depth-damage relationship. After model calibration, the performance of the model was validated for the prediction of loss ratios and absolute damage values. It was also contrasted with an uncalibrated relative model with frequent usage in Europe. In this regard, a three-fold cross-validation procedure was carried out over the empirical sample to measure the range of uncertainty from the actual damage data. The predictive capability has also been studied for some sub-classes of water depth. The validation procedure shows that the newly derived function performs well (no bias and only 10 % mean absolute error), especially when the water depth is high. Results of these validation tests illustrate the importance of model calibration. The advantages of the FLF-IT model over other Italian models include calibration with empirical data, consideration of the epistemic uncertainty of data, and the ability to change parameters based on building practices across Italy.

  11. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors

    PubMed Central

    Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister

    2017-01-01

    Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach offers practical benefits for probing novel insights into the mode of action of investigational compounds, and for the identification of new target selectivities for drug repurposing applications. PMID:28787438

  12. Screening of healthcare workers for tuberculosis: development and validation of a new health economic model to inform practice

    PubMed Central

    Eralp, Merve Nazli; Scholtes, Stefan; Martell, Geraldine; Winter, Robert

    2012-01-01

    Background Methods for determining cost-effectiveness of different treatments are well established, unlike appraisal of non-drug interventions, including novel diagnostics and biomarkers. Objective The authors develop and validate a new health economic model by comparing cost-effectiveness of tuberculin skin test (TST); blood test, interferon-gamma release assay (IGRA) and TST followed by IGRA in conditional sequence, in screening healthcare workers for latent or active tuberculosis (TB). Design The authors focus on healthy life years gained as the benefit metric, rather than quality-adjusted life years given limited data to estimate quality adjustments of life years with TB and complications of treatment, like hepatitis. Healthy life years gained refer to the number of TB or hepatitis cases avoided and the increase in life expectancy. The authors incorporate disease and test parameters informed by systematic meta-analyses and clinical practice. Health and economic outcomes of each strategy are modelled as a decision tree in Markov chains, representing different health states informed by epidemiology. Cost and effectiveness values are generated as the individual is cycled through 20 years of the model. Key parameters undergo one-way and Monte Carlo probabilistic sensitivity analyses. Setting Screening healthcare workers in secondary and tertiary care. Results IGRA is the most effective strategy, with incremental costs per healthy life year gained of £10 614–£20 929, base case, £8021–£18 348, market costs TST £45, IGRA £90, IGRA specificities of 99%–97%; mean (5%, 95%), £12 060 (£4137–£38 418) by Monte Carlo analysis. Conclusions Incremental costs per healthy life year gained, a conservative estimate of benefit, are comparable to the £20 000–£30 000 NICE band for IGRA alone, across wide differences in disease and test parameters. Health gains justify IGRA costs, even if IGRA tests cost three times TST. This health economic model offers a powerful tool for appraising non-drug interventions in the market and under development. PMID:22382118

  13. Validating hyperbilirubinemia and gut mucosal atrophy with a novel ultramobile ambulatory total parenteral nutrition piglet model

    USDA-ARS?s Scientific Manuscript database

    Total parenteral nutrition (TPN) provides all nutrition intravenously. Although TPN therapy has grown enormously, it causes significant complications, including gut and hepatic dysfunction. Current models use animal tethering which is unlike ambulatory human TPN delivery and is cost prohibitive. We ...

  14. A comparison of methods to handle skew distributed cost variables in the analysis of the resource consumption in schizophrenia treatment.

    PubMed

    Kilian, Reinhold; Matschinger, Herbert; Löeffler, Walter; Roick, Christiane; Angermeyer, Matthias C

    2002-03-01

    Transformation of the dependent cost variable is often used to solve the problems of heteroscedasticity and skewness in linear ordinary least square regression of health service cost data. However, transformation may cause difficulties in the interpretation of regression coefficients and the retransformation of predicted values. The study compares the advantages and disadvantages of different methods to estimate regression based cost functions using data on the annual costs of schizophrenia treatment. Annual costs of psychiatric service use and clinical and socio-demographic characteristics of the patients were assessed for a sample of 254 patients with a diagnosis of schizophrenia (ICD-10 F 20.0) living in Leipzig. The clinical characteristics of the participants were assessed by means of the BPRS 4.0, the GAF, and the CAN for service needs. Quality of life was measured by WHOQOL-BREF. A linear OLS regression model with non-parametric standard errors, a log-transformed OLS model and a generalized linear model with a log-link and a gamma distribution were used to estimate service costs. For the estimation of robust non-parametric standard errors, the variance estimator by White and a bootstrap estimator based on 2000 replications were employed. Models were evaluated by the comparison of the R2 and the root mean squared error (RMSE). RMSE of the log-transformed OLS model was computed with three different methods of bias-correction. The 95% confidence intervals for the differences between the RMSE were computed by means of bootstrapping. A split-sample-cross-validation procedure was used to forecast the costs for the one half of the sample on the basis of a regression equation computed for the other half of the sample. All three methods showed significant positive influences of psychiatric symptoms and met psychiatric service needs on service costs. Only the log- transformed OLS model showed a significant negative impact of age, and only the GLM shows a significant negative influences of employment status and partnership on costs. All three models provided a R2 of about.31. The Residuals of the linear OLS model revealed significant deviances from normality and homoscedasticity. The residuals of the log-transformed model are normally distributed but still heteroscedastic. The linear OLS model provided the lowest prediction error and the best forecast of the dependent cost variable. The log-transformed model provided the lowest RMSE if the heteroscedastic bias correction was used. The RMSE of the GLM with a log link and a gamma distribution was higher than those of the linear OLS model and the log-transformed OLS model. The difference between the RMSE of the linear OLS model and that of the log-transformed OLS model without bias correction was significant at the 95% level. As result of the cross-validation procedure, the linear OLS model provided the lowest RMSE followed by the log-transformed OLS model with a heteroscedastic bias correction. The GLM showed the weakest model fit again. None of the differences between the RMSE resulting form the cross- validation procedure were found to be significant. The comparison of the fit indices of the different regression models revealed that the linear OLS model provided a better fit than the log-transformed model and the GLM, but the differences between the models RMSE were not significant. Due to the small number of cases in the study the lack of significance does not sufficiently proof that the differences between the RSME for the different models are zero and the superiority of the linear OLS model can not be generalized. The lack of significant differences among the alternative estimators may reflect a lack of sample size adequate to detect important differences among the estimators employed. Further studies with larger case number are necessary to confirm the results. Specification of an adequate regression models requires a careful examination of the characteristics of the data. Estimation of standard errors and confidence intervals by nonparametric methods which are robust against deviations from the normal distribution and the homoscedasticity of residuals are suitable alternatives to the transformation of the skew distributed dependent variable. Further studies with more adequate case numbers are needed to confirm the results.

  15. [A new low-cost webcam-based laparoscopic training model].

    PubMed

    Langeron, A; Mercier, G; Lima, S; Chauleur, C; Golfier, F; Seffert, P; Chêne, G

    2012-01-01

    To validate a new laparoscopy home training model (GYN Trainer®) in order to practise and learn basic laparoscopic surgery. Ten junior surgical residents and six experienced operators were timed and assessed during six laparoscopic exercises performed on the home training model. Acquisition of skill was 35%. All the novices significantly improved performance in surgical skills despite an 8% partial loss of acquisition between two training sessions. Qualitative evaluation of the system was good (3.8/5). This low-cost personal laparoscopic model seems to be a useful tool to assist surgical novices in learning basic laparoscopic skills. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  16. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Kevin M.; Smith, Brennan T.; Witt, Adam M.

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  17. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  18. Cost-Effectiveness of Procedures for Treatment of Ostium Secundum Atrial Septal Defects Occlusion Comparing Conventional Surgery and Septal Percutaneous Implant

    PubMed Central

    da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Senna, Kátia Marie Simões e.; Tura, Bernardo Rangel; Goulart, Marcelo Correia

    2014-01-01

    Objectives The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. Methods A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. Results The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. Conclusions The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation. PMID:25302806

  19. Cost-effectiveness of procedures for treatment of ostium secundum atrial septal defects occlusion comparing conventional surgery and septal percutaneous implant.

    PubMed

    da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Simões e Senna, Kátia Marie; Tura, Bernardo Rangel; Correia, Marcelo Goulart; Goulart, Marcelo Correia

    2014-01-01

    The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation.

  20. Estimating Procurement Cost Growth Using Logistic and Multiple Regression

    DTIC Science & Technology

    2003-03-01

    Figure 4). The plots fail to pass the visual inspection for constant variance as well as the Breusch - Pagan test (Neter, 1996: 112) at an alpha level...plots fail to pass the visual inspection for constant variance as well as the Breusch - Pagan test at an alpha level of 0.05. Based on these findings...amount of cost growth a program will have 13 once model A deems that the program will incur cost growth. Sipple conducts validation testing on

  1. A novel low-cost open-hardware platform for monitoring soil water content and multiple soil-air-vegetation parameters.

    PubMed

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-10-21

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost "open hardware" platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSE(training) = 2.63; RMSE(validation) = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies.

  2. Can automation in radiotherapy reduce costs?

    PubMed

    Massaccesi, Mariangela; Corti, Michele; Azario, Luigi; Balducci, Mario; Ferro, Milena; Mantini, Giovanna; Mattiucci, Gian Carlo; Valentini, Vincenzo

    2015-01-01

    Computerized automation is likely to play an increasingly important role in radiotherapy. The objective of this study was to report the results of the first part of a program to implement a model for economical evaluation based on micro-costing method. To test the efficacy of the model, the financial impact of the introduction of an automation tool was estimated. A single- and multi-center validation of the model by a prospective collection of data is planned as the second step of the program. The model was implemented by using an interactive spreadsheet (Microsoft Excel, 2010). The variables to be included were identified across three components: productivity, staff, and equipment. To calculate staff requirements, the workflow of Gemelli ART center was mapped out and relevant workload measures were defined. Profit and loss, productivity and staffing were identified as significant outcomes. Results were presented in terms of earnings before interest and taxes (EBIT). Three different scenarios were hypothesized: baseline situation at Gemelli ART (scenario 1); reduction by 2 minutes of the average duration of treatment fractions (scenario 2); and increased incidence of advanced treatment modalities (scenario 3). By using the model, predicted EBIT values for each scenario were calculated across a period of eight years (from 2015 to 2022). For both scenarios 2 and 3 costs are expected to slightly increase as compared to baseline situation that is particularly due to a little increase in clinical personnel costs. However, in both cases EBIT values are more favorable than baseline situation (EBIT values: scenario 1, 27%, scenario 2, 30%, scenario 3, 28% of revenues). A model based on a micro-costing method was able to estimate the financial consequences of the introduction of an automation tool in our radiotherapy department. A prospective collection of data at Gemelli ART and in a consortium of centers is currently under way to prospectively validate the model.

  3. Patient casemix classification for medicare psychiatric prospective payment.

    PubMed

    Drozd, Edward M; Cromwell, Jerry; Gage, Barbara; Maier, Jan; Greenwald, Leslie M; Goldman, Howard H

    2006-04-01

    For a proposed Medicare prospective payment system for inpatient psychiatric facility treatment, the authors developed a casemix classification to capture differences in patients' real daily resource use. Primary data on patient characteristics and daily time spent in various activities were collected in a survey of 696 patients from 40 inpatient psychiatric facilities. Survey data were combined with Medicare claims data to estimate intensity-adjusted daily cost. Classification and Regression Trees (CART) analysis of average daily routine and ancillary costs yielded several hierarchical classification groupings. Regression analysis was used to control for facility and day-of-stay effects in order to compare hierarchical models with models based on the recently proposed payment system of the Centers for Medicare & Medicaid Services. CART analysis identified a small set of patient characteristics strongly associated with higher daily costs, including age, psychiatric diagnosis, deficits in daily living activities, and detox or ECT use. A parsimonious, 16-group, fully interactive model that used five major DSM-IV categories and stratified by age, illness severity, deficits in daily living activities, dangerousness, and use of ECT explained 40% (out of a possible 76%) of daily cost variation not attributable to idiosyncratic daily changes within patients. A noninteractive model based on diagnosis-related groups, age, and medical comorbidity had explanatory power of only 32%. A regression model with 16 casemix groups restricted to using "appropriate" payment variables (i.e., those with clinical face validity and low administrative burden that are easily validated and provide proper care incentives) produced more efficient and equitable payments than did a noninteractive system based on diagnosis-related groups.

  4. Using multilevel models for assessing the variability of multinational resource use and cost data.

    PubMed

    Grieve, Richard; Nixon, Richard; Thompson, Simon G; Normand, Charles

    2005-02-01

    Multinational economic evaluations often calculate a single measure of cost-effectiveness using cost data pooled across several countries. To assess the validity of pooling international cost data the reasons for cost variation across countries need to be assessed. Previously, ordinary least-squares (OLS) regression models have been used to identify factors associated with variability in resource use and total costs. However, multilevel models (MLMs), which accommodate the hierarchical structure of the data, may be more appropriate. This paper compares these different techniques using a multinational dataset comprising case-mix, resource use and cost data on 1300 stroke admissions from 13 centres in 11 European countries. OLS and MLMs were used to estimate the effect of patient and centre-level covariates on the total length of hospital stay (LOS) and total cost. MLMs with normal and gamma distributions for the data within centres were compared. The results from the OLS model showed that both patient and centre-level covariates were associated with LOS and total cost. The estimates from the MLMs showed that none of the centre-level characteristics were associated with LOS, and the level of spending on health was the centre-level variable most highly associated with total cost. We conclude that using OLS models for assessing international variation can lead to incorrect inferences, and that MLMs are more appropriate for assessing why resource use and costs vary across centres. Copyright (c) 2004 John Wiley & Sons, Ltd.

  5. Reflector Technology Development and System Design for Concentrating Solar Power Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adam Schaut

    2011-12-30

    Alcoa began this program in March of 2008 with the goal of developing and validating an advanced CSP trough design to lower the levelized cost of energy (LCOE) as compared to existing glass based, space-frame trough technology. In addition to showing a pathway to a significant LCOE reduction, Alcoa also desired to create US jobs to support the emerging CSP industry. Alcoa's objective during Phase I: Concept Feasibility was to provide the DOE with a design approach that demonstrates significant overall system cost savings without sacrificing performance. Phase I consisted of two major tasks; reflector surface development and system conceptmore » development. Two specific reflective surface technologies were investigated, silver metallized lamination, and thin film deposition both applied on an aluminum substrate. Alcoa prepared samples; performed test validation internally; and provided samples to the NREL for full-spectrum reflectivity measurements. The final objective was to report reflectivity at t = 0 and the latest durability results as of the completion of Phase 1. The target criteria for reflectance and durability were as follows: (1) initial (t = 0), hemispherical reflectance >93%, (2) initial spectral reflectance >90% for 25-mrad reading and >87% for 7-mrad reading, and (3) predicted 20 year durability of less than 5% optical performance drop. While the results of the reflective development activities were promising, Alcoa was unable to down-select on a reflective technology that met the target criteria. Given the progress and potential of both silver film and thin film technologies, Alcoa continued reflector surface development activities in Phase II. The Phase I concept development activities began with acquiring baseline CSP system information from both CSP Services and the DOE. This information was used as the basis to develop conceptual designs through ideation sessions. The concepts were evaluated based on estimated cost and high-level structural performance. The target criteria for the concept development was to achieve a solar field cost savings of 25%-50% thereby meeting or exceeding the DOE solar field cost savings target of $350/m2. After evaluating various structural design approaches, Alcoa down-selected to a monocoque, dubbed Wing Box, design that utilizes the reflective surface as a structural, load carrying member. The cost and performance potential of the Wing Box concept was developed via initial finite element analysis (FEA) and cost modeling. The structural members were sized through material utilization modeling when subjected to representative loading conditions including wind loading. Cost modeling was utilized to refine potential manufacturing techniques that could be employed to manufacture the structural members. Alcoa concluded that an aluminum intensive collector design can achieve significant cost savings without sacrificing performance. Based on the cost saving potential of this Concept Feasibility study, Alcoa recommended further validation of this CSP approach through the execution of Phase II: Design and Prototype Development. Alcoa Phase II objective was to provide the DOE with a validated CSP trough design that demonstrates significant overall system cost savings without sacrificing performance. Phase II consisted of three major tasks; Detail System Design, Prototype Build, and System Validation. Additionally, the reflector surface development that began in Phase I was continued in Phase II. After further development work, Alcoa was unable to develop a reflective technology that demonstrated significant performance or cost benefits compared to commercially available CSP reflective products. After considering other commercially available reflective surfaces, Alcoa selected Alano's MIRO-SUN product for use on the full scale prototype. Although MIRO-SUN has a lower specular reflectivity compared to other options, its durability in terms of handling, cleaning, and long-term reflectivity was deemed the most important attribute to successfully validate Alcoa's advanced trough architecture. To validate the performance of the Wing Box trough, a 6 meter aperture by 14 meter long prototype trough was built. For ease of shipping to and assembly at NREL's test facility, the prototype was fabricated in two half modules and joined along the centerline to create the Wing Box trough. The trough components were designed to achieve high precision of the reflective surface while leveraging high volume manufacturing and assembly techniques.« less

  6. Motivation and personality: relationships between putative motive dimensions and the five factor model of personality.

    PubMed

    Bernard, Larry C

    2010-04-01

    There are few multidimensional measures of individual differences in motivation available. The Assessment of Individual Motives-Questionnaire assesses 15 putative dimensions of motivation. The dimensions are based on evolutionary theory and preliminary evidence suggests the motive scales have good psychometric properties. The scales are reliable and there is evidence of their consensual validity (convergence of self-other ratings) and behavioral validity (relationships with self-other reported behaviors of social importance). Additional validity research is necessary, however, especially with respect to current models of personality. The present study tested two general and 24 specific hypotheses based on proposed evolutionary advantages/disadvantages and fitness benefits/costs of the five-factor model of personality together with the new motive scales in a sample of 424 participants (M age=28.8 yr., SD=14.6). Results were largely supportive of the hypotheses. These results support the validity of new motive dimensions and increase understanding of the five-factor model of personality.

  7. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  8. Business intelligence modeling in launch operations

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-05-01

    The future of business intelligence in space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems. This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations, process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations, and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems.

  9. Business Intelligence Modeling in Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-01-01

    This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation .based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations. process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations. and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems. The future of business intelligence of space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems.

  10. Optimization Scheduling Model for Wind-thermal Power System Considering the Dynamic penalty factor

    NASA Astrophysics Data System (ADS)

    PENG, Siyu; LUO, Jianchun; WANG, Yunyu; YANG, Jun; RAN, Hong; PENG, Xiaodong; HUANG, Ming; LIU, Wanyu

    2018-03-01

    In this paper, a new dynamic economic dispatch model for power system is presented.Objective function of the proposed model presents a major novelty in the dynamic economic dispatch including wind farm: introduced the “Dynamic penalty factor”, This factor could be computed by using fuzzy logic considering both the variable nature of active wind power and power demand, and it could change the wind curtailment cost according to the different state of the power system. Case studies were carried out on the IEEE30 system. Results show that the proposed optimization model could mitigate the wind curtailment and the total cost effectively, demonstrate the validity and effectiveness of the proposed model.

  11. A cross-validation package driving Netica with python

    USGS Publications Warehouse

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  12. Current Status of Simulation-based Training Tools in Orthopedic Surgery: A Systematic Review.

    PubMed

    Morgan, Michael; Aydin, Abdullatif; Salih, Alan; Robati, Shibby; Ahmed, Kamran

    To conduct a systematic review of orthopedic training and assessment simulators with reference to their level of evidence (LoE) and level of recommendation. Medline and EMBASE library databases were searched for English language articles published between 1980 and 2016, describing orthopedic simulators or validation studies of these models. All studies were assessed for LoE, and each model was subsequently awarded a level of recommendation using a modified Oxford Centre for Evidence-Based Medicine classification, adapted for education. A total of 76 articles describing orthopedic simulators met the inclusion criteria, 47 of which described at least 1 validation study. The most commonly identified models (n = 34) and validation studies (n = 26) were for knee arthroscopy. Construct validation was the most frequent validation study attempted by authors. In all, 62% (47 of 76) of the simulator studies described arthroscopy simulators, which also contained validation studies with the highest LoE. Orthopedic simulators are increasingly being subjected to validation studies, although the LoE of such studies generally remain low. There remains a lack of focus on nontechnical skills and on cost analyses of orthopedic simulators. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Cost Validation Using PRICE H

    NASA Technical Reports Server (NTRS)

    Jack, John; Kwan, Eric; Wood, Milana

    2011-01-01

    PRICE H was introduced into the JPL cost estimation tool set circa 2003. It became more available at JPL when IPAO funded the NASA-wide site license for all NASA centers. PRICE H was mainly used as one of the cost tools to validate proposal grassroots cost estimates. Program offices at JPL view PRICE H as an additional crosscheck to Team X (JPL Concurrent Engineering Design Center) estimates. PRICE H became widely accepted ca, 2007 at JPL when the program offices moved away from grassroots cost estimation for Step 1 proposals. PRICE H is now one of the key cost tools used for cost validation, cost trades, and independent cost estimates.

  14. Texas flexible pavements and overlays : year 5 report - complete data documentation.

    DOT National Transportation Integrated Search

    2017-05-01

    Proper calibration and validation of pavement design and performance models to Texas conditions is : essential for cost-effective flexible pavement design, performance predictions, and maintenance/rehab : strategies. The veracity of the calibration o...

  15. A new methodology for modeling of direct landslide costs for transportation infrastructures

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Terhorst, Birgit

    2014-05-01

    The world's transportation infrastructure is at risk of landslides in many areas across the globe. A safe and affordable operation of traffic routes are the two main criteria for transportation planning in landslide-prone areas. The right balancing of these often conflicting priorities requires, amongst others, profound knowledge of the direct costs of landslide damage. These costs include capital investments for landslide repair and mitigation as well as operational expenditures for first response and maintenance works. This contribution presents a new methodology for ex post assessment of direct landslide costs for transportation infrastructures. The methodology includes tools to compile, model, and extrapolate landslide losses on different spatial scales over time. A landslide susceptibility model enables regional cost extrapolation by means of a cost figure obtained from local cost compilation for representative case study areas. On local level, cost survey is closely linked with cost modeling, a toolset for cost estimation based on landslide databases. Cost modeling uses Landslide Disaster Management Process Models (LDMMs) and cost modules to simulate and monetize cost factors for certain types of landslide damage. The landslide susceptibility model provides a regional exposure index and updates the cost figure to a cost index which describes the costs per km of traffic route at risk of landslides. Both indexes enable the regionalization of local landslide losses. The methodology is applied and tested in a cost assessment for highways in the Lower Saxon Uplands, NW Germany, in the period 1980 to 2010. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. In the 7,000 km² large Lower Saxon Uplands, 77 km of highway are located in potential landslide hazard area. Annual average costs of 52k per km of highway at risk of landslides are identified as cost index for a local case study area in this region. The cost extrapolation for the Lower Saxon Uplands results in annual average costs for highways of 4.02mn. This test application as well as a validation of selected modeling tools verifies the functionality of this methodology.

  16. The Diagnosis of Urinary Tract infection in Young children (DUTY): a diagnostic prospective observational study to derive and validate a clinical algorithm for the diagnosis of urinary tract infection in children presenting to primary care with an acute illness.

    PubMed Central

    Hay, Alastair D; Birnie, Kate; Busby, John; Delaney, Brendan; Downing, Harriet; Dudley, Jan; Durbaba, Stevo; Fletcher, Margaret; Harman, Kim; Hollingworth, William; Hood, Kerenza; Howe, Robin; Lawton, Michael; Lisles, Catherine; Little, Paul; MacGowan, Alasdair; O'Brien, Kathryn; Pickles, Timothy; Rumsby, Kate; Sterne, Jonathan Ac; Thomas-Jones, Emma; van der Voort, Judith; Waldron, Cherry-Ann; Whiting, Penny; Wootton, Mandy; Butler, Christopher C

    2016-01-01

    BACKGROUND It is not clear which young children presenting acutely unwell to primary care should be investigated for urinary tract infection (UTI) and whether or not dipstick testing should be used to inform antibiotic treatment. OBJECTIVES To develop algorithms to accurately identify pre-school children in whom urine should be obtained; assess whether or not dipstick urinalysis provides additional diagnostic information; and model algorithm cost-effectiveness. DESIGN Multicentre, prospective diagnostic cohort study. SETTING AND PARTICIPANTS Children < 5 years old presenting to primary care with an acute illness and/or new urinary symptoms. METHODS One hundred and seven clinical characteristics (index tests) were recorded from the child's past medical history, symptoms, physical examination signs and urine dipstick test. Prior to dipstick results clinician opinion of UTI likelihood ('clinical diagnosis') and urine sampling and treatment intentions ('clinical judgement') were recorded. All index tests were measured blind to the reference standard, defined as a pure or predominant uropathogen cultured at ≥ 10(5) colony-forming units (CFU)/ml in a single research laboratory. Urine was collected by clean catch (preferred) or nappy pad. Index tests were sequentially evaluated in two groups, stratified by urine collection method: parent-reported symptoms with clinician-reported signs, and urine dipstick results. Diagnostic accuracy was quantified using area under receiver operating characteristic curve (AUROC) with 95% confidence interval (CI) and bootstrap-validated AUROC, and compared with the 'clinician diagnosis' AUROC. Decision-analytic models were used to identify optimal urine sampling strategy compared with 'clinical judgement'. RESULTS A total of 7163 children were recruited, of whom 50% were female and 49% were < 2 years old. Culture results were available for 5017 (70%); 2740 children provided clean-catch samples, 94% of whom were ≥ 2 years old, with 2.2% meeting the UTI definition. Among these, 'clinical diagnosis' correctly identified 46.6% of positive cultures, with 94.7% specificity and an AUROC of 0.77 (95% CI 0.71 to 0.83). Four symptoms, three signs and three dipstick results were independently associated with UTI with an AUROC (95% CI; bootstrap-validated AUROC) of 0.89 (0.85 to 0.95; validated 0.88) for symptoms and signs, increasing to 0.93 (0.90 to 0.97; validated 0.90) with dipstick results. Nappy pad samples were provided from the other 2277 children, of whom 82% were < 2 years old and 1.3% met the UTI definition. 'Clinical diagnosis' correctly identified 13.3% positive cultures, with 98.5% specificity and an AUROC of 0.63 (95% CI 0.53 to 0.72). Four symptoms and two dipstick results were independently associated with UTI, with an AUROC of 0.81 (0.72 to 0.90; validated 0.78) for symptoms, increasing to 0.87 (0.80 to 0.94; validated 0.82) with the dipstick findings. A high specificity threshold for the clean-catch model was more accurate and less costly than, and as effective as, clinical judgement. The additional diagnostic utility of dipstick testing was offset by its costs. The cost-effectiveness of the nappy pad model was not clear-cut. CONCLUSIONS Clinicians should prioritise the use of clean-catch sampling as symptoms and signs can cost-effectively improve the identification of UTI in young children where clean catch is possible. Dipstick testing can improve targeting of antibiotic treatment, but at a higher cost than waiting for a laboratory result. Future research is needed to distinguish pathogens from contaminants, assess the impact of the clean-catch algorithm on patient outcomes, and the cost-effectiveness of presumptive versus dipstick versus laboratory-guided antibiotic treatment. FUNDING The National Institute for Health Research Health Technology Assessment programme. PMID:27401902

  17. The Diagnosis of Urinary Tract infection in Young children (DUTY): a diagnostic prospective observational study to derive and validate a clinical algorithm for the diagnosis of urinary tract infection in children presenting to primary care with an acute illness.

    PubMed

    Hay, Alastair D; Birnie, Kate; Busby, John; Delaney, Brendan; Downing, Harriet; Dudley, Jan; Durbaba, Stevo; Fletcher, Margaret; Harman, Kim; Hollingworth, William; Hood, Kerenza; Howe, Robin; Lawton, Michael; Lisles, Catherine; Little, Paul; MacGowan, Alasdair; O'Brien, Kathryn; Pickles, Timothy; Rumsby, Kate; Sterne, Jonathan Ac; Thomas-Jones, Emma; van der Voort, Judith; Waldron, Cherry-Ann; Whiting, Penny; Wootton, Mandy; Butler, Christopher C

    2016-07-01

    It is not clear which young children presenting acutely unwell to primary care should be investigated for urinary tract infection (UTI) and whether or not dipstick testing should be used to inform antibiotic treatment. To develop algorithms to accurately identify pre-school children in whom urine should be obtained; assess whether or not dipstick urinalysis provides additional diagnostic information; and model algorithm cost-effectiveness. Multicentre, prospective diagnostic cohort study. Children < 5 years old presenting to primary care with an acute illness and/or new urinary symptoms. One hundred and seven clinical characteristics (index tests) were recorded from the child's past medical history, symptoms, physical examination signs and urine dipstick test. Prior to dipstick results clinician opinion of UTI likelihood ('clinical diagnosis') and urine sampling and treatment intentions ('clinical judgement') were recorded. All index tests were measured blind to the reference standard, defined as a pure or predominant uropathogen cultured at ≥ 10(5) colony-forming units (CFU)/ml in a single research laboratory. Urine was collected by clean catch (preferred) or nappy pad. Index tests were sequentially evaluated in two groups, stratified by urine collection method: parent-reported symptoms with clinician-reported signs, and urine dipstick results. Diagnostic accuracy was quantified using area under receiver operating characteristic curve (AUROC) with 95% confidence interval (CI) and bootstrap-validated AUROC, and compared with the 'clinician diagnosis' AUROC. Decision-analytic models were used to identify optimal urine sampling strategy compared with 'clinical judgement'. A total of 7163 children were recruited, of whom 50% were female and 49% were < 2 years old. Culture results were available for 5017 (70%); 2740 children provided clean-catch samples, 94% of whom were ≥ 2 years old, with 2.2% meeting the UTI definition. Among these, 'clinical diagnosis' correctly identified 46.6% of positive cultures, with 94.7% specificity and an AUROC of 0.77 (95% CI 0.71 to 0.83). Four symptoms, three signs and three dipstick results were independently associated with UTI with an AUROC (95% CI; bootstrap-validated AUROC) of 0.89 (0.85 to 0.95; validated 0.88) for symptoms and signs, increasing to 0.93 (0.90 to 0.97; validated 0.90) with dipstick results. Nappy pad samples were provided from the other 2277 children, of whom 82% were < 2 years old and 1.3% met the UTI definition. 'Clinical diagnosis' correctly identified 13.3% positive cultures, with 98.5% specificity and an AUROC of 0.63 (95% CI 0.53 to 0.72). Four symptoms and two dipstick results were independently associated with UTI, with an AUROC of 0.81 (0.72 to 0.90; validated 0.78) for symptoms, increasing to 0.87 (0.80 to 0.94; validated 0.82) with the dipstick findings. A high specificity threshold for the clean-catch model was more accurate and less costly than, and as effective as, clinical judgement. The additional diagnostic utility of dipstick testing was offset by its costs. The cost-effectiveness of the nappy pad model was not clear-cut. Clinicians should prioritise the use of clean-catch sampling as symptoms and signs can cost-effectively improve the identification of UTI in young children where clean catch is possible. Dipstick testing can improve targeting of antibiotic treatment, but at a higher cost than waiting for a laboratory result. Future research is needed to distinguish pathogens from contaminants, assess the impact of the clean-catch algorithm on patient outcomes, and the cost-effectiveness of presumptive versus dipstick versus laboratory-guided antibiotic treatment. The National Institute for Health Research Health Technology Assessment programme.

  18. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  19. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  20. Cost Analysis of a Digital Health Care Model in Sweden.

    PubMed

    Ekman, Björn

    2017-09-22

    Digital technologies in health care are expected to increase in scope and to affect ever more parts of the health care system. It is important to enhance the knowledge of whether new digital methods and innovations provide value for money compared with traditional models of care. The objective of the study was to evaluate whether a digital health care model for primary care is a less costly alternative compared with traditional in-office primary care in Sweden. Cost data for the two care models were collected and analyzed to obtain a measure in local currency per care contact. The comparison showed that the total economic cost of a digital consultation is 1960 Swedish krona (SEK) (SEK100 = US$11.29; February 2017) compared with SEK3348 for a traditional consultation at a health care clinic. Cost differences arose on both the provider side and on the user side. The digital health care model may be a less costly alternative to the traditional health care model. Depending on the rate of digital substitution, gross economic cost savings of between SEK1 billion and SEK10 billion per year could be realized if more digital consultations were made. Further studies are needed to validate the findings, assess the types of care most suitable for digital care, and also to obtain various quality-adjusted outcome measures.

  1. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  2. A Cost Analysis Model for Army Sponsored Graduate Dental Education Programs.

    DTIC Science & Technology

    1997-04-01

    characteristics of a good measurement tool ? Cooper and Emory in their textbook, Business Research Methods, state there are three major criteria for evaluating...a measurement tool : validity, reliability, and practicality (Cooper and Emory 1995). Validity can be compartmentalized into internal and external...tremendous expense? The AEGD-1 year program is used extensively as a recruiting tool to encourage senior dental students to join the Army Dental Corps. The

  3. Role of optimization in the human dynamics of task execution

    NASA Astrophysics Data System (ADS)

    Cajueiro, Daniel O.; Maldonado, Wilfredo L.

    2008-03-01

    In order to explain the empirical evidence that the dynamics of human activity may not be well modeled by Poisson processes, a model based on queuing processes was built in the literature [A. L. Barabasi, Nature (London) 435, 207 (2005)]. The main assumption behind that model is that people execute their tasks based on a protocol that first executes the high priority item. In this context, the purpose of this paper is to analyze the validity of that hypothesis assuming that people are rational agents that make their decisions in order to minimize the cost of keeping nonexecuted tasks on the list. Therefore, we build and analytically solve a dynamic programming model with two priority types of tasks and show that the validity of this hypothesis depends strongly on the structure of the instantaneous costs that a person has to face if a given task is kept on the list for more than one period. Moreover, one interesting finding is that in one of the situations the protocol used to execute the tasks generates complex one-dimensional dynamics.

  4. Authentication of organic feed by near-infrared spectroscopy combined with chemometrics: a feasibility study.

    PubMed

    Tres, A; van der Veer, G; Perez-Marin, M D; van Ruth, S M; Garrido-Varo, A

    2012-08-22

    Organic products tend to retail at a higher price than their conventional counterparts, which makes them susceptible to fraud. In this study we evaluate the application of near-infrared spectroscopy (NIRS) as a rapid, cost-effective method to verify the organic identity of feed for laying hens. For this purpose a total of 36 organic and 60 conventional feed samples from The Netherlands were measured by NIRS. A binary classification model (organic vs conventional feed) was developed using partial least squares discriminant analysis. Models were developed using five different data preprocessing techniques, which were externally validated by a stratified random resampling strategy using 1000 realizations. Spectral regions related to the protein and fat content were among the most important ones for the classification model. The models based on data preprocessed using direct orthogonal signal correction (DOSC), standard normal variate (SNV), and first and second derivatives provided the most successful results in terms of median sensitivity (0.91 in external validation) and median specificity (1.00 for external validation of SNV models and 0.94 for DOSC and first and second derivative models). A previously developed model, which was based on fatty acid fingerprinting of the same set of feed samples, provided a higher sensitivity (1.00). This shows that the NIRS-based approach provides a rapid and low-cost screening tool, whereas the fatty acid fingerprinting model can be used for further confirmation of the organic identity of feed samples for laying hens. These methods provide additional assurance to the administrative controls currently conducted in the organic feed sector.

  5. Classification and regression tree (CART) model to predict pulmonary tuberculosis in hospitalized patients.

    PubMed

    Aguiar, Fabio S; Almeida, Luciana L; Ruffino-Netto, Antonio; Kritski, Afranio Lineu; Mello, Fernanda Cq; Werneck, Guilherme L

    2012-08-07

    Tuberculosis (TB) remains a public health issue worldwide. The lack of specific clinical symptoms to diagnose TB makes the correct decision to admit patients to respiratory isolation a difficult task for the clinician. Isolation of patients without the disease is common and increases health costs. Decision models for the diagnosis of TB in patients attending hospitals can increase the quality of care and decrease costs, without the risk of hospital transmission. We present a predictive model for predicting pulmonary TB in hospitalized patients in a high prevalence area in order to contribute to a more rational use of isolation rooms without increasing the risk of transmission. Cross sectional study of patients admitted to CFFH from March 2003 to December 2004. A classification and regression tree (CART) model was generated and validated. The area under the ROC curve (AUC), sensitivity, specificity, positive and negative predictive values were used to evaluate the performance of model. Validation of the model was performed with a different sample of patients admitted to the same hospital from January to December 2005. We studied 290 patients admitted with clinical suspicion of TB. Diagnosis was confirmed in 26.5% of them. Pulmonary TB was present in 83.7% of the patients with TB (62.3% with positive sputum smear) and HIV/AIDS was present in 56.9% of patients. The validated CART model showed sensitivity, specificity, positive predictive value and negative predictive value of 60.00%, 76.16%, 33.33%, and 90.55%, respectively. The AUC was 79.70%. The CART model developed for these hospitalized patients with clinical suspicion of TB had fair to good predictive performance for pulmonary TB. The most important variable for prediction of TB diagnosis was chest radiograph results. Prospective validation is still necessary, but our model offer an alternative for decision making in whether to isolate patients with clinical suspicion of TB in tertiary health facilities in countries with limited resources.

  6. Comparing the performance of meta-classifiers—a case study on selected imbalanced data sets relevant for prediction of liver toxicity

    NASA Astrophysics Data System (ADS)

    Jain, Sankalp; Kotsampasakou, Eleni; Ecker, Gerhard F.

    2018-05-01

    Cheminformatics datasets used in classification problems, especially those related to biological or physicochemical properties, are often imbalanced. This presents a major challenge in development of in silico prediction models, as the traditional machine learning algorithms are known to work best on balanced datasets. The class imbalance introduces a bias in the performance of these algorithms due to their preference towards the majority class. Here, we present a comparison of the performance of seven different meta-classifiers for their ability to handle imbalanced datasets, whereby Random Forest is used as base-classifier. Four different datasets that are directly (cholestasis) or indirectly (via inhibition of organic anion transporting polypeptide 1B1 and 1B3) related to liver toxicity were chosen for this purpose. The imbalance ratio in these datasets ranges between 4:1 and 20:1 for negative and positive classes, respectively. Three different sets of molecular descriptors for model development were used, and their performance was assessed in 10-fold cross-validation and on an independent validation set. Stratified bagging, MetaCost and CostSensitiveClassifier were found to be the best performing among all the methods. While MetaCost and CostSensitiveClassifier provided better sensitivity values, Stratified Bagging resulted in high balanced accuracies.

  7. Comparing the performance of meta-classifiers—a case study on selected imbalanced data sets relevant for prediction of liver toxicity

    NASA Astrophysics Data System (ADS)

    Jain, Sankalp; Kotsampasakou, Eleni; Ecker, Gerhard F.

    2018-04-01

    Cheminformatics datasets used in classification problems, especially those related to biological or physicochemical properties, are often imbalanced. This presents a major challenge in development of in silico prediction models, as the traditional machine learning algorithms are known to work best on balanced datasets. The class imbalance introduces a bias in the performance of these algorithms due to their preference towards the majority class. Here, we present a comparison of the performance of seven different meta-classifiers for their ability to handle imbalanced datasets, whereby Random Forest is used as base-classifier. Four different datasets that are directly (cholestasis) or indirectly (via inhibition of organic anion transporting polypeptide 1B1 and 1B3) related to liver toxicity were chosen for this purpose. The imbalance ratio in these datasets ranges between 4:1 and 20:1 for negative and positive classes, respectively. Three different sets of molecular descriptors for model development were used, and their performance was assessed in 10-fold cross-validation and on an independent validation set. Stratified bagging, MetaCost and CostSensitiveClassifier were found to be the best performing among all the methods. While MetaCost and CostSensitiveClassifier provided better sensitivity values, Stratified Bagging resulted in high balanced accuracies.

  8. Locomotive crashworthiness research : volume 1 : model development and validation

    DOT National Transportation Integrated Search

    1995-07-01

    This report is the first of four volumes concerning a study to investigate the costs and benefits of equipping locomotives with various crashworthiness features beyond those currently specified by the Association of American Railroads S-580 specifica...

  9. External Validation Study of First Trimester Obstetric Prediction Models (Expect Study I): Research Protocol and Population Characteristics.

    PubMed

    Meertens, Linda Jacqueline Elisabeth; Scheepers, Hubertina Cj; De Vries, Raymond G; Dirksen, Carmen D; Korstjens, Irene; Mulder, Antonius Lm; Nieuwenhuijze, Marianne J; Nijhuis, Jan G; Spaanderman, Marc Ea; Smits, Luc Jm

    2017-10-26

    A number of first-trimester prediction models addressing important obstetric outcomes have been published. However, most models have not been externally validated. External validation is essential before implementing a prediction model in clinical practice. The objective of this paper is to describe the design of a study to externally validate existing first trimester obstetric prediction models, based upon maternal characteristics and standard measurements (eg, blood pressure), for the risk of pre-eclampsia (PE), gestational diabetes mellitus (GDM), spontaneous preterm birth (PTB), small-for-gestational-age (SGA) infants, and large-for-gestational-age (LGA) infants among Dutch pregnant women (Expect Study I). The results of a pilot study on the feasibility and acceptability of the recruitment process and the comprehensibility of the Pregnancy Questionnaire 1 are also reported. A multicenter prospective cohort study was performed in The Netherlands between July 1, 2013 and December 31, 2015. First trimester obstetric prediction models were systematically selected from the literature. Predictor variables were measured by the Web-based Pregnancy Questionnaire 1 and pregnancy outcomes were established using the Postpartum Questionnaire 1 and medical records. Information about maternal health-related quality of life, costs, and satisfaction with Dutch obstetric care was collected from a subsample of women. A pilot study was carried out before the official start of inclusion. External validity of the models will be evaluated by assessing discrimination and calibration. Based on the pilot study, minor improvements were made to the recruitment process and online Pregnancy Questionnaire 1. The validation cohort consists of 2614 women. Data analysis of the external validation study is in progress. This study will offer insight into the generalizability of existing, non-invasive first trimester prediction models for various obstetric outcomes in a Dutch obstetric population. An impact study for the evaluation of the best obstetric prediction models in the Dutch setting with respect to their effect on clinical outcomes, costs, and quality of life-Expect Study II-is being planned. Netherlands Trial Registry (NTR): NTR4143; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=4143 (Archived by WebCite at http://www.webcitation.org/6t8ijtpd9). ©Linda Jacqueline Elisabeth Meertens, Hubertina CJ Scheepers, Raymond G De Vries, Carmen D Dirksen, Irene Korstjens, Antonius LM Mulder, Marianne J Nieuwenhuijze, Jan G Nijhuis, Marc EA Spaanderman, Luc JM Smits. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 26.10.2017.

  10. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  11. Unifying Speed-Accuracy Trade-Off and Cost-Benefit Trade-Off in Human Reaching Movements.

    PubMed

    Peternel, Luka; Sigaud, Olivier; Babič, Jan

    2017-01-01

    Two basic trade-offs interact while our brain decides how to move our body. First, with the cost-benefit trade-off, the brain trades between the importance of moving faster toward a target that is more rewarding and the increased muscular cost resulting from a faster movement. Second, with the speed-accuracy trade-off, the brain trades between how accurate the movement needs to be and the time it takes to achieve such accuracy. So far, these two trade-offs have been well studied in isolation, despite their obvious interdependence. To overcome this limitation, we propose a new model that is able to simultaneously account for both trade-offs. The model assumes that the central nervous system maximizes the expected utility resulting from the potential reward and the cost over the repetition of many movements, taking into account the probability to miss the target. The resulting model is able to account for both the speed-accuracy and the cost-benefit trade-offs. To validate the proposed hypothesis, we confront the properties of the computational model to data from an experimental study where subjects have to reach for targets by performing arm movements in a horizontal plane. The results qualitatively show that the proposed model successfully accounts for both cost-benefit and speed-accuracy trade-offs.

  12. SEM-PLS Analysis of Inhibiting Factors of Cost Performance for Large Construction Projects in Malaysia: Perspective of Clients and Consultants

    PubMed Central

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227

  13. SEM-PLS analysis of inhibiting factors of cost performance for large construction projects in Malaysia: perspective of clients and consultants.

    PubMed

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.

  14. Cost-effectiveness of enhanced recovery in hip and knee replacement: a systematic review protocol.

    PubMed

    Murphy, Jacqueline; Pritchard, Mark G; Cheng, Lok Yin; Janarthanan, Roshni; Leal, José

    2018-03-14

    Hip and knee replacement represents a significant burden to the UK healthcare system. 'Enhanced recovery' pathways have been introduced in the National Health Service (NHS) for patients undergoing hip and knee replacement, with the aim of improving outcomes and timely recovery after surgery. To support policymaking, there is a need to evaluate the cost-effectiveness of enhanced recovery pathways across jurisdictions. Our aim is to systematically summarise the published cost-effectiveness evidence on enhanced recovery in hip and knee replacement, both as a whole and for each of the various components of enhanced recovery pathways. A systematic review will be conducted using MEDLINE, EMBASE, Econlit and the National Health Service Economic Evaluations Database. Separate search strategies were developed for each database including terms relating to hip and knee replacement/arthroplasty, economic evaluations, decision modelling and quality of life measures.We will extract peer-reviewed studies published between 2000 and 2017 reporting economic evaluations of preoperative, perioperative or postoperative enhanced recovery interventions within hip or knee replacement. Economic evaluations alongside cohort studies or based on decision models will be included. Only studies with patients undergoing elective replacement surgery of the hip or knee will be included. Data will be extracted using a predefined pro forma following best practice guidelines for economic evaluation, decision modelling and model validation.Our primary outcome will be the cost-effectiveness of enhanced recovery (entire pathway and individual components) in terms of incremental cost per quality-adjusted life year. A narrative synthesis of all studies will be presented, focussing on cost-effectiveness results, study design, quality and validation status. This systematic review is exempted from ethics approval because the work is carried out on published documents. The results of the review will be disseminated in a peer-reviewed academic journal and at conferences. CRD42017059473. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Cost-effectiveness of enhanced recovery in hip and knee replacement: a systematic review protocol

    PubMed Central

    Pritchard, Mark G; Cheng, Lok Yin; Janarthanan, Roshni

    2018-01-01

    Introduction Hip and knee replacement represents a significant burden to the UK healthcare system. ‘Enhanced recovery’ pathways have been introduced in the National Health Service (NHS) for patients undergoing hip and knee replacement, with the aim of improving outcomes and timely recovery after surgery. To support policymaking, there is a need to evaluate the cost-effectiveness of enhanced recovery pathways across jurisdictions. Our aim is to systematically summarise the published cost-effectiveness evidence on enhanced recovery in hip and knee replacement, both as a whole and for each of the various components of enhanced recovery pathways. Methods and analysis A systematic review will be conducted using MEDLINE, EMBASE, Econlit and the National Health Service Economic Evaluations Database. Separate search strategies were developed for each database including terms relating to hip and knee replacement/arthroplasty, economic evaluations, decision modelling and quality of life measures. We will extract peer-reviewed studies published between 2000 and 2017 reporting economic evaluations of preoperative, perioperative or postoperative enhanced recovery interventions within hip or knee replacement. Economic evaluations alongside cohort studies or based on decision models will be included. Only studies with patients undergoing elective replacement surgery of the hip or knee will be included. Data will be extracted using a predefined pro forma following best practice guidelines for economic evaluation, decision modelling and model validation. Our primary outcome will be the cost-effectiveness of enhanced recovery (entire pathway and individual components) in terms of incremental cost per quality-adjusted life year. A narrative synthesis of all studies will be presented, focussing on cost-effectiveness results, study design, quality and validation status. Ethics and dissemination This systematic review is exempted from ethics approval because the work is carried out on published documents. The results of the review will be disseminated in a peer-reviewed academic journal and at conferences. PROSPERO registration number CRD42017059473. PMID:29540418

  16. Developing a Universal Navy Uniform Adoption Model for Use in Forecasting

    DTIC Science & Technology

    2015-12-01

    manpower , and allowance data in order to build the model. Once chosen, the best candidate model will be validated against alternate sales data from a...inventory shortage or excess inventory holding costs caused by overestimation. 14. SUBJECT TERMS demand management, demand forecasting, Defense...software will be used to identify relationships between uniform sales, time, manpower , and allowance data in order to build the model. Once chosen, the

  17. Concept design theory and model for multi-use space facilities: Analysis of key system design parameters through variance of mission requirements

    NASA Astrophysics Data System (ADS)

    Reynerson, Charles Martin

    This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.

  18. The development of an integrated Indonesian health care model using Kano's model, quality function deployment and balanced scorecard

    NASA Astrophysics Data System (ADS)

    Jonny, Zagloed, Teuku Yuri M.

    2017-11-01

    This paper aims to present an integrated health care model for Indonesian health care industry. Based on previous researches, there are two health care models in the industry such as decease- and patient-centered care models. In their developments, the patient-centered care model is widely applied due to its capability in reducing cost and improving quality simultaneously. However, there is still no comprehensive model resulting in cost reduction, quality improvement, patient satisfaction and hospital profitability simultaneously. Therefore, this research is intended to develop that model. In doing so, first, a conceptual model using Kano's Model, Quality Function Deployment (QFD) and Balanced Scorecard (BSC) is developed to generate several important elements of the model as required by stakeholders. Then, a case study of an Indonesian hospital is presented to evaluate the validity of the model using correlation analysis. As a result, it can be concluded that the model is validated implying several managerial insights among its elements such as l) leadership (r=0.85) and context of the organization (r=0.77) improve operations; 2) planning (r=0.96), support process (r=0.87) and continual improvement (r=0.95) also improve operations; 3) operations improve customer satisfaction (r=0.89) and financial performance (r=0.93) and 4) customer satisfaction improves the financial performance (0.98).

  19. A Patient Risk Model of Chemotherapy-Induced Febrile Neutropenia: Lessons Learned From the ANC Study Group.

    PubMed

    Lyman, Gary H; Poniewierski, Marek S

    2017-12-01

    Neutropenia and its complications, including febrile neutropenia (FN), represent major toxicities associated with cancer chemotherapy, resulting in considerable morbidity, mortality, and costs. The myeloid growth factors such as granulocyte colony-stimulating factor (G-CSF) have been shown to reduce the risk of neutropenia complications while enabling safe and effective chemotherapy dose intensity. Concerns about the high costs of these agents along with limited physician adherence to clinical practice guidelines, resulting in both overuse and underuse, has stimulated interest in models for individual patient risk assessment to guide appropriate use of G-CSF. In a model developed and validated by the ANC Study Group, half of patients were classified as high risk and half as low risk based on patient-, disease-, and treatment-related factors. This model has been further validated in an independent patient population. Physician-assessed risk of FN, as well as the decision to use prophylactic CSF, has been shown to correlate poorly with the FN risk estimated by the model. Additional modeling efforts in both adults and children receiving cancer treatment have been reported. Identification of patients at a high individual risk for FN and its consequences may offer the potential for optimal chemotherapy delivery and patient outcomes. Likewise, identification of patients at low risk for neutropenic events may reduce costs when such supportive care is not warranted. This article reviews and summarizes FN modeling studies and the opportunities for personalizing supportive care in patients receiving chemotherapy. Copyright © 2017 by the National Comprehensive Cancer Network.

  20. Multi-scale soil moisture model calibration and validation: An ARS Watershed on the South Fork of the Iowa River

    USDA-ARS?s Scientific Manuscript database

    Soil moisture monitoring with in situ technology is a time consuming and costly endeavor for which a method of increasing the resolution of spatial estimates across in situ networks is necessary. Using a simple hydrologic model, the resolution of an in situ watershed network can be increased beyond...

  1. Validation of an internal hardwood log defect prediction model

    Treesearch

    R. Edward Thomas

    2011-01-01

    The type, size, and location of internal defects dictate the grade and value of lumber sawn from hardwood logs. However, acquiring internal defect knowledge with x-ray/computed-tomography or magnetic-resonance imaging technology can be expensive both in time and cost. An alternative approach uses prediction models based on correlations among external defect indicators...

  2. The Launch Systems Operations Cost Model

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.; Hamaker, Joseph W. (Technical Monitor)

    2001-01-01

    One of NASA's primary missions is to reduce the cost of access to space while simultaneously increasing safety. A key component, and one of the least understood, is the recurring operations and support cost for reusable launch systems. In order to predict these costs, NASA, under the leadership of the Independent Program Assessment Office (IPAO), has commissioned the development of a Launch Systems Operations Cost Model (LSOCM). LSOCM is a tool to predict the operations & support (O&S) cost of new and modified reusable (and partially reusable) launch systems. The requirements are to predict the non-recurring cost for the ground infrastructure and the recurring cost of maintaining that infrastructure, performing vehicle logistics, and performing the O&S actions to return the vehicle to flight. In addition, the model must estimate the time required to cycle the vehicle through all of the ground processing activities. The current version of LSOCM is an amalgamation of existing tools, leveraging our understanding of shuttle operations cost with a means of predicting how the maintenance burden will change as the vehicle becomes more aircraft like. The use of the Conceptual Operations Manpower Estimating Tool/Operations Cost Model (COMET/OCM) provides a solid point of departure based on shuttle and expendable launch vehicle (ELV) experience. The incorporation of the Reliability and Maintainability Analysis Tool (RMAT) as expressed by a set of response surface model equations gives a method for estimating how changing launch system characteristics affects cost and cycle time as compared to today's shuttle system. Plans are being made to improve the model. The development team will be spending the next few months devising a structured methodology that will enable verified and validated algorithms to give accurate cost estimates. To assist in this endeavor the LSOCM team is part of an Agency wide effort to combine resources with other cost and operations professionals to support models, databases, and operations assessments.

  3. High laboratory cost predicted per tuberculosis case diagnosed with increased case finding without a triage strategy.

    PubMed

    Dunbar, R; Naidoo, P; Beyers, N; Langley, I

    2017-09-01

    Cape Town, South Africa. To model the effects of increased case finding and triage strategies on laboratory costs per tuberculosis (TB) case diagnosed. We used a validated operational model and published laboratory cost data. We modelled the effect of varying the proportion with TB among presumptive cases and Xpert cartridge price reductions on cost per TB case and per additional TB case diagnosed in the Xpert-based vs. smear/culture-based algorithms. In our current scenario (18.3% with TB among presumptive cases), the proportion of cases diagnosed increased by 8.7% (16.7% vs. 15.0%), and the cost per case diagnosed increased by 142% (US$121 vs. US$50). The cost per additional case diagnosed was US$986. This would increase to US$1619 if the proportion with TB among presumptive cases was 10.6%. At 25.9-30.8% of TB prevalence among presumptive cases and a 50% reduction in Xpert cartridge price, the cost per TB case diagnosed would range from US$50 to US$59 (comparable to the US$48.77 found in routine practice with smear/culture). The operational model illustrates the effect of increased case finding on laboratory costs per TB case diagnosed. Unless triage strategies are identified, the approach will not be sustainable, even if Xpert cartridge prices are reduced.

  4. Advanced Structural Optimization Under Consideration of Cost Tracking

    NASA Astrophysics Data System (ADS)

    Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.

    2014-06-01

    In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.

  5. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  6. Development and validation of a predictive model for 90-day readmission following elective spine surgery.

    PubMed

    Parker, Scott L; Sivaganesan, Ahilan; Chotai, Silky; McGirt, Matthew J; Asher, Anthony L; Devin, Clinton J

    2018-06-15

    OBJECTIVE Hospital readmissions lead to a significant increase in the total cost of care in patients undergoing elective spine surgery. Understanding factors associated with an increased risk of postoperative readmission could facilitate a reduction in such occurrences. The aims of this study were to develop and validate a predictive model for 90-day hospital readmission following elective spine surgery. METHODS All patients undergoing elective spine surgery for degenerative disease were enrolled in a prospective longitudinal registry. All 90-day readmissions were prospectively recorded. For predictive modeling, all covariates were selected by choosing those variables that were significantly associated with readmission and by incorporating other relevant variables based on clinical intuition and the Akaike information criterion. Eighty percent of the sample was randomly selected for model development and 20% for model validation. Multiple logistic regression analysis was performed with Bayesian model averaging (BMA) to model the odds of 90-day readmission. Goodness of fit was assessed via the C-statistic, that is, the area under the receiver operating characteristic curve (AUC), using the training data set. Discrimination (predictive performance) was assessed using the C-statistic, as applied to the 20% validation data set. RESULTS A total of 2803 consecutive patients were enrolled in the registry, and their data were analyzed for this study. Of this cohort, 227 (8.1%) patients were readmitted to the hospital (for any cause) within 90 days postoperatively. Variables significantly associated with an increased risk of readmission were as follows (OR [95% CI]): lumbar surgery 1.8 [1.1-2.8], government-issued insurance 2.0 [1.4-3.0], hypertension 2.1 [1.4-3.3], prior myocardial infarction 2.2 [1.2-3.8], diabetes 2.5 [1.7-3.7], and coagulation disorder 3.1 [1.6-5.8]. These variables, in addition to others determined a priori to be clinically relevant, comprised 32 inputs in the predictive model constructed using BMA. The AUC value for the training data set was 0.77 for model development and 0.76 for model validation. CONCLUSIONS Identification of high-risk patients is feasible with the novel predictive model presented herein. Appropriate allocation of resources to reduce the postoperative incidence of readmission may reduce the readmission rate and the associated health care costs.

  7. Medical talent management: a model for physician deployment.

    PubMed

    Brightman, Baird

    2007-01-01

    This article aims to provide a focused cost-effective method for triaging physicians into appropriate non-clinical roles to benefit both doctors and healthcare organizations. Reviews a validated career-planning process and customize it for medical talent management. A structured career assessment can differentiate between different physician work styles and direct medical talent into best-fit positions. This allows healthcare organizations to create a more finely tuned career ladder than the familiar "in or out" binary choice. PRACTICAL IMPLICATIONS--Healthcare organizations can invest in cost-effective processes for the optimal utilization of their medical talent. Provides a new use for a well-validated career assessment and planning system. The actual value of this approach should be studied using best-practices in ROI research.

  8. Installed Cost Benchmarks and Deployment Barriers for Residential Solar Photovoltaics with Energy Storage: Q1 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ardani, Kristen; O'Shaughnessy, Eric; Fu, Ran

    2016-12-01

    In this report, we fill a gap in the existing knowledge about PV-plus-storage system costs and value by providing detailed component- and system-level installed cost benchmarks for residential systems. We also examine other barriers to increased deployment of PV-plus-storage systems in the residential sector. The results are meant to help technology manufacturers, installers, and other stakeholders identify cost-reduction opportunities and inform decision makers about regulatory, policy, and market characteristics that impede solar plus storage deployment. In addition, our periodic cost benchmarks will document progress in cost reductions over time. To analyze costs for PV-plus-storage systems deployed in the first quartermore » of 2016, we adapt the National Renewable Energy Laboratory's component- and system-level cost-modeling methods for standalone PV. In general, we attempt to model best-in-class installation techniques and business operations from an installed-cost perspective. In addition to our original analysis, model development, and review of published literature, we derive inputs for our model and validate our draft results via interviews with industry and subject-matter experts. One challenge to analyzing the costs of PV-plus-storage systems is choosing an appropriate cost metric. Unlike standalone PV, energy storage lacks universally accepted cost metrics, such as dollars per watt of installed capacity and lifetime levelized cost of energy. We explain the difficulty of arriving at a standard approach for reporting storage costs and then provide the rationale for using the total installed costs of a standard PV-plus-storage system as our primary metric, rather than using a system-size-normalized metric.« less

  9. Mapping the Paediatric Quality of Life Inventory (PedsQL™) Generic Core Scales onto the Child Health Utility Index-9 Dimension (CHU-9D) Score for Economic Evaluation in Children.

    PubMed

    Lambe, Tosin; Frew, Emma; Ives, Natalie J; Woolley, Rebecca L; Cummins, Carole; Brettell, Elizabeth A; Barsoum, Emma N; Webb, Nicholas J A

    2018-04-01

    The Paediatric Quality of Life Inventory (PedsQL™) questionnaire is a widely used, generic instrument designed for measuring health-related quality of life (HRQoL); however, it is not preference-based and therefore not suitable for cost-utility analysis. The Child Health Utility Index-9 Dimension (CHU-9D), however, is a preference-based instrument that has been primarily developed to support cost-utility analysis. This paper presents a method for estimating CHU-9D index scores from responses to the PedsQL™ using data from a randomised controlled trial of prednisolone therapy for treatment of childhood corticosteroid-sensitive nephrotic syndrome. HRQoL data were collected from children at randomisation, week 16, and months 12, 18, 24, 36 and 48. Observations on children aged 5 years and older were pooled across all data collection timepoints and were then randomised into an estimation (n = 279) and validation (n = 284) sample. A number of models were developed using the estimation data before internal validation. The best model was chosen using multi-stage selection criteria. Most of the models developed accurately predicted the CHU-9D mean index score. The best performing model was a generalised linear model (mean absolute error = 0.0408; mean square error = 0.0035). The proportion of index scores deviating from the observed scores by <  0.03 was 53%. The mapping algorithm provides an empirical tool for estimating CHU-9D index scores and for conducting cost-utility analyses within clinical studies that have only collected PedsQL™ data. It is valid for children aged 5 years or older. Caution should be exercised when using this with children younger than 5 years, older adolescents (>  13 years) or patient groups with particularly poor quality of life. 16645249.

  10. Fixed gain and adaptive techniques for rotorcraft vibration control

    NASA Technical Reports Server (NTRS)

    Roy, R. H.; Saberi, H. A.; Walker, R. A.

    1985-01-01

    The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.

  11. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  12. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  13. Analysis procedures and subjective flight results of a simulator validation and cue fidelity experiment

    NASA Technical Reports Server (NTRS)

    Carr, Peter C.; Mckissick, Burnell T.

    1988-01-01

    A joint experiment to investigate simulator validation and cue fidelity was conducted by the Dryden Flight Research Facility of NASA Ames Research Center (Ames-Dryden) and NASA Langley Research Center. The primary objective was to validate the use of a closed-loop pilot-vehicle mathematical model as an analytical tool for optimizing the tradeoff between simulator fidelity requirements and simulator cost. The validation process includes comparing model predictions with simulation and flight test results to evaluate various hypotheses for differences in motion and visual cues and information transfer. A group of five pilots flew air-to-air tracking maneuvers in the Langley differential maneuvering simulator and visual motion simulator and in an F-14 aircraft at Ames-Dryden. The simulators used motion and visual cueing devices including a g-seat, a helmet loader, wide field-of-view horizon, and a motion base platform.

  14. The Mt. Hood challenge: cross-testing two diabetes simulation models.

    PubMed

    Brown, J B; Palmer, A J; Bisgaard, P; Chan, W; Pedula, K; Russell, A

    2000-11-01

    Starting from identical patients with type 2 diabetes, we compared the 20-year predictions of two computer simulation models, a 1998 version of the IMIB model and version 2.17 of the Global Diabetes Model (GDM). Primary measures of outcome were 20-year cumulative rates of: survival, first (incident) acute myocardial infarction (AMI), first stroke, proliferative diabetic retinopathy (PDR), macro-albuminuria (gross proteinuria, or GPR), and amputation. Standardized test patients were newly diagnosed males aged 45 or 75, with high and low levels of glycated hemoglobin (HbA(1c)), systolic blood pressure (SBP), and serum lipids. Both models generated realistic results and appropriate responses to changes in risk factors. Compared with the GDM, the IMIB model predicted much higher rates of mortality and AMI, and fewer strokes. These differences can be explained by differences in model architecture (Markov vs. microsimulation), different evidence bases for cardiovascular prediction (Framingham Heart Study cohort vs. Kaiser Permanente patients), and isolated versus interdependent prediction of cardiovascular events. Compared with IMIB, GDM predicted much higher lifetime costs, because of lower mortality and the use of a different costing method. It is feasible to cross-validate and explicate dissimilar diabetes simulation models using standardized patients. The wide differences in the model results that we observed demonstrate the need for cross-validation. We propose to hold a second 'Mt Hood Challenge' in 2001 and invite all diabetes modelers to attend.

  15. Cost-effectiveness of diabetes case management for low-income populations.

    PubMed

    Gilmer, Todd P; Roze, Stéphane; Valentine, William J; Emy-Albrecht, Katrina; Ray, Joshua A; Cobden, David; Nicklasson, Lars; Philis-Tsimikas, Athena; Palmer, Andrew J

    2007-10-01

    To evaluate the cost-effectiveness of Project Dulce, a culturally specific diabetes case management and self-management training program, in four cohorts defined by insurance status. Clinical and cost data on 3,893 persons with diabetes participating in Project Dulce were used as inputs into a diabetes simulation model. The Center for Outcomes Research Diabetes Model, a published, peer-reviewed and validated simulation model of diabetes, was used to evaluate life expectancy, quality-adjusted life expectancy (QALY), cumulative incidence of complications and direct medical costs over patient lifetimes (40-year time horizon) from a third-party payer perspective. Cohort characteristics, treatment effects, and case management costs were derived using a difference in difference design comparing data from the Project Dulce program to a cohort of historical controls. Long-term costs were derived from published U.S. sources. Costs and clinical benefits were discounted at 3.0 percent per annum. Sensitivity analyses were performed. Incremental cost-effectiveness ratios of $10,141, $24,584, $44,941, and $69,587 per QALY gained were estimated for Project Dulce participants versus control in the uninsured, County Medical Services, Medi-Cal, and commercial insurance cohorts, respectively. The Project Dulce diabetes case management program was associated with cost-effective improvements in quality-adjusted life expectancy and decreased incidence of diabetes-related complications over patient lifetimes. Diabetes case management may be particularly cost effective for low-income populations.

  16. A Foraging Cost of Migration for a Partially Migratory Cyprinid Fish

    PubMed Central

    Chapman, Ben B.; Eriksen, Anders; Baktoft, Henrik; Brodersen, Jakob; Nilsson, P. Anders; Hulthen, Kaj; Brönmark, Christer; Hansson, Lars-Anders; Grønkjær, Peter; Skov, Christian

    2013-01-01

    Migration has evolved as a strategy to maximise individual fitness in response to seasonally changing ecological and environmental conditions. However, migration can also incur costs, and quantifying these costs can provide important clues to the ultimate ecological forces that underpin migratory behaviour. A key emerging model to explain migration in many systems posits that migration is driven by seasonal changes to a predation/growth potential (p/g) trade-off that a wide range of animals face. In this study we assess a key assumption of this model for a common cyprinid partial migrant, the roach Rutilus rutilus, which migrates from shallow lakes to streams during winter. By sampling fish from stream and lake habitats in the autumn and spring and measuring their stomach fullness and diet composition, we tested if migrating roach pay a cost of reduced foraging when migrating. Resident fish had fuller stomachs containing more high quality prey items than migrant fish. Hence, we document a feeding cost to migration in roach, which adds additional support for the validity of the p/g model of migration in freshwater systems. PMID:23723967

  17. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  18. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  19. Projecting manpower to attain quality

    NASA Technical Reports Server (NTRS)

    Rone, K. Y.

    1983-01-01

    The resulting model is useful as a projection tool but must be validated in order to be used as an on-going software cost engineering tool. A procedure is developed to facilitate the tracking of model projections and actual data to allow the model to be tuned. Finally, since the model must be used in an environment of overlapping development activities on a progression of software elements in development and maintenance, a manpower allocation model is developed for use in a steady state development/maintenance environment. In these days of soaring software costs it becomes increasingly important to properly manage a software development project. One element of the management task is the projection and tracking of manpower required to perform the task. In addition, since the total cost of the task is directly related to the initial quality built into the software, it becomes a necessity to project the development manpower in a way to attain that quality. An approach to projecting and tracking manpower with quality in mind is described.

  20. A combined model to assess technical and economic consequences of changing conditions and management options for wastewater utilities.

    PubMed

    Giessler, Mathias; Tränckner, Jens

    2018-02-01

    The paper presents a simplified model that quantifies economic and technical consequences of changing conditions in wastewater systems on utility level. It has been developed based on data from stakeholders and ministries, collected by a survey that determined resulting effects and adapted measures. The model comprises all substantial cost relevant assets and activities of a typical German wastewater utility. It consists of three modules: i) Sewer for describing the state development of sewer systems, ii) WWTP for process parameter consideration of waste water treatment plants (WWTP) and iii) Cost Accounting for calculation of expenses in the cost categories and resulting charges. Validity and accuracy of this model was verified by using historical data from an exemplary wastewater utility. Calculated process as well as economic parameters shows a high accuracy compared to measured parameters and given expenses. Thus, the model is proposed to support strategic, process oriented decision making on utility level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Information Search in Judgment Tasks: The Effects of Unequal Cue Validity and Cost.

    DTIC Science & Technology

    1984-05-01

    bookbag before betting on the contents of the bag being sampled ( Edwards , 1965). They proposed an alternative model for the regression (or continuous...ly displaced vertically for clarity.) The analogous relationship for the Bayesian model is developed by Edwards (1965). Snapper and Peterson (1971...A re- gression model and some preliminary findings." Organizational Behavior and Human Performance, 1982, 30, 330-350. Edwards , W.: "Optimal

  2. A discrete event modelling framework for simulation of long-term outcomes of sequential treatment strategies for ankylosing spondylitis.

    PubMed

    Tran-Duy, An; Boonen, Annelies; van de Laar, Mart A F J; Franke, Angelinus C; Severens, Johan L

    2011-12-01

    To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Discrete event simulation paradigm was selected for model development. Drug efficacy was modelled as changes in disease activity (Bath Ankylosing Spondylitis Disease Activity Index (BASDAI)) and functional status (Bath Ankylosing Spondylitis Functional Index (BASFI)), which were linked to costs and health utility using statistical models fitted based on an observational AS cohort. Published clinical data were used to estimate drug efficacy and time to events. Two strategies were compared: (1) five available non-steroidal anti-inflammatory drugs (strategy 1) and (2) same as strategy 1 plus two tumour necrosis factor α inhibitors (strategy 2). 13,000 patients were followed up individually until death. For probability sensitivity analysis, Monte Carlo simulations were performed with 1000 sets of parameters sampled from the appropriate probability distributions. The models successfully generated valid data on treatments, BASDAI, BASFI, utility, quality-adjusted life years (QALYs) and costs at time points with intervals of 1-3 months during the simulation length of 70 years. Incremental cost per QALY gained in strategy 2 compared with strategy 1 was €35,186. At a willingness-to-pay threshold of €80,000, it was 99.9% certain that strategy 2 was cost-effective. The modelling framework provides great flexibility to implement complex algorithms representing treatment selection, disease progression and changes in costs and utilities over time of patients with AS. Results obtained from the simulation are plausible.

  3. A Test of the Validity of Inviscid Wall-Modeled LES

    NASA Astrophysics Data System (ADS)

    Redman, Andrew; Craft, Kyle; Aikens, Kurt

    2015-11-01

    Computational expense is one of the main deterrents to more widespread use of large eddy simulations (LES). As such, it is important to reduce computational costs whenever possible. In this vein, it may be reasonable to assume that high Reynolds number flows with turbulent boundary layers are inviscid when using a wall model. This assumption relies on the grid being too coarse to resolve either the viscous length scales in the outer flow or those near walls. We are not aware of other studies that have suggested or examined the validity of this approach. The inviscid wall-modeled LES assumption is tested here for supersonic flow over a flat plate on three different grids. Inviscid and viscous results are compared to those of another wall-modeled LES as well as experimental data - the results appear promising. Furthermore, the inviscid assumption reduces simulation costs by about 25% and 39% for supersonic and subsonic flows, respectively, with the current LES application. Recommendations are presented as are future areas of research. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  4. Averting HIV Infections in New York City: A Modeling Approach Estimating the Future Impact of Additional Behavioral and Biomedical HIV Prevention Strategies

    PubMed Central

    Kessler, Jason; Myers, Julie E.; Nucifora, Kimberly A.; Mensah, Nana; Kowalski, Alexis; Sweeney, Monica; Toohey, Christopher; Khademi, Amin; Shepard, Colin; Cutler, Blayne; Braithwaite, R. Scott

    2013-01-01

    Background New York City (NYC) remains an epicenter of the HIV epidemic in the United States. Given the variety of evidence-based HIV prevention strategies available and the significant resources required to implement each of them, comparative studies are needed to identify how to maximize the number of HIV cases prevented most economically. Methods A new model of HIV disease transmission was developed integrating information from a previously validated micro-simulation HIV disease progression model. Specification and parameterization of the model and its inputs, including the intervention portfolio, intervention effects and costs were conducted through a collaborative process between the academic modeling team and the NYC Department of Health and Mental Hygiene. The model projects the impact of different prevention strategies, or portfolios of prevention strategies, on the HIV epidemic in NYC. Results Ten unique interventions were able to provide a prevention benefit at an annual program cost of less than $360,000, the threshold for consideration as a cost-saving intervention (because of offsets by future HIV treatment costs averted). An optimized portfolio of these specific interventions could result in up to a 34% reduction in new HIV infections over the next 20 years. The cost-per-infection averted of the portfolio was estimated to be $106,378; the total cost was in excess of $2 billion (over the 20 year period, or approximately $100 million per year, on average). The cost-savings of prevented infections was estimated at more than $5 billion (or approximately $250 million per year, on average). Conclusions Optimal implementation of a portfolio of evidence-based interventions can have a substantial, favorable impact on the ongoing HIV epidemic in NYC and provide future cost-saving despite significant initial costs. PMID:24058465

  5. Risk-adjusted econometric model to estimate postoperative costs: an additional instrument for monitoring performance after major lung resection.

    PubMed

    Brunelli, Alessandro; Salati, Michele; Refai, Majed; Xiumé, Francesco; Rocco, Gaetano; Sabbatini, Armando

    2007-09-01

    The objectives of this study were to develop a risk-adjusted model to estimate individual postoperative costs after major lung resection and to use it for internal economic audit. Variable and fixed hospital costs were collected for 679 consecutive patients who underwent major lung resection from January 2000 through October 2006 at our unit. Several preoperative variables were used to develop a risk-adjusted econometric model from all patients operated on during the period 2000 through 2003 by a stepwise multiple regression analysis (validated by bootstrap). The model was then used to estimate the postoperative costs in the patients operated on during the 3 subsequent periods (years 2004, 2005, and 2006). Observed and predicted costs were then compared within each period by the Wilcoxon signed rank test. Multiple regression and bootstrap analysis yielded the following model predicting postoperative cost: 11,078 + 1340.3X (age > 70 years) + 1927.8X cardiac comorbidity - 95X ppoFEV1%. No differences between predicted and observed costs were noted in the first 2 periods analyzed (year 2004, $6188.40 vs $6241.40, P = .3; year 2005, $6308.60 vs $6483.60, P = .4), whereas in the most recent period (2006) observed costs were significantly lower than the predicted ones ($3457.30 vs $6162.70, P < .0001). Greater precision in predicting outcome and costs after therapy may assist clinicians in the optimization of clinical pathways and allocation of resources. Our economic model may be used as a methodologic template for economic audit in our specialty and complement more traditional outcome measures in the assessment of performance.

  6. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of this exercise will directly provide end users with important information about the uncertainty of regional climate scenarios, and will furthermore provide the basis for further developing downscaling methods. This presentation will provide background information on VALUE and discuss the identified characteristics and the validation framework.

  7. Engineering innovation to reduce wind power COE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ammerman, Curtt Nelson

    There are enough wind resources in the US to provide 10 times the electric power we currently use, however wind power only accounts for 2% of our total electricity production. One of the main limitations to wind use is cost. Wind power currently costs 5-to-8 cents per kilowatt-hour, which is more than twice the cost of electricity generated by burning coal. Our Intelligent Wind Turbine LDRD Project is applying LANL's leading-edge engineering expertise in modeling and simulation, experimental validation, and advanced sensing technologies to challenges faced in the design and operation of modern wind turbines.

  8. Techno-economic assessment of novel vanadium redox flow batteries with large-area cells

    NASA Astrophysics Data System (ADS)

    Minke, Christine; Kunz, Ulrich; Turek, Thomas

    2017-09-01

    The vanadium redox flow battery (VRFB) is a promising electrochemical storage system for stationary megawatt-class applications. The currently limited cell area determined by the bipolar plate (BPP) could be enlarged significantly with a novel extruded large-area plate. For the first time a techno-economic assessment of VRFB in a power range of 1 MW-20 MW and energy capacities of up to 160 MWh is presented on the basis of the production cost model of large-area BPP. The economic model is based on the configuration of a 250 kW stack and the overall system including stacks, power electronics, electrolyte and auxiliaries. Final results include a simple function for the calculation of system costs within the above described scope. In addition, the impact of cost reduction potentials for key components (membrane, electrode, BPP, vanadium electrolyte) on stack and system costs is quantified and validated.

  9. Modeling river total bed material load discharge using artificial intelligence approaches (based on conceptual inputs)

    NASA Astrophysics Data System (ADS)

    Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal

    2014-06-01

    This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.

  10. The role of observational reference data for climate downscaling: Insights from the VALUE COST Action

    NASA Astrophysics Data System (ADS)

    Kotlarski, Sven; Gutiérrez, José M.; Boberg, Fredrik; Bosshard, Thomas; Cardoso, Rita M.; Herrera, Sixto; Maraun, Douglas; Mezghani, Abdelkader; Pagé, Christian; Räty, Olle; Stepanek, Petr; Soares, Pedro M. M.; Szabo, Peter

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of downscaling methods. Such assessments can be expected to crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling, observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. We here present a comprehensive assessment of the influence of uncertainties in observational reference data and of scale-related issues on several of the above-mentioned aspects. First, temperature and precipitation characteristics as simulated by a set of reanalysis-driven EURO-CORDEX RCM experiments are validated against three different gridded reference data products, namely (1) the EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. The analysis reveals a considerable influence of the choice of the reference data on the evaluation results, especially for precipitation. It is also illustrated how differences between the reference data sets influence the ranking of RCMs according to a comprehensive set of performance measures.

  11. Unsteady Three-Dimensional Simulation of a Shear Coaxial GO2/GH2 Rocket Injector with RANS and Hybrid-RAN-LES/DES Using Flamelet Models

    NASA Technical Reports Server (NTRS)

    Westra, Doug G.; West, Jeffrey S.; Richardson, Brian R.

    2015-01-01

    Historically, the analysis and design of liquid rocket engines (LREs) has relied on full-scale testing and one-dimensional empirical tools. The testing is extremely expensive and the one-dimensional tools are not designed to capture the highly complex, and multi-dimensional features that are inherent to LREs. Recent advances in computational fluid dynamics (CFD) tools have made it possible to predict liquid rocket engine performance, stability, to assess the effect of complex flow features, and to evaluate injector-driven thermal environments, to mitigate the cost of testing. Extensive efforts to verify and validate these CFD tools have been conducted, to provide confidence for using them during the design cycle. Previous validation efforts have documented comparisons of predicted heat flux thermal environments with test data for a single element gaseous oxygen (GO2) and gaseous hydrogen (GH2) injector. The most notable validation effort was a comprehensive validation effort conducted by Tucker et al. [1], in which a number of different groups modeled a GO2/GH2 single element configuration by Pal et al [2]. The tools used for this validation comparison employed a range of algorithms, from both steady and unsteady Reynolds Averaged Navier-Stokes (U/RANS) calculations, large-eddy simulations (LES), detached eddy simulations (DES), and various combinations. A more recent effort by Thakur et al. [3] focused on using a state-of-the-art CFD simulation tool, Loci/STREAM, on a two-dimensional grid. Loci/STREAM was chosen because it has a unique, very efficient flamelet parameterization of combustion reactions that are too computationally expensive to simulate with conventional finite-rate chemistry calculations. The current effort focuses on further advancement of validation efforts, again using the Loci/STREAM tool with the flamelet parameterization, but this time with a three-dimensional grid. Comparisons to the Pal et al. heat flux data will be made for both RANS and Hybrid RANSLES/ Detached Eddy simulations (DES). Computation costs will be reported, along with comparison of accuracy and cost to much less expensive two-dimensional RANS simulations of the same geometry.

  12. Control Theory based Shape Design for the Incompressible Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Cowles, G.; Martinelli, L.

    2003-12-01

    A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.

  13. Template-Directed Instrumentation Reduces Cost and Improves Efficiency for Total Knee Arthroplasty: An Economic Decision Analysis and Pilot Study.

    PubMed

    McLawhorn, Alexander S; Carroll, Kaitlin M; Blevins, Jason L; DeNegre, Scott T; Mayman, David J; Jerabek, Seth A

    2015-10-01

    Template-directed instrumentation (TDI) for total knee arthroplasty (TKA) may streamline operating room (OR) workflow and reduce costs by preselecting implants and minimizing instrument tray burden. A decision model simulated the economics of TDI. Sensitivity analyses determined thresholds for model variables to ensure TDI success. A clinical pilot was reviewed. The accuracy of preoperative templates was validated, and 20 consecutive primary TKAs were performed using TDI. The model determined that preoperative component size estimation should be accurate to ±1 implant size for 50% of TKAs to implement TDI. The pilot showed that preoperative template accuracy exceeded 97%. There were statistically significant improvements in OR turnover time and in-room time for TDI compared to an historical cohort of TKAs. TDI reduces costs and improves OR efficiency. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Quantifying tobacco related health care expenditures in the Republic of the Marshall Islands: a case study in determining health costs in a developing US associated island nation.

    PubMed

    Palafox, N A; Ou, A C; Haberle, H; Chen, T H

    2001-01-01

    This case study examines the advantages, disadvantages, and utility of three research methods to measure the medical costs of tobacco use in the Republic of the Marshall Islands (RMI). The authors used the morbidity-based models, models based on the difference in utilization of medical facilities between smokers and non-smokers, and models of inter-country comparisons. In the RMI, morbidity models would have a propensity to grossly under-estimate the medical costs of tobacco use. Models that measure the difference in medical service utilization between smokers and non-smokers can be confounded by cultural factors and by the level of health care that is provided. The RMI population structure affected the sampling methods. The external validity of the survey instrument may be increased through measuring more parameters with greater precision. Inter-country comparisons may be used to approximate and set upper and lower limits of costs for past medical costs, and may be the only method to determine future health care costs from tobacco use. Determining medical costs of tobacco use in an US Associated island nation with an under-developed health care infrastructure has not been previously attempted. There were significant methodological challenges that were encountered. Health, economic, cultural, and research environments in the RMI are unique and require innovative methods to determine medical costs associated with tobacco use. Direct application of the methodologies utilized in the United States to determine medical costs of tobacco use may grossly under-estimate the medical cost of tobacco use in the RMI. The research challenges can be addressed.

  15. Lessons Learned on Operating and Preparing Operations for a Technology Mission from the Perspective of the Earth Observing-1 Mission

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Howard, Joseph

    2000-01-01

    The New Millennium Program's first Earth-observing mission (EO-1) is a technology validation mission. It is managed by the NASA Goddard Space Flight Center in Greenbelt, Maryland and is scheduled for launch in the summer of 2000. The purpose of this mission is to flight-validate revolutionary technologies that will contribute to the reduction of cost and increase of capabilities for future land imaging missions. In the EO-1 mission, there are five instrument, five spacecraft, and three supporting technologies to flight-validate during a year of operations. EO-1 operations and the accompanying ground system were intended to be simple in order to maintain low operational costs. For purposes of formulating operations, it was initially modeled as a small science mission. However, it quickly evolved into a more complex mission due to the difficulties in effectively integrating all of the validation plans of the individual technologies. As a consequence, more operational support was required to confidently complete the on-orbit validation of the new technologies. This paper will outline the issues and lessons learned applicable to future technology validation missions. Examples of some of these include the following: (1) operational complexity encountered in integrating all of the validation plans into a coherent operational plan, (2) initial desire to run single shift operations subsequently growing to 6 "around-the-clock" operations, (3) managing changes in the technologies that ultimately affected operations, (4) necessity for better team communications within the project to offset the effects of change on the Ground System Developers, Operations Engineers, Integration and Test Engineers, S/C Subsystem Engineers, and Scientists, and (5) the need for a more experienced Flight Operations Team to achieve the necessary operational flexibility. The discussion will conclude by providing several cost comparisons for developing operations from previous missions to EO-1 and discuss some details that might be done differently for future technology validation missions.

  16. Long-term cost-effectiveness of disease management in systolic heart failure.

    PubMed

    Miller, George; Randolph, Stephen; Forkner, Emma; Smith, Brad; Galbreath, Autumn Dawn

    2009-01-01

    Although congestive heart failure (CHF) is a primary target for disease management programs, previous studies have generated mixed results regarding the effectiveness and cost savings of disease management when applied to CHF. We estimated the long-term impact of systolic heart failure disease management from the results of an 18-month clinical trial. We used data generated from the trial (starting population distributions, resource utilization, mortality rates, and transition probabilities) in a Markov model to project results of continuing the disease management program for the patients' lifetimes. Outputs included distribution of illness severity, mortality, resource consumption, and the cost of resources consumed. Both cost and effectiveness were discounted at a rate of 3% per year. Cost-effectiveness was computed as cost per quality-adjusted life year (QALY) gained. Model results were validated against trial data and indicated that, over their lifetimes, patients experienced a lifespan extension of 51 days. Combined discounted lifetime program and medical costs were $4850 higher in the disease management group than the control group, but the program had a favorable long-term discounted cost-effectiveness of $43,650/QALY. These results are robust to assumptions regarding mortality rates, the impact of aging on the cost of care, the discount rate, utility values, and the targeted population. Estimation of the clinical benefits and financial burden of disease management can be enhanced by model-based analyses to project costs and effectiveness. Our results suggest that disease management of heart failure patients can be cost-effective over the long term.

  17. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to highlight their inappropriateness for what is really needed at the conceptual phase of the design process. The First-Order Process Velocity Cost Model (FOPV) is discussed at length in the next section. This is followed by an application of the FOPV cost model to a generic wing. For designs that have no precedence as far as acquisition costs are concerned, cost data derived from the FOPV cost model may not be accurate enough because of new requirements for shape complexity, material, equipment and precision/tolerance. The concept of Cost Modulus is introduced at this point to compensate for these new burdens on the basic processes. This is treated in section 5. The cost of a design must be conveniently linked to its CAD representation. The interfacing of CAD models and spreadsheets containing the cost equations is the subject of the next section, section 6. The last section of the report is a summary of the progress made so far, and the anticipated research work to be achieved in the future.

  18. Validation of GC and HPLC systems for residue studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M.

    1995-12-01

    For residue studies, GC and HPLC system performance must be validated prior to and during use. One excellent measure of system performance is the standard curve and associated chromatograms used to construct that curve. The standard curve is a model of system response to an analyte over a specific time period, and is prima facia evidence of system performance beginning at the auto sampler and proceeding through the injector, column, detector, electronics, data-capture device, and printer/plotter. This tool measures the performance of the entire chromatographic system; its power negates most of the benefits associated with costly and time-consuming validation ofmore » individual system components. Other measures of instrument and method validation will be discussed, including quality control charts and experimental designs for method validation.« less

  19. Minimum cost to control bovine tuberculosis in cow-calf herds

    PubMed Central

    Smith, Rebecca L.; Tauer, Loren W.; Sanderson, Michael W.; Grohn, Yrjo T.

    2014-01-01

    Bovine tuberculosis (bTB) outbreaks in US cattle herds, while rare, are expensive to control. A stochastic model for bTB control in US cattle herds was adapted to more accurately represent cow-calf herd dynamics and was validated by comparison to 2 reported outbreaks. Control cost calculations were added to the model, which was then optimized to minimize costs for either the farm or the government. The results of the optimization showed that test-and-removal costs were minimized for both farms and the government if only 2 negative whole-herd tests were required to declare a herd free of infection, with a 2–3 month testing interval. However, the optimal testing interval for governments was increased to 2–4 months if the model was constrained to reject control programs leading to an infected herd being declared free of infection. Although farms always preferred test-and-removal to depopulation from a cost standpoint, government costs were lower with depopulation more than half the time in 2 of 8 regions. Global sensitivity analysis showed that indemnity costs were significantly associated with a rise in the cost to the government, and that low replacement rates were responsible for the long time to detection predicted by the model, but that improving the sensitivity of slaughterhouse screening and the probability that a slaughtered animal’s herd of origin can be identified would result in faster detection times. PMID:24703601

  20. Minimum cost to control bovine tuberculosis in cow-calf herds.

    PubMed

    Smith, Rebecca L; Tauer, Loren W; Sanderson, Michael W; Gröhn, Yrjo T

    2014-07-01

    Bovine tuberculosis (bTB) outbreaks in US cattle herds, while rare, are expensive to control. A stochastic model for bTB control in US cattle herds was adapted to more accurately represent cow-calf herd dynamics and was validated by comparison to 2 reported outbreaks. Control cost calculations were added to the model, which was then optimized to minimize costs for either the farm or the government. The results of the optimization showed that test-and-removal costs were minimized for both farms and the government if only 2 negative whole-herd tests were required to declare a herd free of infection, with a 2-3 month testing interval. However, the optimal testing interval for governments was increased to 2-4 months if the model was constrained to reject control programs leading to an infected herd being declared free of infection. Although farms always preferred test-and-removal to depopulation from a cost standpoint, government costs were lower with depopulation more than half the time in 2 of 8 regions. Global sensitivity analysis showed that indemnity costs were significantly associated with a rise in the cost to the government, and that low replacement rates were responsible for the long time to detection predicted by the model, but that improving the sensitivity of slaughterhouse screening and the probability that a slaughtered animal's herd of origin can be identified would result in faster detection times. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Economic modeling of HIV treatments.

    PubMed

    Simpson, Kit N

    2010-05-01

    To review the general literature on microeconomic modeling and key points that must be considered in the general assessment of economic modeling reports, discuss the evolution of HIV economic models and identify models that illustrate this development over time, as well as examples of current studies. Recommend improvements in HIV economic modeling. Recent economic modeling studies of HIV include examinations of scaling up antiretroviral (ARV) in South Africa, screening prior to use of abacavir, preexposure prophylaxis, early start of ARV in developing countries and cost-effectiveness comparisons of specific ARV drugs using data from clinical trials. These studies all used extensively published second-generation Markov models in their analyses. There have been attempts to simplify approaches to cost-effectiveness estimates by using simple decision trees or cost-effectiveness calculations with short-time horizons. However, these approaches leave out important cumulative economic effects that will not appear early in a treatment. Many economic modeling studies were identified in the 'gray' literature, but limited descriptions precluded an assessment of their adherence to modeling guidelines, and thus to the validity of their findings. There is a need for developing third-generation models to accommodate new knowledge about adherence, adverse effects, and viral resistance.

  2. OC5 Project Phase Ib: Validation of hydrodynamic loading on a fixed, flexible cylinder for offshore wind applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.

    This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less

  3. OC5 Project Phase Ib: Validation of hydrodynamic loading on a fixed, flexible cylinder for offshore wind applications

    DOE PAGES

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...

    2016-10-13

    This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less

  4. Systems biology-embedded target validation: improving efficacy in drug discovery.

    PubMed

    Vandamme, Drieke; Minke, Benedikt A; Fitzmaurice, William; Kholodenko, Boris N; Kolch, Walter

    2014-01-01

    The pharmaceutical industry is faced with a range of challenges with the ever-escalating costs of drug development and a drying out of drug pipelines. By harnessing advances in -omics technologies and moving away from the standard, reductionist model of drug discovery, there is significant potential to reduce costs and improve efficacy. Embedding systems biology approaches in drug discovery, which seek to investigate underlying molecular mechanisms of potential drug targets in a network context, will reduce attrition rates by earlier target validation and the introduction of novel targets into the currently stagnant market. Systems biology approaches also have the potential to assist in the design of multidrug treatments and repositioning of existing drugs, while stratifying patients to give a greater personalization of medical treatment. © 2013 Wiley Periodicals, Inc.

  5. A comparison of cost effectiveness using data from randomized trials or actual clinical practice: selective cox-2 inhibitors as an example.

    PubMed

    van Staa, Tjeerd-Pieter; Leufkens, Hubert G; Zhang, Bill; Smeeth, Liam

    2009-12-01

    Data on absolute risks of outcomes and patterns of drug use in cost-effectiveness analyses are often based on randomised clinical trials (RCTs). The objective of this study was to evaluate the external validity of published cost-effectiveness studies by comparing the data used in these studies (typically based on RCTs) to observational data from actual clinical practice. Selective Cox-2 inhibitors (coxibs) were used as an example. The UK General Practice Research Database (GPRD) was used to estimate the exposure characteristics and individual probabilities of upper gastrointestinal (GI) events during current exposure to nonsteroidal anti-inflammatory drugs (NSAIDs) or coxibs. A basic cost-effectiveness model was developed evaluating two alternative strategies: prescription of a conventional NSAID or coxib. Outcomes included upper GI events as recorded in GPRD and hospitalisation for upper GI events recorded in the national registry of hospitalisations (Hospital Episode Statistics) linked to GPRD. Prescription costs were based on the prescribed number of tables as recorded in GPRD and the 2006 cost data from the British National Formulary. The study population included over 1 million patients prescribed conventional NSAIDs or coxibs. Only a minority of patients used the drugs long-term and daily (34.5% of conventional NSAIDs and 44.2% of coxibs), whereas coxib RCTs required daily use for at least 6-9 months. The mean cost of preventing one upper GI event as recorded in GPRD was US$104k (ranging from US$64k with long-term daily use to US$182k with intermittent use) and US$298k for hospitalizations. The mean costs (for GPRD events) over calendar time were US$58k during 1990-1993 and US$174k during 2002-2005. Using RCT data rather than GPRD data for event probabilities, the mean cost was US$16k with the VIGOR RCT and US$20k with the CLASS RCT. The published cost-effectiveness analyses of coxibs lacked external validity, did not represent patients in actual clinical practice, and should not have been used to inform prescribing policies. External validity should be an explicit requirement for cost-effectiveness analyses.

  6. National Variation in Costs and Mortality for Leukodystrophy Patients in U.S. Children’s Hospitals

    PubMed Central

    Brimley, Cameron J; Lopez, Jonathan; van Haren, Keith; Wilkes, Jacob; Sheng, Xiaoming; Nelson, Clint; Korgenski, E. Kent; Srivastava, Rajendu; Bonkowsky, Joshua L.

    2013-01-01

    Background Inherited leukodystrophies are progressive, debilitating neurological disorders with few treatment options and high mortality rates. Our objective was to determine national variation in the costs for leukodystrophy patients, and to evaluate differences in their care. Methods We developed an algorithm to identify inherited leukodystrophy patients in de-identified data sets using a recursive tree model based on ICD-9 CM diagnosis and procedure charge codes. Validation of the algorithm was performed independently at two institutions, and with data from the Pediatric Health Information System (PHIS) of 43 U.S. children’s hospitals, for a seven year time period, 2004–2010. Results A recursive algorithm was developed and validated, based on six ICD-9 codes and one procedure code, that had a sensitivity up to 90% (range 61–90%) and a specificity up to 99% (range 53–99%) for identifying inherited leukodystrophy patients. Inherited leukodystrophy patients comprise 0.4% of admissions to children’s hospitals and 0.7% of costs. Over seven years these patients required $411 million of hospital care, or $131,000/patient. Hospital costs for leukodystrophy patients varied at different institutions, ranging from 2 to 15 times more than the average pediatric patient. There was a statistically significant correlation between higher volume and increased cost efficiency. Increased mortality rates had an inverse relationship with increased patient volume that was not statistically significant. Conclusions We developed and validated a code-based algorithm for identifying leukodystrophy patients in deidentified national datasets. Leukodystrophy patients account for $59 million of costs yearly at children’s hospitals. Our data highlight potential to reduce unwarranted variability and improve patient care. PMID:23953952

  7. Application of target costing in machining

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Bhaskaran; Kokatnur, Ameet; Gupta, Deepak P.

    2004-11-01

    In today's intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functionality requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. Target costing is a structured approach to determine the cost at which a proposed product, meeting the quality and functionality requirements, must be produced in order to generate the desired profits. It subtracts the desired profit margin from the company's selling price to establish the manufacturing cost of the product. Extensive literature review revealed that companies in automotive, electronic and process industries have reaped the benefits of target costing. However target costing approach has not been applied in the machining industry, but other techniques based on Geometric Programming, Goal Programming, and Lagrange Multiplier have been proposed for application in this industry. These models follow a forward approach, by first selecting a set of machining parameters, and then determining the machining cost. Hence in this study we have developed an algorithm to apply the concepts of target costing, which is a backward approach that selects the machining parameters based on the required machining costs, and is therefore more suitable for practical applications in process improvement and cost reduction. A target costing model was developed for turning operation and was successfully validated using practical data.

  8. Decoding Problem Gamblers' Signals: A Decision Model for Casino Enterprises.

    PubMed

    Ifrim, Sandra

    2015-12-01

    The aim of the present study is to offer a validated decision model for casino enterprises. The model enables those users to perform early detection of problem gamblers and fulfill their ethical duty of social cost minimization. To this end, the interpretation of casino customers' nonverbal communication is understood as a signal-processing problem. Indicators of problem gambling recommended by Delfabbro et al. (Identifying problem gamblers in gambling venues: final report, 2007) are combined with Viterbi algorithm into an interdisciplinary model that helps decoding signals emitted by casino customers. Model output consists of a historical path of mental states and cumulated social costs associated with a particular client. Groups of problem and non-problem gamblers were simulated to investigate the model's diagnostic capability and its cost minimization ability. Each group consisted of 26 subjects and was subsequently enlarged to 100 subjects. In approximately 95% of the cases, mental states were correctly decoded for problem gamblers. Statistical analysis using planned contrasts revealed that the model is relatively robust to the suppression of signals performed by casino clientele facing gambling problems as well as to misjudgments made by staff regarding the clients' mental states. Only if the last mentioned source of error occurs in a very pronounced manner, i.e. judgment is extremely faulty, cumulated social costs might be distorted.

  9. Measuring Value in Internal Medicine Residency Training Hospitals Using Publicly Reported Measures.

    PubMed

    Schickedanz, Adam; Gupta, Reshma; Arora, Vineet M; Braddock, Clarence H

    2018-03-01

    Graduate medical education (GME) lacks measures of resident preparation for high-quality, cost-conscious practice. The authors used publicly reported teaching hospital value measures to compare internal medicine residency programs on high-value care training and to validate these measures against program director perceptions of value. Program-level value training scores were constructed using Centers for Medicare & Medicaid Services Value-Based Purchasing (VBP) Program hospital quality and cost-efficiency data. Correlations with Association of Program Directors in Internal Medicine Annual Survey high-value care training measures were examined using logistic regression. For every point increase in program-level VBP score, residency directors were more likely to agree that GME programs have a responsibility to contain health care costs (adjusted odds ratio [aOR] 1.18, P = .04), their faculty model high-value care (aOR 1.07, P = .03), and residents are prepared to make high-value medical decisions (aOR 1.07, P = .09). Publicly reported clinical data offer valid measures of GME value training.

  10. Design of a Low-cost Oil Spill Tracking Buoy

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Hu, X.; Yu, F.; Dong, S.; Chen, G.

    2017-12-01

    As the rapid development of oil exploitation and transportation, oil spill accidents, such as Prestige oil spill, Gulf of Mexico oil spill accident and so on, happened frequently in recent years which would result in long-term damage to the environment and human life. It would be helpful for rescue operation if we can locate the oil slick diffusion area in real time. Equipped with GNSS system, current tracking buoys(CTB), such as Lagrangian drifting buoy, Surface Velocity Program (SVP) drifter, iSLDMB (Iridium self locating datum marker buoy) and Argosphere buoy, have been used as oil tracking buoy in oil slick observation and as validation tools for oil spill simulation. However, surface wind could affect the movement of oil slick, which couldn't be reflected by CTB, thus the oil spill tracking performance is limited. Here, we proposed an novel oil spill tracking buoy (OSTB) which has a low cost of less than $140 and is equipped with Beidou positioning module and sails to track oil slick. Based on hydrodynamic equilibrium model and ocean dynamic analysis, the wind sails and water sails are designed to be adjustable according to different marine conditions to improve tracking efficiency. Quick release device is designed to assure easy deployment from air or ship. Sea experiment was carried out in Jiaozhou Bay, Northern China. OSTB, SVP, iSLDMB, Argosphere buoy and a piece of oil-simulated rubber sheet were deployed at the same time. Meanwhile, oil spill simulation model GNOME (general NOAA operational modeling environment) was configured with the wind and current field, which were collected by an unmanned surface vehicle (USV) mounted with acoustic Doppler current profilers (ADCP) and wind speed and direction sensors. Experimental results show that the OSTB has better relevance with rubber sheet and GNOME simulation results, which validate the oil tracking ability of OSTB. With low cost and easy deployment, OSTB provides an effective way for oil spill numerical modeling validation and quick response to oil spill accidents.

  11. In-Silico Screening of Ligand Based Pharmacophore, Database Mining and Molecular Docking on 2, 5-Diaminopyrimidines Azapurines as Potential Inhibitors of Glycogen Synthase Kinase-3β.

    PubMed

    Mishra, Pooja; Kesar, Seema; Paliwal, Sarvesh K; Chauhan, Monika; Madan, Kirtika

    2018-05-29

    Glycogen synthase kinase-3β plays a significant role in the regulation of various pathological pathways relating to central nervous system (CNS). Dysregulation of Glycogen synthase kinase 3 (GSK-3) activity gives a rise to numerous neuroinflammation and neurodegenerative related disorders that affect the whole central nervous system. By the sequential application of in-silico tools, efforts have been attempted to design the novel GSK-3β inhibitors. Owing to the potential role of GSK-3β in nervous disorders, we have attempted to develop the quantitative four featured pharmacophore model comprising two hydrogen bond acceptors (HBA), one ring aromatic (RA), and one hydrophobe (HY), which were further affirmed by cost-function analysis, rm2 matrices, internal and external test set validation and Güner-Henry (GH) scoring analysis. Validated pharmacophoric model was used for virtual screening and out of 345 compounds, two potential virtual hits were finalized that were on the basis of fit value, estimated activity and Lipinski's violation. The chosen compounds were subjected to dock within the active site of GSK-3β Result: Four essential features, i.e., two hydrogen bond acceptors(HBA), one ring aromatic(RA), and one hydrophobe(HY), were subjected to build the pharmacophoric model and showed good correlation coefficient, RMSD and cost difference values of 0.91, 0.94 and 42.9 respectively and further model was validated employing cost-function analysis, rm2-matrices, internal and external test set prediction with r2 value of 0.77 and 0.84. Docked conformations showed potential interactions in between the features of the identified hits (NCI 4296, NCI 3034) and the amino acids present in the active site. In line with the overhead discussion, and through our stepwise computational approaches, we have identified novel, structurally diverse glycogen synthase kinase inhibitors. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. Optimizing preventive maintenance policy: A data-driven application for a light rail braking system.

    PubMed

    Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel

    2017-10-01

    This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions.

  13. Optimizing preventive maintenance policy: A data-driven application for a light rail braking system

    PubMed Central

    Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel

    2017-01-01

    This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions. PMID:29278245

  14. Does box model training improve surgical dexterity and economy of movement during virtual reality laparoscopy? A randomised trial.

    PubMed

    Clevin, Lotte; Grantcharov, Teodor P

    2008-01-01

    Laparoscopic box model trainers have been used in training curricula for a long time, however data on their impact on skills acquisition is still limited. Our aim was to validate a low cost box model trainer as a tool for the training of skills relevant to laparoscopic surgery. Randomised, controlled trial (Canadian Task Force Classification I). University Hospital. Sixteen gynaecologic residents with limited laparoscopic experience were randomised to a group that received a structured box model training curriculum, and a control group. Performance before and after the training was assessed in a virtual reality laparoscopic trainer (LapSim and was based on objective parameters, registered by the computer system (time, error, and economy of motion scores). Group A showed significantly greater improvement in all performance parameters compared with the control group: economy of movement (p=0.001), time (p=0.001) and tissue damage (p=0.036), confirming the positive impact of box-trainer curriculum on laparoscopic skills acquisition. Structured laparoscopic skill training on a low cost box model trainer improves performance as assessed using the VR system. Trainees who used the box model trainer showed significant improvement compared to the control group. Box model trainers are valid tools for laparoscopic skills training and should be implemented in the comprehensive training curricula in gynaecology.

  15. Preliminary design, analysis, and costing of a dynamic scale model of the NASA space station

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Pinson, E. D.; Voqui, H. L.; Crawley, E. F.; Everman, M. R.

    1987-01-01

    The difficulty of testing the next generation of large flexible space structures on the ground places an emphasis on other means for validating predicted on-orbit dynamic behavior. Scale model technology represents one way of verifying analytical predictions with ground test data. This study investigates the preliminary design, scaling and cost trades for a Space Station dynamic scale model. The scaling of nonlinear joint behavior is studied from theoretical and practical points of view. Suspension system interaction trades are conducted for the ISS Dual Keel Configuration and Build-Up Stages suspended in the proposed NASA/LaRC Large Spacecraft Laboratory. Key issues addressed are scaling laws, replication vs. simulation of components, manufacturing, suspension interactions, joint behavior, damping, articulation capability, and cost. These issues are the subject of parametric trades versus the scale model factor. The results of these detailed analyses are used to recommend scale factors for four different scale model options, each with varying degrees of replication. Potential problems in constructing and testing the scale model are identified, and recommendations for further study are outlined.

  16. An Investigation of Human Performance Model Validation

    DTIC Science & Technology

    2005-03-01

    of design decisions, what the costs and benefits are for each of the stages of analysis options. On the ’benefits’ side, the manager needs to know...confidence. But we also want to know that we are not expending any more effort (and other costs ) than necessary to ensure that the right decision is...supported at each stage. Ultimately, we want to enable SBA managers to have confidence that they are selecting the right HPM tools and using them correctly in

  17. Calibration and Validation of the Sage Software Cost/Schedule Estimating System to United States Air Force Databases

    DTIC Science & Technology

    1997-09-01

    factor values are identified. For SASET, revised cost estimating relationships are provided ( Apgar et al., 1991). A 1991 AFIT thesis by Gerald Ourada...description of the model is a paragraph directly quoted from the user’s manual . This is not to imply that a lack of a thorough analysis indicates...constraints imposed by the system. The effective technology rating is computed from the basic technology rating by the following equation ( Apgar et al., 1991

  18. Screening for Chlamydia trachomatis: a systematic review of the economic evaluations and modelling

    PubMed Central

    Roberts, T E; Robinson, S; Barton, P; Bryan, S; Low, N

    2006-01-01

    Objective To review systematically and critically, evidence used to derive estimates of costs and cost effectiveness of chlamydia screening. Methods Systematic review. A search of 11 electronic bibliographic databases from the earliest date available to August 2004 using keywords including chlamydia, pelvic inflammatory disease, economic evaluation, and cost. We included studies of chlamydia screening in males and/or females over 14 years, including studies of diagnostic tests, contact tracing, and treatment as part of a screening programme. Outcomes included cases of chlamydia identified and major outcomes averted. We assessed methodological quality and the modelling approach used. Results Of 713 identified papers we included 57 formal economic evaluations and two cost studies. Most studies found chlamydia screening to be cost effective, partner notification to be an effective adjunct, and testing with nucleic acid amplification tests, and treatment with azithromycin to be cost effective. Methodological problems limited the validity of these findings: most studies used static models that are inappropriate for infectious diseases; restricted outcomes were used as a basis for policy recommendations; and high estimates of the probability of chlamydia associated complications might have overestimated cost effectiveness. Two high quality dynamic modelling studies found opportunistic screening to be cost effective but poor reporting or uncertainty about complication rates make interpretation difficult. Conclusion The inappropriate use of static models to study interventions to prevent a communicable disease means that uncertainty remains about whether chlamydia screening programmes are cost effective or not. The results of this review can be used by health service managers in the allocation of resources, and health economists and other researchers who are considering further research in this area. PMID:16731666

  19. Accelerating cross-validation with total variation and its application to super-resolution imaging

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Ikeda, Shiro; Akiyama, Kazunori; Kabashima, Yoshiyuki

    2017-12-01

    We develop an approximation formula for the cross-validation error (CVE) of a sparse linear regression penalized by ℓ_1-norm and total variation terms, which is based on a perturbative expansion utilizing the largeness of both the data dimensionality and the model. The developed formula allows us to reduce the necessary computational cost of the CVE evaluation significantly. The practicality of the formula is tested through application to simulated black-hole image reconstruction on the event-horizon scale with super resolution. The results demonstrate that our approximation reproduces the CVE values obtained via literally conducted cross-validation with reasonably good precision.

  20. Budget-impact model for colonoscopy cost calculation and comparison between 2 litre PEG+ASC and sodium picosulphate with magnesium citrate or sodium phosphate oral bowel cleansing agents.

    PubMed

    Gruss, H-J; Cockett, A; Leicester, R J

    2012-01-01

    With the availability of several bowel cleansing agents, physicians and hospitals performing colonoscopies will often base their choice of cleansing agent purely on acquisition cost. Therefore, an easy to use budget impact model has been developed and established as a tool to compare total colon preparation costs between different established bowel cleansing agents. The model was programmed in Excel and designed as a questionnaire evaluating information on treatment costs for a range of established bowel cleansing products. The sum of costs is based on National Health Service reference costs for bowel cleansing products. Estimations are made for savings achievable when using a 2-litre polyethylene glycol with ascorbate components solution (PEG+ASC) in place of other bowel cleansing solutions. Test data were entered into the model to confirm validity and sensitivity. The model was then applied to a set of audit cost data from a major hospital colonoscopy unit in the UK. Descriptive analysis of the test data showed that the main cost drivers in the colonoscopy process are the procedure costs and costs for bed days rather than drug acquisition costs, irrespective of the cleansing agent. Audit data from a colonoscopy unit in the UK confirmed the finding with a saving of £107,000 per year in favour of PEG+ASC when compared to sodium picosulphate with magnesium citrate solution (NaPic+MgCit). For every patient group the model calculated overall cost savings. This was irrespective of the higher drug expenditure associated with the use of PEG+ASC for bowel preparation. Savings were mainly realized through reduced costs for repeat colonoscopy procedures and associated costs, such as inpatient length of stay. The budget impact model demonstrated that the primary cost driver was the procedure cost for colonoscopy. Savings can be realized through the use of PEG+ASC despite higher drug acquisition costs relative to the comparator products. From a global hospital funding perspective, the acquisition costs of bowel preparations should not be used as the primary reason to select the preferred treatment agent, but should be part of the consideration, with an emphasis on the clinical outcome.

  1. A cost-effectiveness model to personalize antiviral therapy in naive patients with genotype 1 chronic hepatitis C.

    PubMed

    Iannazzo, Sergio; Colombatto, Piero; Ricco, Gabriele; Oliveri, Filippo; Bonino, Ferruccio; Brunetto, Maurizia R

    2015-03-01

    Rapid virologic response is the best predictor of sustained virologic response with dual therapy in genotype-1 chronic hepatitis C, and its evaluation was proposed to tailor triple therapy in F0-F2 patients. Bio-mathematical modelling of viral dynamics during dual therapy has potentially higher accuracy than rapid virologic in the identification of patients who will eventually achieve sustained response. Study's objective was the cost-effectiveness analysis of a personalized therapy in naïve F0-F2 patients with chronic hepatitis C based on a bio-mathematical model (model-guided strategy) rather than on rapid virologic response (guideline-guided strategy). A deterministic bio-mathematical model of the infected cell dynamics was validated in a cohort of 135 patients treated with dual therapy. A decision-analytic economic model was then developed to compare model-guided and guideline-guided strategies in the Italian setting. The outcomes of the cost-effectiveness analysis with model-guided and guideline-guided strategy were 19.1-19.4 and 18.9-19.3 quality-adjusted-life-years. Total per-patient lifetime costs were €25,200-€26,000 with model-guided strategy and €28,800-€29,900 with guideline-guided strategy. When comparing model-guided with guideline-guided strategy the former resulted more effective and less costly. The adoption of the bio-mathematical predictive criterion has the potential to improve the cost-effectiveness of a personalized therapy for chronic hepatitis C, reserving triple therapy for those patients who really need it. Copyright © 2014 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  2. Vitamin D and health care costs: Results from two independent population-based cohort studies.

    PubMed

    Hannemann, A; Wallaschofski, H; Nauck, M; Marschall, P; Flessa, S; Grabe, H J; Schmidt, C O; Baumeister, S E

    2017-10-31

    Vitamin D deficiency is associated with higher morbidity. However, there is few data regarding the effect of vitamin D deficiency on health care costs. This study examined the cross-sectional and longitudinal associations between the serum 25-hydroxy vitamin D concentration (25OHD) and direct health care costs and hospitalization in two independent samples of the general population in North-Eastern Germany. We studied 7217 healthy individuals from the 'Study of Health in Pomerania' (SHIP n = 3203) and the 'Study of Health in Pomerania-Trend' (SHIP-Trend n = 4014) who had valid 25OHD measurements and provided data on annual total costs, outpatient costs, hospital stays, and inpatient costs. The associations between 25OHD concentrations (modelled continuously using factional polynomials) and health care costs were examined using a generalized linear model with gamma distribution and a log link. Poisson regression models were used to estimate relative risks of hospitalization. In cross-sectional analysis of SHIP-Trend, non-linear associations between the 25OHD concentration and inpatient costs and hospitalization were detected: participants with 25OHD concentrations of 5, 10 and 15 ng/ml had 226.1%, 51.5% and 14.1%, respectively, higher inpatient costs than those with 25OHD concentrations of 20 ng/ml (overall p-value = 0.001) in multivariable models. We found a relation between lower 25OHD concentrations and increased inpatient health care costs and hospitalization. Our results thus indicate an influence of vitamin D deficiency on health care costs in the general population. Copyright © 2017 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  3. Detecting and treating occlusal caries lesions: a cost-effectiveness analysis.

    PubMed

    Schwendicke, F; Stolpe, M; Meyer-Lueckel, H; Paris, S

    2015-02-01

    The health gains and costs resulting from using different caries detection strategies might not only depend on the accuracy of the used method but also the treatment emanating from its use in different populations. We compared combinations of visual-tactile, radiographic, or laser-fluorescence-based detection methods with 1 of 3 treatments (non-, micro-, and invasive treatment) initiated at different cutoffs (treating all or only dentinal lesions) in populations with low or high caries prevalence. A Markov model was constructed to follow an occlusal surface in a permanent molar in an initially 12-y-old male German patient over his lifetime. Prevalence data and transition probabilities were extracted from the literature, while validity parameters of different methods were synthesized or obtained from systematic reviews. Microsimulations were performed to analyze the model, assuming a German health care setting and a mixed public-private payer perspective. Radiographic and fluorescence-based methods led to more overtreatments, especially in populations with low prevalence. For the latter, combining visual-tactile or radiographic detection with microinvasive treatment retained teeth longest (mean 66 y) at lowest costs (329 and 332 Euro, respectively), while combining radiographic or fluorescence-based detections with invasive treatment was the least cost-effective (<60 y, >700 Euro). In populations with high prevalence, combining radiographic detection with microinvasive treatment was most cost-effective (63 y, 528 Euro), while sensitive detection methods combined with invasive treatments were again the least cost-effective (<59 y, >690 Euro). The suitability of detection methods differed significantly between populations, and the cost-effectiveness was greatly influenced by the treatment initiated after lesion detection. The accuracy of a detection method relative to a "gold standard" did not automatically convey into better health or reduced costs. Detection methods should be evaluated not only against their criterion validity but also the long-term effects resulting from their use in different populations. © International & American Associations for Dental Research 2014.

  4. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  5. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  6. A discrete event simulation tool to support and predict hospital and clinic staffing.

    PubMed

    DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David

    2017-06-01

    We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.

  7. Cigar Box Arthroscopy: A Randomized Controlled Trial Validates Nonanatomic Simulation Training of Novice Arthroscopy Skills.

    PubMed

    Sandberg, Rory P; Sherman, Nathan C; Latt, L Daniel; Hardy, Jolene C

    2017-11-01

    The goal of this study was to validate the cigar box arthroscopy trainer (CBAT) as a training tool and then compare its effectiveness to didactic training and to another previously validated low-fidelity but anatomic model, the anatomic knee arthroscopy trainer (AKAT). A nonanatomic knee arthroscopy training module was developed at our institution. Twenty-four medical students with no prior arthroscopic or laparoscopic experience were enrolled as subjects. Eight subjects served as controls. The remaining 16 subjects were randomized to participate in 4 hours of either the CBAT or a previously validated AKAT. Subjects' skills were assessed by 1 of 2 faculty members through repeated attempts at performing a diagnostic knee arthroscopy on a cadaveric specimen. Objective scores were given using a minimally adapted version of the Basic Arthroscopic Knee Skill Scoring System. Total cost differences were calculated. Seventy-five percent of subjects in the CBAT and AKAT groups succeeded in reaching minimum proficiency in the allotted time compared with 25% in the control group (P < .05). There was no significant difference in the number of attempts to reach proficiency between the CBAT and AKAT groups. The cost to build the CBAT was $44.12, whereas the cost was $324.33 for the AKAT. This pilot study suggests the CBAT is an effective knee arthroscopy trainer that may decrease the learning curve of residents without significant cost to a residency program. This study demonstrates the need for an agreed-upon objective scoring system to properly evaluate residents and compare the effectiveness of different training tools. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  8. Defining landscape resistance values in least-cost connectivity models for the invasive grey squirrel: a comparison of approaches using expert-opinion and habitat suitability modelling.

    PubMed

    Stevenson-Holt, Claire D; Watts, Kevin; Bellamy, Chloe C; Nevin, Owen T; Ramsey, Andrew D

    2014-01-01

    Least-cost models are widely used to study the functional connectivity of habitat within a varied landscape matrix. A critical step in the process is identifying resistance values for each land cover based upon the facilitating or impeding impact on species movement. Ideally resistance values would be parameterised with empirical data, but due to a shortage of such information, expert-opinion is often used. However, the use of expert-opinion is seen as subjective, human-centric and unreliable. This study derived resistance values from grey squirrel habitat suitability models (HSM) in order to compare the utility and validity of this approach with more traditional, expert-led methods. Models were built and tested with MaxEnt, using squirrel presence records and a categorical land cover map for Cumbria, UK. Predictions on the likelihood of squirrel occurrence within each land cover type were inverted, providing resistance values which were used to parameterise a least-cost model. The resulting habitat networks were measured and compared to those derived from a least-cost model built with previously collated information from experts. The expert-derived and HSM-inferred least-cost networks differ in precision. The HSM-informed networks were smaller and more fragmented because of the higher resistance values attributed to most habitats. These results are discussed in relation to the applicability of both approaches for conservation and management objectives, providing guidance to researchers and practitioners attempting to apply and interpret a least-cost approach to mapping ecological networks.

  9. Simulation-based training for prostate surgery.

    PubMed

    Khan, Raheej; Aydin, Abdullatif; Khan, Muhammad Shamim; Dasgupta, Prokar; Ahmed, Kamran

    2015-10-01

    To identify and review the currently available simulators for prostate surgery and to explore the evidence supporting their validity for training purposes. A review of the literature between 1999 and 2014 was performed. The search terms included a combination of urology, prostate surgery, robotic prostatectomy, laparoscopic prostatectomy, transurethral resection of the prostate (TURP), simulation, virtual reality, animal model, human cadavers, training, assessment, technical skills, validation and learning curves. Furthermore, relevant abstracts from the American Urological Association, European Association of Urology, British Association of Urological Surgeons and World Congress of Endourology meetings, between 1999 and 2013, were included. Only studies related to prostate surgery simulators were included; studies regarding other urological simulators were excluded. A total of 22 studies that carried out a validation study were identified. Five validated models and/or simulators were identified for TURP, one for photoselective vaporisation of the prostate, two for holmium enucleation of the prostate, three for laparoscopic radical prostatectomy (LRP) and four for robot-assisted surgery. Of the TURP simulators, all five have demonstrated content validity, three face validity and four construct validity. The GreenLight laser simulator has demonstrated face, content and construct validities. The Kansai HoLEP Simulator has demonstrated face and content validity whilst the UroSim HoLEP Simulator has demonstrated face, content and construct validity. All three animal models for LRP have been shown to have construct validity whilst the chicken skin model was also content valid. Only two robotic simulators were identified with relevance to robot-assisted laparoscopic prostatectomy, both of which demonstrated construct validity. A wide range of different simulators are available for prostate surgery, including synthetic bench models, virtual-reality platforms, animal models, human cadavers, distributed simulation and advanced training programmes and modules. The currently validated simulators can be used by healthcare organisations to provide supplementary training sessions for trainee surgeons. Further research should be conducted to validate simulated environments, to determine which simulators have greater efficacy than others and to assess the cost-effectiveness of the simulators and the transferability of skills learnt. With surgeons investigating new possibilities for easily reproducible and valid methods of training, simulation offers great scope for implementation alongside traditional methods of training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  10. A single-vendor and a single-buyer integrated inventory model with ordering cost reduction dependent on lead time

    NASA Astrophysics Data System (ADS)

    Vijayashree, M.; Uthayakumar, R.

    2017-09-01

    Lead time is one of the major limits that affect planning at every stage of the supply chain system. In this paper, we study a continuous review inventory model. This paper investigates the ordering cost reductions are dependent on lead time. This study addressed two-echelon supply chain problem consisting of a single vendor and a single buyer. The main contribution of this study is that the integrated total cost of the single vendor and the single buyer integrated system is analyzed by adopting two different (linear and logarithmic) types ordering cost reductions act dependent on lead time. In both cases, we develop effective solution procedures for finding the optimal solution and then illustrative numerical examples are given to illustrate the results. The solution procedure is to determine the optimal solutions of order quantity, ordering cost, lead time and the number of deliveries from the single vendor and the single buyer in one production run, so that the integrated total cost incurred has the minimum value. Ordering cost reduction is the main aspect of the proposed model. A numerical example is given to validate the model. Numerical example solved by using Matlab software. The mathematical model is solved analytically by minimizing the integrated total cost. Furthermore, the sensitivity analysis is included and the numerical examples are given to illustrate the results. The results obtained in this paper are illustrated with the help of numerical examples. The sensitivity of the proposed model has been checked with respect to the various major parameters of the system. Results reveal that the proposed integrated inventory model is more applicable for the supply chain manufacturing system. For each case, an algorithm procedure of finding the optimal solution is developed. Finally, the graphical representation is presented to illustrate the proposed model and also include the computer flowchart in each model.

  11. Cost prediction following traumatic brain injury: model development and validation.

    PubMed

    Spitz, Gershon; McKenzie, Dean; Attwood, David; Ponsford, Jennie L

    2016-02-01

    The ability to predict costs following a traumatic brain injury (TBI) would assist in planning treatment and support services by healthcare providers, insurers and other agencies. The objective of the current study was to develop predictive models of hospital, medical, paramedical, and long-term care (LTC) costs for the first 10 years following a TBI. The sample comprised 798 participants with TBI, the majority of whom were male and aged between 15 and 34 at time of injury. Costing information was obtained for hospital, medical, paramedical, and LTC costs up to 10 years postinjury. Demographic and injury-severity variables were collected at the time of admission to the rehabilitation hospital. Duration of PTA was the most important single predictor for each cost type. The final models predicted 44% of hospital costs, 26% of medical costs, 23% of paramedical costs, and 34% of LTC costs. Greater costs were incurred, depending on cost type, for individuals with longer PTA duration, obtaining a limb or chest injury, a lower GCS score, older age at injury, not being married or defacto prior to injury, living in metropolitan areas, and those reporting premorbid excessive or problem alcohol use. This study has provided a comprehensive analysis of factors predicting various types of costs following TBI, with the combination of injury-related and demographic variables predicting 23-44% of costs. PTA duration was the strongest predictor across all cost categories. These factors may be used for the planning and case management of individuals following TBI. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. On-line experimental validation of a model-based diagnostic algorithm dedicated to a solid oxide fuel cell system

    NASA Astrophysics Data System (ADS)

    Polverino, Pierpaolo; Esposito, Angelo; Pianese, Cesare; Ludwig, Bastian; Iwanschitz, Boris; Mai, Andreas

    2016-02-01

    In the current energetic scenario, Solid Oxide Fuel Cells (SOFCs) exhibit appealing features which make them suitable for environmental-friendly power production, especially for stationary applications. An example is represented by micro-combined heat and power (μ-CHP) generation units based on SOFC stacks, which are able to produce electric and thermal power with high efficiency and low pollutant and greenhouse gases emissions. However, the main limitations to their diffusion into the mass market consist in high maintenance and production costs and short lifetime. To improve these aspects, the current research activity focuses on the development of robust and generalizable diagnostic techniques, aimed at detecting and isolating faults within the entire system (i.e. SOFC stack and balance of plant). Coupled with appropriate recovery strategies, diagnosis can prevent undesired system shutdowns during faulty conditions, with consequent lifetime increase and maintenance costs reduction. This paper deals with the on-line experimental validation of a model-based diagnostic algorithm applied to a pre-commercial SOFC system. The proposed algorithm exploits a Fault Signature Matrix based on a Fault Tree Analysis and improved through fault simulations. The algorithm is characterized on the considered system and it is validated by means of experimental induction of faulty states in controlled conditions.

  13. Pre- and Post-Planned Evaluation: Which Is Preferable?

    ERIC Educational Resources Information Center

    Strasser, Stephen; Deniston, O. Lynn

    1978-01-01

    Factors involved in pre-planned and post-planned evaluation of program effectiveness are compared: (1) reliability and cost of data; (2) internal and external validity; (3) obtrusiveness and threat; (4) goal displacement and program direction. A model to help program administrators decide which approach is more appropriate is presented. (Author/MH)

  14. Assessment of the Hypochondriasis Domain: The Multidimensional Inventory of Hypochondriacal Traits (MIHT)

    ERIC Educational Resources Information Center

    Longley, Susan L.; Watson, David; Noyes, Russell, Jr.

    2005-01-01

    Although hypochondriasis is associated with the costly use of unnecessary medical resources, this mental health problem remains largely neglected. A lack of clear conceptual models and valid measures has impeded accurate assessment and hindered progress. The Multidimensional Inventory of Hypochondriacal Traits (MIHT) addresses these deficiencies…

  15. Optimal test selection for prediction uncertainty reduction

    DOE PAGES

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less

  16. High fidelity, low cost moulage as a valid simulation tool to improve burns education.

    PubMed

    Pywell, M J; Evgeniou, E; Highway, K; Pitt, E; Estela, C M

    2016-06-01

    Simulation allows the opportunity for repeated practice in controlled, safe conditions. Moulage uses materials such as makeup to simulate clinical presentations. Moulage fidelity can be assessed by face validity (realism) and content validity (appropriateness). The aim of this project is to compare the fidelity of professional moulage to non-professional moulage in the context of a burns management course. Four actors were randomly assigned to a professional make-up artist or a course faculty member for moulage preparation such that two actors were in each group. Participants completed the actor-based burn management scenarios and answered a ten-question Likert-scale questionnaire on face and content validity. Mean scores and a linear mixed effects model were used to compare professional and non-professional moulage. Cronbach's alpha assessed internal consistency. Twenty participants experienced three out of four scenarios and at the end of the course completed a total of 60 questionnaires. Professional moulage had higher average ratings for face (4.30 v 3.80; p=0.11) and content (4.30 v 4.00; p=0.06) validity. Internal consistency of face (α=0.91) and content (α=0.85) validity questions was very good. The fidelity of professionally prepared moulage, as assessed by content validity, was higher than non-professionally prepared moulage. We have shown that using professional techniques and low cost materials we can prepare quality high fidelity moulage simulations. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  17. The EPQ model under conditions of two levels of trade credit and limited storage capacity in supply chain management

    NASA Astrophysics Data System (ADS)

    Chung, Kun-Jen

    2013-09-01

    An inventory problem involves a lot of factors influencing inventory decisions. To understand it, the traditional economic production quantity (EPQ) model plays rather important role for inventory analysis. Although the traditional EPQ models are still widely used in industry, practitioners frequently question validities of assumptions of these models such that their use encounters challenges and difficulties. So, this article tries to present a new inventory model by considering two levels of trade credit, finite replenishment rate and limited storage capacity together to relax the basic assumptions of the traditional EPQ model to improve the environment of the use of it. Keeping in mind cost-minimisation strategy, four easy-to-use theorems are developed to characterise the optimal solution. Finally, the sensitivity analyses are executed to investigate the effects of the various parameters on ordering policies and the annual total relevant costs of the inventory system.

  18. The participative method of subject definition as used in the quantitative modelling of hospital laundry services.

    PubMed

    Hammer, K A; Janes, F R

    1995-01-01

    The objectives for developing the participative method of subject definition were to gain all the relevant information to a high level of fidelity in the earliest stages of the work and so be able to build a realistic model at reduced labour cost. In order to better integrate the two activities--information acquisition and mathematical modelling--a procedure was devised using the methods of interactive management to facilitate teamwork. This procedure provided the techniques to create suitable working relationships between the two groups, the informants and the modellers, so as to maximize their free and accurate intercommunication, both during the initial definition of the linen service and during the monitoring of the accuracy and reality of the draft models. The objectives of this project were met in that the final model was quickly validated and approved, at a low labour cost.

  19. An instrument to assess subjective task value beliefs regarding the decision to pursue postgraduate training.

    PubMed

    Hagemeier, Nicholas E; Murawski, Matthew M

    2014-02-12

    To develop and validate an instrument to assess subjective ratings of the perceived value of various postgraduate training paths followed using expectancy-value as a theoretical framework; and to explore differences in value beliefs across type of postgraduate training pursued and type of pharmacy training completed prior to postgraduate training. A survey instrument was developed to sample 4 theoretical domains of subjective task value: intrinsic value, attainment value, utility value, and perceived cost. Retrospective self-report methodology was employed to examine respondents' (N=1,148) subjective task value beliefs specific to their highest level of postgraduate training completed. Exploratory and confirmatory factor analytic techniques were used to evaluate and validate value belief constructs. Intrinsic, attainment, utility, cost, and financial value constructs resulted from exploratory factor analysis. Cross-validation resulted in a 26-item instrument that demonstrated good model fit. Differences in value beliefs were noted across type of postgraduate training pursued and pharmacy training characteristics. The Postgraduate Training Value Instrument demonstrated evidence of reliability and construct validity. The survey instrument can be used to assess value beliefs regarding multiple postgraduate training options in pharmacy and potentially inform targeted recruiting of individuals to those paths best matching their own value beliefs.

  20. Computer simulation models of pre-diabetes populations: a systematic review protocol

    PubMed Central

    Khurshid, Waqar; Pagano, Eva; Feenstra, Talitha

    2017-01-01

    Introduction Diabetes is a major public health problem and prediabetes (intermediate hyperglycaemia) is associated with a high risk of developing diabetes. With evidence supporting the use of preventive interventions for prediabetes populations and the discovery of novel biomarkers stratifying the risk of progression, there is a need to evaluate their cost-effectiveness across jurisdictions. In diabetes and prediabetes, it is relevant to inform cost-effectiveness analysis using decision models due to their ability to forecast long-term health outcomes and costs beyond the time frame of clinical trials. To support good implementation and reimbursement decisions of interventions in these populations, models should be clinically credible, based on best available evidence, reproducible and validated against clinical data. Our aim is to identify recent studies on computer simulation models and model-based economic evaluations of populations of individuals with prediabetes, qualify them and discuss the knowledge gaps, challenges and opportunities that need to be addressed for future evaluations. Methods and analysis A systematic review will be conducted in MEDLINE, Embase, EconLit and National Health Service Economic Evaluation Database. We will extract peer-reviewed studies published between 2000 and 2016 that describe computer simulation models of the natural history of individuals with prediabetes and/or decision models to evaluate the impact of interventions, risk stratification and/or screening on these populations. Two reviewers will independently assess each study for inclusion. Data will be extracted using a predefined pro forma developed using best practice. Study quality will be assessed using a modelling checklist. A narrative synthesis of all studies will be presented, focussing on model structure, quality of models and input data, and validation status. Ethics and dissemination This systematic review is exempt from ethics approval because the work is carried out on published documents. The findings of the review will be disseminated in a related peer-reviewed journal and presented at conferences. Reviewregistration number CRD42016047228. PMID:28982807

  1. [Drug expenditure in primary care: associated variables and allocation of drug budgets according to health district].

    PubMed

    García-Sempere, A; Peiró, S

    2001-01-01

    Identify factors explaining variability in prescribing costs after reviewing ecological data related to costs and socio-demographic characteristics of the health care zones in the autonomous region of Valencia, and explore the usefulness of using the model to set prescribing budgets in basic healthcare zones. An ecological analysis of the value socio-demographic characteristics and use of healthcare services to explain prescribing costs in 1997. Development of a prediction model based on multiple linear regression in data for prescribing costs in 1997 and validation in data for 1998. Factors that correlated positively with prescribing costs were the percentage of inhabitants over the age of 80, the death rate, the percentage of inhabitants with only primary education or less, the percentage of inhabitants between the ages of 65 and 79 and the distance from the capital city. A multivariate model including the death rate, the percentage of inhabitants 80 years of age and older, the number of cars per 100 inhabitants and number of visits per inhabitant accounted for 44.5% of the variations in prescribing costs in 1997 and 32% in 1998. Socio-demographic factors and certain variables associated with health care utilization can be applied, within certain limitations, to set prescribing budgets in basic healthcare zones.

  2. Cold-end Subsystem Testing for the Fission Power System Technology Demonstration Unit

    NASA Technical Reports Server (NTRS)

    Briggs, Maxwell; Gibson, Marc; Ellis, David; Sanzi, James

    2013-01-01

    The Fission Power System (FPS) Technology Demonstration Unit (TDU) consists of a pumped sodium-potassium (NaK) loop that provides heat to a Stirling Power Conversion Unit (PCU), which converts some of that heat into electricity and rejects the waste heat to a pumped water loop. Each of the TDU subsystems is being tested independently prior to full system testing at the NASA Glenn Research Center. The pumped NaK loop is being tested at NASA Marshall Space Flight Center; the Stirling PCU and electrical controller are being tested by Sunpower Inc.; and the pumped water loop is being tested at Glenn. This paper describes cold-end subsystem setup and testing at Glenn. The TDU cold end has been assembled in Vacuum Facility 6 (VF 6) at Glenn, the same chamber that will be used for TDU testing. Cold-end testing in VF 6 will demonstrate functionality; validated cold-end fill, drain, and emergency backup systems; and generated pump performance and system pressure drop data used to validate models. In addition, a low-cost proof-of concept radiator has been built and tested at Glenn, validating the design and demonstrating the feasibility of using low-cost metal radiators as an alternative to high-cost composite radiators in an end-to-end TDU test.

  3. A nearest neighbor approach for automated transporter prediction and categorization from protein sequences.

    PubMed

    Li, Haiquan; Dai, Xinbin; Zhao, Xuechun

    2008-05-01

    Membrane transport proteins play a crucial role in the import and export of ions, small molecules or macromolecules across biological membranes. Currently, there are a limited number of published computational tools which enable the systematic discovery and categorization of transporters prior to costly experimental validation. To approach this problem, we utilized a nearest neighbor method which seamlessly integrates homologous search and topological analysis into a machine-learning framework. Our approach satisfactorily distinguished 484 transporter families in the Transporter Classification Database, a curated and representative database for transporters. A five-fold cross-validation on the database achieved a positive classification rate of 72.3% on average. Furthermore, this method successfully detected transporters in seven model and four non-model organisms, ranging from archaean to mammalian species. A preliminary literature-based validation has cross-validated 65.8% of our predictions on the 11 organisms, including 55.9% of our predictions overlapping with 83.6% of the predicted transporters in TransportDB.

  4. The cost of starting and maintaining a large home hemodialysis program.

    PubMed

    Komenda, Paul; Copland, Michael; Makwana, Jay; Djurdjev, Ogdjenka; Sood, Manish M; Levin, Adeera

    2010-06-01

    Home extended hours hemodialysis improves some measurable biological and quality-of-life parameters over conventional renal replacement therapies in patients with end-stage renal disease. Published small studies evaluating costs have shown savings in terms of ongoing operating costs with this modality. However, all estimates need to include the total costs, including infrastructure, patient training, and maintenance; patient attrition by death, transplantation, technique failure; and the necessity of in-center dialysis. We describe a comprehensive funding model for a large centrally administered but locally delivered home hemodialysis program in British Columbia, Canada that covered 122 patients, of which 113 were still in the program at study end. The majority of patients performed home nocturnal hemodialysis in this 2-year retrospective study. All training periods, both in-center and in-home dialysis, medications, hospitalizations, and deaths were captured using our provincial renal database and vital statistics. Comparative data from the provincial database and pricing models were used for costing purposes. The total comprehensive costs per patient-incorporating startup, home, and in-center dialysis; medications; home remodeling; and consumables-was $59,179 for years 2004-2005 and $48,648 for 2005-2006. The home dialysis patients required multiple in-center dialysis runs, significantly contributing to the overall costs. Our study describes a valid, comprehensive funding model delineating reliable cost estimates of starting and maintaining a large home-based hemodialysis program. Consideration of hidden costs is important for administrators and planners to take into account when designing budgets for home hemodialysis.

  5. A machine learning model to predict the risk of 30-day readmissions in patients with heart failure: a retrospective analysis of electronic medical records data.

    PubMed

    Golas, Sara Bersche; Shibahara, Takuma; Agboola, Stephen; Otaki, Hiroko; Sato, Jumpei; Nakae, Tatsuya; Hisamitsu, Toru; Kojima, Go; Felsted, Jennifer; Kakarmath, Sujay; Kvedar, Joseph; Jethwani, Kamal

    2018-06-22

    Heart failure is one of the leading causes of hospitalization in the United States. Advances in big data solutions allow for storage, management, and mining of large volumes of structured and semi-structured data, such as complex healthcare data. Applying these advances to complex healthcare data has led to the development of risk prediction models to help identify patients who would benefit most from disease management programs in an effort to reduce readmissions and healthcare cost, but the results of these efforts have been varied. The primary aim of this study was to develop a 30-day readmission risk prediction model for heart failure patients discharged from a hospital admission. We used longitudinal electronic medical record data of heart failure patients admitted within a large healthcare system. Feature vectors included structured demographic, utilization, and clinical data, as well as selected extracts of un-structured data from clinician-authored notes. The risk prediction model was developed using deep unified networks (DUNs), a new mesh-like network structure of deep learning designed to avoid over-fitting. The model was validated with 10-fold cross-validation and results compared to models based on logistic regression, gradient boosting, and maxout networks. Overall model performance was assessed using concordance statistic. We also selected a discrimination threshold based on maximum projected cost saving to the Partners Healthcare system. Data from 11,510 patients with 27,334 admissions and 6369 30-day readmissions were used to train the model. After data processing, the final model included 3512 variables. The DUNs model had the best performance after 10-fold cross-validation. AUCs for prediction models were 0.664 ± 0.015, 0.650 ± 0.011, 0.695 ± 0.016 and 0.705 ± 0.015 for logistic regression, gradient boosting, maxout networks, and DUNs respectively. The DUNs model had an accuracy of 76.4% at the classification threshold that corresponded with maximum cost saving to the hospital. Deep learning techniques performed better than other traditional techniques in developing this EMR-based prediction model for 30-day readmissions in heart failure patients. Such models can be used to identify heart failure patients with impending hospitalization, enabling care teams to target interventions at their most high-risk patients and improving overall clinical outcomes.

  6. Assessing the Battery Cost at Which Plug-In Hybrid Medium-Duty Parcel Delivery Vehicles Become Cost-Effective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramroth, L. A.; Gonder, J. D.; Brooker, A. D.

    2013-04-01

    The National Renewable Energy Laboratory (NREL) validated diesel-conventional and diesel-hybrid medium-duty parcel delivery vehicle models to evaluate petroleum reductions and cost implications of hybrid and plug-in hybrid diesel variants. The hybrid and plug-in hybrid variants are run on a field data-derived design matrix to analyze the effect of drive cycle, distance, engine downsizing, battery replacements, and battery energy on fuel consumption and lifetime cost. For an array of diesel fuel costs, the battery cost per kilowatt-hour at which the hybridized configuration becomes cost-effective is calculated. This builds on a previous analysis that found the fuel savings from medium duty plug-inmore » hybrids more than offset the vehicles' incremental price under future battery and fuel cost projections, but that they seldom did so under present day cost assumptions in the absence of purchase incentives. The results also highlight the importance of understanding the application's drive cycle specific daily distance and kinetic intensity.« less

  7. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

  8. The space shuttle payload planning working groups. Volume 8: Earth and ocean physics

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The findings and recommendations of the Earth and Ocean Physics working group of the space shuttle payload planning activity are presented. The requirements for the space shuttle mission are defined as: (1) precision measurement for earth and ocean physics experiments, (2) development and demonstration of new and improved sensors and analytical techniques, (3) acquisition of surface truth data for evaluation of new measurement techniques, (4) conduct of critical experiments to validate geophysical phenomena and instrumental results, and (5) development and validation of analytical/experimental models for global ocean dynamics and solid earth dynamics/earthquake prediction. Tables of data are presented to show the flight schedule estimated costs, and the mission model.

  9. Semipermeable Hollow Fiber Phantoms for Development and Validation of Perfusion-Sensitive MR Methods and Signal Models

    PubMed Central

    Anderson, J.R.; Ackerman, J.J.H.; Garbow, J.R.

    2015-01-01

    Two semipermeable, hollow fiber phantoms for the validation of perfusion-sensitive magnetic resonance methods and signal models are described. Semipermeable hollow fibers harvested from a standard commercial hemodialysis cartridge serve to mimic tissue capillary function. Flow of aqueous media through the fiber lumen is achieved with a laboratory-grade peristaltic pump. Diffusion of water and solute species (e.g., Gd-based contrast agent) occurs across the fiber wall, allowing exchange between the lumen and the extralumenal space. Phantom design attributes include: i) small physical size, ii) easy and low-cost construction, iii) definable compartment volumes, and iv) experimental control over media content and flow rate. PMID:26167136

  10. Evaluation of dynamical models: dissipative synchronization and other techniques.

    PubMed

    Aguirre, Luis Antonio; Furtado, Edgar Campos; Tôrres, Leonardo A B

    2006-12-01

    Some recent developments for the validation of nonlinear models built from data are reviewed. Besides giving an overall view of the field, a procedure is proposed and investigated based on the concept of dissipative synchronization between the data and the model, which is very useful in validating models that should reproduce dominant dynamical features, like bifurcations, of the original system. In order to assess the discriminating power of the procedure, four well-known benchmarks have been used: namely, Duffing-Ueda, Duffing-Holmes, and van der Pol oscillators, plus the Hénon map. The procedure, developed for discrete-time systems, is focused on the dynamical properties of the model, rather than on statistical issues. For all the systems investigated, it is shown that the discriminating power of the procedure is similar to that of bifurcation diagrams--which in turn is much greater than, say, that of correlation dimension--but at a much lower computational cost.

  11. Cost-utility analysis of bariatric surgery compared with conventional medical management in Germany: a decision analytic modeling.

    PubMed

    Borisenko, Oleg; Mann, Oliver; Duprée, Anna

    2017-08-03

    The objective was to evaluate cost-utility of bariatric surgery in Germany for a lifetime and 10-year horizon from a health care payer perspective. State-transition Markov model provided absolute and incremental clinical and monetary results. In the model, obese patients could undergo surgery, develop post-surgery complications, experience diabetes type II, cardiovascular diseases or die. German Quality Assurance in Bariatric Surgery Registry and literature sources provided data on clinical effectiveness and safety. The model considered three types of surgeries: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. The model was extensively validated, and deterministic and probabilistic sensitivity analyses were performed to evaluate uncertainty. Cost data were obtained from German sources and presented in 2012 euros (€). Over 10 years, bariatric surgery led to the incremental cost of €2909, generated additional 0.03 years of life and 1.2 quality-adjusted life years (QALYs). Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of €2457 per QALY. Over a lifetime, surgery led to savings of €8522 and generated an increment of 0.7 years of life or 3.2 QALYs. The analysis also depicted an association between surgery and a reduction of obesity-related adverse events (diabetes, cardiovascular disorders). Delaying surgery for up to 3 years, resulted in a reduction of life years and QALYs gained, in addition to a moderate reduction in associated healthcare costs. Bariatric surgery is cost-effective at 10 years post-surgery and may result in a substantial reduction in the financial burden on the healthcare system over the lifetime of the treated individuals. It is also observed that delays in the provision of surgery may lead to a significant loss of clinical benefits.

  12. Validation of a reduced-order jet model for subsonic and underexpanded hydrogen jets

    DOE PAGES

    Li, Xuefang; Hecht, Ethan S.; Christopher, David M.

    2016-01-01

    Much effort has been made to model hydrogen releases from leaks during potential failures of hydrogen storage systems. A reduced-order jet model can be used to quickly characterize these flows, with low computational cost. Notional nozzle models are often used to avoid modeling the complex shock structures produced by the underexpanded jets by determining an “effective” source to produce the observed downstream trends. In our work, the mean hydrogen concentration fields were measured in a series of subsonic and underexpanded jets using a planar laser Rayleigh scattering system. Furthermore, we compared the experimental data to a reduced order jet modelmore » for subsonic flows and a notional nozzle model coupled to the jet model for underexpanded jets. The values of some key model parameters were determined by comparisons with the experimental data. Finally, the coupled model was also validated against hydrogen concentrations measurements for 100 and 200 bar hydrogen jets with the predictions agreeing well with data in the literature.« less

  13. Development and validation of a preference based measure derived from the Cambridge Pulmonary Hypertension Outcome Review (CAMPHOR) for use in cost utility analyses.

    PubMed

    McKenna, Stephen P; Ratcliffe, Julie; Meads, David M; Brazier, John E

    2008-08-21

    Pulmonary Hypertension is a severe and incurable disease with poor prognosis. A suite of new disease-specific measures--the Cambridge Pulmonary Hypertension Outcome Review (CAMPHOR) - was recently developed for use in this condition. The purpose of this study was to develop and validate a preference based measure from the CAMPHOR that could be used in cost-utility analyses. Items were selected that covered major issues covered by the CAMPHOR QoL scale (activities, travelling, dependence and communication). These were used to create 36 health states that were valued by 249 people representative of the UK adult population, using the time trade-off (TTO) technique. Data from the TTO interviews were analysed using both aggregate and individual level modelling. Finally, the original CAMPHOR validation data were used to validate the new preference based model. The predicted health state values ranged from 0.962 to 0.136. The mean level model selected for analyzing the data had good explanatory power (0.936), did not systematically over- or underestimate the observed mean health state values and showed no evidence of auto correlation in the prediction errors. The value of less than 1 reflects a background level of ill health in state 1111, as judged by the respondents. Scores derived from the new measure had excellent test-retest reliability (0.85) and construct validity. The CAMPHOR utility score appears better able to distinguish between WHO functional classes (II and III) than the EQ-5D and SF-6D. The tariff derived in this study can be used to classify an individual into a health state based on their responses to the CAMPHOR. The results of this study widen the evidence base for conducting economic evaluations of interventions designed to improve QoL for patients with PH.

  14. Molecular descriptor subset selection in theoretical peptide quantitative structure-retention relationship model development using nature-inspired optimization algorithms.

    PubMed

    Žuvela, Petar; Liu, J Jay; Macur, Katarzyna; Bączek, Tomasz

    2015-10-06

    In this work, performance of five nature-inspired optimization algorithms, genetic algorithm (GA), particle swarm optimization (PSO), artificial bee colony (ABC), firefly algorithm (FA), and flower pollination algorithm (FPA), was compared in molecular descriptor selection for development of quantitative structure-retention relationship (QSRR) models for 83 peptides that originate from eight model proteins. The matrix with 423 descriptors was used as input, and QSRR models based on selected descriptors were built using partial least squares (PLS), whereas root mean square error of prediction (RMSEP) was used as a fitness function for their selection. Three performance criteria, prediction accuracy, computational cost, and the number of selected descriptors, were used to evaluate the developed QSRR models. The results show that all five variable selection methods outperform interval PLS (iPLS), sparse PLS (sPLS), and the full PLS model, whereas GA is superior because of its lowest computational cost and higher accuracy (RMSEP of 5.534%) with a smaller number of variables (nine descriptors). The GA-QSRR model was validated initially through Y-randomization. In addition, it was successfully validated with an external testing set out of 102 peptides originating from Bacillus subtilis proteomes (RMSEP of 22.030%). Its applicability domain was defined, from which it was evident that the developed GA-QSRR exhibited strong robustness. All the sources of the model's error were identified, thus allowing for further application of the developed methodology in proteomics.

  15. A semi-analytical bearing model considering outer race flexibility for model based bearing load monitoring

    NASA Astrophysics Data System (ADS)

    Kerst, Stijn; Shyrokau, Barys; Holweg, Edward

    2018-05-01

    This paper proposes a novel semi-analytical bearing model addressing flexibility of the bearing outer race structure. It furthermore presents the application of this model in a bearing load condition monitoring approach. The bearing model is developed as current computational low cost bearing models fail to provide an accurate description of the more and more common flexible size and weight optimized bearing designs due to their assumptions of rigidity. In the proposed bearing model raceway flexibility is described by the use of static deformation shapes. The excitation of the deformation shapes is calculated based on the modelled rolling element loads and a Fourier series based compliance approximation. The resulting model is computational low cost and provides an accurate description of the rolling element loads for flexible outer raceway structures. The latter is validated by a simulation-based comparison study with a well-established bearing simulation software tool. An experimental study finally shows the potential of the proposed model in a bearing load monitoring approach.

  16. Validation of ACG Case-mix for equitable resource allocation in Swedish primary health care.

    PubMed

    Zielinski, Andrzej; Kronogård, Maria; Lenhoff, Håkan; Halling, Anders

    2009-09-18

    Adequate resource allocation is an important factor to ensure equity in health care. Previous reimbursement models have been based on age, gender and socioeconomic factors. An explanatory model based on individual need of primary health care (PHC) has not yet been used in Sweden to allocate resources. The aim of this study was to examine to what extent the ACG case-mix system could explain concurrent costs in Swedish PHC. Diagnoses were obtained from electronic PHC records of inhabitants in Blekinge County (approx. 150,000) listed with public PHC (approx. 120,000) for three consecutive years, 2004-2006. The inhabitants were then classified into six different resource utilization bands (RUB) using the ACG case-mix system. The mean costs for primary health care were calculated for each RUB and year. Using linear regression models and log-cost as dependent variable the adjusted R2 was calculated in the unadjusted model (gender) and in consecutive models where age, listing with specific PHC and RUB were added. In an additional model the ACG groups were added. Gender, age and listing with specific PHC explained 14.48-14.88% of the variance in individual costs for PHC. By also adding information on level of co-morbidity, as measured by the ACG case-mix system, to specific PHC the adjusted R2 increased to 60.89-63.41%. The ACG case-mix system explains patient costs in primary care to a high degree. Age and gender are important explanatory factors, but most of the variance in concurrent patient costs was explained by the ACG case-mix system.

  17. A systems approach to healthcare: agent-based modeling, community mental health, and population well-being.

    PubMed

    Silverman, Barry G; Hanrahan, Nancy; Bharathy, Gnana; Gordon, Kim; Johnson, Dan

    2015-02-01

    Explore whether agent-based modeling and simulation can help healthcare administrators discover interventions that increase population wellness and quality of care while, simultaneously, decreasing costs. Since important dynamics often lie in the social determinants outside the health facilities that provide services, this study thus models the problem at three levels (individuals, organizations, and society). The study explores the utility of translating an existing (prize winning) software for modeling complex societal systems and agent's daily life activities (like a Sim City style of software), into a desired decision support system. A case study tests if the 3 levels of system modeling approach is feasible, valid, and useful. The case study involves an urban population with serious mental health and Philadelphia's Medicaid population (n=527,056), in particular. Section 3 explains the models using data from the case study and thereby establishes feasibility of the approach for modeling a real system. The models were trained and tuned using national epidemiologic datasets and various domain expert inputs. To avoid co-mingling of training and testing data, the simulations were then run and compared (Section 4.1) to an analysis of 250,000 Philadelphia patient hospital admissions for the year 2010 in terms of re-hospitalization rate, number of doctor visits, and days in hospital. Based on the Student t-test, deviations between simulated vs. real world outcomes are not statistically significant. Validity is thus established for the 2008-2010 timeframe. We computed models of various types of interventions that were ineffective as well as 4 categories of interventions (e.g., reduced per-nurse caseload, increased check-ins and stays, etc.) that result in improvement in well-being and cost. The 3 level approach appears to be useful to help health administrators sort through system complexities to find effective interventions at lower costs. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. OC5 Project Phase II: Validation of Global Loads of the DeepCwind Floating Semisubmersible Wind Turbine

    DOE PAGES

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...

    2017-10-01

    This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive pitch excitation and resulting tower loads in some frequency bands.« less

  19. OC5 Project Phase II: Validation of Global Loads of the DeepCwind Floating Semisubmersible Wind Turbine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.

    This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive pitch excitation and resulting tower loads in some frequency bands.« less

  20. Cost-effectiveness models for chronic obstructive pulmonary disease: cross-model comparison of hypothetical treatment scenarios.

    PubMed

    Hoogendoorn, Martine; Feenstra, Talitha L; Asukai, Yumi; Borg, Sixten; Hansen, Ryan N; Jansson, Sven-Arne; Samyshkin, Yevgeniy; Wacker, Margarethe; Briggs, Andrew H; Lloyd, Adam; Sullivan, Sean D; Rutten-van Mölken, Maureen P M H

    2014-07-01

    To compare different chronic obstructive pulmonary disease (COPD) cost-effectiveness models with respect to structure and input parameters and to cross-validate the models by running the same hypothetical treatment scenarios. COPD modeling groups simulated four hypothetical interventions with their model and compared the results with a reference scenario of no intervention. The four interventions modeled assumed 1) 20% reduction in decline in lung function, 2) 25% reduction in exacerbation frequency, 3) 10% reduction in all-cause mortality, and 4) all these effects combined. The interventions were simulated for a 5-year and lifetime horizon with standardization, if possible, for sex, age, COPD severity, smoking status, exacerbation frequencies, mortality due to other causes, utilities, costs, and discount rates. Furthermore, uncertainty around the outcomes of intervention four was compared. Seven out of nine contacted COPD modeling groups agreed to participate. The 5-year incremental cost-effectiveness ratios (ICERs) for the most comprehensive intervention, intervention four, was €17,000/quality-adjusted life-year (QALY) for two models, €25,000 to €28,000/QALY for three models, and €47,000/QALY for the remaining two models. Differences in the ICERs could mainly be explained by differences in input values for disease progression, exacerbation-related mortality, and all-cause mortality, with high input values resulting in low ICERs and vice versa. Lifetime results were mainly affected by the input values for mortality. The probability of intervention four to be cost-effective at a willingness-to-pay value of €50,000/QALY was 90% to 100% for five models and about 70% and 50% for the other two models, respectively. Mortality was the most important factor determining the differences in cost-effectiveness outcomes between models. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. Decision-makers' preferences for approving new medicines in Wales: a discrete-choice experiment with assessment of external validity.

    PubMed

    Linley, Warren G; Hughes, Dyfrig A

    2013-04-01

    Few studies to date have explored the stated preferences of national decision makers for health technology adoption criteria, and none of these have compared stated decision-making behaviours against actual behaviours. Assessment of the external validity of stated preference studies, such as discrete-choice experiments (DCEs), remains an under-researched area. The primary aim was to explore the preferences of All Wales Medicines Strategy Group (AWMSG) appraisal committee and appraisal sub-committee (the New Medicines Group) members ('appraisal committees') for specific new medicines adoption criteria. Secondary aims were to explore the external validity of respondents' stated preferences and the impact of question choice options upon preference structures in DCEs. A DCE was conducted to estimate appraisal committees members' preferences for incremental cost effectiveness, quality-adjusted life-years (QALYs) gained, annual number of patients expected to be treated, the impact of the disease on patients before treatment, and the assessment of uncertainty in the economic evidence submitted for new medicines compared with current UK NHS treatment. Respondents evaluated 28 pairs of hypothetical new medicines, making a primary forced choice between each pair and a more flexible secondary choice, which permitted either, neither or both new medicines to be chosen. The performance of the resultant models was compared against previous AWMSG decisions. Forty-one out of a total of 80 past and present members of AWMSG appraisal committees completed the DCE. The incremental cost effectiveness of new medicines, and the QALY gains they provide, significantly (p < 0.0001) influence recommendations. Committee members were willing to accept higher incremental cost-effectiveness ratios and lower QALY gains for medicines that treat disease impacting primarily upon survival rather than quality of life, and where uncertainty in the cost-effectiveness estimates has been thoroughly explored. The number of patients to be treated by the new medicine did not exert a significant influence upon recommendations. The use of a flexible-choice question format revealed a different preference structure to the forced-choice format, but the performance of the two models was similar. Aggregate decisions of the AWMSG were well predicted by both models, but their sensitivity (64 %, 68 %) and specificity (55 %, 64 %) were limited. A willingness to trade the cost effectiveness and QALY gains against other factors indicates that economic efficiency and QALY maximisation are not the only considerations of committee members when making recommendations on the use of medicines in Wales. On average, appraisal committee members' stated preferences appear consistent with their actual decision-making behaviours, providing support for the external validity of our DCEs. However, as health technology assessment involves complex decision-making processes, and each individual recommendation may be influenced to varying degrees by a multitude of different considerations, the ability of our models to predict individual medicine recommendations is more limited.

  2. Advanced scatter search approach and its application in a sequencing problem of mixed-model assembly lines in a case company

    NASA Astrophysics Data System (ADS)

    Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing

    2014-11-01

    Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.

  3. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    PubMed

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  4. Health care use and costs of adverse drug events emerging from outpatient treatment in Germany: a modelling approach.

    PubMed

    Stark, Renee G; John, Jürgen; Leidl, Reiner

    2011-01-13

    This study's aim was to develop a first quantification of the frequency and costs of adverse drug events (ADEs) originating in ambulatory medical practice in Germany. The frequencies and costs of ADEs were quantified for a base case, building on an existing cost-of-illness model for ADEs. The model originates from the U.S. health care system, its structure of treatment probabilities linked to ADEs was transferred to Germany. Sensitivity analyses based on values determined from a literature review were used to test the postulated results. For Germany, the base case postulated that about 2 million adults ingesting medications have will have an ADE in 2007. Health care costs related to ADEs in this base case totalled 816 million Euros, mean costs per case were 381 Euros. About 58% of costs resulted from hospitalisations, 11% from emergency department visits and 21% from long-term care. Base case estimates of frequency and costs of ADEs were lower than all estimates of the sensitivity analyses. The postulated frequency and costs of ADEs illustrate the possible size of the health problems and economic burden related to ADEs in Germany. The validity of the U.S. treatment structure used remains to be determined for Germany. The sensitivity analysis used assumptions from different studies and thus further quantified the information gap in Germany regarding ADEs. This study found costs of ADEs in the ambulatory setting in Germany to be significant. Due to data scarcity, results are only a rough indication.

  5. Scoring System for the Management of Acute Gallstone Pancreatitis: Cost Analysis of a Prospective Study.

    PubMed

    Prigoff, Jake G; Swain, Gary W; Divino, Celia M

    2016-05-01

    Predicting the presence of a persistent common bile duct (CBD) stone is a difficult and expensive task. The aim of this study is to determine if a previously described protocol-based scoring system is a cost-effective strategy. The protocol includes all patients with gallstone pancreatitis and stratifies them based on laboratory values and imaging to high, medium, and low likelihood of persistent stones. The patient's stratification then dictates the next course of management. A decision analytic model was developed to compare the costs for patients who followed the protocol versus those that did not. Clinical data model inputs were obtained from a prospective study conducted at The Mount Sinai Medical Center to validate the protocol from Oct 2009 to May 2013. The study included all patients presenting with gallstone pancreatitis regardless of disease severity. Seventy-three patients followed the proposed protocol and 32 did not. The protocol group cost an average of $14,962/patient and the non-protocol group cost $17,138/patient for procedural costs. Mean length of stay for protocol and non-protocol patients was 5.6 and 7.7 days, respectively. The proposed protocol is a cost-effective way to determine the course for patients with gallstone pancreatitis, reducing total procedural costs over 12 %.

  6. External validation of risk prediction models for incident colorectal cancer using UK Biobank

    PubMed Central

    Usher-Smith, J A; Harshfield, A; Saunders, C L; Sharp, S J; Emery, J; Walter, F M; Muir, K; Griffin, S J

    2018-01-01

    Background: This study aimed to compare and externally validate risk scores developed to predict incident colorectal cancer (CRC) that include variables routinely available or easily obtainable via self-completed questionnaire. Methods: External validation of fourteen risk models from a previous systematic review in 373 112 men and women within the UK Biobank cohort with 5-year follow-up, no prior history of CRC and data for incidence of CRC through linkage to national cancer registries. Results: There were 1719 (0.46%) cases of incident CRC. The performance of the risk models varied substantially. In men, the QCancer10 model and models by Tao, Driver and Ma all had an area under the receiver operating characteristic curve (AUC) between 0.67 and 0.70. Discrimination was lower in women: the QCancer10, Wells, Tao, Guesmi and Ma models were the best performing with AUCs between 0.63 and 0.66. Assessment of calibration was possible for six models in men and women. All would require country-specific recalibration if estimates of absolute risks were to be given to individuals. Conclusions: Several risk models based on easily obtainable data have relatively good discrimination in a UK population. Modelling studies are now required to estimate the potential health benefits and cost-effectiveness of implementing stratified risk-based CRC screening. PMID:29381683

  7. Cost-utility of cognitive behavioral therapy for low back pain from the commercial payer perspective.

    PubMed

    Norton, Giulia; McDonough, Christine M; Cabral, Howard; Shwartz, Michael; Burgess, James F

    2015-05-15

    Markov cost-utility model. To evaluate the cost-utility of cognitive behavioral therapy (CBT) for the treatment of persistent nonspecific low back pain (LBP) from the perspective of US commercial payers. CBT is widely deemed clinically effective for LBP treatment. The evidence is suggestive of cost-effectiveness. We constructed and validated a Markov intention-to-treat model to estimate the cost-utility of CBT, with 1-year and 10-year time horizons. We applied likelihood of improvement and utilities from a randomized controlled trial assessing CBT to treat LBP. The trial randomized subjects to treatment but subjects freely sought health care services. We derived the cost of equivalent rates and types of services from US commercial claims for LBP for a similar population. For the 10-year estimates, we derived recurrence rates from the literature. The base case included medical and pharmaceutical services and assumed gradual loss of skill in applying CBT techniques. Sensitivity analyses assessed the distribution of service utilization, utility values, and rate of LBP recurrence. We compared health plan designs. Results are based on 5000 iterations of each model and expressed as an incremental cost per quality-adjusted life-year. The incremental cost-utility of CBT was $7197 per quality-adjusted life-year in the first year and $5855 per quality-adjusted life-year over 10 years. The results are robust across numerous sensitivity analyses. No change of parameter estimate resulted in a difference of more than 7% from the base case for either time horizon. Including chiropractic and/or acupuncture care did not substantively affect cost-effectiveness. The model with medical but no pharmaceutical costs was more cost-effective ($5238 for 1 yr and $3849 for 10 yr). CBT is a cost-effective approach to manage chronic LBP among commercial health plans members. Cost-effectiveness is demonstrated for multiple plan designs. 2.

  8. Postoperative costs associated with outcomes after cardiac surgery with extracorporeal circulation: role of antithrombin levels.

    PubMed

    Muedra, Vicente; Llau, Juan V; Llagunes, José; Paniagua, Pilar; Veiras, Sonia; Fernández-López, Antonio R; Diago, Carmen; Hidalgo, Francisco; Gil, Jesús; Valiño, Cristina; Moret, Enric; Gómez, Laura; Pajares, Azucena; de Prada, Blanca

    2013-04-01

    To study the impact on postoperative costs of a patient's antithrombin levels associated with outcomes after cardiac surgery with extracorporeal circulation. An analytic decision model was designed to estimate costs and clinical outcomes after cardiac surgery in a typical patient with low antithrombin levels (<63.7%) compared with a patient with normal antithrombin levels (≥63.7%). The data used in the model were obtained from a literature review and subsequently validated by a panel of experts in cardiothoracic anesthesiology. Multi-institutional (14 Spanish hospitals). Consultant anesthesiologists. A sensitivity analysis of extreme scenarios was carried out to assess the impact of the major variables in the model results. The average cost per patient was €18,772 for a typical patient with low antithrombin levels and €13,881 for a typical patient with normal antithrombin levels. The difference in cost was due mainly to the longer hospital stay of a patient with low antithrombin levels compared with a patient with normal levels (13 v 10 days, respectively, representing a €4,596 higher cost) rather than to costs related to the management of postoperative complications (€215, mostly owing to transfusions). Sensitivity analysis showed a high variability range of approximately ±55% of the base case cost between the minimum and maximum scenarios, with the hospital stay contributing more significantly to the variation. Based on this analytic decision model, there could be a marked increase in the postoperative costs of patients with low antithrombin activity levels at the end of cardiac surgery, mainly ascribed to a longer hospitalization. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Development and Evaluation of an Automated Machine Learning Algorithm for In-Hospital Mortality Risk Adjustment Among Critical Care Patients.

    PubMed

    Delahanty, Ryan J; Kaufman, David; Jones, Spencer S

    2018-06-01

    Risk adjustment algorithms for ICU mortality are necessary for measuring and improving ICU performance. Existing risk adjustment algorithms are not widely adopted. Key barriers to adoption include licensing and implementation costs as well as labor costs associated with human-intensive data collection. Widespread adoption of electronic health records makes automated risk adjustment feasible. Using modern machine learning methods and open source tools, we developed and evaluated a retrospective risk adjustment algorithm for in-hospital mortality among ICU patients. The Risk of Inpatient Death score can be fully automated and is reliant upon data elements that are generated in the course of usual hospital processes. One hundred thirty-one ICUs in 53 hospitals operated by Tenet Healthcare. A cohort of 237,173 ICU patients discharged between January 2014 and December 2016. The data were randomly split into training (36 hospitals), and validation (17 hospitals) data sets. Feature selection and model training were carried out using the training set while the discrimination, calibration, and accuracy of the model were assessed in the validation data set. Model discrimination was evaluated based on the area under receiver operating characteristic curve; accuracy and calibration were assessed via adjusted Brier scores and visual analysis of calibration curves. Seventeen features, including a mix of clinical and administrative data elements, were retained in the final model. The Risk of Inpatient Death score demonstrated excellent discrimination (area under receiver operating characteristic curve = 0.94) and calibration (adjusted Brier score = 52.8%) in the validation dataset; these results compare favorably to the published performance statistics for the most commonly used mortality risk adjustment algorithms. Low adoption of ICU mortality risk adjustment algorithms impedes progress toward increasing the value of the healthcare delivered in ICUs. The Risk of Inpatient Death score has many attractive attributes that address the key barriers to adoption of ICU risk adjustment algorithms and performs comparably to existing human-intensive algorithms. Automated risk adjustment algorithms have the potential to obviate known barriers to adoption such as cost-prohibitive licensing fees and significant direct labor costs. Further evaluation is needed to ensure that the level of performance observed in this study could be achieved at independent sites.

  10. Physician outcome measurement: review and proposed model.

    PubMed

    Siha, S

    1998-01-01

    As health care moves from a free-for-service environment to a capitated arena, outcome measurements must change. ABC Children's Medical Center is challenged with developing comprehensive outcome measures for an employed physician group. An extensive literature review validates that physician outcomes must move beyond revenue production and measure all aspects of care delivery. The proposed measurement model for this physician group is a trilogy model. It includes measures of cost, quality, and service. While these measures can be examined separately, it is imperative to understand their integration in determining an organization's competitive advantage. The recommended measurements for the physician group must be consistent with the overall organizational goals. The long-term impact will be better utilization of resources. This will result in the most cost effective, quality care for the health care consumer.

  11. Model Description and Proposed Application for the Enlisted Personnel Inventory, Cost, and Compensation Model

    DTIC Science & Technology

    1994-07-01

    provide additional information for the user / policy analyst: Eichers, D., Sola, M., McLernan, G., EPICC User’s Manual , Systems Research and Applications...maintenance, and a set of on-line help screens. Each are further discussed below and a full discussion is included in the EPICC User’s Manual . Menu Based...written documentation (user’s manual ) that will be provided with the model. 55 The next chapter discusses the validation of the inventory projection and

  12. Design and analysis of electricity markets

    NASA Astrophysics Data System (ADS)

    Sioshansi, Ramteen Mehr

    Restructured competitive electricity markets rely on designing market-based mechanisms which can efficiently coordinate the power system and minimize the exercise of market power. This dissertation is a series of essays which develop and analyze models of restructured electricity markets. Chapter 2 studies the incentive properties of a co-optimized market for energy and reserves that pays reserved generators their implied opportunity cost---which is the difference between their stated energy cost and the market-clearing price for energy. By analyzing the market as a competitive direct revelation mechanism we examine the properties of efficient equilibria and demonstrate that generators have incentives to shade their stated costs below actual costs. We further demonstrate that the expected energy payments of our mechanism is less than that in a disjoint market for energy only. Chapter 3 is an empirical validation of a supply function equilibrium (SFE) model. By comparing theoretically optimal supply functions and actual generation offers into the Texas spot balancing market, we show the SFE to fit the actual behavior of the largest generators in market. This not only serves to validate the model, but also demonstrates the extent to which firms exercise market power. Chapters 4 and 5 examine equity, incentive, and efficiency issues in the design of non-convex commitment auctions. We demonstrate that different near-optimal solutions to a central unit commitment problem which have similar-sized optimality gaps will generally yield vastly different energy prices and payoffs to individual generators. Although solving the mixed integer program to optimality will overcome such issues, we show that this relies on achieving optimality of the commitment---which may not be tractable for large-scale problems within the allotted timeframe. We then simulate and compare a competitive benchmark for a market with centralized and self commitment in order to bound the efficiency losses stemming from coordination losses (cost of anarchy) in a decentralized market.

  13. Discrete Event Simulation for Decision Modeling in Health Care: Lessons from Abdominal Aortic Aneurysm Screening

    PubMed Central

    Jones, Edmund; Masconi, Katya L.; Sweeting, Michael J.; Thompson, Simon G.; Powell, Janet T.

    2018-01-01

    Markov models are often used to evaluate the cost-effectiveness of new healthcare interventions but they are sometimes not flexible enough to allow accurate modeling or investigation of alternative scenarios and policies. A Markov model previously demonstrated that a one-off invitation to screening for abdominal aortic aneurysm (AAA) for men aged 65 y in the UK and subsequent follow-up of identified AAAs was likely to be highly cost-effective at thresholds commonly adopted in the UK (£20,000 to £30,000 per quality adjusted life-year). However, new evidence has emerged and the decision problem has evolved to include exploration of the circumstances under which AAA screening may be cost-effective, which the Markov model is not easily able to address. A new model to handle this more complex decision problem was needed, and the case of AAA screening thus provides an illustration of the relative merits of Markov models and discrete event simulation (DES) models. An individual-level DES model was built using the R programming language to reflect possible events and pathways of individuals invited to screening v. those not invited. The model was validated against key events and cost-effectiveness, as observed in a large, randomized trial. Different screening protocol scenarios were investigated to demonstrate the flexibility of the DES. The case of AAA screening highlights the benefits of DES, particularly in the context of screening studies.

  14. Estimating and validating ground-based timber harvesting production through computer simulation

    Treesearch

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  15. VALIDATION OF A METHOD FOR ESTIMATING POLLUTION EMISSION RATES FROM AREA SOURCES USING OPEN-PATH FTIR SEPCTROSCOPY AND DISPERSION MODELING TECHNIQUES

    EPA Science Inventory

    The paper describes a methodology developed to estimate emissions factors for a variety of different area sources in a rapid, accurate, and cost effective manner. he methodology involves using an open-path Fourier transform infrared (FTIR) spectrometer to measure concentrations o...

  16. ASSESSMENT OF CHEMICAL EFFECTS ON NEURONAL DIFFERENTIATION USING THE ARRAYSCAN HIGH CONTENT SCREENING SYSTEM

    EPA Science Inventory

    The development of alternative methods for toxicity testing is driven by the need for scientifically valid data that can be obtained in a rapid and cost-efficient manner. In vitro systems provide a model in which chemical effects on cellular events can be examined using technique...

  17. Evaluating the intersection of a regional wildlife connectivity network with highways

    Treesearch

    Samuel A. Cushman; Jesse S. Lewis; Erin L. Landguth

    2013-01-01

    Reliable predictions of regional-scale population connectivity are needed to prioritize conservation actions. However, there have been few examples of regional connectivity models that are empirically derived and validated. The central goals of this paper were to (1) evaluate the effectiveness of factorial least cost path corridor mapping on an empirical...

  18. TOWARDS RELIABLE AND COST-EFFECTIVE OZONE EXPOSURE ASSESSMENT: PARAMETER EVALUATION AND MODEL VALIDATION USING THE HARVARD SOUTHERN CALIFORNIA CHRONIC OZONE EXPOSURE STUDY DATA

    EPA Science Inventory

    Accurate assessment of chronic human exposure to atmospheric criteria pollutants, such as ozone, is critical for understanding human health risks associated with living in environments with elevated ambient pollutant concentrations. In this study, we analyzed a data set from a...

  19. Use of 3-dimensional printing technology and silicone modeling in surgical simulation: development and face validation in pediatric laparoscopic pyeloplasty.

    PubMed

    Cheung, Carling L; Looi, Thomas; Lendvay, Thomas S; Drake, James M; Farhat, Walid A

    2014-01-01

    Pediatric laparoscopy poses unique training challenges owing to smaller workspaces, finer sutures used, and potentially more delicate tissues that require increased surgical dexterity when compared with adult analogs. We describe the development and face validation of a pediatric pyeloplasty simulator using a low-cost laparoscopic dry-laboratory model developed with 3-dimensional (3D) printing and silicone modeling. The organs (the kidney, renal pelvis, and ureter) were created in a 3-step process where molds were created with 3D modeling software, printed with a Spectrum Z510 3D printer, and cast with Dragon Skin 30 silicone rubber. The model was secured in a laparoscopy box trainer. A pilot study was conducted at a Canadian Urological Association meeting. A total of 24 pediatric urology fellows and 3 experienced faculty members then assessed our skills module during a minimally invasive surgery training course. Participants had 60 minutes to perform a right-side pyeloplasty using laparoscopic tools and 5-0 VICRYL suture. Face validity was demonstrated on a 5-point Likert scale. The dry-laboratory model consists of a kidney, a replaceable dilated renal pelvis and ureter with an obstructed ureteropelvic junction, and an overlying peritoneum with an inscribed fundamentals of laparoscopic surgery pattern-cutting exercise. During initial validation at the Canadian Urological Association, participants rated (out of 5) 4.75 ± 0.29 for overall impression, 4.50 ± 0.41 for realism, and 4.38 ± 0.48 for handling. During the minimally invasive surgery course, 22 of 24 fellows and all the faculty members completed the scoring. Usability was rated 4 or 5 by 14 participants (overall, 3.6 ± 1.22 by novices and 3.7 ± 0.58 by experts), indicating that they would use the model in their own training and teaching. Esthetically, the model was rated 3.5 ± 0.74 (novices) and 3.3 ± 0.58 (experts). We developed a pediatric pyeloplasty simulator by applying a low-cost reusable model for laparoscopic training and skills acquisition. The model's usability, realism, and feel are good, it can be imaged under common modalities, and it shows promise as an educational tool. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  20. Approaches to eliminate waste and reduce cost for recycling glass.

    PubMed

    Chao, Chien-Wen; Liao, Ching-Jong

    2011-12-01

    In recent years, the issue of environmental protection has received considerable attention. This paper adds to the literature by investigating a scheduling problem in the manufacturing of a glass recycling factory in Taiwan. The objective is to minimize the sum of the total holding cost and loss cost. We first represent the problem as an integer programming (IP) model, and then develop two heuristics based on the IP model to find near-optimal solutions for the problem. To validate the proposed heuristics, comparisons between optimal solutions from the IP model and solutions from the current method are conducted. The comparisons involve two problem sizes, small and large, where the small problems range from 15 to 45 jobs, and the large problems from 50 to 100 jobs. Finally, a genetic algorithm is applied to evaluate the proposed heuristics. Computational experiments show that the proposed heuristics can find good solutions in a reasonable time for the considered problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Cost Analysis In A Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature serviceoriented, multi-mission control centers to streamline or refine their cost analysis process.

  2. Cost Analysis in a Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Felton, Larry; Newhouse, Marilyn; Bornas, Nick; Botts, Dennis; Ijames, Gayleen; Montgomery, Patty; Roth, Karl

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single mission-type support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multi-mission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature service-oriented, multi-mission control centers to streamline or refine their cost analysis process.

  3. Using Length of Stay to Control for Unobserved Heterogeneity When Estimating Treatment Effect on Hospital Costs with Observational Data: Issues of Reliability, Robustness, and Usefulness.

    PubMed

    May, Peter; Garrido, Melissa M; Cassel, J Brian; Morrison, R Sean; Normand, Charles

    2016-10-01

    To evaluate the sensitivity of treatment effect estimates when length of stay (LOS) is used to control for unobserved heterogeneity when estimating treatment effect on cost of hospital admission with observational data. We used data from a prospective cohort study on the impact of palliative care consultation teams (PCCTs) on direct cost of hospital care. Adult patients with an advanced cancer diagnosis admitted to five large medical and cancer centers in the United States between 2007 and 2011 were eligible for this study. Costs were modeled using generalized linear models with a gamma distribution and a log link. We compared variability in estimates of PCCT impact on hospitalization costs when LOS was used as a covariate, as a sample parameter, and as an outcome denominator. We used propensity scores to account for patient characteristics associated with both PCCT use and total direct hospitalization costs. We analyzed data from hospital cost databases, medical records, and questionnaires. Our propensity score weighted sample included 969 patients who were discharged alive. In analyses of hospitalization costs, treatment effect estimates are highly sensitive to methods that control for LOS, complicating interpretation. Both the magnitude and significance of results varied widely with the method of controlling for LOS. When we incorporated intervention timing into our analyses, results were robust to LOS-controls. Treatment effect estimates using LOS-controls are not only suboptimal in terms of reliability (given concerns over endogeneity and bias) and usefulness (given the need to validate the cost-effectiveness of an intervention using overall resource use for a sample defined at baseline) but also in terms of robustness (results depend on the approach taken, and there is little evidence to guide this choice). To derive results that minimize endogeneity concerns and maximize external validity, investigators should match and analyze treatment and comparison arms on baseline factors only. Incorporating intervention timing may deliver results that are more reliable, more robust, and more useful than those derived using LOS-controls. © Health Research and Educational Trust.

  4. A Novel Low-Cost Open-Hardware Platform for Monitoring Soil Water Content and Multiple Soil-Air-Vegetation Parameters

    PubMed Central

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-01-01

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost “open hardware” platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSEtraining = 2.63; RMSEvalidation = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies. PMID:25337742

  5. Integrated Model Reduction and Control of Aircraft with Flexible Wings

    NASA Technical Reports Server (NTRS)

    Swei, Sean Shan-Min; Zhu, Guoming G.; Nguyen, Nhan T.

    2013-01-01

    This paper presents an integrated approach to the modeling and control of aircraft with exible wings. The coupled aircraft rigid body dynamics with a high-order elastic wing model can be represented in a nite dimensional state-space form. Given a set of desired output covariance, a model reduction process is performed by using the weighted Modal Cost Analysis (MCA). A dynamic output feedback controller, which is designed based on the reduced-order model, is developed by utilizing output covariance constraint (OCC) algorithm, and the resulting OCC design weighting matrix is used for the next iteration of the weighted cost analysis. This controller is then validated for full-order evaluation model to ensure that the aircraft's handling qualities are met and the uttering motion of the wings suppressed. An iterative algorithm is developed in CONDUIT environment to realize the integration of model reduction and controller design. The proposed integrated approach is applied to NASA Generic Transport Model (GTM) for demonstration.

  6. Validation of a RANS transition model using a high-order weighted compact nonlinear scheme

    NASA Astrophysics Data System (ADS)

    Tu, GuoHua; Deng, XiaoGang; Mao, MeiLiang

    2013-04-01

    A modified transition model is given based on the shear stress transport (SST) turbulence model and an intermittency transport equation. The energy gradient term in the original model is replaced by flow strain rate to saving computational costs. The model employs local variables only, and then it can be conveniently implemented in modern computational fluid dynamics codes. The fifth-order weighted compact nonlinear scheme and the fourth-order staggered scheme are applied to discrete the governing equations for the purpose of minimizing discretization errors, so as to mitigate the confusion between numerical errors and transition model errors. The high-order package is compared with a second-order TVD method on simulating the transitional flow of a flat plate. Numerical results indicate that the high-order package give better grid convergence property than that of the second-order method. Validation of the transition model is performed for transitional flows ranging from low speed to hypersonic speed.

  7. Mathematical modeling of enzyme production using Trichoderma harzianum P49P11 and sugarcane bagasse as carbon source.

    PubMed

    Gelain, Lucas; da Cruz Pradella, José Geraldo; da Costa, Aline Carvalho

    2015-12-01

    A mathematical model to describe the kinetics of enzyme production by the filamentous fungus Trichoderma harzianum P49P11 was developed using a low cost substrate as main carbon source (pretreated sugarcane bagasse). The model describes the cell growth, variation of substrate concentration and production of three kinds of enzymes (cellulases, beta-glucosidase and xylanase) in different sugarcane bagasse concentrations (5; 10; 20; 30; 40 gL(-1)). The 10 gL(-1) concentration was used to validate the model and the other to parameter estimation. The model for enzyme production has terms implicitly representing induction and repression. Substrate variation was represented by a simple degradation rate. The models seem to represent well the kinetics with a good fit for the majority of the assays. Validation results indicate that the models are adequate to represent the kinetics for a biotechnological process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Development and validation of a risk model for identification of non-neutropenic, critically ill adult patients at high risk of invasive Candida infection: the Fungal Infection Risk Evaluation (FIRE) Study.

    PubMed

    Harrison, D; Muskett, H; Harvey, S; Grieve, R; Shahin, J; Patel, K; Sadique, Z; Allen, E; Dybowski, R; Jit, M; Edgeworth, J; Kibbler, C; Barnes, R; Soni, N; Rowan, K

    2013-02-01

    There is increasing evidence that invasive fungal disease (IFD) is more likely to occur in non-neutropenic patients in critical care units. A number of randomised controlled trials (RCTs) have evaluated antifungal prophylaxis in non-neutropenic, critically ill patients, demonstrating a reduction in the risk of proven IFD and suggesting a reduction in mortality. It is necessary to establish a method to identify and target antifungal prophylaxis at those patients at highest risk of IFD, who stand to benefit most from any antifungal prophylaxis strategy. To develop and validate risk models to identify non-neutropenic, critically ill adult patients at high risk of invasive Candida infection, who would benefit from antifungal prophylaxis, and to assess the cost-effectiveness of targeting antifungal prophylaxis to high-risk patients based on these models. Systematic review, prospective data collection, statistical modelling, economic decision modelling and value of information analysis. Ninety-six UK adult general critical care units. Consecutive admissions to participating critical care units. None. Invasive fungal disease, defined as a blood culture or sample from a normally sterile site showing yeast/mould cells in a microbiological or histopathological report. For statistical and economic modelling, the primary outcome was invasive Candida infection, defined as IFD-positive for Candida species. Systematic review: Thirteen articles exploring risk factors, risk models or clinical decision rules for IFD in critically ill adult patients were identified. Risk factors reported to be significantly associated with IFD were included in the final data set for the prospective data collection. Data were collected on 60,778 admissions between July 2009 and March 2011. Overall, 383 patients (0.6%) were admitted with or developed IFD. The majority of IFD patients (94%) were positive for Candida species. The most common site of infection was blood (55%). The incidence of IFD identified in unit was 4.7 cases per 1000 admissions, and for unit-acquired IFD was 3.2 cases per 1000 admissions. Statistical modelling: Risk models were developed at admission to the critical care unit, 24 hours and the end of calendar day 3. The risk model at admission had fair discrimination (c-index 0.705). Discrimination improved at 24 hours (c-index 0.823) and this was maintained at the end of calendar day 3 (c-index 0.835). There was a drop in model performance in the validation sample. Economic decision model: Irrespective of risk threshold, incremental quality-adjusted life-years of prophylaxis strategies compared with current practice were positive but small compared with the incremental costs. Incremental net benefits of each prophylaxis strategy compared with current practice were all negative. Cost-effectiveness acceptability curves showed that current practice was the strategy most likely to be cost-effective. Across all parameters in the decision model, results indicated that the value of further research for the whole population of interest might be high relative to the research costs. The results of the Fungal Infection Risk Evaluation (FIRE) Study, derived from a highly representative sample of adult general critical care units across the UK, indicated a low incidence of IFD among non-neutropenic, critically ill adult patients. IFD was associated with substantially higher mortality, more intensive organ support and longer length of stay. Risk modelling produced simple risk models that provided acceptable discrimination for identifying patients at 'high risk' of invasive Candida infection. Results of the economic model suggested that the current most cost-effective treatment strategy for prophylactic use of systemic antifungal agents among non-neutropenic, critically ill adult patients admitted to NHS adult general critical care units is a strategy of no risk assessment and no antifungal prophylaxis. Funding for this study was provided by the Health Technology Assessment programme of the National Institute for Health Research.

  9. Validating Human Performance Models of the Future Orion Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Wong, Douglas T.; Walters, Brett; Fairey, Lisa

    2010-01-01

    NASA's Orion Crew Exploration Vehicle (CEV) will provide transportation for crew and cargo to and from destinations in support of the Constellation Architecture Design Reference Missions. Discrete Event Simulation (DES) is one of the design methods NASA employs for crew performance of the CEV. During the early development of the CEV, NASA and its prime Orion contractor Lockheed Martin (LM) strived to seek an effective low-cost method for developing and validating human performance DES models. This paper focuses on the method developed while creating a DES model for the CEV Rendezvous, Proximity Operations, and Docking (RPOD) task to the International Space Station. Our approach to validation was to attack the problem from several fronts. First, we began the development of the model early in the CEV design stage. Second, we adhered strictly to M&S development standards. Third, we involved the stakeholders, NASA astronauts, subject matter experts, and NASA's modeling and simulation development community throughout. Fourth, we applied standard and easy-to-conduct methods to ensure the model's accuracy. Lastly, we reviewed the data from an earlier human-in-the-loop RPOD simulation that had different objectives, which provided us an additional means to estimate the model's confidence level. The results revealed that a majority of the DES model was a reasonable representation of the current CEV design.

  10. Application of artificial intelligence (AI) concepts to the development of space flight parts approval model

    NASA Technical Reports Server (NTRS)

    Krishnan, G. S.

    1997-01-01

    A cost effective model which uses the artificial intelligence techniques in the selection and approval of parts is presented. The knowledge which is acquired from the specialists for different part types are represented in a knowledge base in the form of rules and objects. The parts information is stored separately in a data base and is isolated from the knowledge base. Validation, verification and performance issues are highlighted.

  11. An efficient Bayesian data-worth analysis using a multilevel Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ricciuto, Daniel; Evans, Katherine

    2018-03-01

    Improving the understanding of subsurface systems and thus reducing prediction uncertainty requires collection of data. As the collection of subsurface data is costly, it is important that the data collection scheme is cost-effective. Design of a cost-effective data collection scheme, i.e., data-worth analysis, requires quantifying model parameter, prediction, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface hydrological model simulations using standard Monte Carlo (MC) sampling or surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose an efficient Bayesian data-worth analysis using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce computational costs using multifidelity approximations. Since the Bayesian data-worth analysis involves a great deal of expectation estimation, the cost saving of the MLMC in the assessment can be outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it for a highly heterogeneous two-phase subsurface flow simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the standard MC estimation. But compared to the standard MC, the MLMC greatly reduces the computational costs.

  12. Budget impact analysis of thrombolysis for stroke in Spain: a discrete event simulation model.

    PubMed

    Mar, Javier; Arrospide, Arantzazu; Comas, Mercè

    2010-01-01

    Thrombolysis within the first 3 hours after the onset of symptoms of a stroke has been shown to be a cost-effective treatment because treated patients are 30% more likely than nontreated patients to have no residual disability. The objective of this study was to calculate by means of a discrete event simulation model the budget impact of thrombolysis in Spain. The budget impact analysis was based on stroke incidence rates and the estimation of the prevalence of stroke-related disability in Spain and its translation to hospital and social costs. A discrete event simulation model was constructed to represent the flow of patients with stroke in Spain. If 10% of patients with stroke from 2000 to 2015 would receive thrombolytic treatment, the prevalence of dependent patients in 2015 would decrease from 149,953 to 145,922. For the first 6 years, the cost of intervention would surpass the savings. Nevertheless, the number of cases in which patient dependency was avoided would steadily increase, and after 2006 the cost savings would be greater, with a widening difference between the cost of intervention and the cost of nonintervention, until 2015. The impact of thrombolysis on society's health and social budget indicates a net benefit after 6 years, and the improvement in health grows continuously. The validation of the model demonstrates the adequacy of the discrete event simulation approach in representing the epidemiology of stroke to calculate the budget impact.

  13. Peering Strategic Game Models for Interdependent ISPs in Content Centric Internet

    PubMed Central

    Guan, Jianfeng; Xu, Changqiao; Su, Wei; Zhang, Hongke

    2013-01-01

    Emergent content-oriented networks prompt Internet service providers (ISPs) to evolve and take major responsibility for content delivery. Numerous content items and varying content popularities motivate interdependence between peering ISPs to elaborate their content caching and sharing strategies. In this paper, we propose the concept of peering for content exchange between interdependent ISPs in content centric Internet to minimize content delivery cost by a proper peering strategy. We model four peering strategic games to formulate four types of peering relationships between ISPs who are characterized by varying degrees of cooperative willingness from egoism to altruism and interconnected as profit-individuals or profit-coalition. Simulation results show the price of anarchy (PoA) and communication cost in the four games to validate that ISPs should decide their peering strategies by balancing intradomain content demand and interdomain peering relations for an optimal cost of content delivery. PMID:24381517

  14. Peering strategic game models for interdependent ISPs in content centric Internet.

    PubMed

    Zhao, Jia; Guan, Jianfeng; Xu, Changqiao; Su, Wei; Zhang, Hongke

    2013-01-01

    Emergent content-oriented networks prompt Internet service providers (ISPs) to evolve and take major responsibility for content delivery. Numerous content items and varying content popularities motivate interdependence between peering ISPs to elaborate their content caching and sharing strategies. In this paper, we propose the concept of peering for content exchange between interdependent ISPs in content centric Internet to minimize content delivery cost by a proper peering strategy. We model four peering strategic games to formulate four types of peering relationships between ISPs who are characterized by varying degrees of cooperative willingness from egoism to altruism and interconnected as profit-individuals or profit-coalition. Simulation results show the price of anarchy (PoA) and communication cost in the four games to validate that ISPs should decide their peering strategies by balancing intradomain content demand and interdomain peering relations for an optimal cost of content delivery.

  15. Cost-Sharing of Ecological Construction Based on Trapezoidal Intuitionistic Fuzzy Cooperative Games.

    PubMed

    Liu, Jiacai; Zhao, Wenjian

    2016-11-08

    There exist some fuzziness and uncertainty in the process of ecological construction. The aim of this paper is to develop a direct and an effective simplified method for obtaining the cost-sharing scheme when some interested parties form a cooperative coalition to improve the ecological environment of Min River together. Firstly, we propose the solution concept of the least square prenucleolus of cooperative games with coalition values expressed by trapezoidal intuitionistic fuzzy numbers. Then, based on the square of the distance in the numerical value between two trapezoidal intuitionistic fuzzy numbers, we establish a corresponding quadratic programming model to obtain the least square prenucleolus, which can effectively avoid the information distortion and uncertainty enlargement brought about by the subtraction of trapezoidal intuitionistic fuzzy numbers. Finally, we give a numerical example about the cost-sharing of ecological construction in Fujian Province in China to show the validity, applicability, and advantages of the proposed model and method.

  16. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    NASA Astrophysics Data System (ADS)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.

  17. Cost and logistics for implementing the American College of Surgeons objective structured clinical examination.

    PubMed

    Sudan, Ranjan; Clark, Philip; Henry, Brandon

    2015-01-01

    The American College of Surgeons has developed a reliable and valid OSCE (objective structured clinical examination) to assess the clinical skills of incoming postgraduate year 1 surgery residents, but the cost and logistics of implementation have not been described. Fixed costs included staff time, medical supplies, facility fee, standardized patient (SP) training time, and one OSCE session. Variable costs were incurred for additional OSCE sessions. Costs per resident were calculated and modeled for increasing the number of test takers. American College of Surgeons OSCE materials and examination facilities were free. Fixed costs included training 11 SPs for 4 hours ($1,540), moulage and simulation material ($469), and administrative effort for 44 hours ($2,200). Variable cost for each session was $1,540 (SP time). Total cost for the first session was $6,649 ($664/resident), decreased to $324/resident for 3 sessions, and projected to further decline to $239/resident for 6 sessions. The cost decreased as the number of residents tested increased. To manage costs, testing more trainees by regional collaboration is recommended. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Cost and efficacy comparison of in vitro fertilization and tubal anastomosis for women after tubal ligation

    PubMed Central

    Messinger, Lauren B.; Alford, Connie E.; Csokmay, John M.; Henne, Melinda B.; Mumford, Sunni L.; Segars, James H.; Armstrong, Alicia Y.

    2016-01-01

    Objective To compare cost and efficacy of tubal anastomosis to in vitro fertilization (IVF) in women who desired fertility after a tubal ligation. Design Cost-effectiveness analysis. Setting Not applicable. Patient(s) Not applicable. Intervention(s) Not applicable. Main Outcome Measure(s) Cost per ongoing pregnancy. Result(s) Cost per ongoing pregnancy for women after tubal anastomosis ranged from $16,446 to $223,482 (2014 USD), whereas IVF ranged from $32,902 to $111,679 (2014 USD). Across maternal age groups <35 and 35–40, years tubal anastomosis was more cost effective than IVF for ongoing pregnancy. Sensitivity analyses validated these findings across a wide range of ongoing pregnancy probabilities as well as costs per procedure. Conclusion(s) Tubal anastomosis was the most cost-effective approach for most women less than 41 years of age, whereas IVF was the most cost-effective approach for women aged ≥41 years who desired fertility after tubal ligation. A model was created that can be modified based on cost and success rates in individual clinics for improved patient counseling. PMID:26006734

  19. Cost and efficacy comparison of in vitro fertilization and tubal anastomosis for women after tubal ligation.

    PubMed

    Messinger, Lauren B; Alford, Connie E; Csokmay, John M; Henne, Melinda B; Mumford, Sunni L; Segars, James H; Armstrong, Alicia Y

    2015-07-01

    To compare cost and efficacy of tubal anastomosis to in vitro fertilization (IVF) in women who desired fertility after a tubal ligation. Cost-effectiveness analysis. Not applicable. Not applicable. Not applicable. Cost per ongoing pregnancy. Cost per ongoing pregnancy for women after tubal anastomosis ranged from $16,446 to $223,482 (2014 USD), whereas IVF ranged from $32,902 to $111,679 (2014 USD). Across maternal age groups <35 and 35-40, years tubal anastomosis was more cost effective than IVF for ongoing pregnancy. Sensitivity analyses validated these findings across a wide range of ongoing pregnancy probabilities as well as costs per procedure. Tubal anastomosis was the most cost-effective approach for most women less than 41 years of age, whereas IVF was the most cost-effective approach for women aged ≥41 years who desired fertility after tubal ligation. A model was created that can be modified based on cost and success rates in individual clinics for improved patient counseling. Copyright © 2015 American Society for Reproductive Medicine. All rights reserved.

  20. Point-of-care CD4 testing to inform selection of antiretroviral medications in south african antenatal clinics: a cost-effectiveness analysis.

    PubMed

    Ciaranello, Andrea L; Myer, Landon; Kelly, Kathleen; Christensen, Sarah; Daskilewicz, Kristen; Doherty, Katie; Bekker, Linda-Gail; Hou, Taige; Wood, Robin; Francke, Jordan A; Wools-Kaloustian, Kara; Freedberg, Kenneth A; Walensky, Rochelle P

    2015-01-01

    Many prevention of mother-to-child HIV transmission (PMTCT) programs currently prioritize antiretroviral therapy (ART) for women with advanced HIV. Point-of-care (POC) CD4 assays may expedite the selection of three-drug ART instead of zidovudine, but are costlier than traditional laboratory assays. We used validated models of HIV infection to simulate pregnant, HIV-infected women (mean age 26 years, gestational age 26 weeks) in a general antenatal clinic in South Africa, and their infants. We examined two strategies for CD4 testing after HIV diagnosis: laboratory (test rate: 96%, result-return rate: 87%, cost: $14) and POC (test rate: 99%, result-return rate: 95%, cost: $26). We modeled South African PMTCT guidelines during the study period (WHO "Option A"): antenatal zidovudine (CD4 ≤350/μL) or ART (CD4>350/μL). Outcomes included MTCT risk at weaning (age 6 months), maternal and pediatric life expectancy (LE), maternal and pediatric lifetime healthcare costs (2013 USD), and cost-effectiveness ($/life-year saved). In the base case, laboratory led to projected MTCT risks of 5.7%, undiscounted pediatric LE of 53.2 years, and undiscounted PMTCT plus pediatric lifetime costs of $1,070/infant. POC led to lower modeled MTCT risk (5.3%), greater pediatric LE (53.4 years) and lower PMTCT plus pediatric lifetime costs ($1,040/infant). Maternal outcomes following laboratory were similar to POC (LE: 21.2 years; lifetime costs: $23,860/person). Compared to laboratory, POC improved clinical outcomes and reduced healthcare costs. In antenatal clinics implementing Option A, the higher initial cost of a one-time POC CD4 assay will be offset by cost-savings from prevention of pediatric HIV infection.

  1. Cost-effectiveness of oral ibandronate compared with intravenous (i.v.) zoledronic acid or i.v. generic pamidronate in breast cancer patients with metastatic bone disease undergoing i.v. chemotherapy.

    PubMed

    De Cock, E; Hutton, J; Canney, P; Body, J J; Barrett-Lee, P; Neary, M P; Lewis, G

    2005-12-01

    Ibandronate is the first third-generation bisphosphonate to have both oral and intravenous (i.v.) efficacy. An incremental cost-effectiveness model compared oral ibandronate with i.v. zoledronic acid and i.v. generic pamidronate in female breast cancer patients with metastatic bone disease, undergoing i.v. chemotherapy. A global economic model was adapted to the UK National Health Service (NHS), with primary outcomes of direct healthcare costs and quality-adjusted life years (QALYs). Efficacy, measured as relative risk reduction of skeletal-related events (SREs), was obtained from clinical trials. Resource use data for i.v. bisphosphonates and the cost of managing SREs were obtained from published studies. Hospital management and SRE treatment costs were taken from unit cost databases. Monthly drug acquisition costs were obtained from the British National Formulary. Utility scores were applied to time with/without an SRE to adjust survival for quality of life. Model design and inputs were validated through expert UK clinician review. Total cost, including drug acquisition, was pound 386 less per patient with oral ibandronate vs. i.v. zoledronic acid and pound 224 less vs. i.v. generic pamidronate. Oral ibandronate gained 0.019 and 0.02 QALYs vs. i.v. zoledronic acid and i.v. pamidronate, respectively, making it the economically dominant option. At a threshold of pound 30,000 per QALY, oral ibandronate was cost-effective vs. zoledronic acid in 85% of simulations and vs. pamidronate in 79%. Oral ibandronate is a cost-effective treatment for metastatic bone disease from breast cancer due to reduced SREs, bone pain, and cost savings from avoidance of resource use commonly associated with bisphosphonate infusions.

  2. Indacaterol/glycopyrronium is cost-effective compared to salmeterol/fluticasone in COPD: FLAME-based modelling in a Swedish population.

    PubMed

    Bjermer, Leif; van Boven, Job F M; Costa-Scharplatz, Madlaina; Keininger, Dorothy L; Gutzwiller, Florian S; Lisspers, Karin; Mahon, Ronan; Olsson, Petter; Roche, Nicolas

    2017-12-11

    This study assessed the cost-effectiveness of indacaterol/glycopyrronium (IND/GLY) versus salmeterol/fluticasone (SFC) in chronic obstructive pulmonary disease (COPD) patients with moderate to very severe airflow limitation and ≥1 exacerbation in the preceding year. A previously published and validated patient-level simulation model was adapted using clinical data from the FLAME trial and real-world cost data from the ARCTIC study. Costs (total monetary costs comprising drug, maintenance, exacerbation, and pneumonia costs) and health outcomes (life-years (LYs), quality-adjusted life-years (QALYs)) were projected over various time horizons (1, 5, 10 years, and lifetime) from the Swedish payer's perspective and were discounted at 3% annually. Uncertainty in model input values was studied through one-way and probabilistic sensitivity analyses. Subgroup analyses were also performed. IND/GLY was associated with lower costs and better outcomes compared with SFC over all the analysed time horizons. Use of IND/GLY resulted in additional 0.192 LYs and 0.134 QALYs with cost savings of €1211 compared with SFC over lifetime. The net monetary benefit (NMB) was estimated to be €8560 based on a willingness-to-pay threshold of €55,000/QALY. The NMB was higher in the following subgroups: severe (GOLD 3), high risk and more symptoms (GOLD D), females, and current smokers. IND/GLY is a cost-effective treatment compared with SFC in COPD patients with mMRC dyspnea grade ≥ 2, moderate to very severe airflow limitation, and ≥1 exacerbation in the preceding year.

  3. Application of multivariate probabilistic (Bayesian) networks to substance use disorder risk stratification and cost estimation.

    PubMed

    Weinstein, Lawrence; Radano, Todd A; Jack, Timothy; Kalina, Philip; Eberhardt, John S

    2009-09-16

    This paper explores the use of machine learning and Bayesian classification models to develop broadly applicable risk stratification models to guide disease management of health plan enrollees with substance use disorder (SUD). While the high costs and morbidities associated with SUD are understood by payers, who manage it through utilization review, acute interventions, coverage and cost limitations, and disease management, the literature shows mixed results for these modalities in improving patient outcomes and controlling cost. Our objective is to evaluate the potential of data mining methods to identify novel risk factors for chronic disease and stratification of enrollee utilization, which can be used to develop new methods for targeting disease management services to maximize benefits to both enrollees and payers. For our evaluation, we used DecisionQ machine learning algorithms to build Bayesian network models of a representative sample of data licensed from Thomson-Reuters' MarketScan consisting of 185,322 enrollees with three full-year claim records. Data sets were prepared, and a stepwise learning process was used to train a series of Bayesian belief networks (BBNs). The BBNs were validated using a 10 percent holdout set. The networks were highly predictive, with the risk-stratification BBNs producing area under the curve (AUC) for SUD positive of 0.948 (95 percent confidence interval [CI], 0.944-0.951) and 0.736 (95 percent CI, 0.721-0.752), respectively, and SUD negative of 0.951 (95 percent CI, 0.947-0.954) and 0.738 (95 percent CI, 0.727-0.750), respectively. The cost estimation models produced area under the curve ranging from 0.72 (95 percent CI, 0.708-0.731) to 0.961 (95 percent CI, 0.95-0.971). We were able to successfully model a large, heterogeneous population of commercial enrollees, applying state-of-the-art machine learning technology to develop complex and accurate multivariate models that support near-real-time scoring of novel payer populations based on historic claims and diagnostic data. Initial validation results indicate that we can stratify enrollees with SUD diagnoses into different cost categories with a high degree of sensitivity and specificity, and the most challenging issue becomes one of policy. Due to the social stigma associated with the disease and ethical issues pertaining to access to care and individual versus societal benefit, a thoughtful dialogue needs to occur about the appropriate way to implement these technologies.

  4. Evaluating the effectiveness of management practices on hydrology and water quality at watershed scale with a rainfall-runoff model.

    PubMed

    Liu, Yaoze; Bralts, Vincent F; Engel, Bernard A

    2015-04-01

    The adverse influence of urban development on hydrology and water quality can be reduced by applying best management practices (BMPs) and low impact development (LID) practices. This study applied green roof, rain barrel/cistern, bioretention system, porous pavement, permeable patio, grass strip, grassed swale, wetland channel, retention pond, detention basin, and wetland basin, on Crooked Creek watershed. The model was calibrated and validated for annual runoff volume. A framework for simulating BMPs and LID practices at watershed scales was created, and the impacts of BMPs and LID practices on water quantity and water quality were evaluated with the Long-Term Hydrologic Impact Assessment-Low Impact Development 2.1 (L-THIA-LID 2.1) model for 16 scenarios. The various levels and combinations of BMPs/LID practices reduced runoff volume by 0 to 26.47%, Total Nitrogen (TN) by 0.30 to 34.20%, Total Phosphorus (TP) by 0.27 to 47.41%, Total Suspended Solids (TSS) by 0.33 to 53.59%, Lead (Pb) by 0.30 to 60.98%, Biochemical Oxygen Demand (BOD) by 0 to 26.70%, and Chemical Oxygen Demand (COD) by 0 to 27.52%. The implementation of grass strips in 25% of the watershed where this practice could be applied was the most cost-efficient scenario, with cost per unit reduction of $1m3/yr for runoff, while cost for reductions of two pollutants of concern was $445 kg/yr for Total Nitrogen (TN) and $4871 kg/yr for Total Phosphorous (TP). The scenario with very high levels of BMP and LID practice adoption (scenario 15) reduced runoff volume and pollutant loads from 26.47% to 60.98%, and provided the greatest reduction in runoff volume and pollutant loads among all scenarios. However, this scenario was not as cost-efficient as most other scenarios. The L-THIA-LID 2.1 model is a valid tool that can be applied to various locations to help identify cost effective BMP/LID practice plans at watershed scales. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Ngram time series model to predict activity type and energy cost from wrist, hip and ankle accelerometers: implications of age

    PubMed Central

    Strath, Scott J; Kate, Rohit J; Keenan, Kevin G; Welch, Whitney A; Swartz, Ann M

    2016-01-01

    To develop and test time series single site and multi-site placement models, we used wrist, hip and ankle processed accelerometer data to estimate energy cost and type of physical activity in adults. Ninety-nine subjects in three age groups (18–39, 40–64, 65 + years) performed 11 activities while wearing three triaxial accelereometers: one each on the non-dominant wrist, hip, and ankle. During each activity net oxygen cost (METs) was assessed. The time series of accelerometer signals were represented in terms of uniformly discretized values called bins. Support Vector Machine was used for activity classification with bins and every pair of bins used as features. Bagged decision tree regression was used for net metabolic cost prediction. To evaluate model performance we employed the jackknife leave-one-out cross validation method. Single accelerometer and multi-accelerometer site model estimates across and within age group revealed similar accuracy, with a bias range of −0.03 to 0.01 METs, bias percent of −0.8 to 0.3%, and a rMSE range of 0.81–1.04 METs. Multi-site accelerometer location models improved activity type classification over single site location models from a low of 69.3% to a maximum of 92.8% accuracy. For each accelerometer site location model, or combined site location model, percent accuracy classification decreased as a function of age group, or when young age groups models were generalized to older age groups. Specific age group models on average performed better than when all age groups were combined. A time series computation show promising results for predicting energy cost and activity type. Differences in prediction across age group, a lack of generalizability across age groups, and that age group specific models perform better than when all ages are combined needs to be considered as analytic calibration procedures to detect energy cost and type are further developed. PMID:26449155

  6. The Prime Diabetes Model: Novel Methods for Estimating Long-Term Clinical and Cost Outcomes in Type 1 Diabetes Mellitus.

    PubMed

    Valentine, William J; Pollock, Richard F; Saunders, Rhodri; Bae, Jay; Norrbacka, Kirsi; Boye, Kristina

    Recent publications describing long-term follow-up from landmark trials and diabetes registries represent an opportunity to revisit modeling options in type 1 diabetes mellitus (T1DM). To develop a new product-independent model capable of predicting long-term clinical and cost outcomes. After a systematic literature review to identify clinical trial and registry data, a model was developed (the PRIME Diabetes Model) to simulate T1DM progression and complication onset. The model runs as a patient-level simulation, making use of covariance matrices for cohort generation and risk factor progression, and simulating myocardial infarction, stroke, angina, heart failure, nephropathy, retinopathy, macular edema, neuropathy, amputation, hypoglycemia, ketoacidosis, mortality, and risk factor evolution. Several approaches novel to T1DM modeling were used, including patient characteristics and risk factor covariance, a glycated hemoglobin progression model derived from patient-level data, and model averaging approaches to evaluate complication risk. Validation analyses comparing modeled outcomes with published studies demonstrated that the PRIME Diabetes Model projects long-term patient outcomes consistent with those reported for a number of long-term studies. Macrovascular end points were reliably reproduced across five different populations and microvascular complication risk was accurately predicted on the basis of comparisons with landmark studies and published registry data. The PRIME Diabetes Model is product-independent, available online, and has been developed in line with good practice guidelines. Validation has indicated that outcomes from long-term studies can be reliably reproduced. The model offers new approaches to long-standing challenges in diabetes modeling and may become a valuable tool for informing health care policy. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. Automation life-cycle cost model

    NASA Technical Reports Server (NTRS)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  8. The cost-effectiveness of iodine 131 scintigraphy, ultrasonography, and fine-needle aspiration biopsy in the initial diagnosis of solitary thyroid nodules.

    PubMed

    Khalid, Ayesha N; Hollenbeak, Christopher S; Quraishi, Sadeq A; Fan, Chris Y; Stack, Brendan C

    2006-03-01

    To compare the cost-effectiveness of fine-needle aspiration biopsy, iodine 131 scintigraphy, and ultrasonography for the initial diagnostic workup of a solitary palpable thyroid nodule. A deterministic cost-effectiveness analysis was conducted using a decision tree to model the diagnostic strategies. A single, mid-Atlantic academic medical center. Expected costs, expected number of cases correctly diagnosed, and incremental cost per additional case correctly diagnosed. Relative to the routine use of fine-needle aspiration biopsy, the incremental cost per case correctly diagnosed is 24,554 dollars for the iodine 131 scintigraphy strategy and 1212 dollars for the ultrasound strategy. A diagnostic strategy using initial fine-needle aspiration biopsy for palpable thyroid nodules was found to be cost-effective compared with the other approaches as long as a payor's willingness to pay for an additional correct diagnosis is less than 1212 dollars. Prospective studies are needed to validate these finding in clinical practice.

  9. Status of costing hospital nursing work within Australian casemix activity-based funding policy.

    PubMed

    Heslop, Liza

    2012-02-01

    Australia has a long history of patient level costing initiated when casemix funding was implemented in several states in the early 1990s. Australia includes, to some extent, hospital payment based on nursing intensity adopted within casemix funding policy and the Diagnostic Related Group system. Costing of hospital nursing services in Australia has not changed significantly in the last few decades despite widespread introduction of casemix funding policy at the state level. Recent Commonwealth of Australia National Health Reform presents change to the management of the delivery of health care including health-care costing. There is agreement for all Australian jurisdictions to progress to casemix-based activity funding. Within this context, nurse costing infrastructure presents contemporary issues and challenges. An assessment is made of the progress of costing nursing services within casemix funding models in Australian hospitals. Valid and reliable Australian-refined nursing service weights might overcome present cost deficiencies and limitations. © 2012 Blackwell Publishing Asia Pty Ltd.

  10. Pediatric Specialty Care Model for Management of Chronic Respiratory Failure: Cost and Savings Implications and Misalignment With Payment Models.

    PubMed

    Graham, Robert J; McManus, Michael L; Rodday, Angie Mae; Weidner, Ruth Ann; Parsons, Susan K

    2018-05-01

    To describe program design, costs, and savings implications of a critical care-based care coordination model for medically complex children with chronic respiratory failure. All program activities and resultant clinical outcomes were tracked over 4 years using an adapted version of the Care Coordination Measurement Tool. Patient characteristics, program activity, and acute care resource utilization were prospectively documented in the adapted version of the Care Coordination Measurement Tool and retrospectively cross-validated with hospital billing data. Impact on total costs of care was then estimated based on program outcomes and nationally representative administrative data. Tertiary children's hospital. Critical Care, Anesthesia, Perioperative Extension and Home Ventilation Program enrollees. None. The program provided care for 346 patients and families over the study period. Median age at enrollment was 6 years with more than half deriving secondary respiratory failure from a primary neuromuscular disease. There were 11,960 encounters over the study period, including 1,202 home visits, 673 clinic visits, and 4,970 telephone or telemedicine encounters. Half (n = 5,853) of all encounters involved a physician and 45% included at least one care coordination activity. Overall, we estimated that program interventions were responsible for averting 556 emergency department visits and 107 hospitalizations. Conservative monetization of these alone accounted for annual savings of $1.2-2 million or $407/pt/mo net of program costs. Innovative models, such as extension of critical care services, for high-risk, high-cost patients can result in immediate cost savings. Evaluation of financial implications of comprehensive care for high-risk patients is necessary to complement clinical and patient-centered outcomes for alternative care models. When year-to-year cost variability is high and cost persistence is low, these savings can be estimated from documentation within care coordination management tools. Means of financial sustainability, scalability, and equal access of such care models need to be established.

  11. Formal specification and design techniques for wireless sensor and actuator networks.

    PubMed

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system.

  12. Risk Prediction Score for HIV Infection: Development and Internal Validation with Cross-Sectional Data from Men Who Have Sex with Men in China.

    PubMed

    Yin, Lu; Zhao, Yuejuan; Peratikos, Meridith Blevins; Song, Liang; Zhang, Xiangjun; Xin, Ruolei; Sun, Zheya; Xu, Yunan; Zhang, Li; Hu, Yifei; Hao, Chun; Ruan, Yuhua; Shao, Yiming; Vermund, Sten H; Qian, Han-Zhu

    2018-05-21

    Receptive anal intercourse, multiple partners, condomless sex, sexually transmitted infections (STIs), and drug/alcohol addiction are familiar factors that correlate with increased human immunodeficiency virus (HIV) risk among men who have sex with men (MSM). To improve estimation to HIV acquisition, we created a composite score using questions from routine survey of 3588 MSM in Beijing, China. The HIV prevalence was 13.4%. A risk scoring tool using penalized maximum likelihood multivariable logistic regression modeling was developed, deploying backward step-down variable selection to obtain a reduced-form model. The full penalized model included 19 sexual predictors, while the reduced-form model had 12 predictors. Both models calibrated well; bootstrap-corrected c-indices were 0.70 (full model) and 0.71 (reduced-form model). Non-Beijing residence, short-term living in Beijing, illegal drug use, multiple male sexual partners, receptive anal sex, inconsistent condom use, alcohol consumption before sex, and syphilis infection were the strongest predictors of HIV infection. Discriminating higher-risk MSM for targeted HIV prevention programming using a validated risk score could improve the efficiency of resource deployment for educational and risk reduction programs. A valid risk score can also identify higher risk persons into prevention and vaccine clinical trials, which would improve trial cost-efficiency.

  13. Structural Design Optimization of Doubly-Fed Induction Generators Using GeneratorSE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethuraman, Latha; Fingersh, Lee J; Dykes, Katherine L

    2017-11-13

    A wind turbine with a larger rotor swept area can generate more electricity, however, this increases costs disproportionately for manufacturing, transportation, and installation. This poster presents analytical models for optimizing doubly-fed induction generators (DFIGs), with the objective of reducing the costs and mass of wind turbine drivetrains. The structural design for the induction machine includes models for the casing, stator, rotor, and high-speed shaft developed within the DFIG module in the National Renewable Energy Laboratory's wind turbine sizing tool, GeneratorSE. The mechanical integrity of the machine is verified by examining stresses, structural deflections, and modal properties. The optimization results aremore » then validated using finite element analysis (FEA). The results suggest that our analytical model correlates with the FEA in some areas, such as radial deflection, differing by less than 20 percent. But the analytical model requires further development for axial deflections, torsional deflections, and stress calculations.« less

  14. Effects of human running cadence and experimental validation of the bouncing ball model

    NASA Astrophysics Data System (ADS)

    Bencsik, László; Zelei, Ambrus

    2017-05-01

    The biomechanical analysis of human running is a complex problem, because of the large number of parameters and degrees of freedom. However, simplified models can be constructed, which are usually characterized by some fundamental parameters, like step length, foot strike pattern and cadence. The bouncing ball model of human running is analysed theoretically and experimentally in this work. It is a minimally complex dynamic model when the aim is to estimate the energy cost of running and the tendency of ground-foot impact intensity as a function of cadence. The model shows that cadence has a direct effect on energy efficiency of running and ground-foot impact intensity. Furthermore, it shows that higher cadence implies lower risk of injury and better energy efficiency. An experimental data collection of 121 amateur runners is presented. The experimental results validate the model and provides information about the walk-to-run transition speed and the typical development of cadence and grounded phase ratio in different running speed ranges.

  15. Testing the convergent validity of the contingent valuation and travel cost methods in valuing the benefits of health care.

    PubMed

    Clarke, Philip M

    2002-03-01

    In this study, the convergent validity of the contingent valuation method (CVM) and travel cost method (TCM) is tested by comparing estimates of the willingness to pay (WTP) for improving access to mammographic screening in rural areas of Australia. It is based on a telephone survey of 458 women in 19 towns, in which they were asked about their recent screening behaviour and their WTP to have a mobile screening unit visit their nearest town. After eliminating missing data and other non-usable responses the contingent valuation experiment and travel cost model were based on information from 372 and 319 women, respectively. Estimates of the maximum WTP for the use of mobile screening units were derived using both methods and compared. The highest mean WTP estimated using the TCM was $83.10 (95% C.I. $99.06-$68.53), which is significantly less than the estimate of $148.09 ($131.13-$166.60) using the CVM. This could be due to the CVM estimates also reflecting non-use values such as altruism, or a range of potential biases that are known to affect both methods. Further tests of validity are required in order to gain a greater understanding of the relationship between these two methods of estimating WTP. Copyright 2001 John Wiley & Sons, Ltd.

  16. Design and validation of low-cost assistive glove for hand assessment and therapy during activity of daily living-focused robotic stroke therapy.

    PubMed

    Nathan, Dominic E; Johnson, Michelle J; McGuire, John R

    2009-01-01

    Hand and arm impairment is common after stroke. Robotic stroke therapy will be more effective if hand and upper-arm training is integrated to help users practice reaching and grasping tasks. This article presents the design, development, and validation of a low-cost, functional electrical stimulation grasp-assistive glove for use with task-oriented robotic stroke therapy. Our glove measures grasp aperture while a user completes simple-to-complex real-life activities, and when combined with an integrated functional electrical stimulator, it assists in hand opening and closing. A key function is a new grasp-aperture prediction model, which uses the position of the end-effectors of two planar robots to define the distance between the thumb and index finger. We validated the accuracy and repeatability of the glove and its capability to assist in grasping. Results from five nondisabled subjects indicated that the glove is accurate and repeatable for both static hand-open and -closed tasks when compared with goniometric measures and for dynamic reach-to-grasp tasks when compared with motion analysis measures. Results from five subjects with stroke showed that with the glove, they could open their hands but without it could not. We present a glove that is a low-cost solution for in vivo grasp measurement and assistance.

  17. Stochastic Hourly Weather Generator HOWGH: Validation and its Use in Pest Modelling under Present and Future Climates

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Hirschi, M.; Spirig, C.

    2014-12-01

    To quantify impact of the climate change on a specific pest (or any weather-dependent process) in a specific site, we may use a site-calibrated pest (or other) model and compare its outputs obtained with site-specific weather data representing present vs. perturbed climates. The input weather data may be produced by the stochastic weather generator. Apart from the quality of the pest model, the reliability of the results obtained in such experiment depend on an ability of the generator to represent the statistical structure of the real world weather series, and on the sensitivity of the pest model to possible imperfections of the generator. This contribution deals with the multivariate HOWGH weather generator, which is based on a combination of parametric and non-parametric statistical methods. Here, HOWGH is used to generate synthetic hourly series of three weather variables (solar radiation, temperature and precipitation) required by a dynamic pest model SOPRA to simulate the development of codling moth. The contribution presents results of the direct and indirect validation of HOWGH. In the direct validation, the synthetic series generated by HOWGH (various settings of its underlying model are assumed) are validated in terms of multiple climatic characteristics, focusing on the subdaily wet/dry and hot/cold spells. In the indirect validation, we assess the generator in terms of characteristics derived from the outputs of SOPRA model fed by the observed vs. synthetic series. The weather generator may be used to produce weather series representing present and future climates. In the latter case, the parameters of the generator may be modified by the climate change scenarios based on Global or Regional Climate Models. To demonstrate this feature, the results of codling moth simulations for future climate will be shown. Acknowledgements: The weather generator is developed and validated within the frame of projects WG4VALUE (project LD12029 sponsored by the Ministry of Education, Youth and Sports of CR), and VALUE (COST ES 1102 action).

  18. Modeling the economic impact of medication adherence in type 2 diabetes: a theoretical approach.

    PubMed

    Cobden, David S; Niessen, Louis W; Rutten, Frans Fh; Redekop, W Ken

    2010-09-07

    While strong correlations exist between medication adherence and health economic outcomes in type 2 diabetes, current economic analyses do not adequately consider them. We propose a new approach to incorporate adherence in cost-effectiveness analysis. We describe a theoretical approach to incorporating the effect of adherence when estimating the long-term costs and effectiveness of an antidiabetic medication. This approach was applied in a Markov model which includes common diabetic health states. We compared two treatments using hypothetical patient cohorts: injectable insulin (IDM) and oral (OAD) medications. Two analyses were performed, one which ignored adherence (analysis 1) and one which incorporated it (analysis 2). Results from the two analyses were then compared to explore the extent to which adherence may impact incremental cost-effectiveness ratios. In both analyses, IDM was more costly and more effective than OAD. When adherence was ignored, IDM generated an incremental cost-effectiveness of $12,097 per quality-adjusted life-year (QALY) gained versus OAD. Incorporation of adherence resulted in a slightly higher ratio ($16,241/QALY). This increase was primarily due to better adherence with OAD than with IDM, and the higher direct medical costs for IDM. Incorporating medication adherence into economic analyses can meaningfully influence the estimated cost-effectiveness of type 2 diabetes treatments, and should therefore be considered in health care decision-making. Future work on the impact of adherence on health economic outcomes, and validation of different approaches to modeling adherence, is warranted.

  19. Application of multi-objective optimization to pooled experiments of next generation sequencing for detection of rare mutations.

    PubMed

    Zilinskas, Julius; Lančinskas, Algirdas; Guarracino, Mario Rosario

    2014-01-01

    In this paper we propose some mathematical models to plan a Next Generation Sequencing experiment to detect rare mutations in pools of patients. A mathematical optimization problem is formulated for optimal pooling, with respect to minimization of the experiment cost. Then, two different strategies to replicate patients in pools are proposed, which have the advantage to decrease the overall costs. Finally, a multi-objective optimization formulation is proposed, where the trade-off between the probability to detect a mutation and overall costs is taken into account. The proposed solutions are devised in pursuance of the following advantages: (i) the solution guarantees mutations are detectable in the experimental setting, and (ii) the cost of the NGS experiment and its biological validation using Sanger sequencing is minimized. Simulations show replicating pools can decrease overall experimental cost, thus making pooling an interesting option.

  20. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  1. Implementation of fuzzy logic to determining selling price of products in a local corporate chain store

    NASA Astrophysics Data System (ADS)

    Kristiana, S. P. D.

    2017-12-01

    Corporate chain store is one type of retail industries companies that are developing growing rapidly in Indonesia. The competition between retail companies is very tight, so retailer companies should evaluate its performance continuously in order to survive. The selling price of products is one of the essential attributes and gets attention of many consumers where it’s used to evaluate the performance of the industry. This research aimed to determine optimal selling price of product with considering cost factors, namely purchase price of the product from supplier, holding costs, and transportation costs. Fuzzy logic approach is used in data processing with MATLAB software. Fuzzy logic is selected to solve the problem because this method can consider complexities factors. The result is a model of determination of the optimal selling price by considering three cost factors as inputs in the model. Calculating MAPE and model prediction ability for some products are used as validation and verification where the average value is 0.0525 for MAPE and 94.75% for prediction ability. The conclusion is this model can predict the selling price of up to 94.75%, so it can be used as tools for the corporate chain store in particular to determine the optimal selling price for its products.

  2. The Interrelationship between Promoter Strength, Gene Expression, and Growth Rate

    PubMed Central

    Klesmith, Justin R.; Detwiler, Emily E.; Tomek, Kyle J.; Whitehead, Timothy A.

    2014-01-01

    In exponentially growing bacteria, expression of heterologous protein impedes cellular growth rates. Quantitative understanding of the relationship between expression and growth rate will advance our ability to forward engineer bacteria, important for metabolic engineering and synthetic biology applications. Recently, a work described a scaling model based on optimal allocation of ribosomes for protein translation. This model quantitatively predicts a linear relationship between microbial growth rate and heterologous protein expression with no free parameters. With the aim of validating this model, we have rigorously quantified the fitness cost of gene expression by using a library of synthetic constitutive promoters to drive expression of two separate proteins (eGFP and amiE) in E. coli in different strains and growth media. In all cases, we demonstrate that the fitness cost is consistent with the previous findings. We expand upon the previous theory by introducing a simple promoter activity model to quantitatively predict how basal promoter strength relates to growth rate and protein expression. We then estimate the amount of protein expression needed to support high flux through a heterologous metabolic pathway and predict the sizable fitness cost associated with enzyme production. This work has broad implications across applied biological sciences because it allows for prediction of the interplay between promoter strength, protein expression, and the resulting cost to microbial growth rates. PMID:25286161

  3. Including crystal structure attributes in machine learning models of formation energies via Voronoi tessellations

    NASA Astrophysics Data System (ADS)

    Ward, Logan; Liu, Ruoqian; Krishna, Amar; Hegde, Vinay I.; Agrawal, Ankit; Choudhary, Alok; Wolverton, Chris

    2017-07-01

    While high-throughput density functional theory (DFT) has become a prevalent tool for materials discovery, it is limited by the relatively large computational cost. In this paper, we explore using DFT data from high-throughput calculations to create faster, surrogate models with machine learning (ML) that can be used to guide new searches. Our method works by using decision tree models to map DFT-calculated formation enthalpies to a set of attributes consisting of two distinct types: (i) composition-dependent attributes of elemental properties (as have been used in previous ML models of DFT formation energies), combined with (ii) attributes derived from the Voronoi tessellation of the compound's crystal structure. The ML models created using this method have half the cross-validation error and similar training and evaluation speeds to models created with the Coulomb matrix and partial radial distribution function methods. For a dataset of 435 000 formation energies taken from the Open Quantum Materials Database (OQMD), our model achieves a mean absolute error of 80 meV/atom in cross validation, which is lower than the approximate error between DFT-computed and experimentally measured formation enthalpies and below 15% of the mean absolute deviation of the training set. We also demonstrate that our method can accurately estimate the formation energy of materials outside of the training set and be used to identify materials with especially large formation enthalpies. We propose that our models can be used to accelerate the discovery of new materials by identifying the most promising materials to study with DFT at little additional computational cost.

  4. Fairness in optimizing bus-crew scheduling process.

    PubMed

    Ma, Jihui; Song, Cuiying; Ceder, Avishai Avi; Liu, Tao; Guan, Wei

    2017-01-01

    This work proposes a model considering fairness in the problem of crew scheduling for bus drivers (CSP-BD) using a hybrid ant-colony optimization (HACO) algorithm to solve it. The main contributions of this work are the following: (a) a valid approach for cases with a special cost structure and constraints considering the fairness of working time and idle time; (b) an improved algorithm incorporating Gamma heuristic function and selecting rules. The relationships of each cost are examined with ten bus lines collected from the Beijing Public Transport Holdings (Group) Co., Ltd., one of the largest bus transit companies in the world. It shows that unfair cost is indirectly related to common cost, fixed cost and extra cost and also the unfair cost approaches to common and fixed cost when its coefficient is twice of common cost coefficient. Furthermore, the longest time for the tested bus line with 1108 pieces, 74 blocks is less than 30 minutes. The results indicate that the HACO-based algorithm can be a feasible and efficient optimization technique for CSP-BD, especially with large scale problems.

  5. Cost-effectiveness simulation and analysis of colorectal cancer screening in Hong Kong Chinese population: comparison amongst colonoscopy, guaiac and immunologic fecal occult blood testing.

    PubMed

    Wong, Carlos K H; Lam, Cindy L K; Wan, Y F; Fong, Daniel Y T

    2015-10-15

    The aim of this study was to evaluate the cost-effectiveness of CRC screening strategies from the healthcare service provider perspective based on Chinese population. A Markov model was constructed to compare the cost-effectiveness of recommended screening strategies including annual/biennial guaiac fecal occult blood testing (G-FOBT), annual/biennial immunologic FOBT (I-FOBT), and colonoscopy every 10 years in Chinese aged 50 year over a 25-year period. External validity of model was tested against data retrieved from published randomized controlled trials of G-FOBT. Recourse use data collected from Chinese subjects among staging of colorectal neoplasm were combined with published unit cost data ($USD in 2009 price values) to estimate a stage-specific cost per patient. Quality-adjusted life-years (QALYs) were quantified based on the stage duration and SF-6D preference-based value of each stage. The cost-effectiveness outcome was the incremental cost-effectiveness ratio (ICER) represented by costs per life-years (LY) and costs per QALYs gained. In base-case scenario, the non-dominated strategies were annual and biennial I-FOBT. Compared with no screening, the ICER presented $20,542/LYs and $3155/QALYs gained for annual I-FOBT, and $19,838/LYs gained and $2976/QALYs gained for biennial I-FOBT. The optimal screening strategy was annual I-FOBT that attained the highest ICER at the threshold of $50,000 per LYs or QALYs gained. The Markov model informed the health policymakers that I-FOBT every year may be the most effective and cost-effective CRC screening strategy among recommended screening strategies, depending on the willingness-to-pay of mass screening for Chinese population. ClinicalTrials.gov Identifier NCT02038283.

  6. CoMET: Cost and Mass Evaluation Tool for Spacecraft and Mission Design

    NASA Technical Reports Server (NTRS)

    Bieber, Ben S.

    2005-01-01

    New technology in space exploration is often developed without a complete knowledge of its impact. While the immediate benefits of a new technology are obvious, it is harder to understand its indirect consequences, which ripple through the entire system. COMET is a technology evaluation tool designed to illuminate how specific technology choices affect a mission at each system level. COMET uses simplified models for mass, power, and cost to analyze performance parameters of technologies of interest. The sensitivity analysis that CoMET provides shows whether developing a certain technology will greatly benefit the project or not. CoMET is an ongoing project approaching a web-based implementation phase. This year, development focused on the models for planetary daughter craft, such as atmospheric probes, blimps and balloons, and landers. These models are developed through research into historical data, well established rules of thumb, and engineering judgment of experts at JPL. The model is validated by corroboration with JpL advanced mission studies. Other enhancements to COMET include adding launch vehicle analysis and integrating an updated cost model. When completed, COMET will allow technological development to be focused on areas that will most drastically improve spacecraft performance.

  7. A financing model to solve financial barriers for implementing green building projects.

    PubMed

    Lee, Sanghyo; Lee, Baekrae; Kim, Juhyung; Kim, Jaejun

    2013-01-01

    Along with the growing interest in greenhouse gas reduction, the effect of greenhouse gas energy reduction from implementing green buildings is gaining attention. The government of the Republic of Korea has set green growth as its paradigm for national development, and there is a growing interest in energy saving for green buildings. However, green buildings may have financial barriers that have high initial construction costs and uncertainties about future project value. Under the circumstances, governmental support to attract private funding is necessary to implement green building projects. The objective of this study is to suggest a financing model for facilitating green building projects with a governmental guarantee based on Certified Emission Reduction (CER). In this model, the government provides a guarantee for the increased costs of a green building project in return for CER. And this study presents the validation of the model as well as feasibility for implementing green building project. In addition, the suggested model assumed governmental guarantees for the increased cost, but private guarantees seem to be feasible as well because of the promising value of the guarantee from CER. To do this, certification of Clean Development Mechanisms (CDMs) for green buildings must be obtained.

  8. A Financing Model to Solve Financial Barriers for Implementing Green Building Projects

    PubMed Central

    Lee, Baekrae; Kim, Juhyung; Kim, Jaejun

    2013-01-01

    Along with the growing interest in greenhouse gas reduction, the effect of greenhouse gas energy reduction from implementing green buildings is gaining attention. The government of the Republic of Korea has set green growth as its paradigm for national development, and there is a growing interest in energy saving for green buildings. However, green buildings may have financial barriers that have high initial construction costs and uncertainties about future project value. Under the circumstances, governmental support to attract private funding is necessary to implement green building projects. The objective of this study is to suggest a financing model for facilitating green building projects with a governmental guarantee based on Certified Emission Reduction (CER). In this model, the government provides a guarantee for the increased costs of a green building project in return for CER. And this study presents the validation of the model as well as feasibility for implementing green building project. In addition, the suggested model assumed governmental guarantees for the increased cost, but private guarantees seem to be feasible as well because of the promising value of the guarantee from CER. To do this, certification of Clean Development Mechanisms (CDMs) for green buildings must be obtained. PMID:24376379

  9. Validity and reliability of a low-cost digital dynamometer for measuring isometric strength of lower limb.

    PubMed

    Romero-Franco, Natalia; Jiménez-Reyes, Pedro; Montaño-Munuera, Juan A

    2017-11-01

    Lower limb isometric strength is a key parameter to monitor the training process or recognise muscle weakness and injury risk. However, valid and reliable methods to evaluate it often require high-cost tools. The aim of this study was to analyse the concurrent validity and reliability of a low-cost digital dynamometer for measuring isometric strength in lower limb. Eleven physically active and healthy participants performed maximal isometric strength for: flexion and extension of ankle, flexion and extension of knee, flexion, extension, adduction, abduction, internal and external rotation of hip. Data obtained by the digital dynamometer were compared with the isokinetic dynamometer to examine its concurrent validity. Data obtained by the digital dynamometer from 2 different evaluators and 2 different sessions were compared to examine its inter-rater and intra-rater reliability. Intra-class correlation (ICC) for validity was excellent in every movement (ICC > 0.9). Intra and inter-tester reliability was excellent for all the movements assessed (ICC > 0.75). The low-cost digital dynamometer demonstrated strong concurrent validity and excellent intra and inter-tester reliability for assessing isometric strength in the main lower limb movements.

  10. Modeling spatial segregation and travel cost influences on utilitarian walking: Towards policy intervention

    PubMed Central

    Yang, Yong; Auchincloss, Amy H.; Rodriguez, Daniel A.; Brown, Daniel G.; Riolo, Rick; Diez-Roux, Ana V.

    2015-01-01

    We develop an agent-based model of utilitarian walking and use the model to explore spatial and socioeconomic factors affecting adult utilitarian walking and how travel costs as well as various educational interventions aimed at changing attitudes can alter the prevalence of walking and income differentials in walking. The model is validated against US national data. We contrast realistic and extreme parameter values in our model and test effects of changing these parameters across various segregation and pricing scenarios while allowing for interactions between travel choice and place and for behavioral feedbacks. Results suggest that in addition to income differences in the perceived cost of time, the concentration of mixed land use (differential density of residences and businesses) are important determinants of income differences in walking (high income walk less), whereas safety from crime and income segregation on their own do not have large influences on income differences in walking. We also show the difficulty in altering walking behaviors for higher income groups who are insensitive to price and how adding to the cost of driving could increase the income differential in walking particularly in the context of segregation by income and land use. We show that strategies to decrease positive attitudes towards driving can interact synergistically with shifting cost structures to favor walking in increasing the percent of walking trips. Agent-based models, with their ability to capture dynamic processes and incorporate empirical data, are powerful tools to explore the influence on health behavior from multiple factors and test policy interventions. PMID:25733776

  11. Predicting Aspergillus fumigatus exposure from composting facilities using a dispersion model: A conditional calibration and validation.

    PubMed

    Douglas, Philippa; Tyrrel, Sean F; Kinnersley, Robert P; Whelan, Michael; Longhurst, Philip J; Hansell, Anna L; Walsh, Kerry; Pollard, Simon J T; Drew, Gillian H

    2017-01-01

    Bioaerosols are released in elevated quantities from composting facilities and are associated with negative health effects, although dose-response relationships are unclear. Exposure levels are difficult to quantify as established sampling methods are costly, time-consuming and current data provide limited temporal and spatial information. Confidence in dispersion model outputs in this context would be advantageous to provide a more detailed exposure assessment. We present the calibration and validation of a recognised atmospheric dispersion model (ADMS) for bioaerosol exposure assessments. The model was calibrated by a trial and error optimisation of observed Aspergillus fumigatus concentrations at different locations around a composting site. Validation was performed using a second dataset of measured concentrations for a different site. The best fit between modelled and measured data was achieved when emissions were represented as a single area source, with a temperature of 29°C. Predicted bioaerosol concentrations were within an order of magnitude of measured values (1000-10,000CFU/m 3 ) at the validation site, once minor adjustments were made to reflect local differences between the sites (r 2 >0.7 at 150, 300, 500 and 600m downwind of source). Results suggest that calibrated dispersion modelling can be applied to make reasonable predictions of bioaerosol exposures at multiple sites and may be used to inform site regulation and operational management. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  12. Impact of predictive model-directed end-of-life counseling for Medicare beneficiaries.

    PubMed

    Hamlet, Karen S; Hobgood, Adam; Hamar, Guy Brent; Dobbs, Angela C; Rula, Elizabeth Y; Pope, James E

    2010-05-01

    To validate a predictive model for identifying Medicare beneficiaries who need end-of-life care planning and to determine the impact on cost and hospice care of a telephonic counseling program utilizing this predictive model in 2 Medicare Health Support (MHS) pilots. Secondary analysis of data from 2 MHS pilot programs that used a randomized controlled design. A predictive model was developed using intervention group data (N = 43,497) to identify individuals at greatest risk of death. Model output guided delivery of a telephonic intervention designed to support educated end-of-life decisions and improve end-of-life provisions. Control group participants received usual care. As a primary outcome, Medicare costs in the last 6 months of life were compared between intervention group decedents (n = 3112) and control group decedents (n = 1630). Hospice admission rates and duration of hospice care were compared as secondary measures. The predictive model was highly accurate, and more than 80% of intervention group decedents were contacted during the 12 months before death. Average Medicare costs were $1913 lower for intervention group decedents compared with control group decedents in the last 6 months of life (P = .05), for a total savings of $5.95 million. There were no significant changes in hospice admissions or mean duration of hospice care. Telephonic end-of-life counseling provided as an ancillary Medicare service, guided by a predictive model, can reach a majority of individuals needing support and can reduce costs by facilitating voluntary election of less intensive care.

  13. Assessing health and economic outcomes of interventions to reduce pregnancy-related mortality in Nigeria.

    PubMed

    Erim, Daniel O; Resch, Stephen C; Goldie, Sue J

    2012-09-14

    Women in Nigeria face some of the highest maternal mortality risks in the world. We explore the benefits and cost-effectiveness of individual and integrated packages of interventions to prevent pregnancy-related deaths. We adapt a previously validated maternal mortality model to Nigeria. Model outcomes included clinical events, population measures, costs, and cost-effectiveness ratios. Separate models were adapted to Southwest and Northeast zones using survey-based data. Strategies consisted of improving coverage of effective interventions, and could include improved logistics. Increasing family planning was the most effective individual intervention to reduce pregnancy-related mortality, was cost saving in the Southwest zone and cost-effective elsewhere, and prevented nearly 1 in 5 abortion-related deaths. However, with a singular focus on family planning and safe abortion, mortality reduction would plateau below MDG 5. Strategies that could prevent 4 out of 5 maternal deaths included an integrated and stepwise approach that includes increased skilled deliveries, facility births, access to antenatal/postpartum care, improved recognition of referral need, transport, and availability quality of EmOC in addition to family planning and safe abortion. The economic benefits of these strategies ranged from being cost-saving to having incremental cost-effectiveness ratios less than $500 per YLS, well below Nigeria's per capita GDP. Early intensive efforts to improve family planning and control of fertility choices, accompanied by a stepwise effort to scale-up capacity for integrated maternal health services over several years, will save lives and provide equal or greater value than many public health interventions we consider among the most cost-effective (e.g., childhood immunization).

  14. NREL Begins On-Site Validation of Drivetrain Gearbox and Bearings | News |

    Science.gov Websites

    drivetrain failure often leads to higher-than-expected operations and maintenance costs. NREL researchers operations and maintenance costs for the wind industry. The validation is expected to last through the spring

  15. Cost-effectiveness of routine varicella vaccination using the measles, mumps, rubella and varicella vaccine in France: an economic analysis based on a dynamic transmission model for varicella and herpes zoster.

    PubMed

    Littlewood, Kavi J; Ouwens, Mario J N M; Sauboin, Christophe; Tehard, Bertrand; Alain, Sophie; Denis, François

    2015-04-01

    Each year in France, varicella and zoster affect large numbers of children and adults, resulting in medical visits, hospitalizations for varicella- and zoster-related complications, and societal costs. Disease prevention by varicella vaccination is feasible, wherein a plausible option involves replacing the combined measles, mumps, and rubella (MMR) vaccine with the combined MMR and varicella (MMRV) vaccine. This study aimed to: (1) assess the cost-effectiveness of adding routine varicella vaccination through MMRV, using different vaccination strategies in France; and (2) address key uncertainties, such as the economic consequences of breakthrough varicella cases, the waning of vaccine-conferred protection, vaccination coverage, and indirect costs. Based on the outputs of a dynamic transmission model that used data on epidemiology and costs from France, a cost-effectiveness model was built. A conservative approach was taken regarding the impact of varicella vaccination on zoster incidence by assuming the validity of the hypothesis of an age-specific boosting of immunity against varicella. The model determined that routine MMRV vaccination is expected to be a cost-effective option, considering a cost-effectiveness threshold of €20,000 per quality-adjusted life-year saved; routine vaccination was cost-saving from the societal perspective. Results were driven by a large decrease in varicella incidence despite a temporary initial increase in the number of zoster cases due to the assumption of exogenous boosting. In the scenario analyses, despite moderate changes in assumptions about incidence and costs, varicella vaccination remained a cost-effective option for France. Routine vaccination with MMRV was associated with high gains in quality-adjusted life-years, substantial reduction in the occurrences of varicella- and zoster-related complications, and few deaths due to varicella. Routine MMRV vaccination is also expected to provide reductions in costs related to hospitalizations, medication use, and general-practitioner visits, as well as indirect costs, and it is expected to be a cost-effective intervention in France (GSK study identifier: HO-12-6924). Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Cost-effectiveness of an HPV self-collection campaign in Uganda: comparing models for delivery of cervical cancer screening in a low-income setting

    PubMed Central

    Campos, Nicole G; Tsu, Vivien; Jeronimo, Jose; Njama-Meya, Denise; Mvundura, Mercy; Kim, Jane J

    2017-01-01

    Abstract With the availability of a low-cost HPV DNA test that can be administered by either a healthcare provider or a woman herself, programme planners require information on the costs and cost-effectiveness of implementing cervical cancer screening programmes in low-resource settings under different models of healthcare delivery. Using data from the START-UP demonstration project and a micro-costing approach, we estimated the health and economic impact of once-in-a-lifetime HPV self-collection campaign relative to clinic-based provider-collection of HPV specimens in Uganda. We used an individual-based Monte Carlo simulation model of the natural history of HPV and cervical cancer to estimate lifetime health and economic outcomes associated with screening with HPV DNA testing once in a lifetime (clinic-based provider-collection vs a self-collection campaign). Test performance and cost data were obtained from the START-UP demonstration project using a micro-costing approach. Model outcomes included lifetime risk of cervical cancer, total lifetime costs (in 2011 international dollars [I$]), and life expectancy. Cost-effectiveness ratios were expressed using incremental cost-effectiveness ratios (ICERs). When both strategies achieved 75% population coverage, ICERs were below Uganda’s per capita GDP (self-collection: I$80 per year of life saved [YLS]; provider-collection: I$120 per YLS). When the self-collection campaign achieved coverage gains of 15–20%, it was more effective than provider-collection, and had a lower ICER unless coverage with both strategies was 50% or less. Findings were sensitive to cryotherapy compliance among screen-positive women and relative HPV test performance. The primary limitation of this analysis is that self-collection costs are based on a hypothetical campaign but are based on unit costs from Uganda. Once-in-a-lifetime screening with HPV self-collection may be very cost-effective and reduce cervical cancer risk by > 20% if coverage is high. Demonstration projects will be needed to confirm the validity of our logistical, costing and compliance assumptions. PMID:28369405

  17. Cost-effectiveness of an HPV self-collection campaign in Uganda: comparing models for delivery of cervical cancer screening in a low-income setting.

    PubMed

    Campos, Nicole G; Tsu, Vivien; Jeronimo, Jose; Njama-Meya, Denise; Mvundura, Mercy; Kim, Jane J

    2017-09-01

    With the availability of a low-cost HPV DNA test that can be administered by either a healthcare provider or a woman herself, programme planners require information on the costs and cost-effectiveness of implementing cervical cancer screening programmes in low-resource settings under different models of healthcare delivery. Using data from the START-UP demonstration project and a micro-costing approach, we estimated the health and economic impact of once-in-a-lifetime HPV self-collection campaign relative to clinic-based provider-collection of HPV specimens in Uganda. We used an individual-based Monte Carlo simulation model of the natural history of HPV and cervical cancer to estimate lifetime health and economic outcomes associated with screening with HPV DNA testing once in a lifetime (clinic-based provider-collection vs a self-collection campaign). Test performance and cost data were obtained from the START-UP demonstration project using a micro-costing approach. Model outcomes included lifetime risk of cervical cancer, total lifetime costs (in 2011 international dollars [I$]), and life expectancy. Cost-effectiveness ratios were expressed using incremental cost-effectiveness ratios (ICERs). When both strategies achieved 75% population coverage, ICERs were below Uganda's per capita GDP (self-collection: I$80 per year of life saved [YLS]; provider-collection: I$120 per YLS). When the self-collection campaign achieved coverage gains of 15-20%, it was more effective than provider-collection, and had a lower ICER unless coverage with both strategies was 50% or less. Findings were sensitive to cryotherapy compliance among screen-positive women and relative HPV test performance. The primary limitation of this analysis is that self-collection costs are based on a hypothetical campaign but are based on unit costs from Uganda. Once-in-a-lifetime screening with HPV self-collection may be very cost-effective and reduce cervical cancer risk by > 20% if coverage is high. Demonstration projects will be needed to confirm the validity of our logistical, costing and compliance assumptions. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  18. Proceedings of the Military Operations Research Society (MORS) Simulation Validation Workshop (SIMVAL II) Held at Alexandria, Virginia on 31 March-2 April 1992

    DTIC Science & Technology

    1992-04-02

    learned . demanded by VV&A. * The CM should maintain a knowl- The cost of configuration manage- edgeable staff that can support ade- ment is increased by of...Carter note: a model before seeing that the results make even the vaguest sense, or learning what ...the argument for paying aspect of the model drives...that person the researcher can find model. Strong teanis have certain things in what are the variables of interest, what will common. They are made up

  19. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    PubMed

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Cost-Effectiveness of Different Strategies to Monitor Adults on Antiretroviral Treatment: A Combined Analysis of Three Mathematical Models

    PubMed Central

    Keebler, Daniel; Revill, Paul; Braithwaite, Scott; Phillips, Andrew; Blaser, Nello; Borquez, Annick; Cambiano, Valentina; Ciaranello, Andrea; Estill, Janne; Gray, Richard; Hill, Andrew; Keiser, Olivia; Kessler, Jason; Menzies, Nicolas A; Nucifora, Kimberly A; Vizcaya, Luisa Salazar; Walker, Simon; Welte, Alex; Easterbrook, Philippa; Doherty, Meg; Hirnschall, Gottfried; Hallett, Timothy B

    2015-01-01

    Background The WHO’s 2013 revisions to its Consolidated Guidelines on ARVs will recommend routine viral load monitoring (VLM), rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources given other competing priorities, including expansion of ART coverage. Here we assess the cost-effectiveness of alternative patient monitoring strategies. Methods A range of monitoring strategies was evaluated, including clinical, CD4 and viral load monitoring alone and together at different frequencies and with different criteria for switching to second-line therapies. Three independently-constructed and validated models were analysed simultaneously. Costs were estimated based on resource use projected in the models and associated unit costs; impact was quantified as disability-adjusted life years (DALYs) averted. Alternatives were compared using incremental cost-effectiveness analysis. Results All models show that clinical monitoring delivers significant benefit compared to a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than VLM, which is currently more expensive. VLM without CD4 every six to 12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing ART coverage or expanding ART eligibility. Interpretation The priority for HIV programmes should be to expand ART coverage, firstly at CD4 <350 cells and then at CD4 <500, using lower-cost clinical or CD4 monitoring. At current costs, VLM should be considered only after high ART coverage has been achieved. Point-of-care technologies and other factors reducing costs may make VLM more affordable in future. Funding The HIV Modelling Consortium is funded by the Bill and Melinda Gates Foundation. Funding for this work was also provided by the World Health Organization. PMID:25104633

  1. Forecast of dengue incidence using temperature and rainfall.

    PubMed

    Hii, Yien Ling; Zhu, Huaiping; Ng, Nawi; Ng, Lee Ching; Rocklöv, Joacim

    2012-01-01

    An accurate early warning system to predict impending epidemics enhances the effectiveness of preventive measures against dengue fever. The aim of this study was to develop and validate a forecasting model that could predict dengue cases and provide timely early warning in Singapore. We developed a time series Poisson multivariate regression model using weekly mean temperature and cumulative rainfall over the period 2000-2010. Weather data were modeled using piecewise linear spline functions. We analyzed various lag times between dengue and weather variables to identify the optimal dengue forecasting period. Autoregression, seasonality and trend were considered in the model. We validated the model by forecasting dengue cases for week 1 of 2011 up to week 16 of 2012 using weather data alone. Model selection and validation were based on Akaike's Information Criterion, standardized Root Mean Square Error, and residuals diagnoses. A Receiver Operating Characteristics curve was used to analyze the sensitivity of the forecast of epidemics. The optimal period for dengue forecast was 16 weeks. Our model forecasted correctly with errors of 0.3 and 0.32 of the standard deviation of reported cases during the model training and validation periods, respectively. It was sensitive enough to distinguish between outbreak and non-outbreak to a 96% (CI = 93-98%) in 2004-2010 and 98% (CI = 95%-100%) in 2011. The model predicted the outbreak in 2011 accurately with less than 3% possibility of false alarm. We have developed a weather-based dengue forecasting model that allows warning 16 weeks in advance of dengue epidemics with high sensitivity and specificity. We demonstrate that models using temperature and rainfall could be simple, precise, and low cost tools for dengue forecasting which could be used to enhance decision making on the timing, scale of vector control operations, and utilization of limited resources.

  2. Prediction of coronary artery disease in patients undergoing operations for mitral valve degeneration

    NASA Technical Reports Server (NTRS)

    Lin, S. S.; Lauer, M. S.; Asher, C. R.; Cosgrove, D. M.; Blackstone, E.; Thomas, J. D.; Garcia, M. J.

    2001-01-01

    OBJECTIVES: We sought to develop and validate a model that estimates the risk of obstructive coronary artery disease in patients undergoing operations for mitral valve degeneration and to demonstrate its potential clinical utility. METHODS: A total of 722 patients (67% men; age, 61 +/- 12 years) without a history of myocardial infarction, ischemic electrocardiographic changes, or angina who underwent routine coronary angiography before mitral valve prolapse operations between 1989 and 1996 were analyzed. A bootstrap-validated logistic regression model on the basis of clinical risk factors was developed to identify low-risk (< or =5%) patients. Obstructive coronary atherosclerosis was defined as 50% or more luminal narrowing in one or more major epicardial vessels, as determined by means of coronary angiography. RESULTS: One hundred thirty-nine (19%) patients had obstructive coronary atherosclerosis. Independent predictors of coronary artery disease include age, male sex, hypertension, diabetes mellitus,and hyperlipidemia. Two hundred twenty patients were designated as low risk according to the logistic model. Of these patients, only 3 (1.3%) had single-vessel disease, and none had multivessel disease. The model showed good discrimination, with an area under the receiver-operating characteristic curve of 0.84. Cost analysis indicated that application of this model could safely eliminate 30% of coronary angiograms, corresponding to cost savings of $430,000 per 1000 patients without missing any case of high-risk coronary artery disease. CONCLUSION: A model with standard clinical predictors can reliably estimate the prevalence of obstructive coronary atherosclerosis in patients undergoing mitral valve prolapse operations. This model can identify low-risk patients in whom routine preoperative angiography may be safely avoided.

  3. Modelling obesity trends in Australia: unravelling the past and predicting the future.

    PubMed

    Hayes, A J; Lung, T W C; Bauman, A; Howard, K

    2017-01-01

    Modelling is increasingly being used to predict the epidemiology of obesity progression and its consequences. The aims of this study were: (a) to present and validate a model for prediction of obesity among Australian adults and (b) to use the model to project the prevalence of obesity and severe obesity by 2025. Individual level simulation combined with survey estimation techniques to model changing population body mass index (BMI) distribution over time. The model input population was derived from a nationally representative survey in 1995, representing over 12 million adults. Simulations were run for 30 years. The model was validated retrospectively and then used to predict obesity and severe obesity by 2025 among different aged cohorts and at a whole population level. The changing BMI distribution over time was well predicted by the model and projected prevalence of weight status groups agreed with population level data in 2008, 2012 and 2014.The model predicts more growth in obesity among younger than older adult cohorts. Projections at a whole population level, were that healthy weight will decline, overweight will remain steady, but obesity and severe obesity prevalence will continue to increase beyond 2016. Adult obesity prevalence was projected to increase from 19% in 1995 to 35% by 2025. Severe obesity (BMI>35), which was only around 5% in 1995, was projected to be 13% by 2025, two to three times the 1995 levels. The projected rise in obesity severe obesity will have more substantial cost and healthcare system implications than in previous decades. Having a robust epidemiological model is key to predicting these long-term costs and health outcomes into the future.

  4. HyPEP FY06 Report: Models and Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOE report

    2006-09-01

    The Department of Energy envisions the next generation very high-temperature gas-cooled reactor (VHTR) as a single-purpose or dual-purpose facility that produces hydrogen and electricity. The Ministry of Science and Technology (MOST) of the Republic of Korea also selected VHTR for the Nuclear Hydrogen Development and Demonstration (NHDD) Project. This research project aims at developing a user-friendly program for evaluating and optimizing cycle efficiencies of producing hydrogen and electricity in a Very-High-Temperature Reactor (VHTR). Systems for producing electricity and hydrogen are complex and the calculations associated with optimizing these systems are intensive, involving a large number of operating parameter variations andmore » many different system configurations. This research project will produce the HyPEP computer model, which is specifically designed to be an easy-to-use and fast running tool for evaluating nuclear hydrogen and electricity production facilities. The model accommodates flexible system layouts and its cost models will enable HyPEP to be well-suited for system optimization. Specific activities of this research are designed to develop the HyPEP model into a working tool, including (a) identifying major systems and components for modeling, (b) establishing system operating parameters and calculation scope, (c) establishing the overall calculation scheme, (d) developing component models, (e) developing cost and optimization models, and (f) verifying and validating the program. Once the HyPEP model is fully developed and validated, it will be used to execute calculations on candidate system configurations. FY-06 report includes a description of reference designs, methods used in this study, models and computational strategies developed for the first year effort. Results from computer codes such as HYSYS and GASS/PASS-H used by Idaho National Laboratory and Argonne National Laboratory, respectively will be benchmarked with HyPEP results in the following years.« less

  5. Parametric study of different contributors to tumor thermal profile

    NASA Astrophysics Data System (ADS)

    Tepper, Michal; Gannot, Israel

    2014-03-01

    Treating cancer is one of the major challenges of modern medicine. There is great interest in assessing tumor development in in vivo animal and human models, as well as in in vitro experiments. Existing methods are either limited by cost and availability or by their low accuracy and reproducibility. Thermography holds the potential of being a noninvasive, low-cost, irradiative and easy-to-use method for tumor monitoring. Tumors can be detected in thermal images due to their relatively higher or lower temperature compared to the temperature of the healthy skin surrounding them. Extensive research is performed to show the validity of thermography as an efficient method for tumor detection and the possibility of extracting tumor properties from thermal images, showing promising results. However, deducing from one type of experiment to others is difficult due to the differences in tumor properties, especially between different types of tumors or different species. There is a need in a research linking different types of tumor experiments. In this research, parametric analysis of possible contributors to tumor thermal profiles was performed. The effect of tumor geometric, physical and thermal properties was studied, both independently and together, in phantom model experiments and computer simulations. Theoretical and experimental results were cross-correlated to validate the models used and increase the accuracy of simulated complex tumor models. The contribution of different parameters in various tumor scenarios was estimated and the implication of these differences on the observed thermal profiles was studied. The correlation between animal and human models is discussed.

  6. Technology development status at McDonnell Douglas

    NASA Technical Reports Server (NTRS)

    Rowe, W. T.

    1981-01-01

    The significant technology items of the Concorde and the conceptual MCD baseline advanced supersonic transport are compared. The four major improvements are in the areas of range performance, structures (materials), aerodynamics, and in community noise. Presentation charts show aerodynamic efficiency; the reoptimized wing; low scale lift/drag ratio; control systems; structural modeling and analysis; weight and cost comparisons for superplasticity diffusion bonded titanium sandwich structures and for aluminum brazed titanium honeycomb structures; operating cost reduction; suppressor nozzles; noise reduction and range; the bicone inlet; a market summary; environmental issues; high priority items; the titanium wing and fuselage test components; and technology validation.

  7. MEMS applications in space exploration

    NASA Astrophysics Data System (ADS)

    Tang, William C.

    1997-09-01

    Space exploration in the coming century will emphasize cost effectiveness and highly focused mission objectives, which will result in frequent multiple missions that broaden the scope of space science and to validate new technologies on a timely basis. MEMS is one of the key enabling technology to create cost-effective, ultra-miniaturized, robust, and functionally focused spacecraft for both robotic and human exploration programs. Examples of MEMS devices at various stages of development include microgyroscope, microseismometer, microhygrometer, quadrupole mass spectrometer, and micropropulsion engine. These devices, when proven successful, will serve as models for developing components and systems for new-millennium spacecraft.

  8. Applications of MEMS for Space Exploration

    NASA Astrophysics Data System (ADS)

    Tang, William C.

    1998-03-01

    Space exploration in the coming century will emphasize cost effectiveness and highly focused mission objectives, which will result in frequent multiple missions that broaden the scope of space science and to validate new technologies on a timely basis. Micro Electro Mechanical Systems (MEMS) is one of the key enabling technologies to create cost-effective, ultra-miniaturized, robust, and functionally focused spacecraft for both robotic and human exploration programs. Examples of MEMS devices at various stages of development include microgyroscope, microseismometer, microhygrometer, quadrupole mass spectrometer, and micropropulsion engine. These devices, when proven successful, will serve as models for developing components and systems for new-millennium spacecraft.

  9. Assembly line performance and modeling

    NASA Astrophysics Data System (ADS)

    Rane, Arun B.; Sunnapwar, Vivek K.

    2017-09-01

    Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.

  10. Two fuzzy possibilistic bi-objective zero-one programming models for outsourcing the equipment maintenance problem

    NASA Astrophysics Data System (ADS)

    Vahdani, Behnam; Jolai, Fariborz; Tavakkoli-Moghaddam, Reza; Meysam Mousavi, S.

    2012-07-01

    Maintenance outsourcing can be regarded as a strategic weapon to increase productivity and customer satisfaction in many companies, and this critical activity can be performed in a more efficient and effective way. This article presents two novel fuzzy possibilistic bi-objective zero-one programming (FPBOZOP) models for outsourcing of the equipment maintenance. In these models, cost parameters, including outsourcing cost, risk cost, time operations for performing the equipment maintenance and reliability level, as well as other influential parameters are considered through the outsourcing process. Moreover, the presented models can measure the capability of the company in doing different activities, unlike previous studies, in order to see the possibility of maintenance in-house, and can lead to make a best decision on the basis of the models' results. Both models are developed under uncertainty, which bring top managers the possibility of assigning more than one equipment or project to the supplier so that the profit is maximized, and the cost is minimized by considering bi-objectives concurrently. Then, a new fuzzy mathematical programming based possibilistic approach is introduced as a solution methodology from the recent literature to solve the proposed bi-objective zero-one programming (BOZOP) models and to reach a preferred compromise solution. Furthermore, a real-case study is utilized to demonstrate and to validate the effectiveness of the presented models. The computational results revealed that the models can be implemented in variety of problems in the domain of the equipment maintenance outsourcing and project outsourcing either from theory or application perspectives.

  11. Development and Validation of a Practical Two-Step Prediction Model and Clinical Risk Score for Post-Thrombotic Syndrome.

    PubMed

    Amin, Elham E; van Kuijk, Sander M J; Joore, Manuela A; Prandoni, Paolo; Cate, Hugo Ten; Cate-Hoek, Arina J Ten

    2018-06-04

     Post-thrombotic syndrome (PTS) is a common chronic consequence of deep vein thrombosis that affects the quality of life and is associated with substantial costs. In clinical practice, it is not possible to predict the individual patient risk. We develop and validate a practical two-step prediction tool for PTS in the acute and sub-acute phase of deep vein thrombosis.  Multivariable regression modelling with data from two prospective cohorts in which 479 (derivation) and 1,107 (validation) consecutive patients with objectively confirmed deep vein thrombosis of the leg, from thrombosis outpatient clinic of Maastricht University Medical Centre, the Netherlands (derivation) and Padua University hospital in Italy (validation), were included. PTS was defined as a Villalta score of ≥ 5 at least 6 months after acute thrombosis.  Variables in the baseline model in the acute phase were: age, body mass index, sex, varicose veins, history of venous thrombosis, smoking status, provoked thrombosis and thrombus location. For the secondary model, the additional variable was residual vein obstruction. Optimism-corrected area under the receiver operating characteristic curves (AUCs) were 0.71 for the baseline model and 0.60 for the secondary model. Calibration plots showed well-calibrated predictions. External validation of the derived clinical risk scores was successful: AUC, 0.66 (95% confidence interval [CI], 0.63-0.70) and 0.64 (95% CI, 0.60-0.69).  Individual risk for PTS in the acute phase of deep vein thrombosis can be predicted based on readily accessible baseline clinical and demographic characteristics. The individual risk in the sub-acute phase can be predicted with limited additional clinical characteristics. Schattauer GmbH Stuttgart.

  12. Current status of validation for robotic surgery simulators - a systematic review.

    PubMed

    Abboudi, Hamid; Khan, Mohammed S; Aboumarzouk, Omar; Guru, Khurshid A; Challacombe, Ben; Dasgupta, Prokar; Ahmed, Kamran

    2013-02-01

    To analyse studies validating the effectiveness of robotic surgery simulators. The MEDLINE(®), EMBASE(®) and PsycINFO(®) databases were systematically searched until September 2011. References from retrieved articles were reviewed to broaden the search. The simulator name, training tasks, participant level, training duration and evaluation scoring were extracted from each study. We also extracted data on feasibility, validity, cost-effectiveness, reliability and educational impact. We identified 19 studies investigating simulation options in robotic surgery. There are five different robotic surgery simulation platforms available on the market. In all, 11 studies sought opinion and compared performance between two different groups; 'expert' and 'novice'. Experts ranged in experience from 21-2200 robotic cases. The novice groups consisted of participants with no prior experience on a robotic platform and were often medical students or junior doctors. The Mimic dV-Trainer(®), ProMIS(®), SimSurgery Educational Platform(®) (SEP) and Intuitive systems have shown face, content and construct validity. The Robotic Surgical SimulatorTM system has only been face and content validated. All of the simulators except SEP have shown educational impact. Feasibility and cost-effectiveness of simulation systems was not evaluated in any trial. Virtual reality simulators were shown to be effective training tools for junior trainees. Simulation training holds the greatest potential to be used as an adjunct to traditional training methods to equip the next generation of robotic surgeons with the skills required to operate safely. However, current simulation models have only been validated in small studies. There is no evidence to suggest one type of simulator provides more effective training than any other. More research is needed to validate simulated environments further and investigate the effectiveness of animal and cadaveric training in robotic surgery. © 2012 BJU International.

  13. Thermal analysis of fused deposition modeling process using infrared thermography imaging and finite element modeling

    NASA Astrophysics Data System (ADS)

    Zhou, Xunfei; Hsieh, Sheng-Jen

    2017-05-01

    After years of development, Fused Deposition Modeling (FDM) has become the most popular technique in commercial 3D printing due to its cost effectiveness and easy-to-operate fabrication process. Mechanical strength and dimensional accuracy are two of the most important factors for reliability of FDM products. However, the solid-liquid-solid state changes of material in the FDM process make it difficult to monitor and model. In this paper, an experimental model was developed to apply cost-effective infrared thermography imaging method to acquire temperature history of filaments at the interface and their corresponding cooling mechanism. A three-dimensional finite element model was constructed to simulate the same process using element "birth and death" feature and validated with the thermal response from the experimental model. In 6 of 9 experimental conditions, a maximum of 13% difference existed between the experimental and numerical models. This work suggests that numerical modeling of FDM process is reliable and can facilitate better understanding of bead spreading and road-to-road bonding mechanics during fabrication.

  14. Prevention of recurrent rhinopharyngitis in at-risk children in France: a cost-effectiveness model for a nonspecific immunostimulating bacterial extract (OM-85 BV).

    PubMed

    Pessey, Jean-Jacques; Mégas, Françoise; Arnould, Benoît; Baron-Papillon, Florence

    2003-01-01

    To estimate the pharmacoeconomic impact for the French Social Security System of preventing recurrent acute rhinopharyngitis (RARP) in at-risk children with OM-85 BV, an immunostimulating agent indicated for the prevention of recurrences. A decision-analysis model. The probability of progression of the infection and of its associated care, the principal direct costs linked to them, and the effectiveness of OM-85 BV were established or calculated by reviewing the available literature (published between 1984 and 2000). Four experts validated the parameters and the model. For the French Social Security System, the mean direct cost for an acute rhinopharyngitis (ARP) infection was 49.39 Euro(2000 values). By using OM-85 BV prevention, 1.52 infections were prevented in 6 months saving 67.83 Euro on the costs of care for the recurrently infected child. Sensitivity analyses confirmed the robustness of the model and indicated a saving of between 6.28 Euro and 303.64 Euro in direct costs for each individual treated preventively. Threshold analyses showed that OM-85 BV prophylaxis is economically profitable if more than 0.15 infections are prevented and if direct costs of care of an ARP are greater than 4.78 Euro. Non-specific immunotherapy should be considered for the child at risk of RARP and administered in addition to other recommended measures. The economic savings for the community of using a medication for which the clinical effectiveness has been demonstrated should also be taken into account in assessing its usefulness.

  15. Pediatric laryngeal simulator using 3D printed models: A novel technique.

    PubMed

    Kavanagh, Katherine R; Cote, Valerie; Tsui, Yvonne; Kudernatsch, Simon; Peterson, Donald R; Valdez, Tulio A

    2017-04-01

    Simulation to acquire and test technical skills is an essential component of medical education and residency training in both surgical and nonsurgical specialties. High-quality simulation education relies on the availability, accessibility, and reliability of models. The objective of this work was to describe a practical pediatric laryngeal model for use in otolaryngology residency training. Ideally, this model would be low-cost, have tactile properties resembling human tissue, and be reliably reproducible. Pediatric laryngeal models were developed using two manufacturing methods: direct three-dimensional (3D) printing of anatomical models and casted anatomical models using 3D-printed molds. Polylactic acid, acrylonitrile butadiene styrene, and high-impact polystyrene (HIPS) were used for the directly printed models, whereas a silicone elastomer (SE) was used for the casted models. The models were evaluated for anatomic quality, ease of manipulation, hardness, and cost of production. A tissue likeness scale was created to validate the simulation model. Fleiss' Kappa rating was performed to evaluate interrater agreement, and analysis of variance was performed to evaluate differences among the materials. The SE provided the most anatomically accurate models, with the tactile properties allowing for surgical manipulation of the larynx. Direct 3D printing was more cost-effective than the SE casting method but did not possess the material properties and tissue likeness necessary for surgical simulation. The SE models of the pediatric larynx created from a casting method demonstrated high quality anatomy, tactile properties comparable to human tissue, and easy manipulation with standard surgical instruments. Their use in a reliable, low-cost, accessible, modular simulation system provides a valuable training resource for otolaryngology residents. N/A. Laryngoscope, 127:E132-E137, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  16. An economic evaluation of pediatric small bowel transplantation in the United Kingdom.

    PubMed

    Longworth, Louise; Young, Tracey; Beath, Sue V; Kelly, Deirdre A; Mistry, Hema; Protheroe, Sue M; Ratcliffe, Julie; Buxton, Martin J

    2006-08-27

    Small bowel transplantation (SBTx) offers an alternative to parenteral nutrition (PN) for the treatment of chronic intestinal failure in children: this study estimated its cost-effectiveness in the early phase of a U.K. program. Children assessed for SBTx were categorized as: 1) requiring SBTx following PN-related complications (n=23), 2) stable at home not requiring SBTx (n=24), and 3) terminally ill and unsuitable for SBTx (n=6). Costs were estimated from detailed resource-use data. Two comparisons were used for effectiveness: actual survival following transplantation (n=14) compared to: 1) estimated survival without transplantation using a prognostic model, and 2) the waiting list experiences of all patients listed for SBTx (n=23). Mean costs up to 30 months were pounds sterling 207,000 for those transplanted or on the waiting list, pounds sterling 159,000 for those stable on home PN, and pounds sterling 56,000 for those terminally ill. The prognostic model estimated a mean survival gain from transplantation of 0.12 years over 30 months, and suggested that transplantation was cost-saving. The second approach suggested that transplantation reduced survival by 0.24 years at an additional cost of pounds sterling 131,000. Firm conclusions on cost-effectiveness of SBTx are not possible given the two different estimates. The prognostic model approach (suggesting that pediatric SBTx may provide a small survival benefit at a small reduction in costs) should be less subject to bias, but the model requires external validation. Meanwhile, children at risk of fatal PN-complications should be given the opportunity to receive a SBTx only within a continuing formal assessment of the technology.

  17. Medial compartment knee osteoarthritis: age-stratified cost-effectiveness of total knee arthroplasty, unicompartmental knee arthroplasty, and high tibial osteotomy.

    PubMed

    Smith, William B; Steinberg, Joni; Scholtes, Stefan; Mcnamara, Iain R

    2017-03-01

    To compare the age-based cost-effectiveness of total knee arthroplasty (TKA), unicompartmental knee arthroplasty (UKA), and high tibial osteotomy (HTO) for the treatment of medial compartment knee osteoarthritis (MCOA). A Markov model was used to simulate theoretical cohorts of patients 40, 50, 60, and 70 years of age undergoing primary TKA, UKA, or HTO. Costs and outcomes associated with initial and subsequent interventions were estimated by following these virtual cohorts over a 10-year period. Revision and mortality rates, costs, and functional outcome data were estimated from a systematic review of the literature. Probabilistic analysis was conducted to accommodate these parameters' inherent uncertainty, and both discrete and probabilistic sensitivity analyses were utilized to assess the robustness of the model's outputs to changes in key variables. HTO was most likely to be cost-effective in cohorts under 60, and UKA most likely in those 60 and over. Probabilistic results did not indicate one intervention to be significantly more cost-effective than another. The model was exquisitely sensitive to changes in utility (functional outcome), somewhat sensitive to changes in cost, and least sensitive to changes in 10-year revision risk. HTO may be the most cost-effective option when treating MCOA in younger patients, while UKA may be preferred in older patients. Functional utility is the primary driver of the cost-effectiveness of these interventions. For the clinician, this study supports HTO as a competitive treatment option in young patient populations. It also validates each one of the three interventions considered as potentially optimal, depending heavily on patient preferences and functional utility derived over time.

  18. In search of the economic sustainability of Hadron therapy: the real cost of setting up and operating a Hadron facility.

    PubMed

    Vanderstraeten, Barbara; Verstraete, Jan; De Croock, Roger; De Neve, Wilfried; Lievens, Yolande

    2014-05-01

    To determine the treatment cost and required reimbursement for a new hadron therapy facility, considering different technical solutions and financing methods. The 3 technical solutions analyzed are a carbon only (COC), proton only (POC), and combined (CC) center, each operating 2 treatment rooms and assumed to function at full capacity. A business model defines the required reimbursement and analyzes the financial implications of setting up a facility over time; activity-based costing (ABC) calculates the treatment costs per type of patient for a center in a steady state of operation. Both models compare a private, full-cost approach with public sponsoring, only taking into account operational costs. Yearly operational costs range between €10.0M (M = million) for a publicly sponsored POC to €24.8M for a CC with private financing. Disregarding inflation, the average treatment cost calculated with ABC (COC: €29,450; POC: €46,342; CC: €46,443 for private financing; respectively €16,059, €28,296, and €23,956 for public sponsoring) is slightly lower than the required reimbursement based on the business model (between €51,200 in a privately funded POC and €18,400 in COC with public sponsoring). Reimbursement for privately financed centers is very sensitive to a delay in commissioning and to the interest rate. Higher throughput and hypofractionation have a positive impact on the treatment costs. Both calculation methods are valid and complementary. The financially most attractive option of a publicly sponsored COC should be balanced to the clinical necessities and the sociopolitical context. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Greater first year effectiveness drives favorable cost-effectiveness of brand risedronate versus generic or brand alendronate: modeled Canadian analysis

    PubMed Central

    Papaioannou, A.; Thompson, M. F.; Pasquale, M. K.; Adachi, J. D.

    2016-01-01

    Summary The RisedronatE and ALendronate (REAL) study provided a unique opportunity to conduct cost-effectiveness analyses based on effectiveness data from real-world clinical practice. Using a published osteoporosis model, the researchers found risedronate to be cost-effective compared to generic or brand alendronate for the treatment of Canadian postmenopausal osteoporosis in patients aged 65 years or older. Introduction The REAL study provides robust data on the real-world performance of risedronate and alendronate. The study used these data to assess the cost-effectiveness of brand risedronate versus generic or brand alendronate for treatment of Canadian postmenopausal osteoporosis patients aged 65 years or older. Methods A previously published osteoporosis model was populated with Canadian cost and epidemiological data, and the estimated fracture risk was validated. Effectiveness data were derived from REAL and utility data from published sources. The incremental cost per quality-adjusted life-year (QALY) gained was estimated from a Canadian public payer perspective, and comprehensive sensitivity analyses were conducted. Results The base case analysis found fewer fractures and more QALYs in the risedronate cohort, providing an incremental cost per QALY gained of $3,877 for risedronate compared to generic alendronate. The results were most sensitive to treatment duration and effectiveness. Conclusions The REAL study provided a unique opportunity to conduct cost-effectiveness analyses based on effectiveness data taken from real-world clinical practice. The analysis supports the cost-effectiveness of risedronate compared to generic or brand alendronate and the use of risedronate for the treatment of osteoporotic Canadian women aged 65 years or older with a BMD T-score ≤−2.5. PMID:18008100

  20. A person based formula for allocating commissioning funds to general practices in England: development of a statistical model.

    PubMed

    Dixon, Jennifer; Smith, Peter; Gravelle, Hugh; Martin, Steve; Bardsley, Martin; Rice, Nigel; Georghiou, Theo; Dusheiko, Mark; Billings, John; Lorenzo, Michael De; Sanderson, Colin

    2011-11-22

    To develop a formula for allocating resources for commissioning hospital care to all general practices in England based on the health needs of the people registered in each practice Multivariate prospective statistical models were developed in which routinely collected electronic information from 2005-6 and 2006-7 on individuals and the areas in which they lived was used to predict their costs of hospital care in the next year, 2007-8. Data on individuals included all diagnoses recorded at any inpatient admission. Models were developed on a random sample of 5 million people and validated on a second random sample of 5 million people and a third sample of 5 million people drawn from a random sample of practices. All general practices in England as of 1 April 2007. All NHS inpatient admissions and outpatient attendances for individuals registered with a general practice on that date. All individuals registered with a general practice in England at 1 April 2007. Power of the statistical models to predict the costs of the individual patient or each practice's registered population for 2007-8 tested with a range of metrics (R(2) reported here). Comparisons of predicted costs in 2007-8 with actual costs incurred in the same year were calculated by individual and by practice. Models including person level information (age, sex, and ICD-10 codes diagnostic recorded) and a range of area level information (such as socioeconomic deprivation and supply of health facilities) were most predictive of costs. After accounting for person level variables, area level variables added little explanatory power. The best models for resource allocation could predict upwards of 77% of the variation in costs at practice level, and about 12% at the person level. With these models, the predicted costs of about a third of practices would exceed or undershoot the actual costs by 10% or more. Smaller practices were more likely to be in these groups. A model was developed that performed well by international standards, and could be used for allocations to practices for commissioning. The best formulas, however, could predict only about 12% of the variation in next year's costs of most inpatient and outpatient NHS care for each individual. Person-based diagnostic data significantly added to the predictive power of the models.

  1. Guaranteed cost control of polynomial fuzzy systems via a sum of squares approach.

    PubMed

    Tanaka, Kazuo; Ohtake, Hiroshi; Wang, Hua O

    2009-04-01

    This paper presents the guaranteed cost control of polynomial fuzzy systems via a sum of squares (SOS) approach. First, we present a polynomial fuzzy model and controller that are more general representations of the well-known Takagi-Sugeno (T-S) fuzzy model and controller, respectively. Second, we derive a guaranteed cost control design condition based on polynomial Lyapunov functions. Hence, the design approach discussed in this paper is more general than the existing LMI approaches (to T-S fuzzy control system designs) based on quadratic Lyapunov functions. The design condition realizes a guaranteed cost control by minimizing the upper bound of a given performance function. In addition, the design condition in the proposed approach can be represented in terms of SOS and is numerically (partially symbolically) solved via the recent developed SOSTOOLS. To illustrate the validity of the design approach, two design examples are provided. The first example deals with a complicated nonlinear system. The second example presents micro helicopter control. Both the examples show that our approach provides more extensive design results for the existing LMI approach.

  2. A Practical Measure of Student Motivation: Establishing Validity Evidence for the Expectancy-Value-Cost Scale in Middle School

    ERIC Educational Resources Information Center

    Kosovich, Jeff J.; Hulleman, Chris S.; Barron, Kenneth E.; Getty, Steve

    2015-01-01

    We present validity evidence for the Expectancy-Value-Cost (EVC) Scale of student motivation. Using a brief, 10-item scale, we measured middle school students' expectancy, value, and cost for their math and science classes in the Fall and Winter of the same academic year. Confirmatory factor analyses supported the three-factor structure of the EVC…

  3. Cost calculation and prediction in adult intensive care: a ground-up utilization study.

    PubMed

    Moran, J L; Peisach, A R; Solomon, P J; Martin, J

    2004-12-01

    The ability of various proxy cost measures, including therapeutic activity scores (TISS and Omega) and cumulative daily severity of illness scores, to predict individual ICU patient costs was assessed in a prospective "ground-up" utilization costing study over a six month period in 1991. Daily activity (TISS and Omega scores) and utilization in consecutive admissions to three adult university associated ICUs was recorded by dedicated data collectors. Cost prediction used linear regression with determination (80%) and validation (20%) data sets. The cohort, 1333 patients, had a mean (SD) age 57.5 (19.4) years, (41% female) and admission APACHE III score of 58 (27). ICU length of stay and mortality were 3.9 (6.1) days and 17.6% respectively. Mean total TISS and Omega scores were 117 (157) and 72 (113) respectively. Mean patient costs per ICU episode (1991 dollar AUS) were dollar 6801 (dollar 10311), with median costs of dollar 2534, range dollar 106 to dollar 95,602. Dominant cost fractions were nursing 43.3% and overheads 16.9%. Inflation adjusted year 2002 (mean) costs were dollar 9343 (dollar AUS). Total costs in survivors were predicted by Omega score, summed APACHE III score and ICU length of stay; determination R2, 0.91; validation 0.88. Omega was the preferred activity score. Without the Omega score, predictors were age, summed APACHE III score and ICU length of stay; determination R2, 0.73; validation 0.73. In non-survivors, predictors were age and ICU length of stay (plus interaction), and Omega score (determination R2, 0.97; validation 0.91). Patient costs may be predicted by a combination of ICU activity indices and severity scores.

  4. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  5. Assessing the construct validity and reliability of the Parental Perception on Antibiotics (PAPA) scales.

    PubMed

    Alumran, Arwa; Hou, Xiang-Yu; Sun, Jiandong; Yousef, Abdullah A; Hurst, Cameron

    2014-01-23

    The overuse of antibiotics is becoming an increasing concern. Antibiotic resistance, which increases both the burden of disease, and the cost of health services, is perhaps the most profound impact of antibiotics overuse. Attempts have been made to develop instruments to measure the psychosocial constructs underlying antibiotics use, however, none of these instruments have undergone thorough psychometric validation. This study evaluates the psychometric properties of the Parental Perceptions on Antibiotics (PAPA) scales. The PAPA scales attempt to measure the factors influencing parental use of antibiotics in children. 1111 parents of children younger than 12 years old were recruited from primary schools' parental meetings in the Eastern Province of Saudi Arabia from September 2012 to January 2013. The structure of the PAPA instrument was validated using Confirmatory Factor Analysis (CFA) with measurement model fit evaluated using the raw and scaled χ2, Goodness of Fit Index, and Root Mean Square Error of Approximation. A five-factor model was confirmed with the model showing good fit. Constructs in the model include: Knowledge and Beliefs, Behaviors, Sources of information, Adherence, and Awareness about antibiotics resistance. The instrument was shown to have good internal consistency, and good discriminant and convergent validity. The availability of an instrument able to measure the psychosocial factors underlying antibiotics usage allows the risk factors underlying antibiotic use and overuse to now be investigated.

  6. Formal Specification and Design Techniques for Wireless Sensor and Actuator Networks

    PubMed Central

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system. PMID:22344203

  7. A design procedure for the handling qualities optimization of the X-29A aircraft

    NASA Technical Reports Server (NTRS)

    Bosworth, John T.; Cox, Timothy H.

    1989-01-01

    A design technique for handling qualities improvement was developed for the X-29A aircraft. As with any new aircraft, the X-29A control law designers were presented with a relatively high degree of uncertainty in their mathematical models. The presence of uncertainties, and the high level of static instability of the X-29A caused the control law designers to stress stability and robustness over handling qualities. During flight test, the mathematical models of the vehicle were validated or corrected to match the vehicle dynamic behavior. The updated models were then used to fine tune the control system to provide fighter-like handling characteristics. A design methodology was developed which works within the existing control system architecture to provide improved handling qualities and acceptable stability with a minimum of cost in both implementation as well as software verification and validation.

  8. Assessing the empirical validity of the "take-the-best" heuristic as a model of human probabilistic inference.

    PubMed

    Bröder, A

    2000-09-01

    The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.

  9. An alternative to the balance error scoring system: using a low-cost balance board to improve the validity/reliability of sports-related concussion balance testing.

    PubMed

    Chang, Jasper O; Levy, Susan S; Seay, Seth W; Goble, Daniel J

    2014-05-01

    Recent guidelines advocate sports medicine professionals to use balance tests to assess sensorimotor status in the management of concussions. The present study sought to determine whether a low-cost balance board could provide a valid, reliable, and objective means of performing this balance testing. Criterion validity testing relative to a gold standard and 7 day test-retest reliability. University biomechanics laboratory. Thirty healthy young adults. Balance ability was assessed on 2 days separated by 1 week using (1) a gold standard measure (ie, scientific grade force plate), (2) a low-cost Nintendo Wii Balance Board (WBB), and (3) the Balance Error Scoring System (BESS). Validity of the WBB center of pressure path length and BESS scores were determined relative to the force plate data. Test-retest reliability was established based on intraclass correlation coefficients. Composite scores for the WBB had excellent validity (r = 0.99) and test-retest reliability (R = 0.88). Both the validity (r = 0.10-0.52) and test-retest reliability (r = 0.61-0.78) were lower for the BESS. These findings demonstrate that a low-cost balance board can provide improved balance testing accuracy/reliability compared with the BESS. This approach provides a potentially more valid/reliable, yet affordable, means of assessing sports-related concussion compared with current methods.

  10. Using 3D Printing (Additive Manufacturing) to Produce Low-Cost Simulation Models for Medical Training.

    PubMed

    Lichtenberger, John P; Tatum, Peter S; Gada, Satyen; Wyn, Mark; Ho, Vincent B; Liacouras, Peter

    2018-03-01

    This work describes customized, task-specific simulation models derived from 3D printing in clinical settings and medical professional training programs. Simulation models/task trainers have an array of purposes and desired achievements for the trainee, defining that these are the first step in the production process. After this purpose is defined, computer-aided design and 3D printing (additive manufacturing) are used to create a customized anatomical model. Simulation models then undergo initial in-house testing by medical specialists followed by a larger scale beta testing. Feedback is acquired, via surveys, to validate effectiveness and to guide or determine if any future modifications and/or improvements are necessary. Numerous custom simulation models have been successfully completed with resulting task trainers designed for procedures, including removal of ocular foreign bodies, ultrasound-guided joint injections, nerve block injections, and various suturing and reconstruction procedures. These task trainers have been frequently utilized in the delivery of simulation-based training with increasing demand. 3D printing has been integral to the production of limited-quantity, low-cost simulation models across a variety of medical specialties. In general, production cost is a small fraction of a commercial, generic simulation model, if available. These simulation and training models are customized to the educational need and serve an integral role in the education of our military health professionals.

  11. Validation of the Concurrent Atomistic-Continuum Method on Screw Dislocation/Stacking Fault Interactions

    DOE PAGES

    Xu, Shuozhi; Xiong, Liming; Chen, Youping; ...

    2017-04-26

    Dislocation/stacking fault interactions play an important role in the plastic deformation of metallic nanocrystals and polycrystals. These interactions have been explored in atomistic models, which are limited in scale length by high computational cost. In contrast, multiscale material modeling approaches have the potential to simulate the same systems at a fraction of the computational cost. In this paper, we validate the concurrent atomistic-continuum (CAC) method on the interactions between a lattice screw dislocation and a stacking fault (SF) in three face-centered cubic metallic materials—Ni, Al, and Ag. Two types of SFs are considered: intrinsic SF (ISF) and extrinsic SF (ESF).more » For the three materials at different strain levels, two screw dislocation/ISF interaction modes (annihilation of the ISF and transmission of the dislocation across the ISF) and three screw dislocation/ESF interaction modes (transformation of the ESF into a three-layer twin, transformation of the ESF into an ISF, and transmission of the dislocation across the ESF) are identified. Here, our results show that CAC is capable of accurately predicting the dislocation/SF interaction modes with greatly reduced DOFs compared to fully-resolved atomistic simulations.« less

  12. Multi-criteria anomaly detection in urban noise sensor networks.

    PubMed

    Dauwe, Samuel; Oldoni, Damiano; De Baets, Bernard; Van Renterghem, Timothy; Botteldooren, Dick; Dhoedt, Bart

    2014-01-01

    The growing concern of citizens about the quality of their living environment and the emergence of low-cost microphones and data acquisition systems triggered the deployment of numerous noise monitoring networks spread over large geographical areas. Due to the local character of noise pollution in an urban environment, a dense measurement network is needed in order to accurately assess the spatial and temporal variations. The use of consumer grade microphones in this context appears to be very cost-efficient compared to the use of measurement microphones. However, the lower reliability of these sensing units requires a strong quality control of the measured data. To automatically validate sensor (microphone) data, prior to their use in further processing, a multi-criteria measurement quality assessment model for detecting anomalies such as microphone breakdowns, drifts and critical outliers was developed. Each of the criteria results in a quality score between 0 and 1. An ordered weighted average (OWA) operator combines these individual scores into a global quality score. The model is validated on datasets acquired from a real-world, extensive noise monitoring network consisting of more than 50 microphones. Over a period of more than a year, the proposed approach successfully detected several microphone faults and anomalies.

  13. Validation of the Concurrent Atomistic-Continuum Method on Screw Dislocation/Stacking Fault Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Shuozhi; Xiong, Liming; Chen, Youping

    Dislocation/stacking fault interactions play an important role in the plastic deformation of metallic nanocrystals and polycrystals. These interactions have been explored in atomistic models, which are limited in scale length by high computational cost. In contrast, multiscale material modeling approaches have the potential to simulate the same systems at a fraction of the computational cost. In this paper, we validate the concurrent atomistic-continuum (CAC) method on the interactions between a lattice screw dislocation and a stacking fault (SF) in three face-centered cubic metallic materials—Ni, Al, and Ag. Two types of SFs are considered: intrinsic SF (ISF) and extrinsic SF (ESF).more » For the three materials at different strain levels, two screw dislocation/ISF interaction modes (annihilation of the ISF and transmission of the dislocation across the ISF) and three screw dislocation/ESF interaction modes (transformation of the ESF into a three-layer twin, transformation of the ESF into an ISF, and transmission of the dislocation across the ESF) are identified. Here, our results show that CAC is capable of accurately predicting the dislocation/SF interaction modes with greatly reduced DOFs compared to fully-resolved atomistic simulations.« less

  14. The potential for machine learning algorithms to improve and reduce the cost of 3-dimensional printing for surgical planning.

    PubMed

    Huff, Trevor J; Ludwig, Parker E; Zuniga, Jorge M

    2018-05-01

    3D-printed anatomical models play an important role in medical and research settings. The recent successes of 3D anatomical models in healthcare have led many institutions to adopt the technology. However, there remain several issues that must be addressed before it can become more wide-spread. Of importance are the problems of cost and time of manufacturing. Machine learning (ML) could be utilized to solve these issues by streamlining the 3D modeling process through rapid medical image segmentation and improved patient selection and image acquisition. The current challenges, potential solutions, and future directions for ML and 3D anatomical modeling in healthcare are discussed. Areas covered: This review covers research articles in the field of machine learning as related to 3D anatomical modeling. Topics discussed include automated image segmentation, cost reduction, and related time constraints. Expert commentary: ML-based segmentation of medical images could potentially improve the process of 3D anatomical modeling. However, until more research is done to validate these technologies in clinical practice, their impact on patient outcomes will remain unknown. We have the necessary computational tools to tackle the problems discussed. The difficulty now lies in our ability to collect sufficient data.

  15. A Capable and Temporary Test Facility on a Shoestring Budget: The MSL Touchdown Test Facility

    NASA Technical Reports Server (NTRS)

    White, Christopher V.; Frankovich, John K.; Yates, Philip; Wells, George, Jr.; Robert, Losey

    2008-01-01

    The Mars Science Laboratory mission (MSL) has undertaken a developmental Touchdown Test Program that utilizes a full-scale rover vehicle and an overhead winch system to replicate the skycrane landing event. Landing surfaces consisting of flat and sloped granular media, planar, rigid surfaces, and various combinations of rocks and slopes were studied. Information gathered from these tests was vital for validating the rover analytical model, validating certain design or system behavior assumptions, and for exploring events and phenomenon that are either very difficult or too costly to model in a credible way. This paper describes this test program, with a focus on the creation of test facility, daily test operations, and some of the challenges faced and lessons learned along the way.

  16. ABEL model: Evaluates corporations` claims of inability to afford penalties and compliance costs (version 3.0.16). Model-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-11-01

    The easy-to-use ABEL software evaluates for-profit company claims of inability to afford penalties, clean-up costs, or compliance costs. Violators raise the issue of inability to pay in most of EPA`s enforcement actions regardless of whether there is any hard evidence supporting those claims. The program enables Federal, State and local enforcement professionals to quickly determine if there was any validity to those claims. ABEL is a tool that promotes quick settlements by performing screening analyses of defendants and potentially responsible parties (PRP`s) to determine their financial capacity. After analyzing some basic financial ratios that reflect a company`s solvency, ABEL assessesmore » the firm`s ability to pay by focusing on projected cash flows. The model explicitly calculates the value of projected, internally generated cash flows from historical tax information, and compares these cash flows to the proposed environmental expenditure(s). The software is extremely easy to use. Version 3.0.16 updates the standard values for inflation and discount rate.« less

  17. Advanced Energy Storage Management in Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Guodong; Ceylan, Oguzhan; Xiao, Bailu

    2016-01-01

    With increasing penetration of distributed generation (DG) in the distribution networks (DN), the secure and optimal operation of DN has become an important concern. In this paper, an iterative mixed integer quadratic constrained quadratic programming model to optimize the operation of a three phase unbalanced distribution system with high penetration of Photovoltaic (PV) panels, DG and energy storage (ES) is developed. The proposed model minimizes not only the operating cost, including fuel cost and purchasing cost, but also voltage deviations and power loss. The optimization model is based on the linearized sensitivity coefficients between state variables (e.g., node voltages) andmore » control variables (e.g., real and reactive power injections of DG and ES). To avoid slow convergence when close to the optimum, a golden search method is introduced to control the step size and accelerate the convergence. The proposed algorithm is demonstrated on modified IEEE 13 nodes test feeders with multiple PV panels, DG and ES. Numerical simulation results validate the proposed algorithm. Various scenarios of system configuration are studied and some critical findings are concluded.« less

  18. Validation of a 30m resolution flood hazard model of the conterminous United States

    NASA Astrophysics Data System (ADS)

    Sampson, C. C.; Wing, O.; Smith, A.; Bates, P. D.; Neal, J. C.

    2017-12-01

    We present a 30m resolution two-dimensional hydrodynamic model of the entire conterminous US that has been used to simulate continent-wide flood extent for ten return periods. The model uses a highly efficient numerical solution of the shallow water equations to simulate fluvial flooding in catchments down to 50 km2 and pluvial flooding in all catchments. We use the US National Elevation Dataset (NED) to determine topography for the model and the US Army Corps of Engineers National Levee Dataset to explicitly represent known flood defences. Return period flows and rainfall intensities are estimated using regionalized frequency analyses. We validate these simulations against the complete catalogue of Federal Emergency Management Agency (FEMA) Special Flood Hazard Area maps. We also compare the results obtained from the NED-based continental model with results from a 90m resolution global hydraulic model built using SRTM terrain and identical boundary conditions. Where the FEMA Special Flood Hazard Areas are based on high quality local models the NED-based continental scale model attains a Hit Rate of 86% and a Critical Success Index (CSI) of 0.59; both are typical of scores achieved when comparing high quality reach-scale models to observed event data. The NED model also consistently outperformed the coarser SRTM model. The correspondence between the continental model and FEMA improves in temperate areas and for basins above 400 km2. Given typical hydraulic modeling uncertainties in the FEMA maps, it is probable that the continental-scale model can replicate them to within error. The continental model covers the entire continental US, compared to only 61% for FEMA, and also maps flooding in smaller watersheds not included in the FEMA coverage. The simulations were performed using computing hardware costing less than 100k, whereas the FEMA flood layers are built from thousands of individual local studies that took several decades to develop at an estimated cost (up to 2013) of 4.5 - $7.5bn. The continental model is relatively straightforward to modify and could be re-run under different scenarios, such as climate change. The results show that continental-scale models may now offer sufficient rigor to inform some decision-making needs with far lower cost and greater coverage than traditional patchwork approaches.

  19. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  20. Evaluation of the long-term cost-effectiveness of liraglutide therapy for patients with type 2 diabetes in France.

    PubMed

    Roussel, Ronan; Martinez, Luc; Vandebrouck, Tom; Douik, Habiba; Emiel, Patrick; Guery, Matthieu; Hunt, Barnaby; Valentine, William J

    2016-01-01

    The present study aimed to compare the projected long-term clinical and cost implications associated with liraglutide, sitagliptin and glimepiride in patients with type 2 diabetes mellitus failing to achieve glycemic control on metformin monotherapy in France. Clinical input data for the modeling analysis were taken from two randomized, controlled trials (LIRA-DPP4 and LEAD-2). Long-term (patient lifetime) projections of clinical outcomes and direct costs (2013 Euros; €) were made using a validated computer simulation model of type 2 diabetes. Costs were taken from published France-specific sources. Future costs and clinical benefits were discounted at 3% annually. Sensitivity analyses were performed. Liraglutide was associated with an increase in quality-adjusted life expectancy of 0.25 quality-adjusted life years (QALYs) and an increase in mean direct healthcare costs of €2558 per patient compared with sitagliptin. In the comparison with glimepiride, liraglutide was associated with an increase in quality-adjusted life expectancy of 0.23 QALYs and an increase in direct costs of €4695. Based on these estimates, liraglutide was associated with an incremental cost-effectiveness ratio (ICER) of €10,275 per QALY gained vs sitagliptin and €20,709 per QALY gained vs glimepiride in France. Calculated ICERs for both comparisons fell below the commonly quoted willingness-to-pay threshold of €30,000 per QALY gained. Therefore, liraglutide is likely to be cost-effective vs sitagliptin and glimepiride from a healthcare payer perspective in France.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skrinak, V.M.

    The Eastern Devonian Gas Shales Technology Review is a technology transfer vehicle designed to keep industry and research organizations aware of major happenings in the shales. Four issues were published, and the majority of the readership was found to be operators. Under the other major task in this project, areal and analytic analyses of the basin resulted in reducing the study area by 30% while defining a rectangular coordinate system for the basin. Shale-well cost and economic models were developed and validated, and a simplified flow model was prepared.

  2. Pinatubo global cooling on target

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, R.A.

    1993-01-29

    When Pinatubo blasted millions of tons of debris into the stratosphere in June 1991, Hansen of NASA's Goddard Institute for Space Studies used his computer climate model to predict that the shade cost by the debris would cool the globe by about half a degree C. Year end temperature reports for 1992 are now showing that the prediction was on target-confirming the tentative belief that volcanos can temporarily cool the climate and validating at least one component of the computer models predicting a greenhouse warming.

  3. Cost-Value Analysis and the SAVE: A Work in Progress, But an Option for Localised Decision Making?

    PubMed

    Karnon, Jonathan; Partington, Andrew

    2015-12-01

    Cost-value analysis aims to address the limitations of the quality-adjusted life-year (QALY) by incorporating the strength of public concerns for fairness in the allocation of scarce health care resources. To date, the measurement of value has focused on equity weights to reflect societal preferences for the allocation of QALY gains. Another approach is to use a non-QALY-based measure of value, such as an outcome 'equivalent to saving the life of a young person' (a SAVE). This paper assesses the feasibility and validity of using the SAVE as a measure of value for the economic evaluation of health care technologies. A web-based person trade-off (PTO) survey was designed and implemented to estimate equivalent SAVEs for outcome events associated with the progression and treatment of early-stage breast cancer. The estimated equivalent SAVEs were applied to the outputs of an existing decision analytic model for early breast cancer. The web-based PTO survey was undertaken by 1094 respondents. Validation tests showed that 68 % of eligible responses revealed consistent ordering of responses and 32 % displayed ordinal transitivity, while 37 % of respondents showing consistency and ordinal transitivity approached cardinal transitivity. Using consistent and ordinally transitive responses, the mean incremental cost per SAVE gained was £ 3.72 million. Further research is required to improve the validity of the SAVE, which may include a simpler web-based survey format or a face-to-face format to facilitate more informed responses. A validated method for estimating equivalent SAVEs is unlikely to replace the QALY as the globally preferred measure of outcome, but the SAVE may provide a useful alternative for localized decision makers with relatively small, constrained budgets-for example, in programme budgeting and marginal analysis.

  4. Automated extraction and validation of children's gait parameters with the Kinect.

    PubMed

    Motiian, Saeid; Pergami, Paola; Guffey, Keegan; Mancinelli, Corrie A; Doretto, Gianfranco

    2015-12-02

    Gait analysis for therapy regimen prescription and monitoring requires patients to physically access clinics with specialized equipment. The timely availability of such infrastructure at the right frequency is especially important for small children. Besides being very costly, this is a challenge for many children living in rural areas. This is why this work develops a low-cost, portable, and automated approach for in-home gait analysis, based on the Microsoft Kinect. A robust and efficient method for extracting gait parameters is introduced, which copes with the high variability of noisy Kinect skeleton tracking data experienced across the population of young children. This is achieved by temporally segmenting the data with an approach based on coupling a probabilistic matching of stride template models, learned offline, with the estimation of their global and local temporal scaling. A preliminary study conducted on healthy children between 2 and 4 years of age is performed to analyze the accuracy, precision, repeatability, and concurrent validity of the proposed method against the GAITRite when measuring several spatial and temporal children's gait parameters. The method has excellent accuracy and good precision, with segmenting temporal sequences of body joint locations into stride and step cycles. Also, the spatial and temporal gait parameters, estimated automatically, exhibit good concurrent validity with those provided by the GAITRite, as well as very good repeatability. In particular, on a range of nine gait parameters, the relative and absolute agreements were found to be good and excellent, and the overall agreements were found to be good and moderate. This work enables and validates the automated use of the Kinect for children's gait analysis in healthy subjects. In particular, the approach makes a step forward towards developing a low-cost, portable, parent-operated in-home tool for clinicians assisting young children.

  5. Cost-Effectiveness of Patient Navigation to Increase Adherence with Screening Colonoscopy Among Minority Individuals

    PubMed Central

    Ladabaum, Uri; Mannalithara, Ajitha; Jandorf, Lina; Itzkowitz, Steven H.

    2015-01-01

    Background Colorectal cancer (CRC) screening is underutilized by minority populations. Patient navigation increases adherence with screening colonoscopy. We estimated the cost-effectiveness of navigation for screening colonoscopy from the perspective of a payer seeking to improve population health. Methods We informed our validated model of CRC screening with inputs from navigation studies in New York City (population 43% African American, 49% Hispanic, 4% White, 4% Other; base case screening 40% without and 65% with navigation, navigation costs $29/colonoscopy completer, $21/non-completer, $3/non-navigated). We compared: 1) navigation vs. no navigation for one-time screening colonoscopy in unscreened persons age ≥50; 2) programs of colonoscopy with vs. without navigation, vs. fecal occult blood testing (FOBT) or immunochemical testing (FIT) for ages 50-80. Results In the base case: 1) one-time navigation gained quality-adjusted life-years (QALYs) and decreased costs; 2) longitudinal navigation cost $9,800/QALY gained vs. no navigation, and assuming comparable uptake rates, it cost $118,700/QALY gained vs. FOBT, but was less effective and more costly than FIT. Results were most dependent on screening participation rates and navigation costs: 1) assuming a 5% increase in screening uptake with navigation and navigation cost of $150/completer, one-time navigation cost $26,400/QALY gained; 2) longitudinal navigation with 75% colonoscopy uptake cost <$25,000/QALY gained vs. FIT when FIT uptake was <50%. Probabilistic sensitivity analyses did not alter the conclusions. Conclusions Navigation for screening colonoscopy appears to be cost-effective, and one-time navigation may be cost-saving. In emerging healthcare models that reward outcomes, payers should consider covering the costs of navigation for screening colonoscopy. PMID:25492455

  6. Three-Dimensional Modeling May Improve Surgical Education and Clinical Practice.

    PubMed

    Jones, Daniel B; Sung, Robert; Weinberg, Crispin; Korelitz, Theodore; Andrews, Robert

    2016-04-01

    Three-dimensional (3D) printing has been used in the manufacturing industry for rapid prototyping and product testing. The aim of our study was to assess the feasibility of creating anatomical 3D models from a digital image using 3D printers. Furthermore, we sought face validity of models and explored potential opportunities for using 3D printing to enhance surgical education and clinical practice. Computed tomography and magnetic resonance images were reviewed, converted to computer models, and printed by stereolithography to create near exact replicas of human organs. Medical students and surgeons provided feedback via survey at the 2014 Surgical Education Week conference. There were 51 respondents, and 95.8% wanted these models for their patients. Cost was a concern, but 82.6% found value in these models at a price less than $500. All respondents thought the models would be useful for integration into the medical school curriculum. Three-dimensional printing is a potentially disruptive technology to improve both surgical education and clinical practice. As the technology matures and cost decreases, we envision 3D models being increasingly used in surgery. © The Author(s) 2015.

  7. Principles of pharmacoeconomics and their impact on strategic imperatives of pharmaceutical research and development.

    PubMed

    Bodrogi, József; Kaló, Zoltán

    2010-04-01

    The importance of evidence-based health policy is widely acknowledged among health care professionals, patients and politicians. Health care resources available for medical procedures, including pharmaceuticals, are limited all over the world. Economic evaluations help to alleviate the burden of scarce resources by improving the allocative efficiency of health care financing. Reimbursement of new medicines is subject to their cost-effectiveness and affordability in more and more countries. There are three major approaches to calculate the cost-effectiveness of new pharmaceuticals. Economic analyses alongside pivotal clinical trials are often inconclusive due to the suboptimal collection of economic data and protocol-driven costs. The major limitation of observational naturalistic economic evaluations is the selection bias and that they can be conducted only after registration and reimbursement. Economic modelling is routinely used to predict the cost-effectiveness of new pharmaceuticals for reimbursement purposes. Accuracy of cost-effectiveness estimates depends on the quality of input variables; validity of surrogate end points; and appropriateness of modelling assumptions, including model structure, time horizon and sophistication of the model to differentiate clinically and economically meaningful outcomes. These economic evaluation methods are not mutually exclusive; in practice, economic analyses often combine data collection alongside clinical trials or observational studies with modelling. The need for pharmacoeconomic evidence has fundamentally changed the strategic imperatives of research and development (R&D). Therefore, professionals in pharmaceutical R&D have to be familiar with the principles of pharmacoeconomics, including the selection of health policy-relevant comparators, analytical techniques, measurement of health gain by quality-adjusted life-years and strategic pricing of pharmaceuticals.

  8. Principles of pharmacoeconomics and their impact on strategic imperatives of pharmaceutical research and development

    PubMed Central

    Bodrogi, József; Kaló, Zoltán

    2010-01-01

    The importance of evidence-based health policy is widely acknowledged among health care professionals, patients and politicians. Health care resources available for medical procedures, including pharmaceuticals, are limited all over the world. Economic evaluations help to alleviate the burden of scarce resources by improving the allocative efficiency of health care financing. Reimbursement of new medicines is subject to their cost-effectiveness and affordability in more and more countries. There are three major approaches to calculate the cost-effectiveness of new pharmaceuticals. Economic analyses alongside pivotal clinical trials are often inconclusive due to the suboptimal collection of economic data and protocol-driven costs. The major limitation of observational naturalistic economic evaluations is the selection bias and that they can be conducted only after registration and reimbursement. Economic modelling is routinely used to predict the cost-effectiveness of new pharmaceuticals for reimbursement purposes. Accuracy of cost-effectiveness estimates depends on the quality of input variables; validity of surrogate end points; and appropriateness of modelling assumptions, including model structure, time horizon and sophistication of the model to differentiate clinically and economically meaningful outcomes. These economic evaluation methods are not mutually exclusive; in practice, economic analyses often combine data collection alongside clinical trials or observational studies with modelling. The need for pharmacoeconomic evidence has fundamentally changed the strategic imperatives of research and development (R&D). Therefore, professionals in pharmaceutical R&D have to be familiar with the principles of pharmacoeconomics, including the selection of health policy-relevant comparators, analytical techniques, measurement of health gain by quality-adjusted life-years and strategic pricing of pharmaceuticals. PMID:20132213

  9. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  10. Validating the Use of pPerformance Risk Indices for System-Level Risk and Maturity Assessments

    NASA Astrophysics Data System (ADS)

    Holloman, Sherrica S.

    With pressure on the U.S. Defense Acquisition System (DAS) to reduce cost overruns and schedule delays, system engineers' performance is only as good as their tools. Recent literature details a need for 1) objective, analytical risk quantification methodologies over traditional subjective qualitative methods -- such as, expert judgment, and 2) mathematically rigorous system-level maturity assessments. The Mahafza, Componation, and Tippett (2005) Technology Performance Risk Index (TPRI) ties the assessment of technical performance to the quantification of risk of unmet performance; however, it is structured for component- level data as input. This study's aim is to establish a modified TPRI with systems-level data as model input, and then validate the modified index with actual system-level data from the Department of Defense's (DoD) Major Defense Acquisition Programs (MDAPs). This work's contribution is the establishment and validation of the System-level Performance Risk Index (SPRI). With the introduction of the SPRI, system-level metrics are better aligned, allowing for better assessment, tradeoff and balance of time, performance and cost constraints. This will allow system engineers and program managers to ultimately make better-informed system-level technical decisions throughout the development phase.

  11. Observed Parenting Behavior with Teens: Measurement Invariance and Predictive Validity Across Race

    PubMed Central

    Skinner, Martie L.; MacKenzie, Elizabeth P.; Haggerty, Kevin P.; Hill, Karl G.; Roberson, Kendra C.

    2011-01-01

    Previous reports supporting measurement equality between European American and African American families have often focused on self-reported risk factors or observed parent behavior with young children. This study examines equality of measurement of observer ratings of parenting behavior with adolescents during structured tasks; mean levels of observed parenting; and predictive validity of teen self-reports of antisocial behaviors and beliefs using a sample of 163 African American and 168 European American families. Multiple-group confirmatory factor analyses supported measurement invariance across ethnic groups for 4 measures of observed parenting behavior: prosocial rewards, psychological costs, antisocial rewards, and problem solving. Some mean-level differences were found: African American parents exhibited lower levels of prosocial rewards, higher levels of psychological costs, and lower problem solving when compared to European Americans. No significant mean difference was found in rewards for antisocial behavior. Multigroup structural equation models suggested comparable relationships across race (predictive validity) between parenting constructs and youth antisocial constructs (i.e., drug initiation, positive drug attitudes, antisocial attitudes, problem behaviors) in all but one of the tested relationships. This study adds to existing evidence that family-based interventions targeting parenting behaviors can be generalized to African American families. PMID:21787057

  12. Pell Grant Validation Imposes Some Costs and Does Not Greatly Reduce Award Errors: New Strategies Are Needed. Report to the Honorable Paul Simon, United States Senate.

    ERIC Educational Resources Information Center

    Comptroller General of the U.S., Washington, DC.

    Efforts of the U.S. Department of Education to verify data submitted by applicants to the Pell Grant program were analyzed by the General Accounting Office. The effects of carrying out the Department's policy or methodology, called "validation," on financial aid applicants and colleges were assessed. Costs of 1982-1983 validation on…

  13. Study of indoor radon distribution using measurements and CFD modeling.

    PubMed

    Chauhan, Neetika; Chauhan, R P; Joshi, M; Agarwal, T K; Aggarwal, Praveen; Sahoo, B K

    2014-10-01

    Measurement and/or prediction of indoor radon ((222)Rn) concentration are important due to the impact of radon on indoor air quality and consequent inhalation hazard. In recent times, computational fluid dynamics (CFD) based modeling has become the cost effective replacement of experimental methods for the prediction and visualization of indoor pollutant distribution. The aim of this study is to implement CFD based modeling for studying indoor radon gas distribution. This study focuses on comparison of experimentally measured and CFD modeling predicted spatial distribution of radon concentration for a model test room. The key inputs for simulation viz. radon exhalation rate and ventilation rate were measured as a part of this study. Validation experiments were performed by measuring radon concentration at different locations of test room using active (continuous radon monitor) and passive (pin-hole dosimeters) techniques. Modeling predictions have been found to be reasonably matching with the measurement results. The validated model can be used to understand and study factors affecting indoor radon distribution for more realistic indoor environment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. A predictive bone drilling force model for haptic rendering with experimental validation using fresh cadaveric bone.

    PubMed

    Lin, Yanping; Chen, Huajiang; Yu, Dedong; Zhang, Ying; Yuan, Wen

    2017-01-01

    Bone drilling simulators with virtual and haptic feedback provide a safe, cost-effective and repeatable alternative to traditional surgical training methods. To develop such a simulator, accurate haptic rendering based on a force model is required to feedback bone drilling forces based on user input. Current predictive bone drilling force models based on bovine bones with various drilling conditions and parameters are not representative of the bone drilling process in bone surgery. The objective of this study was to provide a bone drilling force model for haptic rendering based on calibration and validation experiments in fresh cadaveric bones with different bone densities. Using a commonly used drill bit geometry (2 mm diameter), feed rates (20-60 mm/min) and spindle speeds (4000-6000 rpm) in orthognathic surgeries, the bone drilling forces of specimens from two groups were measured and the calibration coefficients of the specific normal and frictional pressures were determined. The comparison of the predicted forces and the measured forces from validation experiments with a large range of feed rates and spindle speeds demonstrates that the proposed bone drilling forces can predict the trends and average forces well. The presented bone drilling force model can be used for haptic rendering in surgical simulators.

  15. Cervical Spine Injuries: A Whole-Body Musculoskeletal Model for the Analysis of Spinal Loading.

    PubMed

    Cazzola, Dario; Holsgrove, Timothy P; Preatoni, Ezio; Gill, Harinderjit S; Trewartha, Grant

    2017-01-01

    Cervical spine trauma from sport or traffic collisions can have devastating consequences for individuals and a high societal cost. The precise mechanisms of such injuries are still unknown as investigation is hampered by the difficulty in experimentally replicating the conditions under which these injuries occur. We harness the benefits of computer simulation to report on the creation and validation of i) a generic musculoskeletal model (MASI) for the analyses of cervical spine loading in healthy subjects, and ii) a population-specific version of the model (Rugby Model), for investigating cervical spine injury mechanisms during rugby activities. The musculoskeletal models were created in OpenSim, and validated against in vivo data of a healthy subject and a rugby player performing neck and upper limb movements. The novel aspects of the Rugby Model comprise i) population-specific inertial properties and muscle parameters representing rugby forward players, and ii) a custom scapula-clavicular joint that allows the application of multiple external loads. We confirm the utility of the developed generic and population-specific models via verification steps and validation of kinematics, joint moments and neuromuscular activations during rugby scrummaging and neck functional movements, which achieve results comparable with in vivo and in vitro data. The Rugby Model was validated and used for the first time to provide insight into anatomical loading and cervical spine injury mechanisms related to rugby, whilst the MASI introduces a new computational tool to allow investigation of spinal injuries arising from other sporting activities, transport, and ergonomic applications. The models used in this study are freely available at simtk.org and allow to integrate in silico analyses with experimental approaches in injury prevention.

  16. Cervical Spine Injuries: A Whole-Body Musculoskeletal Model for the Analysis of Spinal Loading

    PubMed Central

    Holsgrove, Timothy P.; Preatoni, Ezio; Gill, Harinderjit S.; Trewartha, Grant

    2017-01-01

    Cervical spine trauma from sport or traffic collisions can have devastating consequences for individuals and a high societal cost. The precise mechanisms of such injuries are still unknown as investigation is hampered by the difficulty in experimentally replicating the conditions under which these injuries occur. We harness the benefits of computer simulation to report on the creation and validation of i) a generic musculoskeletal model (MASI) for the analyses of cervical spine loading in healthy subjects, and ii) a population-specific version of the model (Rugby Model), for investigating cervical spine injury mechanisms during rugby activities. The musculoskeletal models were created in OpenSim, and validated against in vivo data of a healthy subject and a rugby player performing neck and upper limb movements. The novel aspects of the Rugby Model comprise i) population-specific inertial properties and muscle parameters representing rugby forward players, and ii) a custom scapula-clavicular joint that allows the application of multiple external loads. We confirm the utility of the developed generic and population-specific models via verification steps and validation of kinematics, joint moments and neuromuscular activations during rugby scrummaging and neck functional movements, which achieve results comparable with in vivo and in vitro data. The Rugby Model was validated and used for the first time to provide insight into anatomical loading and cervical spine injury mechanisms related to rugby, whilst the MASI introduces a new computational tool to allow investigation of spinal injuries arising from other sporting activities, transport, and ergonomic applications. The models used in this study are freely available at simtk.org and allow to integrate in silico analyses with experimental approaches in injury prevention. PMID:28052130

  17. Citrate-based fluorescent materials for low-cost chloride sensing in the diagnosis of Cystic Fibrosis.

    PubMed

    Kim, Jimin P; Xie, Zhiwei; Creer, Michael; Liu, Zhiwen; Yang, Jian

    2017-01-01

    Chloride is an essential electrolyte that maintains homeostasis within the body, where abnormal chloride levels in biological fluids may indicate various diseases such as Cystic Fibrosis. However, current analytical solutions for chloride detection fail to meet the clinical needs of both high performance and low material or labor costs, hindering translation into clinical settings. Here we present a new class of fluorescence chloride sensors derived from a facile citrate -based synthesis platform that utilize dynamic quenching mechanisms. Based on this low-cost platform, we demonstrate for the first time a selective sensing strategy that uses a single fluorophore to detect multiple halides simultaneously, promising both selectivity and automation to improve performance and reduce labor costs. We also demonstrate the clinical utility of citrate-based sensors as a new sweat chloride test method for the diagnosis of Cystic Fibrosis by performing analytical validation with sweat controls and clinical validation with sweat from individuals with or without Cystic Fibrosis. Lastly, molecular modeling studies reveal the structural mechanism behind chloride sensing, serving to expand this class of fluorescence sensors with improved chloride sensitivities. Thus citrate-based fluorescent materials may enable low-cost, automated multi-analysis systems for simpler, yet accurate, point-of-care diagnostics that can be readily translated into clinical settings. More broadly, a wide range of medical, industrial, and environmental applications can be achieved with such a facile synthesis platform, demonstrated in our citrate-based biodegradable polymers with intrinsic fluorescence sensing.

  18. The microeconomics of residential photovoltaics: Tariffs, network operation and maintenance, and ancillary services in distribution-level electricity markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Backhaus, Scott N.; Edwards, Brian K.

    Here, we develop a microeconomic model of a distribution-level electricity market that takes explicit account of residential photovoltaics (PV) adoption. The model allows us to study the consequences of most tariffs on PV adoption and the consequences of increased residential PV adoption under the assumption of economic sustainability for electric utilities. We also validated the model using U.S. data and extend it to consider different pricing schemes for operation and maintenance costs of the distribution network and for ancillary services. Results show that net metering promotes more environmental benefits and social welfare than other tariffs. But, if costs to operatemore » the distribution network increase, net metering will amplify the unequal distribution of surplus among households. In conclusion, maintaining the economic sustainability of electric utilities under net metering may become extremely difficult unless the uneven distribution of surplus is legitimated by environmental benefits.« less

  19. The microeconomics of residential photovoltaics: Tariffs, network operation and maintenance, and ancillary services in distribution-level electricity markets

    DOE PAGES

    Boero, Riccardo; Backhaus, Scott N.; Edwards, Brian K.

    2016-11-12

    Here, we develop a microeconomic model of a distribution-level electricity market that takes explicit account of residential photovoltaics (PV) adoption. The model allows us to study the consequences of most tariffs on PV adoption and the consequences of increased residential PV adoption under the assumption of economic sustainability for electric utilities. We also validated the model using U.S. data and extend it to consider different pricing schemes for operation and maintenance costs of the distribution network and for ancillary services. Results show that net metering promotes more environmental benefits and social welfare than other tariffs. But, if costs to operatemore » the distribution network increase, net metering will amplify the unequal distribution of surplus among households. In conclusion, maintaining the economic sustainability of electric utilities under net metering may become extremely difficult unless the uneven distribution of surplus is legitimated by environmental benefits.« less

  20. Two-Level Weld-Material Homogenization for Efficient Computational Analysis of Welded Structure Blast-Survivability

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.

    2012-06-01

    The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.

  1. Optimization of space system development resources

    NASA Astrophysics Data System (ADS)

    Kosmann, William J.; Sarkani, Shahram; Mazzuchi, Thomas

    2013-06-01

    NASA has had a decades-long problem with cost growth during the development of space science missions. Numerous agency-sponsored studies have produced average mission level cost growths ranging from 23% to 77%. A new study of 26 historical NASA Science instrument set developments using expert judgment to reallocate key development resources has an average cost growth of 73.77%. Twice in history, a barter-based mechanism has been used to reallocate key development resources during instrument development. The mean instrument set development cost growth was -1.55%. Performing a bivariate inference on the means of these two distributions, there is statistical evidence to support the claim that using a barter-based mechanism to reallocate key instrument development resources will result in a lower expected cost growth than using the expert judgment approach. Agent-based discrete event simulation is the natural way to model a trade environment. A NetLogo agent-based barter-based simulation of science instrument development was created. The agent-based model was validated against the Cassini historical example, as the starting and ending instrument development conditions are available. The resulting validated agent-based barter-based science instrument resource reallocation simulation was used to perform 300 instrument development simulations, using barter to reallocate development resources. The mean cost growth was -3.365%. A bivariate inference on the means was performed to determine that additional significant statistical evidence exists to support a claim that using barter-based resource reallocation will result in lower expected cost growth, with respect to the historical expert judgment approach. Barter-based key development resource reallocation should work on spacecraft development as well as it has worked on instrument development. A new study of 28 historical NASA science spacecraft developments has an average cost growth of 46.04%. As barter-based key development resource reallocation has never been tried in a spacecraft development, no historical results exist, and a simulation of using that approach must be developed. The instrument development simulation should be modified to account for spacecraft development market participant differences. The resulting agent-based barter-based spacecraft resource reallocation simulation would then be used to determine if significant statistical evidence exists to prove a claim that using barter-based resource reallocation will result in lower expected cost growth.

  2. Cost-effectiveness analysis of ranibizumab plus prompt or deferred laser or triamcinolone plus prompt laser for diabetic macular edema.

    PubMed

    Dewan, Vinay; Lambert, Dennis; Edler, Joshua; Kymes, Steven; Apte, Rajendra S

    2012-08-01

    Perform a cost-effectiveness analysis of the treatment of diabetic macular edema (DME) with ranibizumab plus prompt or deferred laser versus triamcinolone plus prompt laser. Data for the analysis were drawn from reports of the Diabetic Retinopathy Clinical Research Network (DRCRnet) Protocol I. Computer simulation based on Protocol I data. Analyses were conducted from the payor perspective. Simulated participants assigned characteristics reflecting those seen in Protocol I. Markov models were constructed to replicate Protocol I's 104-week outcomes using a microsimulation approach to estimation. Baseline characteristics, visual acuity (VA), treatments, and complications were based on Protocol I data. Costs were identified by literature search. One-way sensitivity analysis was performed, and the results were validated against Protocol I data. Direct cost of care for 2 years, change in VA from baseline, and incremental cost-effectiveness ratio (ICER) measured as cost per additional letter gained from baseline (Early Treatment of Diabetic Retinopathy Study). For sham plus laser (S+L), ranibizumab plus prompt laser (R+pL), ranibizumab plus deferred laser (R+dL), and triamcinolone plus laser (T+L), effectiveness through 104 weeks was predicted to be 3.46, 7.07, 8.63, and 2.40 letters correct, respectively. The ICER values in terms of dollars per VA letter were $393 (S+L vs. T+L), $5943 (R+pL vs. S+L), and $20 (R+dL vs. R+pL). For pseudophakics, the ICER value for comparison triamcinolone with laser versus ranibizumab with deferred laser was $14 690 per letter gained. No clinically relevant changes in model variables altered outcomes. Internal validation demonstrated good similarity to Protocol I treatment patterns. In treatment of phakic patients with DME, ranibizumab with deferred laser provided an additional 6 letters correct compared with triamcinolone with laser at an additional cost of $19 216 over 2 years. That would indicate that if the gain in VA seen at 2 years is maintained in subsequent years, then the treatment of phakic patients with DME using ranibizumab may meet accepted standards of cost-effectiveness. For pseudophakic patients, first-line treatment with triamcinolone seems to be the most cost-effective option. Copyright © 2012 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  3. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  4. Smart Water Conservation System for Irrigated Landscape. ESTCP Cost and Performance Report

    DTIC Science & Technology

    2016-10-01

    water use by as much as 70% in support of meeting EO 13693. Additional performance objectives were to validate energy reduction, cost effectiveness ...Additional performance objectives were to validate energy reduction, cost effectiveness , and system reliability while maintaining satisfactory plant health...developments. The demonstration was conducted for two different climatic regions in the southwestern part of the United States (U.S.), where a typical

  5. The effect of time synchronization of wireless sensors on the modal analysis of structures

    NASA Astrophysics Data System (ADS)

    Krishnamurthy, V.; Fowler, K.; Sazonov, E.

    2008-10-01

    Driven by the need to reduce the installation cost and maintenance cost of structural health monitoring (SHM) systems, wireless sensor networks (WSNs) are becoming increasingly popular. Perfect time synchronization amongst the wireless sensors is a key factor enabling the use of low-cost, low-power WSNs for structural health monitoring applications based on output-only modal analysis of structures. In this paper we present a theoretical framework for analysis of the impact created by time delays in the measured system response on the reconstruction of mode shapes using the popular frequency domain decomposition (FDD) technique. This methodology directly estimates the change in mode shape values based on sensor synchronicity. We confirm the proposed theoretical model by experimental validation in modal identification experiments performed on an aluminum beam. The experimental validation was performed using a wireless intelligent sensor and actuator network (WISAN) which allows for close time synchronization between sensors (0.6-10 µs in the tested configuration) and guarantees lossless data delivery under normal conditions. The experimental results closely match theoretical predictions and show that even very small delays in output response impact the mode shapes.

  6. Load leveling on industrial refrigeration systems

    NASA Astrophysics Data System (ADS)

    Bierenbaum, H. S.; Kraus, A. D.

    1982-01-01

    A computer model was constructed of a brewery with a 2000 horsepower compressor/refrigeration system. The various conservation and load management options were simulated using the validated model. The savings available for implementing the most promising options were verified by trials in the brewery. Result show that an optimized methodology for implementing load leveling and energy conservation consisted of: (1) adjusting (or tuning) refrigeration systems controller variables to minimize unnecessary compressor starts, (2) The primary refrigeration system operating parameters, compressor suction pressure, and discharge pressure are carefully controlled (modulated) to satisfy product quality constraints (as well as in-process material cooling rates and temperature levels) and energy evaluating the energy cost savings associated with reject heat recovery, and (4) a decision is made to implement the reject heat recovery system based on a cost/benefits analysis.

  7. Crowdtruth validation: a new paradigm for validating algorithms that rely on image correspondences.

    PubMed

    Maier-Hein, Lena; Kondermann, Daniel; Roß, Tobias; Mersmann, Sven; Heim, Eric; Bodenstedt, Sebastian; Kenngott, Hannes Götz; Sanchez, Alexandro; Wagner, Martin; Preukschas, Anas; Wekerle, Anna-Laura; Helfert, Stefanie; März, Keno; Mehrabi, Arianeb; Speidel, Stefanie; Stock, Christian

    2015-08-01

    Feature tracking and 3D surface reconstruction are key enabling techniques to computer-assisted minimally invasive surgery. One of the major bottlenecks related to training and validation of new algorithms is the lack of large amounts of annotated images that fully capture the wide range of anatomical/scene variance in clinical practice. To address this issue, we propose a novel approach to obtaining large numbers of high-quality reference image annotations at low cost in an extremely short period of time. The concept is based on outsourcing the correspondence search to a crowd of anonymous users from an online community (crowdsourcing) and comprises four stages: (1) feature detection, (2) correspondence search via crowdsourcing, (3) merging multiple annotations per feature by fitting Gaussian finite mixture models, (4) outlier removal using the result of the clustering as input for a second annotation task. On average, 10,000 annotations were obtained within 24 h at a cost of $100. The annotation of the crowd after clustering and before outlier removal was of expert quality with a median distance of about 1 pixel to a publically available reference annotation. The threshold for the outlier removal task directly determines the maximum annotation error, but also the number of points removed. Our concept is a novel and effective method for fast, low-cost and highly accurate correspondence generation that could be adapted to various other applications related to large-scale data annotation in medical image computing and computer-assisted interventions.

  8. A model of head-related transfer functions based on a state-space analysis

    NASA Astrophysics Data System (ADS)

    Adams, Norman Herkamp

    This dissertation develops and validates a novel state-space method for binaural auditory display. Binaural displays seek to immerse a listener in a 3D virtual auditory scene with a pair of headphones. The challenge for any binaural display is to compute the two signals to supply to the headphones. The present work considers a general framework capable of synthesizing a wide variety of auditory scenes. The framework models collections of head-related transfer functions (HRTFs) simultaneously. This framework improves the flexibility of contemporary displays, but it also compounds the steep computational cost of the display. The cost is reduced dramatically by formulating the collection of HRTFs in the state-space and employing order-reduction techniques to design efficient approximants. Order-reduction techniques based on the Hankel-operator are found to yield accurate low-cost approximants. However, the inter-aural time difference (ITD) of the HRTFs degrades the time-domain response of the approximants. Fortunately, this problem can be circumvented by employing a state-space architecture that allows the ITD to be modeled outside of the state-space. Accordingly, three state-space architectures are considered. Overall, a multiple-input, single-output (MISO) architecture yields the best compromise between performance and flexibility. The state-space approximants are evaluated both empirically and psychoacoustically. An array of truncated FIR filters is used as a pragmatic reference system for comparison. For a fixed cost bound, the state-space systems yield lower approximation error than FIR arrays for D>10, where D is the number of directions in the HRTF collection. A series of headphone listening tests are also performed to validate the state-space approach, and to estimate the minimum order N of indiscriminable approximants. For D = 50, the state-space systems yield order thresholds less than half those of the FIR arrays. Depending upon the stimulus uncertainty, a minimum state-space order of 7≤N≤23 appears to be adequate. In conclusion, the proposed state-space method enables a more flexible and immersive binaural display with low computational cost.

  9. Start-up and operating costs for artisan cheese companies.

    PubMed

    Bouma, Andrea; Durham, Catherine A; Meunier-Goddik, Lisbeth

    2014-01-01

    Lack of valid economic data for artisan cheese making is a serious impediment to developing a realistic business plan and obtaining financing. The objective of this study was to determine approximate start-up and operating costs for an artisan cheese company. In addition, values are provided for the required size of processing and aging facilities associated with specific production volumes. Following in-depth interviews with existing artisan cheese makers, an economic model was developed to predict costs based on input variables such as production volume, production frequency, cheese types, milk types and cost, labor expenses, and financing. Estimated values for start-up cost for processing and aging facility ranged from $267,248 to $623,874 for annual production volumes of 3,402 kg (7,500 lb) and 27,216 kg (60,000 lb), respectively. First-year production costs ranged from $65,245 to $620,094 for the above-mentioned production volumes. It is likely that high start-up and operating costs remain a significant entry barrier for artisan cheese entrepreneurs. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. The Missing Stakeholder Group: Why Patients Should be Involved in Health Economic Modelling.

    PubMed

    van Voorn, George A K; Vemer, Pepijn; Hamerlijnck, Dominique; Ramos, Isaac Corro; Teunissen, Geertruida J; Al, Maiwenn; Feenstra, Talitha L

    2016-04-01

    Evaluations of healthcare interventions, e.g. new drugs or other new treatment strategies, commonly include a cost-effectiveness analysis (CEA) that is based on the application of health economic (HE) models. As end users, patients are important stakeholders regarding the outcomes of CEAs, yet their knowledge of HE model development and application, or their involvement therein, is absent. This paper considers possible benefits and risks of patient involvement in HE model development and application for modellers and patients. An exploratory review of the literature has been performed on stakeholder-involved modelling in various disciplines. In addition, Dutch patient experts have been interviewed about their experience in, and opinion about, the application of HE models. Patients have little to no knowledge of HE models and are seldom involved in HE model development and application. Benefits of becoming involved would include a greater understanding and possible acceptance by patients of HE model application, improved model validation, and a more direct infusion of patient expertise. Risks would include patient bias and increased costs of modelling. Patient involvement in HE modelling seems to carry several benefits as well as risks. We claim that the benefits may outweigh the risks and that patients should become involved.

  11. MRAC Control with Prior Model Knowledge for Asymmetric Damaged Aircraft

    PubMed Central

    Zhang, Jing

    2015-01-01

    This paper develops a novel state-tracking multivariable model reference adaptive control (MRAC) technique utilizing prior knowledge of plant models to recover control performance of an asymmetric structural damaged aircraft. A modification of linear model representation is given. With prior knowledge on structural damage, a polytope linear parameter varying (LPV) model is derived to cover all concerned damage conditions. An MRAC method is developed for the polytope model, of which the stability and asymptotic error convergence are theoretically proved. The proposed technique reduces the number of parameters to be adapted and thus decreases computational cost and requires less input information. The method is validated by simulations on NASA generic transport model (GTM) with damage. PMID:26180839

  12. A low-cost three-dimensional laser surface scanning approach for defining body segment parameters.

    PubMed

    Pandis, Petros; Bull, Anthony Mj

    2017-11-01

    Body segment parameters are used in many different applications in ergonomics as well as in dynamic modelling of the musculoskeletal system. Body segment parameters can be defined using different methods, including techniques that involve time-consuming manual measurements of the human body, used in conjunction with models or equations. In this study, a scanning technique for measuring subject-specific body segment parameters in an easy, fast, accurate and low-cost way was developed and validated. The scanner can obtain the body segment parameters in a single scanning operation, which takes between 8 and 10 s. The results obtained with the system show a standard deviation of 2.5% in volumetric measurements of the upper limb of a mannequin and 3.1% difference between scanning volume and actual volume. Finally, the maximum mean error for the moment of inertia by scanning a standard-sized homogeneous object was 2.2%. This study shows that a low-cost system can provide quick and accurate subject-specific body segment parameter estimates.

  13. Cost-Sharing of Ecological Construction Based on Trapezoidal Intuitionistic Fuzzy Cooperative Games

    PubMed Central

    Liu, Jiacai; Zhao, Wenjian

    2016-01-01

    There exist some fuzziness and uncertainty in the process of ecological construction. The aim of this paper is to develop a direct and an effective simplified method for obtaining the cost-sharing scheme when some interested parties form a cooperative coalition to improve the ecological environment of Min River together. Firstly, we propose the solution concept of the least square prenucleolus of cooperative games with coalition values expressed by trapezoidal intuitionistic fuzzy numbers. Then, based on the square of the distance in the numerical value between two trapezoidal intuitionistic fuzzy numbers, we establish a corresponding quadratic programming model to obtain the least square prenucleolus, which can effectively avoid the information distortion and uncertainty enlargement brought about by the subtraction of trapezoidal intuitionistic fuzzy numbers. Finally, we give a numerical example about the cost-sharing of ecological construction in Fujian Province in China to show the validity, applicability, and advantages of the proposed model and method. PMID:27834830

  14. Investigating the performance of wavelet neural networks in ionospheric tomography using IGS data over Europe

    NASA Astrophysics Data System (ADS)

    Ghaffari Razin, Mir Reza; Voosoghi, Behzad

    2017-04-01

    Ionospheric tomography is a very cost-effective method which is used frequently to modeling of electron density distributions. In this paper, residual minimization training neural network (RMTNN) is used in voxel based ionospheric tomography. Due to the use of wavelet neural network (WNN) with back-propagation (BP) algorithm in RMTNN method, the new method is named modified RMTNN (MRMTNN). To train the WNN with BP algorithm, two cost functions is defined: total and vertical cost functions. Using minimization of cost functions, temporal and spatial ionospheric variations is studied. The GPS measurements of the international GNSS service (IGS) in the central Europe have been used for constructing a 3-D image of the electron density. Three days (2009.04.15, 2011.07.20 and 2013.06.01) with different solar activity index is used for the processing. To validate and better assess reliability of the proposed method, 4 ionosonde and 3 testing stations have been used. Also the results of MRMTNN has been compared to that of the RMTNN method, international reference ionosphere model 2012 (IRI-2012) and spherical cap harmonic (SCH) method as a local ionospheric model. The comparison of MRMTNN results with RMTNN, IRI-2012 and SCH models shows that the root mean square error (RMSE) and standard deviation of the proposed approach are superior to those of the traditional method.

  15. A mechanistic, globally-applicable model of plant nitrogen uptake, retranslocation and fixation

    NASA Astrophysics Data System (ADS)

    Fisher, J. B.; Tan, S.; Malhi, Y.; Fisher, R. A.; Sitch, S.; Huntingford, C.

    2008-12-01

    Nitrogen is one of the nutrients that can most limit plant growth, and nitrogen availability may be a controlling factor on biosphere responses to climate change. We developed a plant nitrogen assimilation model based on a) advective transport through the transpiration stream, b) retranslocation whereby carbon is expended to resorb nitrogen from leaves, c) active uptake whereby carbon is expended to acquire soil nitrogen, and d) biological nitrogen fixation whereby carbon is expended for symbiotic nitrogen fixers. The model relies on 9 inputs: 1) net primary productivity (NPP), 2) plant C:N ratio, 3) available soil nitrogen, 4) root biomass, 5) transpiration rate, 6) saturated soil depth,7) leaf nitrogen before senescence, 8) soil temperature, and 9) ability to fix nitrogen. A carbon cost of retranslocation is estimated based on leaf nitrogen and compared to an active uptake carbon cost based on root biomass and available soil nitrogen; for nitrogen fixers both costs are compared to a carbon cost of fixation dependent on soil temperature. The NPP is then allocated to optimize growth while maintaining the C:N ratio. The model outputs are total plant nitrogen uptake, remaining NPP available for growth, carbon respired to the soil and updated available soil nitrogen content. We test and validate the model (called FUN: Fixation and Uptake of Nitrogen) against data from the UK, Germany and Peru, and run the model under simplified scenarios of primary succession and climate change. FUN is suitable for incorporation into a land surface scheme of a General Circulation Model and will be coupled with a soil model and dynamic global vegetation model as part of a land surface model (JULES).

  16. Predictive modeling of addiction lapses in a mobile health application.

    PubMed

    Chih, Ming-Yuan; Patton, Timothy; McTavish, Fiona M; Isham, Andrew J; Judkins-Fisher, Chris L; Atwood, Amy K; Gustafson, David H

    2014-01-01

    The chronically relapsing nature of alcoholism leads to substantial personal, family, and societal costs. Addiction-comprehensive health enhancement support system (A-CHESS) is a smartphone application that aims to reduce relapse. To offer targeted support to patients who are at risk of lapses within the coming week, a Bayesian network model to predict such events was constructed using responses on 2,934 weekly surveys (called the Weekly Check-in) from 152 alcohol-dependent individuals who recently completed residential treatment. The Weekly Check-in is a self-monitoring service, provided in A-CHESS, to track patients' recovery progress. The model showed good predictability, with the area under receiver operating characteristic curve of 0.829 in the 10-fold cross-validation and 0.912 in the external validation. The sensitivity/specificity table assists the tradeoff decisions necessary to apply the model in practice. This study moves us closer to the goal of providing lapse prediction so that patients might receive more targeted and timely support. © 2013.

  17. Studies and methodologies on vaginal drug permeation.

    PubMed

    Machado, Rita Monteiro; Palmeira-de-Oliveira, Ana; Gaspar, Carlos; Martinez-de-Oliveira, José; Palmeira-de-Oliveira, Rita

    2015-09-15

    The vagina stands as an important alternative to the oral route for those systemic drugs that are poorly absorbed orally or are rapidly metabolized by the liver. Drug permeation through the vaginal tissue can be estimated by using in vitro, ex vivo and in vivo models. The latter ones, although more realistic, assume ethical and biological limitations due to animal handling. Therefore, in vitro and ex vivo models have been developed to predict drug absorption through the vagina while allowing for simultaneous toxicity and pathogenesis studies. This review focuses on available methodologies to study vaginal drug permeation discussing their advantages and drawbacks. The technical complexity, costs and the ethical issues of an available model, along with its accuracy and reproducibility will determine if it is valid and applicable. Therefore every model shall be evaluated, validated and standardized in order to allow for extrapolations and results presumption, and so improving vaginal drug research and stressing its benefits. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Predictive Modeling of Addiction Lapses in a Mobile Health Application

    PubMed Central

    Chih, Ming-Yuan; Patton, Timothy; McTavish, Fiona M.; Isham, Andrew; Judkins-Fisher, Chris L.; Atwood, Amy K.; Gustafson, David H.

    2013-01-01

    The chronically relapsing nature of alcoholism leads to substantial personal, family, and societal costs. Addiction-Comprehensive Health Enhancement Support System (A-CHESS) is a smartphone application that aims to reduce relapse. To offer targeted support to patients who are at risk of lapses within the coming week, a Bayesian network model to predict such events was constructed using responses on 2,934 weekly surveys (called the Weekly Check-in) from 152 alcohol-dependent individuals who recently completed residential treatment. The Weekly Check-in is a self-monitoring service, provided in A-CHESS, to track patients’ recovery progress. The model showed good predictability, with the area under receiver operating characteristic curve of 0.829 in the 10-fold cross-validation and 0.912 in the external validation. The sensitivity/specificity table assists the tradeoff decisions necessary to apply the model in practice. This study moves us closer to the goal of providing lapse prediction so that patients might receive more targeted and timely support. PMID:24035143

  19. Branch-and-Bound algorithm applied to uncertainty quantification of a Boiling Water Reactor Station Blackout

    DOE PAGES

    Nielsen, Joseph; Tokuhiro, Akira; Hiromoto, Robert; ...

    2015-11-13

    Evaluation of the impacts of uncertainty and sensitivity in modeling presents a significant set of challenges in particular to high fidelity modeling. Computational costs and validation of models creates a need for cost effective decision making with regards to experiment design. Experiments designed to validate computation models can be used to reduce uncertainty in the physical model. In some cases, large uncertainty in a particular aspect of the model may or may not have a large impact on the final results. For example, modeling of a relief valve may result in large uncertainty, however, the actual effects on final peakmore » clad temperature in a reactor transient may be small and the large uncertainty with respect to valve modeling may be considered acceptable. Additionally, the ability to determine the adequacy of a model and the validation supporting it should be considered within a risk informed framework. Low fidelity modeling with large uncertainty may be considered adequate if the uncertainty is considered acceptable with respect to risk. In other words, models that are used to evaluate the probability of failure should be evaluated more rigorously with the intent of increasing safety margin. Probabilistic risk assessment (PRA) techniques have traditionally been used to identify accident conditions and transients. Traditional classical event tree methods utilize analysts’ knowledge and experience to identify the important timing of events in coordination with thermal-hydraulic modeling. These methods lack the capability to evaluate complex dynamic systems. In these systems, time and energy scales associated with transient events may vary as a function of transition times and energies to arrive at a different physical state. Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. Unfortunately DPRA methods introduce issues associated with combinatorial explosion of states. This study presents a methodology to address combinatorial explosion using a Branch-and-Bound algorithm applied to Dynamic Event Trees (DET), which utilize LENDIT (L – Length, E – Energy, N – Number, D – Distribution, I – Information, and T – Time) as well as a set theory to describe system, state, resource, and response (S2R2) sets to create bounding functions for the DET. The optimization of the DET in identifying high probability failure branches is extended to create a Phenomenological Identification and Ranking Table (PIRT) methodology to evaluate modeling parameters important to safety of those failure branches that have a high probability of failure. The PIRT can then be used as a tool to identify and evaluate the need for experimental validation of models that have the potential to reduce risk. Finally, in order to demonstrate this methodology, a Boiling Water Reactor (BWR) Station Blackout (SBO) case study is presented.« less

  20. How to Appropriately Extrapolate Costs and Utilities in Cost-Effectiveness Analysis.

    PubMed

    Bojke, Laura; Manca, Andrea; Asaria, Miqdad; Mahon, Ronan; Ren, Shijie; Palmer, Stephen

    2017-08-01

    Costs and utilities are key inputs into any cost-effectiveness analysis. Their estimates are typically derived from individual patient-level data collected as part of clinical studies the follow-up duration of which is often too short to allow a robust quantification of the likely costs and benefits a technology will yield over the patient's entire lifetime. In the absence of long-term data, some form of temporal extrapolation-to project short-term evidence over a longer time horizon-is required. Temporal extrapolation inevitably involves assumptions regarding the behaviour of the quantities of interest beyond the time horizon supported by the clinical evidence. Unfortunately, the implications for decisions made on the basis of evidence derived following this practice and the degree of uncertainty surrounding the validity of any assumptions made are often not fully appreciated. The issue is compounded by the absence of methodological guidance concerning the extrapolation of non-time-to-event outcomes such as costs and utilities. This paper considers current approaches to predict long-term costs and utilities, highlights some of the challenges with the existing methods, and provides recommendations for future applications. It finds that, typically, economic evaluation models employ a simplistic approach to temporal extrapolation of costs and utilities. For instance, their parameters (e.g. mean) are typically assumed to be homogeneous with respect to both time and patients' characteristics. Furthermore, costs and utilities have often been modelled to follow the dynamics of the associated time-to-event outcomes. However, cost and utility estimates may be more nuanced, and it is important to ensure extrapolation is carried out appropriately for these parameters.

  1. High Volume Pulsed EPC for T/R Modules in Satellite Constellation

    NASA Astrophysics Data System (ADS)

    Notarianni, Michael; Maynadier, Paul; Marin, Marc

    2014-08-01

    In the frame of Iridium Next business, a mobile satellite service, Thales Alenia Space (TAS) has to produce more than 2400 x 65W and 162 x 250W pulsed Electronic Power Conditioners (EPC) to supply the RF transmit/receive modules that compose the active antenna of the satellites.The company has to deal with mass production constraints where cost, volume and performances are crucial factors. Compared to previous constellations realized by TAS, the overall challenge is to make further improvements in a short time:- Predictable electrical models- Deeper design-to-cost approach- Streamlining improvements and test coverageAs the active antenna drives the consumption of the payload, accurate performances have been evaluated early owing to the use of simulation (based on average model) and breadboard tests at the same time.The necessary cost reduction has been done owing to large use of COTS (Components Off The Shelf). In order to secure cost and schedule, each manufacturing step has been optimized to maximize test coverage in order to guarantee high reliability.At this time, more than 200 flight models have already been manufactured, validating this approach.This paper is focused on the 65W EPC but the same activities have been led on the 250W EPC.

  2. Cost minimization in a full-scale conventional wastewater treatment plant: associated costs of biological energy consumption versus sludge production.

    PubMed

    Sid, S; Volant, A; Lesage, G; Heran, M

    2017-11-01

    Energy consumption and sludge production minimization represent rising challenges for wastewater treatment plants (WWTPs). The goal of this study is to investigate how energy is consumed throughout the whole plant and how operating conditions affect this energy demand. A WWTP based on the activated sludge process was selected as a case study. Simulations were performed using a pre-compiled model implemented in GPS-X simulation software. Model validation was carried out by comparing experimental and modeling data of the dynamic behavior of the mixed liquor suspended solids (MLSS) concentration and nitrogen compounds concentration, energy consumption for aeration, mixing and sludge treatment and annual sludge production over a three year exercise. In this plant, the energy required for bioreactor aeration was calculated at approximately 44% of the total energy demand. A cost optimization strategy was applied by varying the MLSS concentrations (from 1 to 8 gTSS/L) while recording energy consumption, sludge production and effluent quality. An increase of MLSS led to an increase of the oxygen requirement for biomass aeration, but it also reduced total sludge production. Results permit identification of a key MLSS concentration allowing identification of the best compromise between levels of treatment required, biological energy demand and sludge production while minimizing the overall costs.

  3. Validated Feasibility Study of Integrally Stiffened Metallic Fuselage Panels for Reducing Manufacturing Costs: Cost Assessment of Manufacturing/Design Concepts

    NASA Technical Reports Server (NTRS)

    Metschan, S.

    2000-01-01

    The objective of the Integral Airframe Structures (IAS) program was to demonstrate, for an integrally stiffened structural concept, performance and weight equal to "built-up" structure with lower manufacturing cost. This report presents results of the cost assessment for several design configuration/manufacturing method combinations. The attributes of various cost analysis models were evaluated and COSTRAN selected for this study. A process/design cost evaluation matrix was developed based on material, forming, machining, and assembly of structural sub-elements and assembled structure. A hybrid design, made from high-speed machined extruded frames that are mechanically fastened to high-speed machined plate skin/stringer panels, was identified as the most cost-effective manufacturing solution. Recurring labor and material costs of the hybrid design are up to 61 percent less than the current built-up technology baseline. This would correspond to a total cost reduction of $1.7 million per ship set for a 777-sized airplane. However, there are important outstanding issues with regard to the cost of capacity of high technology machinery, and the ability to cost-effectively provide surface finish acceptable to the commercial aircraft industry. The projected high raw material cost of large extrusions also played an important role in the trade-off between plate and extruded concepts.

  4. A design procedure for the handling qualities optimization of the X-29A aircraft

    NASA Technical Reports Server (NTRS)

    Bosworth, John T.; Cox, Timothy H.

    1989-01-01

    The techniques used to improve the pitch-axis handling qualities of the X-29A wing-canard-planform fighter aircraft are reviewed. The aircraft and its FCS are briefly described, and the design method, which works within the existing FCS architecture, is characterized in detail. Consideration is given to the selection of design goals and design variables, the definition and calculation of the cost function, the validation of the mathematical model on the basis of flight-test data, and the validation of the improved design by means of nonlinear simulations. Flight tests of the improved design are shown to verify the simulation results.

  5. Validation of a numerical method for interface-resolving simulation of multicomponent gas-liquid mass transfer and evaluation of multicomponent diffusion models

    NASA Astrophysics Data System (ADS)

    Woo, Mino; Wörner, Martin; Tischer, Steffen; Deutschmann, Olaf

    2018-03-01

    The multicomponent model and the effective diffusivity model are well established diffusion models for numerical simulation of single-phase flows consisting of several components but are seldom used for two-phase flows so far. In this paper, a specific numerical model for interfacial mass transfer by means of a continuous single-field concentration formulation is combined with the multicomponent model and effective diffusivity model and is validated for multicomponent mass transfer. For this purpose, several test cases for one-dimensional physical or reactive mass transfer of ternary mixtures are considered. The numerical results are compared with analytical or numerical solutions of the Maxell-Stefan equations and/or experimental data. The composition-dependent elements of the diffusivity matrix of the multicomponent and effective diffusivity model are found to substantially differ for non-dilute conditions. The species mole fraction or concentration profiles computed with both diffusion models are, however, for all test cases very similar and in good agreement with the analytical/numerical solutions or measurements. For practical computations, the effective diffusivity model is recommended due to its simplicity and lower computational costs.

  6. Foundation Heat Exchanger Final Report: Demonstration, Measured Performance, and Validated Model and Design Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Patrick; Im, Piljae

    2012-04-01

    Geothermal heat pumps, sometimes called ground-source heat pumps (GSHPs), have been proven capable of significantly reducing energy use and peak demand in buildings. Conventional equipment for controlling the temperature and humidity of a building, or supplying hot water and fresh outdoor air, must exchange energy (or heat) with the building's outdoor environment. Equipment using the ground as a heat source and heat sink consumes less non-renewable energy (electricity and fossil fuels) because the earth is cooler than outdoor air in summer and warmer in winter. The most important barrier to rapid growth of the GSHP industry is high first costmore » of GSHP systems to consumers. The most common GSHP system utilizes a closed-loop ground heat exchanger. This type of GSHP system can be used almost anywhere. There is reason to believe that reducing the cost of closed-loop systems is the strategy that would achieve the greatest energy savings with GSHP technology. The cost premium of closed-loop GSHP systems over conventional space conditioning and water heating systems is primarily associated with drilling boreholes or excavating trenches, installing vertical or horizontal ground heat exchangers, and backfilling the excavations. This project investigates reducing the cost of horizontal closed-loop ground heat exchangers by installing them in the construction excavations, augmented when necessary with additional trenches. This approach applies only to new construction of residential and light commercial buildings or additions to such buildings. In the business-as-usual scenario, construction excavations are not used for the horizontal ground heat exchanger (HGHX); instead the HGHX is installed entirely in trenches dug specifically for that purpose. The potential cost savings comes from using the construction excavations for the installation of ground heat exchangers, thereby minimizing the need and expense of digging additional trenches. The term foundation heat exchanger (FHX) has been coined to refer exclusively to ground heat exchangers installed in the overcut around the basement walls. The primary technical challenge undertaken by this project was the development and validation of energy performance models and design tools for FHX. In terms of performance modeling and design, ground heat exchangers in other construction excavations (e.g., utility trenches) are no different from conventional HGHX, and models and design tools for HGHX already exist. This project successfully developed and validated energy performance models and design tools so that FHX or hybrid FHX/HGHX systems can be engineered with confidence, enabling this technology to be applied in residential and light commercial buildings. The validated energy performance model also addresses and solves another problem, the longstanding inadequacy in the way ground-building thermal interaction is represented in building energy models, whether or not there is a ground heat exchanger nearby. Two side-by-side, three-level, unoccupied research houses with walkout basements, identical 3,700 ft{sup 2} floor plans, and hybrid FHX/HGHX systems were constructed to provide validation data sets for the energy performance model and design tool. The envelopes of both houses are very energy efficient and airtight, and the HERS ratings of the homes are 44 and 45 respectively. Both houses are mechanically ventilated with energy recovery ventilators, with space conditioning provided by water-to-air heat pumps with 2 ton nominal capacities. Separate water-to-water heat pumps with 1.5 ton nominal capacities were used for water heating. In these unoccupied research houses, human impact on energy use (hot water draw, etc.) is simulated to match the national average. At House 1 the hybrid FHX/HGHX system was installed in 300 linear feet of excavation, and 60% of that was construction excavation (needed to construct the home). At House 2 the hybrid FHX/HGHX system was installed in 360 feet of excavation, 50% of which was construction excavation. There are six pipes in all excavations (three parallel circuits - out and back), and the multiple instances of FHX and/or HGHX are all connected in series. The working fluid is 20% by weight propylene glycol in water. Model and design tool development was undertaken in parallel with constructing the houses, installing instrumentation, and monitoring performance for a year. Several detailed numerical models for FHX were developed as part of the project. Essentially the project team was searching for an energy performance model accurate enough to achieve project objectives while also having sufficient computational efficiency for practical use in EnergyPlus. A 3-dimensional, dual-coordinate-system, finite-volume model satisfied these criteria and was included in the October 2011 EnergyPlus Version 7 public release after being validated against measured data.« less

  7. The clinical effectiveness and cost-effectiveness of primary stroke prevention in children with sickle cell disease: a systematic review and economic evaluation.

    PubMed

    Cherry, M G; Greenhalgh, J; Osipenko, L; Venkatachalam, M; Boland, A; Dundar, Y; Marsh, K; Dickson, R; Rees, D C

    2012-01-01

    Sickle cell disease (SCD) is a recessive genetic blood disorder, caused by a mutation in the β-globin gene. For children with SCD, the risk of stroke is estimated to be up to 250 times higher than in the general childhood population. Transcranial Doppler (TCD) ultrasonography is a non-invasive technique which measures local blood velocity in the proximal portions of large intracranial arteries. Screening with TCD ultrasonography identifies individuals with high cerebral blood velocity; these children are at the highest risk of stroke. A number of primary stroke prevention strategies are currently used in clinical practice in the UK including blood transfusion, treatment with hydroxycarbamide and bone marrow transplantation (BMT). No reviews have yet assessed the clinical effectiveness and cost effectiveness of primary stroke prevention strategies in children with SCD identified to be at high risk of stroke using TCD ultrasonography. To assess the clinical effectiveness and cost-effectiveness of primary stroke prevention treatments for children with SCD who are identified (using TCD ultrasonography) to be at high risk of stroke. Electronic databases were searched from inception up to May 2011, including the Cochrane Database of Systematic Reviews (CDSR), the Cochrane Central Register of Controlled Trials (CENTRAL), the Database of Abstracts of Reviews of Effects (DARE), EMBASE, the Health Technology Assessment (HTA) database, ISI Web of Science Proceedings, ISI Web of Science Citation Index, the NHS Economic Evaluation Database (NHS EED) and MEDLINE. The assessment was conducted according to accepted procedures for conducting and reporting systematic reviews and economic evaluations. A de novo Markov model was developed to determine the cost-effectiveness of TCD ultrasonography and blood transfusion, where clinically appropriate, in patients with SCD. Two randomised controlled trials met the inclusion criteria involving a study population of 209 participants. One compared blood transfusion with standard care for children who are identified as being at high risk of stroke using TCD ultrasonography. In this trial, one patient in the transfusion group had a stroke (1/63) compared with 11 children in the standard care group (11/67). The other trial assessed the impact of halting chronic transfusion in patients with SCD. Sixteen patients in the transfusion-halted group had an event (16/41) (two patients experienced stroke and 14 reverted to abnormal TCD velocity); there were no events in the continued-transfusion group (0/38). No meta-analyses of these trials were undertaken. No relevant economic evaluations were identified for inclusion in the review. The de novo modelling suggests that blood transfusions plus TCD scans (compared with just TCD scans) for patients with SCD at high risk of stroke, aged ≥ 2 years, may be good value for money. The intervention has an incremental cost-effectiveness ratio of £24,075 per quality-adjusted life-year gained, and helps avoid 68 strokes over the lifetime of a population of 1000 patients. The intervention costs an additional £13,751 per patient and generates 0.6 extra years of life in full health per patient. The data available for the economic analysis are limited. Sensitivity analyses and validation against existing data and expert opinion provide some reassurance that the conclusion of the model is reliable but further research is required to validate these findings. The main limitations relate to the availability of published clinical data; no completed randomised controlled trials were identified which evaluated the efficacy of either BMT or hydroxycarbamide for primary stroke prevention. Both the clinical and cost data available for use in the economic analysis are limited. Sensitivity analyses and validation against existing data and expert opinion provide some reassurance that the conclusions of the model are reliable, but further research is required to validate these findings. The use of TCD ultrasonography to identify children at high risk of stroke, and treating these children with prophylactic blood transfusions, appears to be both clinically effective and cost-effective compared with TCD ultrasonography only. However, given the limitations in the data available, further research is required to verify this conclusion. Several research recommendations can be proposed from this review. Clinically, more research is needed to assess the effects and optimal duration of long-term blood transfusion and the potential role of hydroxycarbamide in primary stroke prevention. From an economics perspective, further research is required to generate more robust data on which to base estimates of cost-effectiveness or against which model outputs can be calibrated. More data are required to explain how utility weights vary with age, transfusions and strokes. Research is also needed around the cost of paediatric stroke in the UK. PROSPERO CRD42011001496. The National Institute for Health Research Health Technology Assessment programme.

  8. Nutrient profiling can help identify foods of good nutritional quality for their price: a validation study with linear programming.

    PubMed

    Maillot, Matthieu; Ferguson, Elaine L; Drewnowski, Adam; Darmon, Nicole

    2008-06-01

    Nutrient profiling ranks foods based on their nutrient content. They may help identify foods with a good nutritional quality for their price. This hypothesis was tested using diet modeling with linear programming. Analyses were undertaken using food intake data from the nationally representative French INCA (enquête Individuelle et Nationale sur les Consommations Alimentaires) survey and its associated food composition and price database. For each food, a nutrient profile score was defined as the ratio between the previously published nutrient density score (NDS) and the limited nutrient score (LIM); a nutritional quality for price indicator was developed and calculated from the relationship between its NDS:LIM and energy cost (in euro/100 kcal). We developed linear programming models to design diets that fulfilled increasing levels of nutritional constraints at a minimal cost. The median NDS:LIM values of foods selected in modeled diets increased as the levels of nutritional constraints increased (P = 0.005). In addition, the proportion of foods with a good nutritional quality for price indicator was higher (P < 0.0001) among foods selected (81%) than among foods not selected (39%) in modeled diets. This agreement between the linear programming and the nutrient profiling approaches indicates that nutrient profiling can help identify foods of good nutritional quality for their price. Linear programming is a useful tool for testing nutrient profiling systems and validating the concept of nutrient profiling.

  9. A cluster randomized trial for the implementation of an antibiotic checklist based on validated quality indicators: the AB-checklist.

    PubMed

    van Daalen, Frederike V; Prins, Jan M; Opmeer, Brent C; Boermeester, Marja A; Visser, Caroline E; van Hest, Reinier M; Hulscher, Marlies E J L; Geerlings, Suzanne E

    2015-03-19

    Recently we developed and validated generic quality indicators that define 'appropriate antibiotic use' in hospitalized adults treated for a (suspected) bacterial infection. Previous studies have shown that with appropriate antibiotic use a reduction of 13% of length of hospital stay can be achieved. Our main objective in this project is to provide hospitals with an antibiotic checklist based on these quality indicators, and to evaluate the introduction of this checklist in terms of (cost-) effectiveness. The checklist applies to hospitalized adults with a suspected bacterial infection for whom antibiotic therapy is initiated, at first via the intravenous route. A stepped wedge study design will be used, comparing outcomes before and after introduction of the checklist in nine hospitals in the Netherlands. At least 810 patients will be included in both the control and the intervention group. The primary endpoint is length of hospital stay. Secondary endpoints are appropriate antibiotic use measured by the quality indicators, admission to and duration of intensive care unit stay, readmission within 30 days, mortality, total antibiotic use, and costs associated with implementation and hospital stay. Differences in numerical endpoints between the two periods will be evaluated with mixed linear models; for dichotomous outcomes generalized estimating equation models will be used. A process evaluation will be performed to evaluate the professionals' compliance with use of the checklist. The key question for the economic evaluation is whether the benefits of the checklist, which include reduced antibiotic use, reduced length of stay and associated costs, justify the costs associated with implementation activities as well as daily use of the checklist. If (cost-) effective, the AB-checklist will provide physicians with a tool to support appropriate antibiotic use in adult hospitalized patients who start with intravenous antibiotics. Dutch trial registry: NTR4872.

  10. Development of a patient-specific anatomical foot model from structured light scan data.

    PubMed

    Lochner, Samuel J; Huissoon, Jan P; Bedi, Sanjeev S

    2014-01-01

    The use of anatomically accurate finite element (FE) models of the human foot in research studies has increased rapidly in recent years. Uses for FE foot models include advancing knowledge of orthotic design, shoe design, ankle-foot orthoses, pathomechanics, locomotion, plantar pressure, tissue mechanics, plantar fasciitis, joint stress and surgical interventions. Similar applications but for clinical use on a per-patient basis would also be on the rise if it were not for the high costs associated with developing patient-specific anatomical foot models. High costs arise primarily from the expense and challenges of acquiring anatomical data via magnetic resonance imaging (MRI) or computed tomography (CT) and reconstructing the three-dimensional models. The proposed solution morphs detailed anatomy from skin surface geometry and anatomical landmarks of a generic foot model (developed from CT or MRI) to surface geometry and anatomical landmarks acquired from an inexpensive structured light scan of a foot. The method yields a patient-specific anatomical foot model at a fraction of the cost of standard methods. Average error for bone surfaces was 2.53 mm for the six experiments completed. Highest accuracy occurred in the mid-foot and lowest in the forefoot due to the small, irregular bones of the toes. The method must be validated in the intended application to determine if the resulting errors are acceptable.

  11. Analysis of various quality attributes of sunflower and soybean plants by near infra-red reflectance spectroscopy: Development and validation of calibration models

    USDA-ARS?s Scientific Manuscript database

    Soybean and sunflower are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems for forage in addition to their traditional use as protein and/or oil yielding crops. Rapid and low cost methods of analyzing plant quality would be helpf...

  12. Serviceberry [Amerlanchier alnifolia (Nutt.) Nutt. ex. M. Roem(Rosaceae)] leaf exhibits mammalian alpha glucosidase activity and suppresses postprandial glycemic response in a mouse model of diet induced obesity/hyperglycemia

    USDA-ARS?s Scientific Manuscript database

    Several plant-based remedies offer cost-effective management of diabetes, but few plant species adapted to North America have been validated for their antidiabetic properties. One such species is serviceberry (Amelanchier alnifolia), found in Browning, MT, which has been traditionally used by the Am...

  13. Program test objectives milestone 3. [Integrated Propulsion Technology Demonstrator

    NASA Technical Reports Server (NTRS)

    Gaynor, T. L.

    1994-01-01

    The following conclusions have been developed relative to propulsion system technology adequacy for efficient development and operation of recoverable and expendable launch vehicles (RLV and ELV) and the benefits which the integrated propulsion technology demonstrator will provide for enhancing technology: (1) Technology improvements relative to propulsion system design and operation can reduce program cost. Many features or improvement needs to enhance operability, reduce cost, and improve payload are identified. (2) The Integrated Propulsion Technology Demonstrator (IPTD) Program provides a means of resolving the majority of issues associated with improvement needs. (3) The IPTD will evaluate complex integration of vehicle and facility functions in fluid management and propulsion control systems, and provides an environment for validating improved mechanical and electrical components. (4) The IPTD provides a mechanism for investigating operational issues focusing on reducing manpower and time to perform various functions at the launch site. These efforts include model development, collection of data to validate subject models, and ultimate development of complex time line models. (5) The IPTD provides an engine test bed for tri/bi-propellant engine development firings which is representative of the actual vehicle environment. (6) The IPTD provides for only a limited multiengine configuration integration environment for RLV. Multiengine efforts may be simulated for a number of subsystems and a number of subsystems are relatively independent of the multiengine influences.

  14. A model for estimating the impact of changes in children's vaccines.

    PubMed

    Simpson, K N; Biddle, A K; Rabinovich, N R

    1995-12-01

    To assist in strategic planning for the improvement of vaccines and vaccine programs, an economic model was developed and tested that estimates the potential impact of vaccine innovations on health outcomes and costs associated with vaccination and illness. A multistep, iterative process of data extraction/integration was used to develop the model and the scenarios. Parameter replication, sensitivity analysis, and expert review were used to validate the model. The greatest impact on the improvement of health is expected to result from the production of less reactogenic vaccines that require fewer inoculations for immunity. The greatest economic impact is predicted from improvements that decrease the number of inoculations required. Scenario analysis may be useful for integrating health outcomes and economic data into decision making. For childhood infections, this analysis indicates that large cost savings can be achieved in the future if we can improve vaccine efficacy so that the number of required inoculations is reduced. Such an improvement represents a large potential "payback" for the United States and might benefit other countries.

  15. Health economic potential of early nutrition programming: a model calculation of long-term reduction in blood pressure and related morbidity costs by use of long-chain polyunsaturated fatty acid-supplemented formula.

    PubMed

    Straub, Niels; Grunert, Philipp; von Kries, Rüdiger; Koletzko, Berthold

    2011-12-01

    The reported effect sizes of early nutrition programming on long-term health outcomes are often small, and it has been questioned whether early interventions would be worthwhile in enhancing public health. We explored the possible health economic consequences of early nutrition programming by performing a model calculation, based on the only published study currently available for analysis, to evaluate the effects of supplementing infant formula with long-chain polyunsaturated fatty acids (LC-PUFAs) on lowering blood pressure and lowering the risk of hypertension-related diseases in later life. The costs and health effects of LC-PUFA-enriched and standard infant formulas were compared by using a Markov model, including all relevant direct and indirect costs based on German statistics. We assessed the effect size of blood pressure reduction from LC-PUFA-supplemented formula, the long-term persistence of the effect, and the effect of lowered blood pressure on hypertension-related morbidity. The cost-effectiveness analysis showed an increased life expectancy of 1.2 quality-adjusted life-years and an incremental cost-effectiveness ratio of -630 Euros (discounted to present value) for the LC-PUFA formula in comparison with standard formula. LC-PUFA nutrition was the superior strategy even when the blood pressure-lowering effect was reduced to the lower 95% CI. Breastfeeding is the recommended feeding practice, but infants who are not breastfed should receive an appropriate infant formula. Following this model calculation, LC-PUFA supplementation of infant formula represents an economically worthwhile prevention strategy, based on the costs derived from hypertension-linked diseases in later life. However, because our analysis was based on a single randomized controlled trial, further studies are required to verify the validity of this thesis.

  16. [Construction of the addiction prevention core competency model for preventing addictive behavior in adolescents].

    PubMed

    Park, Hyun Sook; Jung, Sun Young

    2013-12-01

    This study was done to provide fundamental data for the development of competency reinforcement programs to prevent addictive behavior in adolescents through the construction and examination of an addiction prevention core competency model. In this study core competencies for preventing addictive behavior in adolescents through competency modeling were identified, and the addiction prevention core competency model was developed. It was validated methodologically. Competencies for preventing addictive behavior in adolescents as defined by the addiction prevention core competency model are as follows: positive self-worth, self-control skill, time management skill, reality perception skill, risk coping skill, and positive communication with parents and with peers or social group. After construction, concurrent cross validation of the addiction prevention core competency model showed that this model was appropriate. The study results indicate that the addiction prevention core competency model for the prevention of addictive behavior in adolescents through competency modeling can be used as a foundation for an integral approach to enhance adolescent is used as an adjective and prevent addictive behavior. This approach can be a school-centered, cost-efficient strategy which not only reduces addictive behavior in adolescents, but also improves the quality of their resources.

  17. Comparison of costs and outcomes of dapagliflozin with other glucose-lowering therapy classes added to metformin using a short-term cost-effectiveness model in the US setting.

    PubMed

    Chakravarty, Abhiroop; Rastogi, Mohini; Dhankhar, Praveen; Bell, Kelly F

    2018-05-01

    To compare 1-year costs and benefits of dapagliflozin (DAPA), a sodium-glucose cotransporter-2 (SGLT-2) inhibitor, with those of other treatments for type 2 diabetes (T2D), such as glucagon-like peptide-1 receptor agonists (GLP-1RAs), sulfonylureas (SUs), thiazolidinediones (TZDs), and dipeptidyl peptidase-4 inhibitors (DPP-4i), all combined with metformin. A short-term decision-analytic model with a 1-year time horizon was developed from a payer's perspective in the United States setting. Costs and benefits associated with four clinical end-points (glycated hemoglobin [A1C], body weight, systolic blood pressure [SBP], and risk of hypoglycemia) were evaluated in the analysis. The impact of DAPA and other glucose-lowering therapy classes on these clinical end-points was estimated from a network meta-analysis (NMA). Data for costs and quality-adjusted life-years (QALYs) associated with a per-unit change in these clinical end-points were taken from published literature. Drug prices were taken from an annual wholesale price list. All costs were inflation-adjusted to December 2016 costs using the medical care component of the consumer price index. Total costs (both medical and drug costs), total QALYs, and incremental cost-effectiveness ratios (ICERs) were estimated. Sensitivity analyses (SA) were performed to explore uncertainty in the inputs. To assess face validity, results from the short-term model were compared with long-term models published for these drugs. The total annual medical cost for DAPA was less than that for GLP-1RA ($186 less), DPP-4i ($1,142 less), SU ($2,474 less), and TZD ($1,640 less). Treatment with DAPA resulted in an average QALY gain of 0.0107, 0.0587, 0.1137, and 0.0715 per treated patient when compared with GLP-1RA, DPP-4i, SU, and TZD, respectively. ICERs for DAPA vs SU and TZD were $19,005 and $25,835, respectively. DAPA was a cost-saving option when compared with GLP-1RAs and DPP-4is. Among all four clinical end-points, change in weight had the greatest impact on total annual costs and ICERS. Sensitivity analysis showed that results were robust, and results from the short-term model were found to be similar to those of published long-term models. This analysis showed that DAPA was cost-saving compared with GLP-1RA and DPP-4i, and cost-effective compared with SU and TZD in the US setting over 1 year. Furthermore, the results suggest that, among the four composite clinical end-points, change in weight and SBP had an impact on cost-effectiveness results.

  18. Cost Effectiveness of Screening Patients With Gastroesophageal Reflux Disease for Barrett's Esophagus With a Minimally Invasive Cell Sampling Device.

    PubMed

    Heberle, Curtis R; Omidvari, Amir-Houshang; Ali, Ayman; Kroep, Sonja; Kong, Chung Yin; Inadomi, John M; Rubenstein, Joel H; Tramontano, Angela C; Dowling, Emily C; Hazelton, William D; Luebeck, E Georg; Lansdorp-Vogelaar, Iris; Hur, Chin

    2017-09-01

    It is important to identify patients with Barrett's esophagus (BE), the precursor to esophageal adenocarcinoma (EAC). Patients with BE usually are identified by endoscopy, which is expensive. The Cytosponge, which collects tissue from the esophagus noninvasively, could be a cost-effective tool for screening individuals with gastroesophageal reflux disease (GERD) who are at increased risk for BE. We developed a model to analyze the cost effectiveness of using the Cytosponge in first-line screening of patients with GERD for BE with endoscopic confirmation, compared with endoscopy screening only. We incorporated data from a large clinical trial of Cytosponge performance into 2 validated microsimulation models of EAC progression (the esophageal adenocarcinoma model from Massachusetts General Hospital and the microsimulation screening analysis model from Erasmus University Medical Center). The models were calibrated for US Surveillance, Epidemiology and End Results data on EAC incidence and mortality. In each model, we simulated the effect of a 1-time screen for BE in male patients with GERD, 60 years of age, using endoscopy alone or Cytosponge collection of tissue, and analysis for the level of trefoil factor 3 with endoscopic confirmation of positive results. For each strategy we recorded the number of cases of EAC that developed, the number of EAC cases detected with screening by Cytosponge only or by subsequent targeted surveillance, and the number of endoscopies needed. In addition, we recorded the cumulative costs (including indirect costs) incurred and quality-adjusted years of life lived within each strategy, discounted at a rate of 3% per year, and computed incremental cost-effectiveness ratios (ICERs) among the 3 strategies. According to the models, screening patients with GERD by Cytosponge with follow-up confirmation of positive results by endoscopy would reduce the cost of screening by 27% to 29% compared with screening by endoscopy, but led to 1.8 to 5.5 (per 1000 patients) fewer quality-adjusted life years. The ICERs for Cytosponge screening compared with no screening ranged from $26,358 to $33,307. For screening patients by endoscopy compared with Cytosponge the ICERs ranged from $107,583 to $330,361. These results were sensitive to Cytosponge cost within a plausible range of values. In a comparative modeling analysis of screening strategies for BE in patients with GERD, we found Cytosponge screening with endoscopic confirmation to be a cost-effective strategy. The greatest benefit was achieved by endoscopic screening, but with an unfavorable cost margin. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.

  19. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  20. Development of a model for predicting reaction rate constants of organic chemicals with ozone at different temperatures.

    PubMed

    Li, Xuehua; Zhao, Wenxing; Li, Jing; Jiang, Jingqiu; Chen, Jianji; Chen, Jingwen

    2013-08-01

    To assess the persistence and fate of volatile organic compounds in the troposphere, the rate constants for the reaction with ozone (kO3) are needed. As kO3 values are only available for hundreds of compounds, and experimental determination of kO3 is costly and time-consuming, it is of importance to develop predictive models on kO3. In this study, a total of 379 logkO3 values at different temperatures were used to develop and validate a model for the prediction of kO3, based on quantum chemical descriptors, Dragon descriptors and structural fragments. Molecular descriptors were screened by stepwise multiple linear regression, and the model was constructed by partial least-squares regression. The cross validation coefficient QCUM(2) of the model is 0.836, and the external validation coefficient Qext(2) is 0.811, indicating that the model has high robustness and good predictive performance. The most significant descriptor explaining logkO3 is the BELm2 descriptor with connectivity information weighted atomic masses. kO3 increases with increasing BELm2, and decreases with increasing ionization potential. The applicability domain of the proposed model was visualized by the Williams plot. The developed model can be used to predict kO3 at different temperatures for a wide range of organic chemicals, including alkenes, cycloalkenes, haloalkenes, alkynes, oxygen-containing compounds, nitrogen-containing compounds (except primary amines) and aromatic compounds. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Long-Term Cost-Effectiveness of Insulin Glargine Versus Neutral Protamine Hagedorn Insulin for Type 2 Diabetes in Thailand.

    PubMed

    Permsuwan, Unchalee; Chaiyakunapruk, Nathorn; Dilokthornsakul, Piyameth; Thavorn, Kednapa; Saokaew, Surasak

    2016-06-01

    Even though Insulin glargine (IGlar) has been available and used in other countries for more than a decade, it has not been adopted into Thai national formulary. This study aimed to evaluate the long-term cost effectiveness of IGlar versus neutral protamine Hagedorn (NPH) insulin in type 2 diabetes from the perspective of Thai Health Care System. A validated computer simulation model (the IMS CORE Diabetes Model) was used to estimate the long-term projection of costs and clinical outcomes. The model was populated with published characteristics of Thai patients with type 2 diabetes. Baseline risk factors were obtained from Thai cohort studies, while relative risk reduction was derived from a meta-analysis study conducted by the Canadian Agency for Drugs and Technology in Health. Only direct costs were taken into account. Costs of diabetes management and complications were obtained from hospital databases in Thailand. Both costs and outcomes were discounted at 3 % per annum and presented in US dollars in terms of 2014 dollar value. Incremental cost-effectiveness ratio (ICER) was calculated. One-way and probabilistic sensitivity analyses were also performed. IGlar is associated with a slight gain in quality-adjusted life years (0.488 QALYs), an additional life expectancy (0.677 life years), and an incremental cost of THB119,543 (US$3522.19) compared with NPH insulin. The ICERs were THB244,915/QALY (US$7216.12/QALY) and THB176,525/life-year gained (LYG) (US$5201.09/LYG). The ICER was sensitive to discount rates and IGlar cost. At the acceptable willingness to pay of THB160,000/QALY (US$4714.20/QALY), the probability that IGlar was cost effective was less than 20 %. Compared to treatment with NPH insulin, treatment with IGlar in type 2 diabetes patients who had uncontrolled blood glucose with oral anti-diabetic drugs did not represent good value for money at the acceptable threshold in Thailand.

  2. Validation of the thermophysiological model by Fiala for prediction of local skin temperatures

    NASA Astrophysics Data System (ADS)

    Martínez, Natividad; Psikuta, Agnes; Kuklane, Kalev; Quesada, José Ignacio Priego; de Anda, Rosa María Cibrián Ortiz; Soriano, Pedro Pérez; Palmer, Rosario Salvador; Corberán, José Miguel; Rossi, René Michel; Annaheim, Simon

    2016-12-01

    The most complete and realistic physiological data are derived from direct measurements during human experiments; however, they present some limitations such as ethical concerns, time and cost burden. Thermophysiological models are able to predict human thermal response in a wide range of environmental conditions, but their use is limited due to lack of validation. The aim of this work was to validate the thermophysiological model by Fiala for prediction of local skin temperatures against a dedicated database containing 43 different human experiments representing a wide range of conditions. The validation was conducted based on root-mean-square deviation (rmsd) and bias. The thermophysiological model by Fiala showed a good precision when predicting core and mean skin temperature (rmsd 0.26 and 0.92 °C, respectively) and also local skin temperatures for most body sites (average rmsd for local skin temperatures 1.32 °C). However, an increased deviation of the predictions was observed for the forehead skin temperature (rmsd of 1.63 °C) and for the thigh during exercising exposures (rmsd of 1.41 °C). Possible reasons for the observed deviations are lack of information on measurement circumstances (hair, head coverage interference) or an overestimation of the sweat evaporative cooling capacity for the head and thigh, respectively. This work has highlighted the importance of collecting details about the clothing worn and how and where the sensors were attached to the skin for achieving more precise results in the simulations.

  3. An extension of trust and TAM model with IDT in the adoption of the electronic logistics information system in HIS in the medical industry.

    PubMed

    Tung, Feng-Cheng; Chang, Su-Chao; Chou, Chi-Min

    2008-05-01

    Ever since National Health Insurance was introduced in 1995, the number of insurants increased to over 96% from 50 to 60%, with a continuous satisfaction rating of about 70%. However, the premium accounted for 5.77% of GDP in 2001 and the Bureau of National Health Insurance had pressing financial difficulties, so it reformed its expenditure systems, such as fee for service, capitation, case payment and the global budget system in order to control the rising medical costs. Since the change in health insurance policy, most hospitals attempted to reduce their operating expenses and improve efficiency. Introducing the electronic logistics information system is one way of reducing the cost of the department of central warehouse and the nursing stations. Hence, the study proposes a technology acceptance research model and examines how nurses' acceptance of the e-logistics information system has been affected in the medical industry. This research combines innovation diffusion theory, technology acceptance model and added two research parameters, trust and perceived financial cost to propose a new hybrid technology acceptance model. Taking Taiwan's medical industry as an experimental example, this paper studies nurses' acceptance of the electronic logistics information system. The structural equation modeling technique was used to evaluate the causal model and confirmatory factor analysis was performed to examine the reliability and validity of the measurement model. The results of the survey strongly support the new hybrid technology acceptance model in predicting nurses' intention to use the electronic logistics information system. The study shows that 'compatibility', 'perceived usefulness', 'perceived ease of use', and 'trust' all have great positive influence on 'behavioral intention to use'. On the other hand 'perceived financial cost' has great negative influence on behavioral intention to use.

  4. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  5. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE PAGES

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.; ...

    2017-04-01

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  6. A person based formula for allocating commissioning funds to general practices in England: development of a statistical model

    PubMed Central

    Smith, Peter; Gravelle, Hugh; Martin, Steve; Bardsley, Martin; Rice, Nigel; Georghiou, Theo; Dusheiko, Mark; Billings, John; Lorenzo, Michael De; Sanderson, Colin

    2011-01-01

    Objectives To develop a formula for allocating resources for commissioning hospital care to all general practices in England based on the health needs of the people registered in each practice Design Multivariate prospective statistical models were developed in which routinely collected electronic information from 2005-6 and 2006-7 on individuals and the areas in which they lived was used to predict their costs of hospital care in the next year, 2007-8. Data on individuals included all diagnoses recorded at any inpatient admission. Models were developed on a random sample of 5 million people and validated on a second random sample of 5 million people and a third sample of 5 million people drawn from a random sample of practices. Setting All general practices in England as of 1 April 2007. All NHS inpatient admissions and outpatient attendances for individuals registered with a general practice on that date. Subjects All individuals registered with a general practice in England at 1 April 2007. Main outcome measures Power of the statistical models to predict the costs of the individual patient or each practice’s registered population for 2007-8 tested with a range of metrics (R2 reported here). Comparisons of predicted costs in 2007-8 with actual costs incurred in the same year were calculated by individual and by practice. Results Models including person level information (age, sex, and ICD-10 codes diagnostic recorded) and a range of area level information (such as socioeconomic deprivation and supply of health facilities) were most predictive of costs. After accounting for person level variables, area level variables added little explanatory power. The best models for resource allocation could predict upwards of 77% of the variation in costs at practice level, and about 12% at the person level. With these models, the predicted costs of about a third of practices would exceed or undershoot the actual costs by 10% or more. Smaller practices were more likely to be in these groups. Conclusions A model was developed that performed well by international standards, and could be used for allocations to practices for commissioning. The best formulas, however, could predict only about 12% of the variation in next year’s costs of most inpatient and outpatient NHS care for each individual. Person-based diagnostic data significantly added to the predictive power of the models. PMID:22110252

  7. Process cost and facility considerations in the selection of primary cell culture clarification technology.

    PubMed

    Felo, Michael; Christensen, Brandon; Higgins, John

    2013-01-01

    The bioreactor volume delineating the selection of primary clarification technology is not always easily defined. Development of a commercial scale process for the manufacture of therapeutic proteins requires scale-up from a few liters to thousands of liters. While the separation techniques used for protein purification are largely conserved across scales, the separation techniques for primary cell culture clarification vary with scale. Process models were developed to compare monoclonal antibody production costs using two cell culture clarification technologies. One process model was created for cell culture clarification by disc stack centrifugation with depth filtration. A second process model was created for clarification by multi-stage depth filtration. Analyses were performed to examine the influence of bioreactor volume, product titer, depth filter capacity, and facility utilization on overall operating costs. At bioreactor volumes <1,000 L, clarification using multi-stage depth filtration offers cost savings compared to clarification using centrifugation. For bioreactor volumes >5,000 L, clarification using centrifugation followed by depth filtration offers significant cost savings. For bioreactor volumes of ∼ 2,000 L, clarification costs are similar between depth filtration and centrifugation. At this scale, factors including facility utilization, available capital, ease of process development, implementation timelines, and process performance characterization play an important role in clarification technology selection. In the case study presented, a multi-product facility selected multi-stage depth filtration for cell culture clarification at the 500 and 2,000 L scales of operation. Facility implementation timelines, process development activities, equipment commissioning and validation, scale-up effects, and process robustness are examined. © 2013 American Institute of Chemical Engineers.

  8. Case-Mix for Performance Management: A Risk Algorithm Based on ICD-10-CM.

    PubMed

    Gao, Jian; Moran, Eileen; Almenoff, Peter L

    2018-06-01

    Accurate risk adjustment is the key to a reliable comparison of cost and quality performance among providers and hospitals. However, the existing case-mix algorithms based on age, sex, and diagnoses can only explain up to 50% of the cost variation. More accurate risk adjustment is desired for provider performance assessment and improvement. To develop a case-mix algorithm that hospitals and payers can use to measure and compare cost and quality performance of their providers. All 6,048,895 patients with valid diagnoses and cost recorded in the US Veterans health care system in fiscal year 2016 were included in this study. The dependent variable was total cost at the patient level, and the explanatory variables were age, sex, and comorbidities represented by 762 clinically homogeneous groups, which were created by expanding the 283 categories from Clinical Classifications Software based on ICD-10-CM codes. The split-sample method was used to assess model overfitting and coefficient stability. The predictive power of the algorithms was ascertained by comparing the R, mean absolute percentage error, root mean square error, predictive ratios, and c-statistics. The expansion of the Clinical Classifications Software categories resulted in higher predictive power. The R reached 0.72 and 0.52 for the transformed and raw scale cost, respectively. The case-mix algorithm we developed based on age, sex, and diagnoses outperformed the existing case-mix models reported in the literature. The method developed in this study can be used by other health systems to produce tailored risk models for their specific purpose.

  9. Arterial waveguide model for shear wave elastography: implementation and in vitro validation

    NASA Astrophysics Data System (ADS)

    Vaziri Astaneh, Ali; Urban, Matthew W.; Aquino, Wilkins; Greenleaf, James F.; Guddati, Murthy N.

    2017-07-01

    Arterial stiffness is found to be an early indicator of many cardiovascular diseases. Among various techniques, shear wave elastography has emerged as a promising tool for estimating local arterial stiffness through the observed dispersion of guided waves. In this paper, we develop efficient models for the computational simulation of guided wave dispersion in arterial walls. The models are capable of considering fluid-loaded tubes, immersed in fluid or embedded in a solid, which are encountered in in vitro/ex vivo, and in vivo experiments. The proposed methods are based on judiciously combining Fourier transformation and finite element discretization, leading to a significant reduction in computational cost while fully capturing complex 3D wave propagation. The developed methods are implemented in open-source code, and verified by comparing them with significantly more expensive, fully 3D finite element models. We also validate the models using the shear wave elastography of tissue-mimicking phantoms. The computational efficiency of the developed methods indicates the possibility of being able to estimate arterial stiffness in real time, which would be beneficial in clinical settings.

  10. Finding models to detect Alzheimer's disease by fusing structural and neuropsychological information

    NASA Astrophysics Data System (ADS)

    Giraldo, Diana L.; García-Arteaga, Juan D.; Velasco, Nelson; Romero, Eduardo

    2015-12-01

    Alzheimer's disease (AD) is a neurodegenerative disease that affects higher brain functions. Initial diagnosis of AD is based on the patient's clinical history and a battery of neuropsychological tests. The accuracy of the diagnosis is highly dependent on the examiner's skills and on the evolution of a variable clinical frame. This work presents an automatic strategy that learns probabilistic brain models for different stages of the disease, reducing the complexity, parameter adjustment and computational costs. The proposed method starts by setting a probabilistic class description using the information stored in the neuropsychological test, followed by constructing the different structural class models using membership values from the learned probabilistic functions. These models are then used as a reference frame for the classification problem: a new case is assigned to a particular class simply by projecting to the different models. The validation was performed using a leave-one-out cross-validation, two classes were used: Normal Control (NC) subjects and patients diagnosed with mild AD. In this experiment it is possible to achieve a sensibility and specificity of 80% and 79% respectively.

  11. Active distribution network planning considering linearized system loss

    NASA Astrophysics Data System (ADS)

    Li, Xiao; Wang, Mingqiang; Xu, Hao

    2018-02-01

    In this paper, various distribution network planning techniques with DGs are reviewed, and a new distribution network planning method is proposed. It assumes that the location of DGs and the topology of the network are fixed. The proposed model optimizes the capacities of DG and the optimal distribution line capacity simultaneously by a cost/benefit analysis and the benefit is quantified by the reduction of the expected interruption cost. Besides, the network loss is explicitly analyzed in the paper. For simplicity, the network loss is appropriately simplified as a quadratic function of difference of voltage phase angle. Then it is further piecewise linearized. In this paper, a piecewise linearization technique with different segment lengths is proposed. To validate its effectiveness and superiority, the proposed distribution network planning model with elaborate linearization technique is tested on the IEEE 33-bus distribution network system.

  12. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  13. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Jones, Thomas W.; Walford, Alan; Black, Jonathan T.; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images-is a flexible and robust approach for measuring the static and dynamic characteristics of future ultralightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  14. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  15. A fast analytical undulator model for realistic high-energy FEL simulations

    NASA Astrophysics Data System (ADS)

    Tatchyn, R.; Cremer, T.

    1997-02-01

    A number of leading FEL simulation codes used for modeling gain in the ultralong undulators required for SASE saturation in the <100 Å range employ simplified analytical models both for field and error representations. Although it is recognized that both the practical and theoretical validity of such codes could be enhanced by incorporating realistic undulator field calculations, the computational cost of doing this can be prohibitive, especially for point-to-point integration of the equations of motion through each undulator period. In this paper we describe a simple analytical model suitable for modeling realistic permanent magnet (PM), hybrid/PM, and non-PM undulator structures, and discuss selected techniques for minimizing computation time.

  16. Easy and low-cost identification of metabolic syndrome in patients treated with second-generation antipsychotics: artificial neural network and logistic regression models.

    PubMed

    Lin, Chao-Cheng; Bai, Ya-Mei; Chen, Jen-Yeu; Hwang, Tzung-Jeng; Chen, Tzu-Ting; Chiu, Hung-Wen; Li, Yu-Chuan

    2010-03-01

    Metabolic syndrome (MetS) is an important side effect of second-generation antipsychotics (SGAs). However, many SGA-treated patients with MetS remain undetected. In this study, we trained and validated artificial neural network (ANN) and multiple logistic regression models without biochemical parameters to rapidly identify MetS in patients with SGA treatment. A total of 383 patients with a diagnosis of schizophrenia or schizoaffective disorder (DSM-IV criteria) with SGA treatment for more than 6 months were investigated to determine whether they met the MetS criteria according to the International Diabetes Federation. The data for these patients were collected between March 2005 and September 2005. The input variables of ANN and logistic regression were limited to demographic and anthropometric data only. All models were trained by randomly selecting two-thirds of the patient data and were internally validated with the remaining one-third of the data. The models were then externally validated with data from 69 patients from another hospital, collected between March 2008 and June 2008. The area under the receiver operating characteristic curve (AUC) was used to measure the performance of all models. Both the final ANN and logistic regression models had high accuracy (88.3% vs 83.6%), sensitivity (93.1% vs 86.2%), and specificity (86.9% vs 83.8%) to identify MetS in the internal validation set. The mean +/- SD AUC was high for both the ANN and logistic regression models (0.934 +/- 0.033 vs 0.922 +/- 0.035, P = .63). During external validation, high AUC was still obtained for both models. Waist circumference and diastolic blood pressure were the common variables that were left in the final ANN and logistic regression models. Our study developed accurate ANN and logistic regression models to detect MetS in patients with SGA treatment. The models are likely to provide a noninvasive tool for large-scale screening of MetS in this group of patients. (c) 2010 Physicians Postgraduate Press, Inc.

  17. Development of Mobile Mapping System for 3D Road Asset Inventory.

    PubMed

    Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott

    2016-03-12

    Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed.

  18. Development of Mobile Mapping System for 3D Road Asset Inventory

    PubMed Central

    Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott

    2016-01-01

    Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed. PMID:26985897

  19. Analysis and prediction of agricultural pest dynamics with Tiko'n, a generic tool to develop agroecological food web models

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs determine their behaviour and allow users to evaluate different integrated management options.

  20. Calculating when elective abdominal aortic aneurysm repair improves survival for individual patients: development of the Aneurysm Repair Decision Aid and economic evaluation.

    PubMed

    Grant, Stuart W; Sperrin, Matthew; Carlson, Eric; Chinai, Natasha; Ntais, Dionysios; Hamilton, Matthew; Dunn, Graham; Buchan, Iain; Davies, Linda; McCollum, Charles N

    2015-04-01

    Abdominal aortic aneurysm (AAA) repair aims to prevent premature death from AAA rupture. Elective repair is currently recommended when AAA diameter reaches 5.5 cm (men) and 5.0 cm (women). Applying population-based indications may not be appropriate for individual patient decisions, as the optimal indication is likely to differ between patients based on age and comorbidities. To develop an Aneurysm Repair Decision Aid (ARDA) to indicate when elective AAA repair optimises survival for individual patients and to assess the cost-effectiveness and associated uncertainty of elective repair at the aneurysm diameter recommended by the ARDA compared with current practice. The UK Vascular Governance North West and National Vascular Database provided individual patient data to develop predictive models for perioperative mortality and survival. Data from published literature were used to model AAA growth and risk of rupture. The cost-effectiveness analysis used data from published literature and from local and national databases. A combination of systematic review methods and clinical registries were used to provide data to populate models and inform the structure of the ARDA. Discrete event simulation (DES) was used to model the patient journey from diagnosis to death and synthesised data were used to estimate patient outcomes and costs for elective repair at alternative aneurysm diameters. Eight patient clinical scenarios (vignettes) were used as exemplars. The DES structure was validated by clinical and statistical experts. The economic evaluation estimated costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICERs) from the NHS, social care provider and patient perspective over a lifetime horizon. Cost-effectiveness acceptability analyses and probabilistic sensitivity analyses explored uncertainty in the data and the value for money of ARDA-based decisions. The ARDA outcome measures include perioperative mortality risk, annual risk of rupture, 1-, 5- and 10-year survival, postoperative long-term survival, median life expectancy and predicted time to current threshold for aneurysm repair. The primary economic measure was the ICER using the QALY as the measure of health benefit. The analysis demonstrated it is feasible to build and run a complex clinical decision aid using DES. The model results support current guidelines for most vignettes but suggest that earlier repair may be effective in younger, fitter patients and ongoing surveillance may be effective in elderly patients with comorbidities. The model adds information to support decisions for patients with aneurysms outside current indications. The economic evaluation suggests that using the ARDA compared with current guidelines could be cost-effective but there is a high level of uncertainty. Lack of high-quality long-term data to populate all sections of the model meant that there is high uncertainty about the long-term clinical and economic consequences of repair. Modelling assumptions were necessary and the developed survival models require external validation. The ARDA provides detailed information on the potential consequences of AAA repair or a decision not to repair that may be helpful to vascular surgeons and their patients in reaching informed decisions. Further research is required to reduce uncertainty about key data, including reintervention following AAA repair, and assess the acceptability and feasibility of the ARDA for use in routine clinical practice. The National Institute for Health Research Health Technology Assessment programme.

  1. An economic assessment of embryo diagnostics (Dx) - the costs of introducing non-invasive embryo diagnostics into IVF standard treatment practices.

    PubMed

    Fugel, Hans-Joerg; Connolly, Mark; Nuijten, Mark

    2014-10-09

    New techniques in assessing oocytes and embryo quality are currently explored to improve pregnancy and delivery rates per embryo transfer. While a better understanding of embryo quality could help optimize the existing "in vitro fertilization" (IVF) therapy schemes, it is essential to address the economic viability of such technologies in the healthcare setting. An Embryo-Dx economic model was constructed to assess the cost-effectiveness of 3 different IVF strategies from a payer's perspective; it compares Embryo-Dx with single embryo transfer (SET) to elective single embryo transfer (eSET) and to double embryo transfer (DET) treatment practices. The introduction of a new non-invasive embryo technology (Embryo-Dx) associated with a cost up to €460 is cost-effective compared to eSET and DET based on the cost per live birth. The model assumed that Embryo-Dx will improve ongoing pregnancy rate/realize an absolute improvement in live births of 9% in this case. This study shows that improved embryo diagnosis combined with SET may have the potential to reduce the cost per live birth per couple treated in IVF treatment practices. The results of this study are likely more sensitive to changes in the ongoing pregnancy rate and consequently the live birth rate than the diagnosis costs. The introduction of a validated Embryo-Dx technology will further support a move towards increased eSET procedures in IVF clinical practice and vice versa.

  2. How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes

    PubMed Central

    Meyniel, Florent; Safra, Lou; Pessiglione, Mathias

    2014-01-01

    A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty. PMID:24743711

  3. [Pharmacoeconomic analysis of community-acquired pneumonia treatment with telithromycin or clarithromycin].

    PubMed

    Rubio-Terrés, C; Cots, J M; Domínguez-Gil, A; Herreras, A; Sánchez Gascón, F; Chang, J; Trilla, A

    2003-09-01

    A pharmacoeconomic analysis was carried out comparing the efficacy of two treatment options for community-acquired pneumonia (CAP): telithromycin and clarithromycin. It was a retrospective analysis using a decision tree model. The efficacy of the two treatment options was estimated from a randomized, double-blind clinical trial, in which 800 mg/day oral telithromycin for 10 days was compared to 1000 mg/day oral clarithromycin for 10 days in patients with CAP (162 and 156 respectively). The use of resources was estimated based on the clinical trial and Spanish sources, and the unit costs from a Spanish health costs database. Costs were evaluated for the acquisition of antibiotic treatments, change of antibiotic due to therapeutic failure, hospital admissions, adverse reactions to treatment, primary care visits, tests and indirect costs (working days lost). The model was validated by a panel of Spanish clinical experts. As the clinical trial was designed to show equivalence, there were no significant differences in efficacy between the treatment options (clinical cure rate 88.3% and 88.5%, respectively), and a cost minimization analysis was performed. In the base case, the average cost of the disease per patient was 308.29 euros with telithromycin and 331.5 euros with clarithromycin (a difference of 23.21 euros). The results were stable in the susceptibility analysis, with differences favorable to telithromycin ranging between 5.50 and 45.45 euros. Telithromycin results in a cost savings of up to 45.45 euros per CAP patient compared to clarithromycin.

  4. Residual standard deviation: Validation of a new measure of dual-task cost in below-knee prosthesis users.

    PubMed

    Howard, Charla L; Wallace, Chris; Abbas, James; Stokic, Dobrivoje S

    2017-01-01

    We developed and evaluated properties of a new measure of variability in stride length and cadence, termed residual standard deviation (RSD). To calculate RSD, stride length and cadence are regressed against velocity to derive the best fit line from which the variability (SD) of the distance between the actual and predicted data points is calculated. We examined construct, concurrent, and discriminative validity of RSD using dual-task paradigm in 14 below-knee prosthesis users and 13 age- and education-matched controls. Subjects walked first over an electronic walkway while performing separately a serial subtraction and backwards spelling task, and then at self-selected slow, normal, and fast speeds used to derive the best fit line for stride length and cadence against velocity. Construct validity was demonstrated by significantly greater increase in RSD during dual-task gait in prosthesis users than controls (group-by-condition interaction, stride length p=0.0006, cadence p=0.009). Concurrent validity was established against coefficient of variation (CV) by moderate-to-high correlations (r=0.50-0.87) between dual-task cost RSD and dual-task cost CV for both stride length and cadence in prosthesis users and controls. Discriminative validity was documented by the ability of dual-task cost calculated from RSD to effectively differentiate prosthesis users from controls (area under the receiver operating characteristic curve, stride length 0.863, p=0.001, cadence 0.808, p=0.007), which was better than the ability of dual-task cost CV (0.692, 0.648, respectively, not significant). These results validate RSD as a new measure of variability in below-knee prosthesis users. Future studies should include larger cohorts and other populations to ascertain its generalizability. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. How a good understanding of the physical oceanography of your offshore renewables site can drive down project costs.

    NASA Astrophysics Data System (ADS)

    Royle, J.

    2016-02-01

    For an offshore renewables plant to be viable it must be safe and cost effective to build and maintain (i.e. the conditions mustn't be too harsh to excessively impede operations at the site), it must also have an energetic enough resource to make the project attractive to investors. In order to strike the correct balance between cost and resource reliable datasets describing the meteorological and oceanographic (metocean) environment needs to be collected, analysed and its findings correctly applied . This presentation will use three real world examples from Iberdrola`s portfolio of offshore windfarms in Europe to demonstrate the economic benefits of good quality metocean data and robust analysis. The three examples are: 1) Moving from traditional frequency domain persistence statistics to time domain installation schedules driven by reliable metocean data reduces uncertainty and allows the developer to have better handle on weather risk during contract negotiations. 2) By comparing the planned installation schedules from a well validated metocean dataset with a coarser low cost unvalidated metocean dataset we can show that each Euro invested in the quality of metocean data can reduce the uncertainty in installation schedules by four Euros. 3) Careful consideration of co-varying wave and tidal parameters can justify lower cost designs, such as lower platform levels leading to shorter and cheaper offshore wind turbine foundations. By considering the above examples we will prove the case for investing in analysis of well validated metocean models as a basis for sound financial planning of offshore renewables installations.

  6. Utilizing random Forest QSAR models with optimized parameters for target identification and its application to target-fishing server.

    PubMed

    Lee, Kyoungyeul; Lee, Minho; Kim, Dongsup

    2017-12-28

    The identification of target molecules is important for understanding the mechanism of "target deconvolution" in phenotypic screening and "polypharmacology" of drugs. Because conventional methods of identifying targets require time and cost, in-silico target identification has been considered an alternative solution. One of the well-known in-silico methods of identifying targets involves structure activity relationships (SARs). SARs have advantages such as low computational cost and high feasibility; however, the data dependency in the SAR approach causes imbalance of active data and ambiguity of inactive data throughout targets. We developed a ligand-based virtual screening model comprising 1121 target SAR models built using a random forest algorithm. The performance of each target model was tested by employing the ROC curve and the mean score using an internal five-fold cross validation. Moreover, recall rates for top-k targets were calculated to assess the performance of target ranking. A benchmark model using an optimized sampling method and parameters was examined via external validation set. The result shows recall rates of 67.6% and 73.9% for top-11 (1% of the total targets) and top-33, respectively. We provide a website for users to search the top-k targets for query ligands available publicly at http://rfqsar.kaist.ac.kr . The target models that we built can be used for both predicting the activity of ligands toward each target and ranking candidate targets for a query ligand using a unified scoring scheme. The scores are additionally fitted to the probability so that users can estimate how likely a ligand-target interaction is active. The user interface of our web site is user friendly and intuitive, offering useful information and cross references.

  7. Use of chemometrics to compare NIR and HPLC for the simultaneous determination of drug levels in fixed-dose combination tablets employed in tuberculosis treatment.

    PubMed

    Teixeira, Kelly Sivocy Sampaio; da Cruz Fonseca, Said Gonçalves; de Moura, Luís Carlos Brigido; de Moura, Mario Luís Ribeiro; Borges, Márcia Herminia Pinheiro; Barbosa, Euzébio Guimaraes; De Lima E Moura, Túlio Flávio Accioly

    2018-02-05

    The World Health Organization recommends that TB treatment be administered using combination therapy. The methodologies for quantifying simultaneously associated drugs are highly complex, being costly, extremely time consuming and producing chemical residues harmful to the environment. The need to seek alternative techniques that minimize these drawbacks is widely discussed in the pharmaceutical industry. Therefore, the objective of this study was to develop and validate a multivariate calibration model in association with the near infrared spectroscopy technique (NIR) for the simultaneous determination of rifampicin, isoniazid, pyrazinamide and ethambutol. These models allow the quality control of these medicines to be optimized using simple, fast, low-cost techniques that produce no chemical waste. In the NIR - PLS method, spectra readings were acquired in the 10,000-4000cm -1 range using an infrared spectrophotometer (IRPrestige - 21 - Shimadzu) with a resolution of 4cm -1 , 20 sweeps, under controlled temperature and humidity. For construction of the model, the central composite experimental design was employed on the program Statistica 13 (StatSoft Inc.). All spectra were treated by computational tools for multivariate analysis using partial least squares regression (PLS) on the software program Pirouette 3.11 (Infometrix, Inc.). Variable selections were performed by the QSAR modeling program. The models developed by NIR in association with multivariate analysis provided good prediction of the APIs for the external samples and were therefore validated. For the tablets, however, the slightly different quantitative compositions of excipients compared to the mixtures prepared for building the models led to results that were not statistically similar, despite having prediction errors considered acceptable in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. ERP evidence for selective drop in attentional costs in uncertain environments: challenging a purely premotor account of covert orienting of attention.

    PubMed

    Lasaponara, Stefano; Chica, Ana B; Lecce, Francesca; Lupianez, Juan; Doricchi, Fabrizio

    2011-07-01

    Several studies have proved that the reliability of endogenous spatial cues linearly modulates the reaction time advantage in the processing of targets at validly cued vs. invalidly cued locations, i.e. the "validity effect". This would imply that with non-predictive cues, no "validity effect" should be observed. However, contrary to this prediction, one could hypothesize that attentional benefits by valid cuing (i.e. the RT advantage for validly vs. neutrally cued targets) can still be maintained with non-predictive cues, if the brain were endowed with mechanisms allowing the selective reduction in costs of reorienting from invalidly cued locations (i.e. the reduction of the RT disadvantage for invalidly vs. neutrally cued targets). This separated modulation of attentional benefits and costs would be adaptive in uncertain contexts where cues predict at chance level the location of targets. Through the joint recording of manual reaction times and event-related cerebral potentials (ERPs), we have found that this is the case and that relying on non-predictive endogenous cues results in abatement of attentional costs and the difference in the amplitude of the P1 brain responses evoked by invalidly vs. neutrally cued targets. In contrast, the use of non-predictive cues leaves unaffected attentional benefits and the difference in the amplitude of the N1 responses evoked by validly vs. neutrally cued targets. At the individual level, the drop in costs with non-predictive cues was matched with equivalent lateral biases in RTs to neutrally and invalidly cued targets presented in the left and right visual field. During the cue period, the drop in costs with non-predictive cues was preceded by reduction of the Early Directing Attention Negativity (EDAN) on posterior occipital sites and by enhancement of the frontal Anterior Directing Attention Negativity (ADAN) correlated to preparatory voluntary orienting. These findings demonstrate, for the first time, that the segregation of mechanisms regulating attentional benefits and costs helps efficiency of orienting in "uncertain" visual spatial contexts characterized by poor probabilistic association between cues and targets. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Customizable Optical Force Sensor for Fast Prototyping and Cost-Effective Applications.

    PubMed

    Díez, Jorge A; Catalán, José M; Blanco, Andrea; García-Perez, José V; Badesa, Francisco J; Gacía-Aracil, Nicolás

    2018-02-07

    This paper presents the development of an optical force sensor architecture directed to prototyping and cost-effective applications, where the actual force requirements are still not well defined or the most suitable commercial technologies would highly increase the cost of the device. The working principle of this sensor consists of determining the displacement of a lens by measuring the distortion of a refracted light beam. This lens is attached to an elastic interface whose elastic constant is known, allowing the estimation of the force that disturbs the optical system. In order to satisfy the requirements of the design process in an inexpensive way, this sensor can be built by fast prototyping technologies and using non-optical grade elements. To deal with the imperfections of this kind of manufacturing procedures and materials, four fitting models are proposed to calibrate the implemented sensor. In order to validate the system, two different sensor implementations with measurement ranges of ±45 N and ±10 N are tested with the proposed models, comparing the resulting force estimation with respect to an industrial-grade load cell. Results show that all models can estimate the loads with an error of about 6% of the measurement range.

  10. Customizable Optical Force Sensor for Fast Prototyping and Cost-Effective Applications

    PubMed Central

    Díez, Jorge A.; Catalán, José M.; Blanco, Andrea; García-Perez, José V.; Badesa, Francisco J.

    2018-01-01

    This paper presents the development of an optical force sensor architecture directed to prototyping and cost-effective applications, where the actual force requirements are still not well defined or the most suitable commercial technologies would highly increase the cost of the device. The working principle of this sensor consists of determining the displacement of a lens by measuring the distortion of a refracted light beam. This lens is attached to an elastic interface whose elastic constant is known, allowing the estimation of the force that disturbs the optical system. In order to satisfy the requirements of the design process in an inexpensive way, this sensor can be built by fast prototyping technologies and using non-optical grade elements. To deal with the imperfections of this kind of manufacturing procedures and materials, four fitting models are proposed to calibrate the implemented sensor. In order to validate the system, two different sensor implementations with measurement ranges of ±45 N and ±10 N are tested with the proposed models, comparing the resulting force estimation with respect to an industrial-grade load cell. Results show that all models can estimate the loads with an error of about 6% of the measurement range. PMID:29414861

  11. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  12. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    NASA Astrophysics Data System (ADS)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  13. Limited sampling strategy models for estimating the AUC of gliclazide in Chinese healthy volunteers.

    PubMed

    Huang, Ji-Han; Wang, Kun; Huang, Xiao-Hui; He, Ying-Chun; Li, Lu-Jin; Sheng, Yu-Cheng; Yang, Juan; Zheng, Qing-Shan

    2013-06-01

    The aim of this work is to reduce the cost of required sampling for the estimation of the area under the gliclazide plasma concentration versus time curve within 60 h (AUC0-60t ). The limited sampling strategy (LSS) models were established and validated by the multiple regression model within 4 or fewer gliclazide concentration values. Absolute prediction error (APE), root of mean square error (RMSE) and visual prediction check were used as criterion. The results of Jack-Knife validation showed that 10 (25.0 %) of the 40 LSS based on the regression analysis were not within an APE of 15 % using one concentration-time point. 90.2, 91.5 and 92.4 % of the 40 LSS models were capable of prediction using 2, 3 and 4 points, respectively. Limited sampling strategies were developed and validated for estimating AUC0-60t of gliclazide. This study indicates that the implementation of an 80 mg dosage regimen enabled accurate predictions of AUC0-60t by the LSS model. This study shows that 12, 6, 4, 2 h after administration are the key sampling times. The combination of (12, 2 h), (12, 8, 2 h) or (12, 8, 4, 2 h) can be chosen as sampling hours for predicting AUC0-60t in practical application according to requirement.

  14. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    USGS Publications Warehouse

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  15. Spinal Cord Injury Clinical Registries: Improving Care across the SCI Care Continuum by Identifying Knowledge Gaps.

    PubMed

    Dvorak, Marcel F; Cheng, Christiana L; Fallah, Nader; Santos, Argelio; Atkins, Derek; Humphreys, Suzanne; Rivers, Carly S; White, Barry A B; Ho, Chester; Ahn, Henry; Kwon, Brian K; Christie, Sean; Noonan, Vanessa K

    2017-10-15

    Timely access and ongoing delivery of care and therapeutic interventions is needed to maximize recovery and function after traumatic spinal cord injury (tSCI). To ensure these decisions are evidence-based, access to consistent, reliable, and valid sources of clinical data is required. The Access to Care and Timing Model used data from the Rick Hansen SCI Registry (RHSCIR) to generate a simulation of healthcare delivery for persons after tSCI and to test scenarios aimed at improving outcomes and reducing the economic burden of SCI. Through model development, we identified knowledge gaps and challenges in the literature and current health outcomes data collection throughout the continuum of SCI care. The objectives of this article were to describe these gaps and to provide recommendations for bridging them. Accurate information on injury severity after tSCI was hindered by difficulties in conducting neurological assessments and classifications of SCI (e.g., timing), variations in reporting, and the lack of a validated SCI-specific measure of associated injuries. There was also limited availability of reliable data on patient factors such as multi-morbidity and patient-reported measures. Knowledge gaps related to structures (e.g., protocols) and processes (e.g., costs) at each phase of care have prevented comprehensive evaluation of system performance. Addressing these knowledge gaps will enhance comparative and cost-effectiveness evaluations to inform decision-making and standards of care. Recommendations to do so were: standardize data element collection and facilitate database linkages, validate and adopt more outcome measures for SCI, and increase opportunities for collaborations with stakeholders from diverse backgrounds.

  16. Differentiating functional brain regions using optical coherence tomography (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Gil, Daniel A.; Bow, Hansen C.; Shen, Jin-H.; Joos, Karen M.; Skala, Melissa C.

    2017-02-01

    The human brain is made up of functional regions governing movement, sensation, language, and cognition. Unintentional injury during neurosurgery can result in significant neurological deficits and morbidity. The current standard for localizing function to brain tissue during surgery, intraoperative electrical stimulation or recording, significantly increases the risk, time, and cost of the procedure. There is a need for a fast, cost-effective, and high-resolution intraoperative technique that can avoid damage to functional brain regions. We propose that optical coherence tomography (OCT) can fill this niche by imaging differences in the cellular composition and organization of functional brain areas. We hypothesized this would manifest as differences in the attenuation coefficient measured using OCT. Five functional regions (prefrontal, somatosensory, auditory, visual, and cerebellum) were imaged in ex vivo porcine brains (n=3), a model chosen due to a similar white/gray matter ratio as human brains. The attenuation coefficient was calculated using a depth-resolved model and quantitatively validated with Intralipid phantoms across a physiological range of attenuation coefficients (absolute difference < 0.1cm-1). Image analysis was performed on the attenuation coefficient images to derive quantitative endpoints. We observed a statistically significant difference among the median attenuation coefficients of these five regions (one-way ANOVA, p<0.05). Nissl-stained histology will be used to validate our results and correlate OCT-measured attenuation coefficients to neuronal density. Additional development and validation of OCT algorithms to discriminate brain regions are planned to improve the safety and efficacy of neurosurgical procedures such as biopsy, electrode placement, and tissue resection.

  17. Spinal Cord Injury Clinical Registries: Improving Care across the SCI Care Continuum by Identifying Knowledge Gaps

    PubMed Central

    Cheng, Christiana L.; Fallah, Nader; Santos, Argelio; Atkins, Derek; Humphreys, Suzanne; Rivers, Carly S.; White, Barry A.B.; Ho, Chester; Ahn, Henry; Kwon, Brian K.; Christie, Sean; Noonan, Vanessa K.

    2017-01-01

    Abstract Timely access and ongoing delivery of care and therapeutic interventions is needed to maximize recovery and function after traumatic spinal cord injury (tSCI). To ensure these decisions are evidence-based, access to consistent, reliable, and valid sources of clinical data is required. The Access to Care and Timing Model used data from the Rick Hansen SCI Registry (RHSCIR) to generate a simulation of healthcare delivery for persons after tSCI and to test scenarios aimed at improving outcomes and reducing the economic burden of SCI. Through model development, we identified knowledge gaps and challenges in the literature and current health outcomes data collection throughout the continuum of SCI care. The objectives of this article were to describe these gaps and to provide recommendations for bridging them. Accurate information on injury severity after tSCI was hindered by difficulties in conducting neurological assessments and classifications of SCI (e.g., timing), variations in reporting, and the lack of a validated SCI-specific measure of associated injuries. There was also limited availability of reliable data on patient factors such as multi-morbidity and patient-reported measures. Knowledge gaps related to structures (e.g., protocols) and processes (e.g., costs) at each phase of care have prevented comprehensive evaluation of system performance. Addressing these knowledge gaps will enhance comparative and cost-effectiveness evaluations to inform decision-making and standards of care. Recommendations to do so were: standardize data element collection and facilitate database linkages, validate and adopt more outcome measures for SCI, and increase opportunities for collaborations with stakeholders from diverse backgrounds. PMID:28745934

  18. Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling.

    PubMed

    Perdikaris, P; Raissi, M; Damianou, A; Lawrence, N D; Karniadakis, G E

    2017-02-01

    Multi-fidelity modelling enables accurate inference of quantities of interest by synergistically combining realizations of low-cost/low-fidelity models with a small set of high-fidelity observations. This is particularly effective when the low- and high-fidelity models exhibit strong correlations, and can lead to significant computational gains over approaches that solely rely on high-fidelity models. However, in many cases of practical interest, low-fidelity models can only be well correlated to their high-fidelity counterparts for a specific range of input parameters, and potentially return wrong trends and erroneous predictions if probed outside of their validity regime. Here we put forth a probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends. This introduces a new class of multi-fidelity information fusion algorithms that provide a fundamental extension to the existing linear autoregressive methodologies, while still maintaining the same algorithmic complexity and overall computational cost. The performance of the proposed methods is tested in several benchmark problems involving both synthetic and real multi-fidelity datasets from computational fluid dynamics simulations.

  19. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  20. Multifactorial risk index for prediction of intraoperative blood transfusion in endovascular aneurysm repair.

    PubMed

    Mahmood, Eitezaz; Matyal, Robina; Mueller, Ariel; Mahmood, Feroze; Tung, Avery; Montealegre-Gallegos, Mario; Schermerhorn, Marc; Shahul, Sajid

    2018-03-01

    In some institutions, the current blood ordering practice does not discriminate minimally invasive endovascular aneurysm repair (EVAR) from open procedures, with consequent increasing costs and likelihood of blood product wastage for EVARs. This limitation in practice can possibly be addressed with the development of a reliable prediction model for transfusion risk in EVAR patients. We used the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database to create a model for prediction of intraoperative blood transfusion occurrence in patients undergoing EVAR. Afterward, we tested our predictive model on the Vascular Study Group of New England (VSGNE) database. We used the ACS NSQIP database for patients who underwent EVAR from 2011 to 2013 (N = 4709) as our derivation set for identifying a risk index for predicting intraoperative blood transfusion. We then developed a clinical risk score and validated this model using patients who underwent EVAR from 2003 to 2014 in the VSGNE database (N = 4478). The transfusion rates were 8.4% and 6.1% for the ACS NSQIP (derivation set) and VSGNE (validation) databases, respectively. Hemoglobin concentration, American Society of Anesthesiologists class, age, and aneurysm diameter predicted blood transfusion in the derivation set. When it was applied on the validation set, our risk index demonstrated good discrimination in both the derivation and validation set (C statistic = 0.73 and 0.70, respectively) and calibration using the Hosmer-Lemeshow test (P = .27 and 0.31) for both data sets. We developed and validated a risk index for predicting the likelihood of intraoperative blood transfusion in EVAR patients. Implementation of this index may facilitate the blood management strategies specific for EVAR. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

Top