Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C
2015-11-01
Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.
A quality-based cost model for new electronic systems and products
NASA Astrophysics Data System (ADS)
Shina, Sammy G.; Saigal, Anil
1998-04-01
This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.
Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping
2015-01-01
Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911
Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M
2015-12-01
Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.
NASA Instrument Cost/Schedule Model
NASA Technical Reports Server (NTRS)
Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George
2011-01-01
NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.
Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diakov, Victor; Cole, Wesley; Sullivan, Patrick
2015-11-01
Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validitymore » of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.« less
Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3.0)
To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfil...
Calculation of the Actual Cost of Engine Maintenance
2003-03-01
Cost Estimating Integrated Tools ( ACEIT ) helps analysts store, retrieve, and analyze data; build cost models; analyze risk; time phase budgets; and...Tools ( ACEIT ).” n. pag. http://www.aceit.com/ 21 February 2003. • USAMC Logistics Support Activity (LOGSA). “Cost Analysis Strategy Assessment
The Launch Systems Operations Cost Model
NASA Technical Reports Server (NTRS)
Prince, Frank A.; Hamaker, Joseph W. (Technical Monitor)
2001-01-01
One of NASA's primary missions is to reduce the cost of access to space while simultaneously increasing safety. A key component, and one of the least understood, is the recurring operations and support cost for reusable launch systems. In order to predict these costs, NASA, under the leadership of the Independent Program Assessment Office (IPAO), has commissioned the development of a Launch Systems Operations Cost Model (LSOCM). LSOCM is a tool to predict the operations & support (O&S) cost of new and modified reusable (and partially reusable) launch systems. The requirements are to predict the non-recurring cost for the ground infrastructure and the recurring cost of maintaining that infrastructure, performing vehicle logistics, and performing the O&S actions to return the vehicle to flight. In addition, the model must estimate the time required to cycle the vehicle through all of the ground processing activities. The current version of LSOCM is an amalgamation of existing tools, leveraging our understanding of shuttle operations cost with a means of predicting how the maintenance burden will change as the vehicle becomes more aircraft like. The use of the Conceptual Operations Manpower Estimating Tool/Operations Cost Model (COMET/OCM) provides a solid point of departure based on shuttle and expendable launch vehicle (ELV) experience. The incorporation of the Reliability and Maintainability Analysis Tool (RMAT) as expressed by a set of response surface model equations gives a method for estimating how changing launch system characteristics affects cost and cycle time as compared to today's shuttle system. Plans are being made to improve the model. The development team will be spending the next few months devising a structured methodology that will enable verified and validated algorithms to give accurate cost estimates. To assist in this endeavor the LSOCM team is part of an Agency wide effort to combine resources with other cost and operations professionals to support models, databases, and operations assessments.
Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens
2010-08-01
The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.
Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.
2015-01-01
Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver
2016-01-01
Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.
NASA Astrophysics Data System (ADS)
Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.
2012-08-01
The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.
Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
System capacity and economic modeling computer tool for satellite mobile communications systems
NASA Technical Reports Server (NTRS)
Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.
1988-01-01
A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.
Timothy M. Young; James H. Perdue; Andy Hartsell; Robert C. Abt; Donald Hodges; Timothy G. Rials
2009-01-01
Optimal locations for biomass facilities that use mill residues are identified for 13 southern U.S. states. The Biomass Site Assessment Tool (BioSAT) model is used to identify the top 20 locations for 13 southern U.S. states. The trucking cost model of BioSAT is used with Timber Mart South 2009 price data to estimate the total cost, average cost, and marginal costs for...
Scale models: A proven cost-effective tool for outage planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, R.; Segroves, R.
1995-03-01
As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning andmore » monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.« less
NASA Technical Reports Server (NTRS)
Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.
1992-01-01
A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Bus Lifecycle Cost Model for Federal Land Management Agencies.
DOT National Transportation Integrated Search
2011-09-30
The Bus Lifecycle Cost Model is a spreadsheet-based planning tool that estimates capital, operating, and maintenance costs for various bus types over the full lifecycle of the vehicle. The model is based on a number of operating characteristics, incl...
Chaiyakunapruk, Nathorn; Somkrua, Ratchadaporn; Hutubessy, Raymond; Henao, Ana Maria; Hombach, Joachim; Melegaro, Alessia; Edmunds, John W; Beutels, Philippe
2011-05-12
Several decision support tools have been developed to aid policymaking regarding the adoption of pneumococcal conjugate vaccine (PCV) into national pediatric immunization programs. The lack of critical appraisal of these tools makes it difficult for decision makers to understand and choose between them. With the aim to guide policymakers on their optimal use, we compared publicly available decision-making tools in relation to their methods, influential parameters and results. The World Health Organization (WHO) requested access to several publicly available cost-effectiveness (CE) tools for PCV from both public and private provenance. All tools were critically assessed according to the WHO's guide for economic evaluations of immunization programs. Key attributes and characteristics were compared and a series of sensitivity analyses was performed to determine the main drivers of the results. The results were compared based on a standardized set of input parameters and assumptions. Three cost-effectiveness modeling tools were provided, including two cohort-based (Pan-American Health Organization (PAHO) ProVac Initiative TriVac, and PneumoADIP) and one population-based model (GlaxoSmithKline's SUPREMES). They all compared the introduction of PCV into national pediatric immunization program with no PCV use. The models were different in terms of model attributes, structure, and data requirement, but captured a similar range of diseases. Herd effects were estimated using different approaches in each model. The main driving parameters were vaccine efficacy against pneumococcal pneumonia, vaccine price, vaccine coverage, serotype coverage and disease burden. With a standardized set of input parameters developed for cohort modeling, TriVac and PneumoADIP produced similar incremental costs and health outcomes, and incremental cost-effectiveness ratios. Vaccine cost (dose price and number of doses), vaccine efficacy and epidemiology of critical endpoint (for example, incidence of pneumonia, distribution of serotypes causing pneumonia) were influential parameters in the models we compared. Understanding the differences and similarities of such CE tools through regular comparisons could render decision-making processes in different countries more efficient, as well as providing guiding information for further clinical and epidemiological research. A tool comparison exercise using standardized data sets can help model developers to be more transparent about their model structure and assumptions and provide analysts and decision makers with a more in-depth view behind the disease dynamics. Adherence to the WHO guide of economic evaluations of immunization programs may also facilitate this process. Please see related article: http://www.biomedcentral.com/1741-7007/9/55.
Ferry Lifecycle Cost Model for Federal Land Management Agencies : User's Guide.
DOT National Transportation Integrated Search
2011-09-30
The Ferry Lifecycle Cost Model (model) is a spreadsheet-based sketch planning tool that estimates capital, operating, and total cost for various vessels that could be used to provide ferry service on a particular route given known service parameters....
Cost analysis of objective resident cataract surgery assessments.
Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M
2015-05-01
To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Validation of the OpCost logging cost model using contractor surveys
Conor K. Bell; Robert F. Keefe; Jeremy S. Fried
2017-01-01
OpCost is a harvest and fuel treatment operations cost model developed to function as both a standalone tool and an integrated component of the Bioregional Inventory Originated Simulation Under Management (BioSum) analytical framework for landscape-level analysis of forest management alternatives. OpCost is an updated implementation of the Fuel Reduction Cost Simulator...
NASA Technical Reports Server (NTRS)
Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.
1992-01-01
The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
Cost Modeling for Space Telescope
NASA Technical Reports Server (NTRS)
Stahl, H. Philip
2011-01-01
Parametric cost models are an important tool for planning missions, compare concepts and justify technology investments. This paper presents on-going efforts to develop single variable and multi-variable cost models for space telescope optical telescope assembly (OTA). These models are based on data collected from historical space telescope missions. Standard statistical methods are used to derive CERs for OTA cost versus aperture diameter and mass. The results are compared with previously published models.
Watkins, David; Lubinga, Solomon J; Mayosi, Bongani; Babigumira, Joseph B
2016-08-01
Rheumatic heart disease (RHD) prevalence and mortality rates remain especially high in many parts of Africa. While effective prevention and treatment exist, coverage rates of the various interventions are low. Little is known about the comparative cost-effectiveness of different RHD interventions in limited resource settings. We developed an economic evaluation tool to assist ministries of health in allocating resources and planning RHD control programs. We constructed a Markov model of the natural history of acute rheumatic fever (ARF) and RHD, taking transition probabilities and intervention effectiveness data from previously published studies and expert opinion. Our model estimates the incremental cost-effectiveness of scaling up coverage of primary prevention (PP), secondary prevention (SP) and heart valve surgery (VS) interventions for RHD. We take a healthcare system perspective on costs and measure outcomes as disability-adjusted life-years (DALYs), discounting both at 3%. Univariate and probabilistic sensitivity analyses are also built into the modeling tool. We illustrate the use of this model in a hypothetical low-income African country, drawing on available disease burden and cost data. We found that, in our hypothetical country, PP would be cost saving and SP would be very cost-effective. International referral for VS (e.g., to a country like India that has existing surgical capacity) would be cost-effective, but building in-country VS services would not be cost-effective at typical low-income country thresholds. Our cost-effectiveness analysis tool is designed to inform priorities for ARF/RHD control programs in Africa at the national or subnational level. In contrast to previous literature, our preliminary findings suggest PP could be the most efficient and cheapest approach in poor countries. We provide our model for public use in the form of a Supplementary File. Our research has immediate policy relevance and calls for renewed efforts to scale up RHD prevention.
Pros, Cons, and Alternatives to Weight Based Cost Estimating
NASA Technical Reports Server (NTRS)
Joyner, Claude R.; Lauriem, Jonathan R.; Levack, Daniel H.; Zapata, Edgar
2011-01-01
Many cost estimating tools use weight as a major parameter in projecting the cost. This is often combined with modifying factors such as complexity, technical maturity of design, environment of operation, etc. to increase the fidelity of the estimate. For a set of conceptual designs, all meeting the same requirements, increased weight can be a major driver in increased cost. However, once a design is fixed, increased weight generally decreases cost, while decreased weight generally increases cost - and the relationship is not linear. Alternative approaches to estimating cost without using weight (except perhaps for materials costs) have been attempted to try to produce a tool usable throughout the design process - from concept studies through development. This paper will address the pros and cons of using weight based models for cost estimating, using liquid rocket engines as the example. It will then examine approaches that minimize the impct of weight based cost estimating. The Rocket Engine- Cost Model (RECM) is an attribute based model developed internally by Pratt & Whitney Rocketdyne for NASA. RECM will be presented primarily to show a successful method to use design and programmatic parameters instead of weight to estimate both design and development costs and production costs. An operations model developed by KSC, the Launch and Landing Effects Ground Operations model (LLEGO), will also be discussed.
NASA Astrophysics Data System (ADS)
Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao
2011-09-01
Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.
Guidelines and Metrics for Assessing Space System Cost Estimates
2008-01-01
analysis time, reuse tooling, models , mechanical ground-support equipment [MGSE]) High mass margin ( simplifying assumptions used to bound solution...engineering environment changes High reuse of architecture, design , tools, code, test scripts, and commercial real- time operating systems Simplified life...Coronal Explorer TWTA traveling wave tube amplifier USAF U.S. Air Force USCM Unmanned Space Vehicle Cost Model USN U.S. Navy UV ultraviolet UVOT UV
Sang-Kyun Han; Han-Sup Han; William J. Elliot; Edward M. Bilek
2017-01-01
We developed a spreadsheet-based model, named ThinTool, to evaluate the cost of mechanical fuel reduction thinning including biomass removal, to predict net energy output, and to assess nutrient impacts from thinning treatments in northern California and southern Oregon. A combination of literature reviews, field-based studies, and contractor surveys was used to...
Lives Saved Tool (LiST) costing: a module to examine costs and prioritize interventions.
Bollinger, Lori A; Sanders, Rachel; Winfrey, William; Adesina, Adebiyi
2017-11-07
Achieving the Sustainable Development Goals will require careful allocation of resources in order to achieve the highest impact. The Lives Saved Tool (LiST) has been used widely to calculate the impact of maternal, neonatal and child health (MNCH) interventions for program planning and multi-country estimation in several Lancet Series commissions. As use of the LiST model increases, many have expressed a desire to cost interventions within the model, in order to support budgeting and prioritization of interventions by countries. A limited LiST costing module was introduced several years ago, but with gaps in cost types. Updates to inputs have now been added to make the module fully functional for a range of uses. This paper builds on previous work that developed an initial version of the LiST costing module to provide costs for MNCH interventions using an ingredients-based costing approach. Here, we update in 2016 the previous econometric estimates from 2013 with newly-available data and also include above-facility level costs such as program management. The updated econometric estimates inform percentages of intervention-level costs for some direct costs and indirect costs. These estimates add to existing values for direct cost requirements for items such as drugs and supplies and required provider time which were already available in LiST Costing. Results generated by the LiST costing module include costs for each intervention, as well as disaggregated costs by intervention including drug and supply costs, labor costs, other recurrent costs, capital costs, and above-service delivery costs. These results can be combined with mortality estimates to support prioritization of interventions by countries. The LiST costing module provides an option for countries to identify resource requirements for scaling up a maternal, neonatal, and child health program, and to examine the financial impact of different resource allocation strategies. It can be a useful tool for countries as they seek to identify the best investments for scarce resources. The purpose of the LiST model is to provide a tool to make resource allocation decisions in a strategic planning process through prioritizing interventions based on resulting impact on maternal and child mortality and morbidity.
Animated-simulation modeling facilitates clinical-process costing.
Zelman, W N; Glick, N D; Blackmore, C C
2001-09-01
Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.
Public biobanks: calculation and recovery of costs.
Clément, Bruno; Yuille, Martin; Zaltoukal, Kurt; Wichmann, Heinz-Erich; Anton, Gabriele; Parodi, Barbara; Kozera, Lukasz; Bréchot, Christian; Hofman, Paul; Dagher, Georges
2014-11-05
A calculation grid developed by an international expert group was tested across biobanks in six countries to evaluate costs for collections of various types of biospecimens. The assessment yielded a tool for setting specimen-access prices that were transparently related to biobank costs, and the tool was applied across three models of collaborative partnership. Copyright © 2014, American Association for the Advancement of Science.
Department of the Army Cost Analysis Manual
2002-05-01
TOOLS ( ACEIT ) ........................................................171 SECTION II - AUTOMATED COST DATA BASE (ACDB...Integrated Tools ( ACEIT ) model and since it is widely used to prepare POEs, CCAs and ICEs, it would expedite the comparative analysis of the submission if...IPT Co-chairs. The documentation produced by the Cost/CRB IPT (in ACEIT ) will be the basis for information contained in the CAB. Any remaining
Department of the Army Cost Analysis Manual
2001-05-01
SECTION I - AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) ................................................................179 SECTION II - AUTOMATED...Management & Comptroller) endorsed the Automated Cost Estimating Integrated Tools ( ACEIT ) model and since it is widely used to prepare POEs, CCAs and...CRB IPT (in ACEIT ) will be the basis for information contained in the CAB. Any remaining unresolved issues from the IPT process will be raised at the
The Shuttle Cost and Price model
NASA Technical Reports Server (NTRS)
Leary, Katherine; Stone, Barbara
1983-01-01
The Shuttle Cost and Price (SCP) model was developed as a tool to assist in evaluating major aspects of Shuttle operations that have direct and indirect economic consequences. It incorporates the major aspects of NASA Pricing Policy and corresponds to the NASA definition of STS operating costs. An overview of the SCP model is presented and the cost model portion of SCP is described in detail. Selected recent applications of the SCP model to NASA Pricing Policy issues are presented.
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
NASA Astrophysics Data System (ADS)
Veerakamolmal, Pitipong; Lee, Yung-Joon; Fasano, J. P.; Hale, Rhea; Jacques, Mary
2002-02-01
In recent years, there has been increased focus by regulators, manufacturers, and consumers on the issue of product end of life management for electronics. This paper presents an overview of a conceptual study designed to examine the costs and benefits of several different Product Take Back (PTB) scenarios for used electronics equipment. The study utilized a reverse logistics supply chain model to examine the effects of several different factors in PTB programs. The model was done using the IBM supply chain optimization tool known as WIT (Watson Implosion Technology). Using the WIT tool, we were able to determine a theoretical optimal cost scenario for PTB programs. The study was designed to assist IBM internally in determining theoretical optimal Product Take Back program models and determining potential incentives for increasing participation rates.
Payload accommodation and development planning tools - A Desktop Resource Leveling Model (DRLM)
NASA Technical Reports Server (NTRS)
Hilchey, John D.; Ledbetter, Bobby; Williams, Richard C.
1989-01-01
The Desktop Resource Leveling Model (DRLM) has been developed as a tool to rapidly structure and manipulate accommodation, schedule, and funding profiles for any kind of experiments, payloads, facilities, and flight systems or other project hardware. The model creates detailed databases describing 'end item' parameters, such as mass, volume, power requirements or costs and schedules for payload, subsystem, or flight system elements. It automatically spreads costs by calendar quarters and sums costs or accommodation parameters by total project, payload, facility, payload launch, or program phase. Final results can be saved or printed out, automatically documenting all assumptions, inputs, and defaults.
DIDEM - An integrated model for comparative health damage costs calculation of air pollution
NASA Astrophysics Data System (ADS)
Ravina, Marco; Panepinto, Deborah; Zanetti, Maria Chiara
2018-01-01
Air pollution represents a continuous hazard to human health. Administration, companies and population need efficient indicators of the possible effects given by a change in decision, strategy or habit. The monetary quantification of health effects of air pollution through the definition of external costs is increasingly recognized as a useful indicator to support decision and information at all levels. The development of modelling tools for the calculation of external costs can provide support to analysts in the development of consistent and comparable assessments. In this paper, the DIATI Dispersion and Externalities Model (DIDEM) is presented. The DIDEM model calculates the delta-external costs of air pollution comparing two alternative emission scenarios. This tool integrates CALPUFF's advanced dispersion modelling with the latest WHO recommendations on concentration-response functions. The model is based on the impact pathway method. It was designed to work with a fine spatial resolution and a local or national geographic scope. The modular structure allows users to input their own data sets. The DIDEM model was tested on a real case study, represented by a comparative analysis of the district heating system in Turin, Italy. Additional advantages and drawbacks of the tool are discussed in the paper. A comparison with other existing models worldwide is reported.
NASA Astrophysics Data System (ADS)
Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi
2015-06-01
Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.
Nanocomposites for Machining Tools
Loginov, Pavel; Mishnaevsky, Leon; Levashov, Evgeny
2017-01-01
Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials. A promising way to improve the performance characteristics of these materials is to design new nanocomposites based on them. The application of micromechanical modeling during the elaboration of composite materials for machining tools can reduce the financial and time costs for development of new tools, with enhanced performance. This article reviews the main groups of nanocomposites for machining tools and their performance. PMID:29027926
A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions
NASA Technical Reports Server (NTRS)
Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.
2016-01-01
Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.
Babela, Robert; Jarcuska, Pavol; Uraz, Vladimir; Krčméry, Vladimír; Jadud, Branislav; Stevlik, Jan; Gould, Ian M
2017-11-01
No previous analyses have attempted to determine optimal therapy for upper respiratory tract infections on the basis of cost-minimization models and the prevalence of antimicrobial resistance among respiratory pathogens in Slovakia. This investigation compares macrolides and cephalosporines for empirical therapy and look at this new tool from the aspect of potential antibiotic policy decision-making process. We employed a decision tree model to determine the threshold level of macrolides and cephalosporines resistance among community respiratory pathogens that would make cephalosporines or macrolides cost-minimising. To obtain information on clinical outcomes and cost of URTIs, a systematic review of the literature was performed. The cost-minimization model of upper respiratory tract infections (URTIs) treatment was derived from the review of literature and published models. We found that the mean cost of empirical treatment with macrolides for an URTIs was €93.27 when the percentage of resistant Streptococcus pneumoniae in the community was 0%; at 5%, the mean cost was €96.45; at 10%, €99.63; at 20%, €105.99, and at 30%, €112.36. Our model demonstrated that when the percentage of macrolide resistant Streptococcus pneumoniae exceeds 13.8%, use of empirical cephalosporines rather than macrolides minimizes the treatment cost of URTIs. Empirical macrolide therapy is less expensive than cephalosporines therapy for URTIs unless macrolide resistance exceeds 13.8% in the community. Results have important antibiotic policy implications, since presented model can be use as an additional decision-making tool for new guidelines and reimbursement processes by local authorities in the era of continual increase in antibiotic resistance.
Economic consequences of high throughput maskless lithography
NASA Astrophysics Data System (ADS)
Hartley, John G.; Govindaraju, Lakshmi
2005-11-01
Many people in the semiconductor industry bemoan the high costs of masks and view mask cost as one of the significant barriers to bringing new chip designs to market. All that is needed is a viable maskless technology and the problem will go away. Numerous sites around the world are working on maskless lithography but inevitably, the question asked is "Wouldn't a one wafer per hour maskless tool make a really good mask writer?" Of course, the answer is yes, the hesitation you hear in the answer isn't based on technology concerns, it's financial. The industry needs maskless lithography because mask costs are too high. Mask costs are too high because mask pattern generators (PG's) are slow and expensive. If mask PG's become much faster, mask costs go down, the maskless market goes away and the PG supplier is faced with an even smaller tool demand from the mask shops. Technical success becomes financial suicide - or does it? In this paper we will present the results of a model that examines some of the consequences of introducing high throughput maskless pattern generation. Specific features in the model include tool throughput for masks and wafers, market segmentation by node for masks and wafers and mask cost as an entry barrier to new chip designs. How does the availability of low cost masks and maskless tools affect the industries tool makeup and what is the ultimate potential market for high throughput maskless pattern generators?
Streamlining the Design Tradespace for Earth Imaging Constellations
NASA Technical Reports Server (NTRS)
Nag, Sreeja; Hughes, Steven P.; Le Moigne, Jacqueline J.
2016-01-01
Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.
NREL Suite of Tools for PV and Storage Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elgqvist, Emma M; Salasovich, James A
Many different factors such as the solar resource, technology costs and incentives, utility cost and consumption, space available, and financial parameters impact the technical and economic potential of a PV project. NREL has developed techno-economic modeling tools that can be used to evaluate PV projects at a site.
Operations and Modeling Analysis
NASA Technical Reports Server (NTRS)
Ebeling, Charles
2005-01-01
The Reliability and Maintainability Analysis Tool (RMAT) provides NASA the capability to estimate reliability and maintainability (R&M) parameters and operational support requirements for proposed space vehicles based upon relationships established from both aircraft and Shuttle R&M data. RMAT has matured both in its underlying database and in its level of sophistication in extrapolating this historical data to satisfy proposed mission requirements, maintenance concepts and policies, and type of vehicle (i.e. ranging from aircraft like to shuttle like). However, a companion analyses tool, the Logistics Cost Model (LCM) has not reached the same level of maturity as RMAT due, in large part, to nonexistent or outdated cost estimating relationships and underlying cost databases, and it's almost exclusive dependence on Shuttle operations and logistics cost input parameters. As a result, the full capability of the RMAT/LCM suite of analysis tools to take a conceptual vehicle and derive its operations and support requirements along with the resulting operating and support costs has not been realized.
OpCost: an open-source system for estimating costs of stand-level forest operations
Conor K. Bell; Robert F. Keefe; Jeremy S. Fried
2017-01-01
This report describes and documents the OpCost forest operations cost model, a key component of the BioSum analysis framework. OpCost is available in two editions: as a callable module for use with BioSum, and in a stand-alone edition that can be run directly from R. OpCost model logic and assumptions for this open-source tool are explained, references to the...
Constellation Program Life-cycle Cost Analysis Model (LCAM)
NASA Technical Reports Server (NTRS)
Prince, Andy; Rose, Heidi; Wood, James
2008-01-01
The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.
Overview of SDCM - The Spacecraft Design and Cost Model
NASA Technical Reports Server (NTRS)
Ferebee, Melvin J.; Farmer, Jeffery T.; Andersen, Gregory C.; Flamm, Jeffery D.; Badi, Deborah M.
1988-01-01
The Spacecraft Design and Cost Model (SDCM) is a computer-aided design and analysis tool for synthesizing spacecraft configurations, integrating their subsystems, and generating information concerning on-orbit servicing and costs. SDCM uses a bottom-up method in which the cost and performance parameters for subsystem components are first calculated; the model then sums the contributions from individual components in order to obtain an estimate of sizes and costs for each candidate configuration within a selected spacecraft system. An optimum spacraft configuration can then be selected.
Edenharter, Günther M; Gartner, Daniel; Pförringer, Dominik
2017-06-01
Increasing costs of material resources challenge hospitals to stay profitable. Particularly in anesthesia departments and intensive care units, bronchoscopes are used for various indications. Inefficient management of single- and multiple-use systems can influence the hospitals' material costs substantially. Using mathematical modeling, we developed a strategic decision support tool to determine the optimum mix of disposable and reusable bronchoscopy devices in the setting of an intensive care unit. A mathematical model with the objective to minimize costs in relation to demand constraints for bronchoscopy devices was formulated. The stochastic model decides whether single-use, multi-use, or a strategically chosen mix of both device types should be used. A decision support tool was developed in which parameters for uncertain demand such as mean, standard deviation, and a reliability parameter can be inserted. Furthermore, reprocessing costs per procedure, procurement, and maintenance costs for devices can be parameterized. Our experiments show for which demand pattern and reliability measure, it is efficient to only use reusable or disposable devices and under which circumstances the combination of both device types is beneficial. To determine the optimum mix of single-use and reusable bronchoscopy devices effectively and efficiently, managers can enter their hospital-specific parameters such as demand and prices into the decision support tool.The software can be downloaded at: https://github.com/drdanielgartner/bronchomix/.
NASA Astrophysics Data System (ADS)
Falinski, K. A.; Oleson, K.; Htun, H.; Kappel, C.; Lecky, J.; Rowe, C.; Selkoe, K.; White, C.
2016-12-01
Faced with anthropogenic stressors and declining coral reef states, managers concerned with restoration and resilience of coral reefs are increasingly recognizing the need to take a ridge-to-reef, ecosystem-based approach. An ecosystem services framing can help managers move towards these goals, helping to illustrate trade-offs and opportunities of management actions in terms of their impacts on society. We describe a research program building a spatial ecosystem services-based decision-support tool, and being applied to guide ridge-to-reef management in a NOAA priority site in West Maui. We use multiple modeling methods to link biophysical processes to ecosystem services and their spatial flows and social values in an integrating platform. Modeled services include water availability, sediment retention, nutrient retention and carbon sequestration on land. A coral reef ecosystem service model is under development to capture the linkages between terrestrial and coastal ecosystem services. Valuation studies are underway to quantify the implications for human well-being. The tool integrates techniques from decision science to facilitate decision making. We use the sediment retention model to illustrate the types of analyses the tool can support. The case study explores the tradeoffs between road rehabilitation costs and sediment export avoided. We couple the sediment and cost models with trade-off analysis to identify optimal distributed solutions that are most cost-effective in reducing erosion, and then use those models to estimate sediment exposure to coral reefs. We find that cooperation between land owners reveals opportunities for maximizing the benefits of fixing roads and minimizes costs. This research forms the building blocks of an ecosystem service decision support tool that we intend to continue to test and apply in other Pacific Island settings.
Selected Tether Applications Cost Model
NASA Technical Reports Server (NTRS)
Keeley, Michael G.
1988-01-01
Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.
A business case evaluation of workplace engineering noise control: a net-cost model.
Lahiri, Supriya; Low, Colleen; Barry, Michael
2011-03-01
This article provides a convenient tool for companies to determine the costs and benefits of alternative interventions to prevent noise-induced hearing loss (NIHL). Contextualized for Singapore and in collaboration with Singapore's Ministry of Manpower, the Net-Cost model evaluates costs of intervention for equipment and labor, avoided costs of productivity losses and medical care, and productivity gains from the employer's economic perspective. To pilot this approach, four case studies are presented, with varying degrees of economic benefits to the employer, including one in which multifactor productivity is the main driver. Although compliance agencies may not require economic analysis of NIHL, given scarce resources in a market-driven economy, this tool enables stakeholders to understand and compare the costs and benefits of NIHL interventions comprehensively and helps in determining risk management strategies.
Launch Vehicle Propulsion Parameter Design Multiple Selection Criteria
NASA Technical Reports Server (NTRS)
Shelton, Joey Dewayne
2004-01-01
The optimization tool described herein addresses and emphasizes the use of computer tools to model a system and focuses on a concept development approach for a liquid hydrogen/liquid oxygen single-stage-to-orbit system, but more particularly the development of the optimized system using new techniques. This methodology uses new and innovative tools to run Monte Carlo simulations, genetic algorithm solvers, and statistical models in order to optimize a design concept. The concept launch vehicle and propulsion system were modeled and optimized to determine the best design for weight and cost by varying design and technology parameters. Uncertainty levels were applied using Monte Carlo Simulations and the model output was compared to the National Aeronautics and Space Administration Space Shuttle Main Engine. Several key conclusions are summarized here for the model results. First, the Gross Liftoff Weight and Dry Weight were 67% higher for the design case for minimization of Design, Development, Test and Evaluation cost when compared to the weights determined by the minimization of Gross Liftoff Weight case. In turn, the Design, Development, Test and Evaluation cost was 53% higher for optimized Gross Liftoff Weight case when compared to the cost determined by case for minimization of Design, Development, Test and Evaluation cost. Therefore, a 53% increase in Design, Development, Test and Evaluation cost results in a 67% reduction in Gross Liftoff Weight. Secondly, the tool outputs define the sensitivity of propulsion parameters, technology and cost factors and how these parameters differ when cost and weight are optimized separately. A key finding was that for a Space Shuttle Main Engine thrust level the oxidizer/fuel ratio of 6.6 resulted in the lowest Gross Liftoff Weight rather than at 5.2 for the maximum specific impulse, demonstrating the relationships between specific impulse, engine weight, tank volume and tank weight. Lastly, the optimum chamber pressure for Gross Liftoff Weight minimization was 2713 pounds per square inch as compared to 3162 for the Design, Development, Test and Evaluation cost optimization case. This chamber pressure range is close to 3000 pounds per square inch for the Space Shuttle Main Engine.
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
Estimating Software-Development Costs With Greater Accuracy
NASA Technical Reports Server (NTRS)
Baker, Dan; Hihn, Jairus; Lum, Karen
2008-01-01
COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.
Use of Cost-Utility Decision Models in Business Education.
ERIC Educational Resources Information Center
Lewis, Darrell R.
1989-01-01
Explains how cost-utility analysis can be applied to the selection of curriculum and instructional methods. Describes the use of multiattribute utility models of decision making as a tool for more informed judgment in educational administration. (SK)
Mechanics and energetics in tool manufacture and use: a synthetic approach.
Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya
2014-11-06
Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Mechanics and energetics in tool manufacture and use: a synthetic approach
Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya
2014-01-01
Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. PMID:25209405
Goldfield, Norbert
2010-01-01
Policymakers are searching for ways to control health care costs and improve quality. Diagnosis-related groups (DRGs) are by far the most important cost control and quality improvement tool that governments and private payers have implemented. This article reviews why DRGs have had this singular success both in the hospital sector and, over the past 10 years, in ambulatory and managed care settings. Last, the author reviews current trends in the development and implementation of tools that have the key ingredients of DRG success: categorical clinical model, separation of the clinical model from payment weights, separate payment adjustments for nonclinical factors, and outlier payments. Virtually all current tools used to manage health care costs and improve quality do not have these characteristics. This failure explains a key reason for the failure, for example, of the Medicare Advantage program to control health care costs. This article concludes with a discussion of future developments for DRG-type models outside the hospital sector.
Comparative analysis for various redox flow batteries chemistries using a cost performance model
NASA Astrophysics Data System (ADS)
Crawford, Alasdair; Viswanathan, Vilayanur; Stephenson, David; Wang, Wei; Thomsen, Edwin; Reed, David; Li, Bin; Balducci, Patrick; Kintner-Meyer, Michael; Sprenkle, Vincent
2015-10-01
The total energy storage system cost is determined by means of a robust performance-based cost model for multiple flow battery chemistries. Systems aspects such as shunt current losses, pumping losses and various flow patterns through electrodes are accounted for. The system cost minimizing objective function determines stack design by optimizing the state of charge operating range, along with current density and current-normalized flow. The model cost estimates are validated using 2-kW stack performance data for the same size electrodes and operating conditions. Using our validated tool, it has been demonstrated that an optimized all-vanadium system has an estimated system cost of < 350 kWh-1 for 4-h application. With an anticipated decrease in component costs facilitated by economies of scale from larger production volumes, coupled with performance improvements enabled by technology development, the system cost is expected to decrease to 160 kWh-1 for a 4-h application, and to 100 kWh-1 for a 10-h application. This tool has been shared with the redox flow battery community to enable cost estimation using their stack data and guide future direction.
Economic Modeling as a Component of Academic Strategic Planning.
ERIC Educational Resources Information Center
MacKinnon, Joyce; Sothmann, Mark; Johnson, James
2001-01-01
Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)
Satellite broadcasting system study
NASA Technical Reports Server (NTRS)
1972-01-01
The study to develop a system model and computer program representative of broadcasting satellite systems employing community-type receiving terminals is reported. The program provides a user-oriented tool for evaluating performance/cost tradeoffs, synthesizing minimum cost systems for a given set of system requirements, and performing sensitivity analyses to identify critical parameters and technology. The performance/ costing philosophy and what is meant by a minimum cost system is shown graphically. Topics discussed include: main line control program, ground segment model, space segment model, cost models and launch vehicle selection. Several examples of minimum cost systems resulting from the computer program are presented. A listing of the computer program is also included.
Data and Tools - Alphabetical Listing | NREL
Climate Action Planning Tool Community Solar Scenario Tool Comparative PV Levelized Cost of Energy (LCOE Design Response Toolbox WEC-Sim: Wave Energy Converter Simulator West Associates Solar Monitoring Network Design and Engineering Model
A review of the solar array manufacturing industry costing standards
NASA Technical Reports Server (NTRS)
1977-01-01
The solar array manufacturing industry costing standards model is designed to compare the cost of producing solar arrays using alternative manufacturing processes. Constructive criticism of the methodology used is intended to enhance its implementation as a practical design tool. Three main elements of the procedure include workbook format and presentation, theoretical model validity and standard financial parameters.
Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup
Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.
2010-01-01
Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure. PMID:21258651
Estill, Janne; Salazar-Vizcaya, Luisa; Blaser, Nello; Egger, Matthias; Keiser, Olivia
2015-01-01
The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.
Estill, Janne; Salazar-Vizcaya, Luisa; Blaser, Nello; Egger, Matthias; Keiser, Olivia
2015-01-01
Background The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. Methods We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. Results Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. Conclusion Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs. PMID:25793531
Linear versus quadratic portfolio optimization model with transaction cost
NASA Astrophysics Data System (ADS)
Razak, Norhidayah Bt Ab; Kamil, Karmila Hanim; Elias, Siti Masitah
2014-06-01
Optimization model is introduced to become one of the decision making tools in investment. Hence, it is always a big challenge for investors to select the best model that could fulfill their goal in investment with respect to risk and return. In this paper we aims to discuss and compare the portfolio allocation and performance generated by quadratic and linear portfolio optimization models namely of Markowitz and Maximin model respectively. The application of these models has been proven to be significant and popular among others. However transaction cost has been debated as one of the important aspects that should be considered for portfolio reallocation as portfolio return could be significantly reduced when transaction cost is taken into consideration. Therefore, recognizing the importance to consider transaction cost value when calculating portfolio' return, we formulate this paper by using data from Shariah compliant securities listed in Bursa Malaysia. It is expected that, results from this paper will effectively justify the advantage of one model to another and shed some lights in quest to find the best decision making tools in investment for individual investors.
Process-Improvement Cost Model for the Emergency Department.
Dyas, Sheila R; Greenfield, Eric; Messimer, Sherri; Thotakura, Swati; Gholston, Sampson; Doughty, Tracy; Hays, Mary; Ivey, Richard; Spalding, Joseph; Phillips, Robin
2015-01-01
The objective of this report is to present a simplified, activity-based costing approach for hospital emergency departments (EDs) to use with Lean Six Sigma cost-benefit analyses. The cost model complexity is reduced by removing diagnostic and condition-specific costs, thereby revealing the underlying process activities' cost inefficiencies. Examples are provided for evaluating the cost savings from reducing discharge delays and the cost impact of keeping patients in the ED (boarding) after the decision to admit has been made. The process-improvement cost model provides a needed tool in selecting, prioritizing, and validating Lean process-improvement projects in the ED and other areas of patient care that involve multiple dissimilar diagnoses.
Chung, Philip; Heller, J Alex; Etemadi, Mozziyar; Ottoson, Paige E; Liu, Jonathan A; Rand, Larry; Roy, Shuvo
2014-06-27
Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications.
Microgrid Analysis Tools Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jimenez, Antonio; Haase, Scott G; Mathur, Shivani
2018-03-05
The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimizationmore » tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).« less
Graham, Robert J; McManus, Michael L; Rodday, Angie Mae; Weidner, Ruth Ann; Parsons, Susan K
2018-05-01
To describe program design, costs, and savings implications of a critical care-based care coordination model for medically complex children with chronic respiratory failure. All program activities and resultant clinical outcomes were tracked over 4 years using an adapted version of the Care Coordination Measurement Tool. Patient characteristics, program activity, and acute care resource utilization were prospectively documented in the adapted version of the Care Coordination Measurement Tool and retrospectively cross-validated with hospital billing data. Impact on total costs of care was then estimated based on program outcomes and nationally representative administrative data. Tertiary children's hospital. Critical Care, Anesthesia, Perioperative Extension and Home Ventilation Program enrollees. None. The program provided care for 346 patients and families over the study period. Median age at enrollment was 6 years with more than half deriving secondary respiratory failure from a primary neuromuscular disease. There were 11,960 encounters over the study period, including 1,202 home visits, 673 clinic visits, and 4,970 telephone or telemedicine encounters. Half (n = 5,853) of all encounters involved a physician and 45% included at least one care coordination activity. Overall, we estimated that program interventions were responsible for averting 556 emergency department visits and 107 hospitalizations. Conservative monetization of these alone accounted for annual savings of $1.2-2 million or $407/pt/mo net of program costs. Innovative models, such as extension of critical care services, for high-risk, high-cost patients can result in immediate cost savings. Evaluation of financial implications of comprehensive care for high-risk patients is necessary to complement clinical and patient-centered outcomes for alternative care models. When year-to-year cost variability is high and cost persistence is low, these savings can be estimated from documentation within care coordination management tools. Means of financial sustainability, scalability, and equal access of such care models need to be established.
NASA Astrophysics Data System (ADS)
Wood, Brian M.; Wood, Zoë J.
2006-01-01
We present a visualization and computation tool for modeling the caloric cost of pedestrian travel across three dimensional terrains. This tool is being used in ongoing archaeological research that analyzes how costs of locomotion affect the spatial distribution of trails and artifacts across archaeological landscapes. Throughout human history, traveling by foot has been the most common form of transportation, and therefore analyses of pedestrian travel costs are important for understanding prehistoric patterns of resource acquisition, migration, trade, and political interaction. Traditionally, archaeologists have measured geographic proximity based on "as the crow flies" distance. We propose new methods for terrain visualization and analysis based on measuring paths of least caloric expense, calculated using well established metabolic equations. Our approach provides a human centered metric of geographic closeness, and overcomes significant limitations of available Geographic Information System (GIS) software. We demonstrate such path computations and visualizations applied to archaeological research questions. Our system includes tools to visualize: energetic cost surfaces, comparisons of the elevation profiles of shortest paths versus least cost paths, and the display of paths of least caloric effort on Digital Elevation Models (DEMs). These analysis tools can be applied to calculate and visualize 1) likely locations of prehistoric trails and 2) expected ratios of raw material types to be recovered at archaeological sites.
A new methodology for modeling of direct landslide costs for transportation infrastructures
NASA Astrophysics Data System (ADS)
Klose, Martin; Terhorst, Birgit
2014-05-01
The world's transportation infrastructure is at risk of landslides in many areas across the globe. A safe and affordable operation of traffic routes are the two main criteria for transportation planning in landslide-prone areas. The right balancing of these often conflicting priorities requires, amongst others, profound knowledge of the direct costs of landslide damage. These costs include capital investments for landslide repair and mitigation as well as operational expenditures for first response and maintenance works. This contribution presents a new methodology for ex post assessment of direct landslide costs for transportation infrastructures. The methodology includes tools to compile, model, and extrapolate landslide losses on different spatial scales over time. A landslide susceptibility model enables regional cost extrapolation by means of a cost figure obtained from local cost compilation for representative case study areas. On local level, cost survey is closely linked with cost modeling, a toolset for cost estimation based on landslide databases. Cost modeling uses Landslide Disaster Management Process Models (LDMMs) and cost modules to simulate and monetize cost factors for certain types of landslide damage. The landslide susceptibility model provides a regional exposure index and updates the cost figure to a cost index which describes the costs per km of traffic route at risk of landslides. Both indexes enable the regionalization of local landslide losses. The methodology is applied and tested in a cost assessment for highways in the Lower Saxon Uplands, NW Germany, in the period 1980 to 2010. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. In the 7,000 km² large Lower Saxon Uplands, 77 km of highway are located in potential landslide hazard area. Annual average costs of 52k per km of highway at risk of landslides are identified as cost index for a local case study area in this region. The cost extrapolation for the Lower Saxon Uplands results in annual average costs for highways of 4.02mn. This test application as well as a validation of selected modeling tools verifies the functionality of this methodology.
Computational algorithm to evaluate product disassembly cost index
NASA Astrophysics Data System (ADS)
Zeid, Ibrahim; Gupta, Surendra M.
2002-02-01
Environmentally conscious manufacturing is an important paradigm in today's engineering practice. Disassembly is a crucial factor in implementing this paradigm. Disassembly allows the reuse and recycling of parts and products that reach their death after their life cycle ends. There are many questions that must be answered before a disassembly decision can be reached. The most important question is economical. The cost of disassembly versus the cost of scrapping a product is always considered. This paper develops a computational tool that allows decision-makers to calculate the disassembly cost of a product. The tool makes it simple to perform 'what if' scenarios fairly quickly. The tool is Web based and has two main parts. The front-end part is a Web page and runs on the client side in a Web browser, while the back-end part is a disassembly engine (servlet) that has disassembly knowledge and costing algorithms and runs on the server side. The tool is based on the client/server model that is pervasively utilized throughout the World Wide Web. An example is used to demonstrate the implementation and capabilities of the tool.
Mars Rover/Sample Return - Phase A cost estimation
NASA Technical Reports Server (NTRS)
Stancati, Michael L.; Spadoni, Daniel J.
1990-01-01
This paper presents a preliminary cost estimate for the design and development of the Mars Rover/Sample Return (MRSR) mission. The estimate was generated using a modeling tool specifically built to provide useful cost estimates from design parameters of the type and fidelity usually available during early phases of mission design. The model approach and its application to MRSR are described.
The EPSA Project Finance Mapping Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadley, Stanton W.; Chinthavali, Supriya
The Energy Policy and Systems Analysis Office of DOE has requested a tool to compare the impact of various Federal policies on the financial viability of generation resources across the country. Policy options could include production tax credits, investment tax credits, solar renewable energy credits, tax abatement, accelerated depreciation, tax-free loans, and others. The tool would model the finances of projects in all fifty states, and possibly other geographic units like utility service territories and RTO/ISO territories. The tool would consider the facility s cost, financing, production, and revenues under different capital and market structures to determine things like levelizedmore » cost of energy, return on equity, and cost impacts on others (e.g., load-serving entities, society.) The tool would compare the cost and value of the facility to the local regional alternatives to determine how and where policy levers may provide sufficient incremental value to motivate investment. The results will be displayed through a purpose-built visualization that maps geographic variations and shows associated figures and tables.« less
Watershed Management Optimization Support Tool v3
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
FIA BioSum: a tool to evaluate financial costs, opportunities and effectiveness of fuel treatments.
Jeremy Fried; Glenn Christensen
2004-01-01
FIA BioSum, a tool developed by the USDA Forest Services Forest Inventory and Analysis (FIA) Program, generates reliable cost estimates, identifies opportunities and evaluates the effectiveness of fuel treatments in forested landscapes. BioSum is an analytic framework that integrates a suite of widely used computer models with a foundation of attribute-rich,...
NASA Astrophysics Data System (ADS)
Juszczyk, Michał
2018-04-01
This paper reports some results of the studies on the use of artificial intelligence tools for the purposes of cost estimation based on building information models. A problem of the cost estimates based on the building information models on a macro level supported by the ensembles of artificial neural networks is concisely discussed. In the course of the research a regression model has been built for the purposes of cost estimation of buildings' floor structural frames, as higher level elements. Building information models are supposed to serve as a repository of data used for the purposes of cost estimation. The core of the model is the ensemble of neural networks. The developed model allows the prediction of cost estimates with satisfactory accuracy.
Automation life-cycle cost model
NASA Technical Reports Server (NTRS)
Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne
1992-01-01
The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.
NASA/Air Force Cost Model: NAFCOM
NASA Technical Reports Server (NTRS)
Winn, Sharon D.; Hamcher, John W. (Technical Monitor)
2002-01-01
The NASA/Air Force Cost Model (NAFCOM) is a parametric estimating tool for space hardware. It is based on historical NASA and Air Force space projects and is primarily used in the very early phases of a development project. NAFCOM can be used at the subsystem or component levels.
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alonemore » reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.« less
Long, Keith R.; Singer, Donald A.
2001-01-01
Determining the economic viability of mineral deposits of various sizes and grades is a critical task in all phases of mineral supply, from land-use management to mine development. This study evaluates two simple tools for estimating the economic viability of porphyry copper deposits mined by open-pit, heap-leach methods when only limited information on these deposits is available. These two methods are useful for evaluating deposits that either (1) are undiscovered deposits predicted by a mineral resource assessment, or (2) have been discovered but for which little data has been collected or released. The first tool uses ordinary least-squared regression analysis of cost and operating data from selected deposits to estimate a predictive relationship between mining rate, itself estimated from deposit size, and capital and operating costs. The second method uses cost models developed by the U.S. Bureau of Mines (Camm, 1991) updated using appropriate cost indices. We find that the cost model method works best for estimating capital costs and the empirical model works best for estimating operating costs for mines to be developed in the United States.
Watershed Management Optimization Support Tool (WMOST) v3: User Guide
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
Watershed Management Optimization Support Tool (WMOST) v3: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context, accounting fo...
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of managemen
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management.
DOT National Transportation Integrated Search
1998-09-16
This paper and presentation discuss some of the benefits of integrating travel : demand models and desktop GIS (ArchInfo and ArcView for PCs) as a : cost-effective and staff saving tool, as well as specific improvements to : transportation planning m...
Chung, Philip; Heller, J. Alex; Etemadi, Mozziyar; Ottoson, Paige E.; Liu, Jonathan A.; Rand, Larry; Roy, Shuvo
2014-01-01
Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications. PMID:24998993
Activity-based costing: a practical model for cost calculation in radiotherapy.
Lievens, Yolande; van den Bogaert, Walter; Kesteloot, Katrien
2003-10-01
The activity-based costing method was used to compute radiotherapy costs. This report describes the model developed, the calculated costs, and possible applications for the Leuven radiotherapy department. Activity-based costing is an advanced cost calculation technique that allocates resource costs to products based on activity consumption. In the Leuven model, a complex allocation principle with a large diversity of cost drivers was avoided by introducing an extra allocation step between activity groups and activities. A straightforward principle of time consumption, weighed by some factors of treatment complexity, was used. The model was developed in an iterative way, progressively defining the constituting components (costs, activities, products, and cost drivers). Radiotherapy costs are predominantly determined by personnel and equipment cost. Treatment-related activities consume the greatest proportion of the resource costs, with treatment delivery the most important component. This translates into products that have a prolonged total or daily treatment time being the most costly. The model was also used to illustrate the impact of changes in resource costs and in practice patterns. The presented activity-based costing model is a practical tool to evaluate the actual cost structure of a radiotherapy department and to evaluate possible resource or practice changes.
INTEGRATION OF COST MODELS AND PROCESS SIMULATION TOOLS FOR OPTIMUM COMPOSITE MANUFACTURING PROCESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, Seongchan; Wilson, Daniel; Aitharaju, Venkat
Manufacturing cost of resin transfer molded composite parts is significantly influenced by the cycle time, which is strongly related to the time for both filling and curing of the resin in the mold. The time for filling can be optimized by various injection strategies, and by suitably reducing the length of the resin flow distance during the injection. The curing time can be reduced by the usage of faster curing resins, but it requires a high pressure injection equipment, which is capital intensive. Predictive manufacturing simulation tools that are being developed recently for composite materials are able to provide variousmore » scenarios of processing conditions virtually well in advance of manufacturing the parts. In the present study, we integrate the cost models with process simulation tools to study the influence of various parameters such as injection strategies, injection pressure, compression control to minimize high pressure injection, resin curing rate, and demold time on the manufacturing cost as affected by the annual part volume. A representative automotive component was selected for the study and the results are presented in this paper« less
NASA Astrophysics Data System (ADS)
Reynerson, Charles Martin
This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.
ERIC Educational Resources Information Center
Kennedy, Eileen; Laurillard, Diana; Horan, Bernard; Charlton, Patricia
2015-01-01
This article reports on a design-based research project to create a modelling tool to analyse the costs and learning benefits involved in different modes of study. The Course Resource Appraisal Model (CRAM) provides accurate cost-benefit information so that institutions are able to make more meaningful decisions about which kind of…
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
Advanced Structural Optimization Under Consideration of Cost Tracking
NASA Astrophysics Data System (ADS)
Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.
2014-06-01
In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.
Models of Weather Impact on Air Traffic
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Wang, Yao
2017-01-01
Flight delays have been a serious problem in the national airspace system costing about $30B per year. About 70 of the delays are attributed to weather and upto two thirds of these are avoidable. Better decision support tools would reduce these delays and improve air traffic management tools. Such tools would benefit from models of weather impacts on the airspace operations. This presentation discusses use of machine learning methods to mine various types of weather and traffic data to develop such models.
Abou-El-Enein, Mohamed; Römhild, Andy; Kaiser, Daniel; Beier, Carola; Bauer, Gerhard; Volk, Hans-Dieter; Reinke, Petra
2013-03-01
Advanced therapy medicinal products (ATMP) have gained considerable attention in academia due to their therapeutic potential. Good Manufacturing Practice (GMP) principles ensure the quality and sterility of manufacturing these products. We developed a model for estimating the manufacturing costs of cell therapy products and optimizing the performance of academic GMP-facilities. The "Clean-Room Technology Assessment Technique" (CTAT) was tested prospectively in the GMP facility of BCRT, Berlin, Germany, then retrospectively in the GMP facility of the University of California-Davis, California, USA. CTAT is a two-level model: level one identifies operational (core) processes and measures their fixed costs; level two identifies production (supporting) processes and measures their variable costs. The model comprises several tools to measure and optimize performance of these processes. Manufacturing costs were itemized using adjusted micro-costing system. CTAT identified GMP activities with strong correlation to the manufacturing process of cell-based products. Building best practice standards allowed for performance improvement and elimination of human errors. The model also demonstrated the unidirectional dependencies that may exist among the core GMP activities. When compared to traditional business models, the CTAT assessment resulted in a more accurate allocation of annual expenses. The estimated expenses were used to set a fee structure for both GMP facilities. A mathematical equation was also developed to provide the final product cost. CTAT can be a useful tool in estimating accurate costs for the ATMPs manufactured in an optimized GMP process. These estimates are useful when analyzing the cost-effectiveness of these novel interventions. Copyright © 2013 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Dashboard systems: implementing pharmacometrics from bench to bedside.
Mould, Diane R; Upton, Richard N; Wojciechowski, Jessica
2014-09-01
In recent years, there has been increasing interest in the development of medical decision-support tools, including dashboard systems. Dashboard systems are software packages that integrate information and calculations about therapeutics from multiple components into a single interface for use in the clinical environment. Given the high cost of medical care, and the increasing need to demonstrate positive clinical outcomes for reimbursement, dashboard systems may become an important tool for improving patient outcome, improving clinical efficiency and containing healthcare costs. Similarly the costs associated with drug development are also rising. The use of model-based drug development (MBDD) has been proposed as a tool to streamline this process, facilitating the selection of appropriate doses and making informed go/no-go decisions. However, complete implementation of MBDD has not always been successful owing to a variety of factors, including the resources required to provide timely modeling and simulation updates. The application of dashboard systems in drug development reduces the resource requirement and may expedite updating models as new data are collected, allowing modeling results to be available in a timely fashion. In this paper, we present some background information on dashboard systems and propose the use of these systems both in the clinic and during drug development.
Projecting manpower to attain quality
NASA Technical Reports Server (NTRS)
Rone, K. Y.
1983-01-01
The resulting model is useful as a projection tool but must be validated in order to be used as an on-going software cost engineering tool. A procedure is developed to facilitate the tracking of model projections and actual data to allow the model to be tuned. Finally, since the model must be used in an environment of overlapping development activities on a progression of software elements in development and maintenance, a manpower allocation model is developed for use in a steady state development/maintenance environment. In these days of soaring software costs it becomes increasingly important to properly manage a software development project. One element of the management task is the projection and tracking of manpower required to perform the task. In addition, since the total cost of the task is directly related to the initial quality built into the software, it becomes a necessity to project the development manpower in a way to attain that quality. An approach to projecting and tracking manpower with quality in mind is described.
Quantifying Oldowan Stone Tool Production at Olduvai Gorge, Tanzania
Reti, Jay S.
2016-01-01
Recent research suggests that variation exists among and between Oldowan stone tool assemblages. Oldowan variation might represent differential constraints on raw materials used to produce these stone implements. Alternatively, variation among Oldowan assemblages could represent different methods that Oldowan producing hominins utilized to produce these lithic implements. Identifying differential patterns of stone tool production within the Oldowan has implications for assessing how stone tool technology evolved, how traditions of lithic production might have been culturally transmitted, and for defining the timing and scope of these evolutionary events. At present there is no null model to predict what morphological variation in the Oldowan should look like. Without such a model, quantifying whether Oldowan assemblages vary due to raw material constraints or whether they vary due to differences in production technique is not possible. This research establishes a null model for Oldowan lithic artifact morphological variation. To establish these expectations this research 1) models the expected range of variation through large scale reduction experiments, 2) develops an algorithm to categorize archaeological flakes based on how they are produced, and 3) statistically assesses the methods of production behavior used by Oldowan producing hominins at the site of DK from Olduvai Gorge, Tanzania via the experimental model. Results indicate that a subset of quartzite flakes deviate from the null expectations in a manner that demonstrates efficiency in flake manufacture, while some basalt flakes deviate from null expectations in a manner that demonstrates inefficiency in flake manufacture. The simultaneous presence of efficiency in stone tool production for one raw material (quartzite) and inefficiency in stone tool production for another raw material (basalt) suggests that Oldowan producing hominins at DK were able to mediate the economic costs associated with stone tool procurement by utilizing high-cost materials more efficiently than is expected and low-cost materials in an inefficient manner. PMID:26808429
Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C
2013-09-01
River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.
Wooley; Ruth; Glassner; Sheehan
1999-10-01
Bioethanol is a fuel-grade ethanol made from trees, grasses, and waste materials. It represents a sustainable substitute for gasoline in today's passenger cars. Modeling and design of processes for making bioethanol are critical tools used in the U.S. Department of Energy's bioethanol research and development program. We use such analysis to guide new directions for research and to help us understand the level at which and the time when bioethanol will achieve commercial success. This paper provides an update on our latest estimates for current and projected costs of bioethanol. These estimates are the result of very sophisticated modeling and costing efforts undertaken in the program over the past few years. Bioethanol could cost anywhere from $1.16 to $1.44 per gallon, depending on the technology and the availability of low cost feedstocks for conversion to ethanol. While this cost range opens the door to fuel blending opportunities, in which ethanol can be used, for example, to improve the octane rating of gasoline, it is not currently competitive with gasoline as a bulk fuel. Research strategies and goals described in this paper have been translated into cost savings for ethanol. Our analysis of these goals shows that the cost of ethanol could drop by 40 cents per gallon over the next ten years by taking advantage of exciting new tools in biotechnology that will improve yield and performance in the conversion process.
SAMICS Validation. SAMICS Support Study, Phase 3
NASA Technical Reports Server (NTRS)
1979-01-01
SAMICS provides a consistent basis for estimating array costs and compares production technology costs. A review and a validation of the SAMICS model are reported. The review had the following purposes: (1) to test the computational validity of the computer model by comparison with preliminary hand calculations based on conventional cost estimating techniques; (2) to review and improve the accuracy of the cost relationships being used by the model: and (3) to provide an independent verification to users of the model's value in decision making for allocation of research and developement funds and for investment in manufacturing capacity. It is concluded that the SAMICS model is a flexible, accurate, and useful tool for managerial decision making.
Distributed Space Mission Design for Earth Observation Using Model-Based Performance Evaluation
NASA Technical Reports Server (NTRS)
Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Cervantes, Ben; DeWeck, Oliver
2015-01-01
Distributed Space Missions (DSMs) are gaining momentum in their application to earth observation missions owing to their unique ability to increase observation sampling in multiple dimensions. DSM design is a complex problem with many design variables, multiple objectives determining performance and cost and emergent, often unexpected, behaviors. There are very few open-access tools available to explore the tradespace of variables, minimize cost and maximize performance for pre-defined science goals, and therefore select the most optimal design. This paper presents a software tool that can multiple DSM architectures based on pre-defined design variable ranges and size those architectures in terms of predefined science and cost metrics. The tool will help a user select Pareto optimal DSM designs based on design of experiments techniques. The tool will be applied to some earth observation examples to demonstrate its applicability in making some key decisions between different performance metrics and cost metrics early in the design lifecycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James Francfort; Kevin Morrow; Dimitri Hochard
2007-02-01
This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.
System-of-Systems Technology-Portfolio-Analysis Tool
NASA Technical Reports Server (NTRS)
O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne
2012-01-01
Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.
Cockpit System Situational Awareness Modeling Tool
NASA Technical Reports Server (NTRS)
Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara
2004-01-01
This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa
This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATLmore » Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.« less
Predictive and Prognostic Models: Implications for Healthcare Decision-Making in a Modern Recession
Vogenberg, F. Randy
2009-01-01
Various modeling tools have been developed to address the lack of standardized processes that incorporate the perspectives of all healthcare stakeholders. Such models can assist in the decision-making process aimed at achieving specific clinical outcomes, as well as guide the allocation of healthcare resources and reduce costs. The current efforts in Congress to change the way healthcare is financed, reimbursed, and delivered have rendered the incorporation of modeling tools into the clinical decision-making all the more important. Prognostic and predictive models are particularly relevant to healthcare, particularly in the clinical decision-making, with implications for payers, patients, and providers. The use of these models is likely to increase, as providers and patients seek to improve their clinical decision process to achieve better outcomes, while reducing overall healthcare costs. PMID:25126292
Christin, Zachary; Bagstad, Kenneth J.; Verdone, Michael
2016-01-01
Restoring degraded forests and agricultural lands has become a global conservation priority. A growing number of tools can quantify ecosystem service tradeoffs associated with forest restoration. This evolving “tools landscape” presents a dilemma: more tools are available, but selecting appropriate tools has become more challenging. We present a Restoration Ecosystem Service Tool Selector (RESTS) framework that describes key characteristics of 13 ecosystem service assessment tools. Analysts enter information about their decision context, services to be analyzed, and desired outputs. Tools are filtered and presented based on five evaluative criteria: scalability, cost, time requirements, handling of uncertainty, and applicability to benefit-cost analysis. RESTS uses a spreadsheet interface but a web-based interface is planned. Given the rapid evolution of ecosystem services science, RESTS provides an adaptable framework to guide forest restoration decision makers toward tools that can help quantify ecosystem services in support of restoration.
Mayer, Gerhard; Quast, Christian; Felden, Janine; Lange, Matthias; Prinz, Manuel; Pühler, Alfred; Lawerenz, Chris; Scholz, Uwe; Glöckner, Frank Oliver; Müller, Wolfgang; Marcus, Katrin; Eisenacher, Martin
2017-10-30
Sustainable noncommercial bioinformatics infrastructures are a prerequisite to use and take advantage of the potential of big data analysis for research and economy. Consequently, funders, universities and institutes as well as users ask for a transparent value model for the tools and services offered. In this article, a generally applicable lightweight method is described by which bioinformatics infrastructure projects can estimate the value of tools and services offered without determining exactly the total costs of ownership. Five representative scenarios for value estimation from a rough estimation to a detailed breakdown of costs are presented. To account for the diversity in bioinformatics applications and services, the notion of service-specific 'service provision units' is introduced together with the factors influencing them and the main underlying assumptions for these 'value influencing factors'. Special attention is given on how to handle personnel costs and indirect costs such as electricity. Four examples are presented for the calculation of the value of tools and services provided by the German Network for Bioinformatics Infrastructure (de.NBI): one for tool usage, one for (Web-based) database analyses, one for consulting services and one for bioinformatics training events. Finally, from the discussed values, the costs of direct funding and the costs of payment of services by funded projects are calculated and compared. © The Author 2017. Published by Oxford University Press.
NASA Air Force Cost Model (NAFCOM): Capabilities and Results
NASA Technical Reports Server (NTRS)
McAfee, Julie; Culver, George; Naderi, Mahmoud
2011-01-01
NAFCOM is a parametric estimating tool for space hardware. Uses cost estimating relationships (CERs) which correlate historical costs to mission characteristics to predict new project costs. It is based on historical NASA and Air Force space projects. It is intended to be used in the very early phases of a development project. NAFCOM can be used at the subsystem or component levels and estimates development and production costs. NAFCOM is applicable to various types of missions (crewed spacecraft, uncrewed spacecraft, and launch vehicles). There are two versions of the model: a government version that is restricted and a contractor releasable version.
Skjerdal, Taran; Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; de Cecare, Alessandra; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Trevisiani, Marcello; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine
2017-01-01
A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes , quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as "good"; "sufficient"; or "corrective action needed" based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.
Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch.; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine
2017-01-01
A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes, quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as “good”; “sufficient”; or “corrective action needed” based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users. PMID:29457031
OPTIMIZING BMP PLACEMENT AT WATERSHED-SCALE USING SUSTAIN
Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...
U.S. EPA's Watershed Management Research Activities
Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...
The Watershed Management Optimization Support Tool (WMOST v.1) was released by the US Environmental Protection Agency in December 2013 (http://www2.epa.gov/exposure-assessment-models/wmost-10-download-page). The objective of WMOST is to serve as a public-domain screening tool th...
Budget impact analysis of trastuzumab in early breast cancer: a hospital district perspective.
Purmonen, Timo T; Auvinen, Päivi K; Martikainen, Janne A
2010-04-01
Adjuvant trastuzumab is widely used in HER2-positive (HER2+) early breast cancer, and despite its cost-effectiveness, it causes substantial costs for health care. The purpose of the study was to develop a tool for estimating the budget impact of new cancer treatments. With this tool, we were able to estimate the budget impact of adjuvant trastuzumab, as well as the probability of staying within a given budget constraint. The created model-based evaluation tool was used to explore the budget impact of trastuzumab in early breast cancer in a single Finnish hospital district with 250,000 inhabitants. The used model took into account the number of patients, HER2+ prevalence, length and cost of treatment, and the effectiveness of the therapy. Probabilistic sensitivity analysis and alternative case scenarios were performed to ensure the robustness of the results. Introduction of adjuvant trastuzumab caused substantial costs for a relatively small hospital district. In base-case analysis the 4-year net budget impact was 1.3 million euro. The trastuzumab acquisition costs were partially offset by the reduction in costs associated with the treatment of cancer recurrence and metastatic disease. Budget impact analyses provide important information about the overall economic impact of new treatments, and thus offer complementary information to cost-effectiveness analyses. Inclusion of treatment outcomes and probabilistic sensitivity analysis provides more realistic estimates of the net budget impact. The length of trastuzumab treatment has a strong effect on the budget impact.
CMOST: an open-source framework for the microsimulation of colorectal cancer screening strategies.
Prakash, Meher K; Lang, Brian; Heinrich, Henriette; Valli, Piero V; Bauerfeind, Peter; Sonnenberg, Amnon; Beerenwinkel, Niko; Misselwitz, Benjamin
2017-06-05
Colorectal cancer (CRC) is a leading cause of cancer-related mortality. CRC incidence and mortality can be reduced by several screening strategies, including colonoscopy, but randomized CRC prevention trials face significant obstacles such as the need for large study populations with long follow-up. Therefore, CRC screening strategies will likely be designed and optimized based on computer simulations. Several computational microsimulation tools have been reported for estimating efficiency and cost-effectiveness of CRC prevention. However, none of these tools is publicly available. There is a need for an open source framework to answer practical questions including testing of new screening interventions and adapting findings to local conditions. We developed and implemented a new microsimulation model, Colon Modeling Open Source Tool (CMOST), for modeling the natural history of CRC, simulating the effects of CRC screening interventions, and calculating the resulting costs. CMOST facilitates automated parameter calibration against epidemiological adenoma prevalence and CRC incidence data. Predictions of CMOST were highly similar compared to a large endoscopic CRC prevention study as well as predictions of existing microsimulation models. We applied CMOST to calculate the optimal timing of a screening colonoscopy. CRC incidence and mortality are reduced most efficiently by a colonoscopy between the ages of 56 and 59; while discounted life years gained (LYG) is maximal at 49-50 years. With a dwell time of 13 years, the most cost-effective screening is at 59 years, at $17,211 discounted USD per LYG. While cost-efficiency varied according to dwell time it did not influence the optimal time point of screening interventions within the tested range. Predictions of CMOST are highly similar compared to a randomized CRC prevention trial as well as those of other microsimulation tools. This open source tool will enable health-economics analyses in for various countries, health-care scenarios and CRC prevention strategies. CMOST is freely available under the GNU General Public License at https://gitlab.com/misselwb/CMOST.
Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae
2015-03-01
Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.
Measurement of W + bb and a search for MSSM Higgs bosons with the CMS detector at the LHC
NASA Astrophysics Data System (ADS)
O'Connor, Alexander Pinpin
Tooling used to cure composite laminates in the aerospace and automotive industries must provide a dimensionally stable geometry throughout the thermal cycle applied during the part curing process. This requires that the Coefficient of Thermal Expansion (CTE) of the tooling materials match that of the composite being cured. The traditional tooling material for production applications is a nickel alloy. Poor machinability and high material costs increase the expense of metallic tooling made from nickel alloys such as 'Invar 36' or 'Invar 42'. Currently, metallic tooling is unable to meet the needs of applications requiring rapid affordable tooling solutions. In applications where the tooling is not required to have the durability provided by metals, such as for small area repair, an opportunity exists for non-metallic tooling materials like graphite, carbon foams, composites, or ceramics and machinable glasses. Nevertheless, efficient machining of brittle, non-metallic materials is challenging due to low ductility, porosity, and high hardness. The machining of a layup tool comprises a large portion of the final cost. Achieving maximum process economy requires optimization of the machining process in the given tooling material. Therefore, machinability of the tooling material is a critical aspect of the overall cost of the tool. In this work, three commercially available, brittle/porous, non-metallic candidate tooling materials were selected, namely: (AAC) Autoclaved Aerated Concrete, CB1100 ceramic block and Cfoam carbon foam. Machining tests were conducted in order to evaluate the machinability of these materials using end milling. Chip formation, cutting forces, cutting tool wear, machining induced damage, surface quality and surface integrity were investigated using High Speed Steel (HSS), carbide, diamond abrasive and Polycrystalline Diamond (PCD) cutting tools. Cutting forces were found to be random in magnitude, which was a result of material porosity. The abrasive nature of Cfoam produced rapid tool wear when using HSS and PCD type cutting tools. However, tool wear was not significant in AAC or CB1100 regardless of the type of cutting edge. Machining induced damage was observed in the form of macro-scale chipping and fracture in combination with micro-scale cracking. Transverse rupture test results revealed significant reductions in residual strength and damage tolerance in CB1100. In contrast, AAC and Cfoam showed no correlation between machining induced damage and a reduction in surface integrity. Cutting forces in machining were modeled for all materials. Cutting force regression models were developed based on Design of Experiment and Analysis of Variance. A mechanistic cutting force model was proposed based upon conventional end milling force models and statistical distributions of material porosity. In order to validate the model, predicted cutting forces were compared to experimental results. Predicted cutting forces agreed well with experimental measurements. Furthermore, over the range of cutting conditions tested, the proposed model was shown to have comparable predictive accuracy to empirically produced regression models; greatly reducing the number of cutting tests required to simulate cutting forces. Further, this work demonstrates a key adaptation of metallic cutting force models to brittle porous material; a vital step in the research into the machining of these materials using end milling.
Regulation, the capital-asset pricing model, and the arbitrage pricing theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roll, R.W.; Ross, S.A.
1983-05-26
This article describes the arbitrage pricing theory (APT) as and compares it with the capital-asset pricing model (CAPM) as a tool for computing the cost of capital in utility regulatory proceedings. The article argues that the APT is a significantly superior method for determining equity cost, and demonstrates that its application to utilities derives more-sensible estimates of the cost of equity capital than the CAPM. 8 references, 1 figure, 2 tables.
Nishikiori, Nobuyuki; Van Weezenbeek, Catharina
2013-02-02
Despite the progress made in the past decade, tuberculosis (TB) control still faces significant challenges. In many countries with declining TB incidence, the disease tends to concentrate in vulnerable populations that often have limited access to health care. In light of the limitations of the current case-finding approach and the global urgency to improve case detection, active case-finding (ACF) has been suggested as an important complementary strategy to accelerate tuberculosis control especially among high-risk populations. The present exercise aims to develop a model that can be used for county-level project planning. A simple deterministic model was developed to calculate the number of estimated TB cases diagnosed and the associated costs of diagnosis. The model was designed to compare cost-effectiveness parameters, such as the cost per case detected, for different diagnostic algorithms when they are applied to different risk populations. The model was transformed into a web-based tool that can support national TB programmes and civil society partners in designing ACF activities. According to the model output, tuberculosis active case-finding can be a costly endeavor, depending on the target population and the diagnostic strategy. The analysis suggests the following: (1) Active case-finding activities are cost-effective only if the tuberculosis prevalence among the target population is high. (2) Extensive diagnostic methods (e.g. X-ray screening for the entire group, use of sputum culture or molecular diagnostics) can be applied only to very high-risk groups such as TB contacts, prisoners or people living with human immunodeficiency virus (HIV) infection. (3) Basic diagnostic approaches such as TB symptom screening are always applicable although the diagnostic yield is very limited. The cost-effectiveness parameter was sensitive to local diagnostic costs and the tuberculosis prevalence of target populations. The prioritization of appropriate target populations and careful selection of cost-effective diagnostic strategies are critical prerequisites for rational active case-finding activities. A decision to conduct such activities should be based on the setting-specific cost-effectiveness analysis and programmatic assessment. A web-based tool was developed and is available to support national tuberculosis programmes and partners in the formulation of cost-effective active case-finding activities at the national and subnational levels.
2010-09-01
NNWC) was used to calculate major cost components—labor, hardware, software , and transport, while a VMware tool was used to calculate power and...cooling costs for both solutions. In addition, VMware provided a cost estimate for the upfront hardware and software licensing costs needed to support...cost per seat (CPS) model developed by Naval Network Warfare Command (NNWC) was used to calculate major cost components—labor, hardware, software , and
Abdullah, Asnawi; Hort, Krishna; Abidin, Azwar Zaenal; Amin, Fadilah M
2012-01-01
Despite significant investment in improving service infrastructure and training of staff, public primary healthcare services in low-income and middle-income countries tend to perform poorly in reaching coverage targets. One of the factors identified in Aceh, Indonesia was the lack of operational funds for service provision. The objective of this study was to develop a simple and transparent costing tool that enables health planners to calculate the unit costs of providing basic health services to estimate additional budgets required to deliver services in accordance with national targets. The tool was developed using a standard economic approach that linked the input activities to achieving six national priority programs at primary healthcare level: health promotion, sanitation and environment health, maternal and child health and family planning, nutrition, immunization and communicable diseases control, and treatment of common illness. Costing was focused on costs of delivery of the programs that need to be funded by local government budgets. The costing tool consisting of 16 linked Microsoft Excel worksheets was developed and tested in several districts enabled the calculation of the unit costs of delivering of the six national priority programs per coverage target of each program (such as unit costs of delivering of maternal and child health program per pregnant mother). This costing tool can be used by health planners to estimate additional money required to achieve a certain level of coverage of programs, and it can be adjusted for different costs and program delivery parameters in different settings. Copyright © 2012 John Wiley & Sons, Ltd.
An Alternative Procedure for Estimating Unit Learning Curves,
1985-09-01
the model accurately describes the real-life situation, i.e., when the model is properly applied to the data, it can be a powerful tool for...predicting unit production costs. There are, however, some unique estimation problems inherent in the model . The usual method of generating predicted unit...production costs attempts to extend properties of least squares estimators to non- linear functions of these estimators. The result is biased estimates of
NASA Technical Reports Server (NTRS)
ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.
2005-01-01
The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.
Hu, Wenfa; He, Xinhua
2014-01-01
The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated.
Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3 ...
To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfills in evaluating the economic and financial feasibility of LFG energy project development. In 2014, LMOP developed a public version of the model, LFGcost-Web (Version 3.0), to allow landfill and industry stakeholders to evaluate project feasibility on their own. LFGcost-Web can analyze costs for 12 energy recovery project types. These project costs can be estimated with or without the costs of a gas collection and control system (GCCS). The EPA used select equations from LFGcost-Web to estimate costs of the regulatory options in the 2015 proposed revisions to the MSW Landfills Standards of Performance (also known as New Source Performance Standards) and the Emission Guidelines (herein thereafter referred to collectively as the Landfill Rules). More specifically, equations derived from LFGcost-Web were applied to each landfill expected to be impacted by the Landfill Rules to estimate annualized installed capital costs and annual O&M costs of a gas collection and control system. In addition, after applying the LFGcost-Web equations to the list of landfills expected to require a GCCS in year 2025 as a result of the proposed Landfill Rules, the regulatory analysis evaluated whether electr
Development of materials for the rapid manufacture of die cast tooling
NASA Astrophysics Data System (ADS)
Hardro, Peter Jason
The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.
A unified approach for composite cost reporting and prediction in the ACT program
NASA Technical Reports Server (NTRS)
Freeman, W. Tom; Vosteen, Louis F.; Siddiqi, Shahid
1991-01-01
The Structures Technology Program Office (STPO) at NASA Langley Research Center has held two workshops with representatives from the commercial airframe companies to establish a plan for development of a standard cost reporting format and a cost prediction tool for conceptual and preliminary designers. This paper reviews the findings of the workshop representatives with a plan for implementation of their recommendations. The recommendations of the cost tracking and reporting committee will be implemented by reinstituting the collection of composite part fabrication data in a format similar to the DoD/NASA Structural Composites Fabrication Guide. The process of data collection will be automated by taking advantage of current technology with user friendly computer interfaces and electronic data transmission. Development of a conceptual and preliminary designers' cost prediction model will be initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design (CAD) programs is assessed.
Teaching project: a low-cost swine model for chest tube insertion training.
Netto, Fernando Antonio Campelo Spencer; Sommer, Camila Garcia; Constantino, Michael de Mello; Cardoso, Michel; Cipriani, Raphael Flávio Fachini; Pereira, Renan Augusto
2016-02-01
to describe and evaluate the acceptance of a low-cost chest tube insertion porcine model in a medical education project in the southwest of Paraná, Brazil. we developed a low-cost and low technology porcine model for teaching chest tube insertion and used it in a teaching project. Medical trainees - students and residents - received theoretical instructions about the procedure and performed thoracic drainage in this porcine model. After performing the procedure, the participants filled a feedback questionnaire about the proposed experimental model. This study presents the model and analyzes the questionnaire responses. seventy-nine medical trainees used and evaluated the model. The anatomical correlation between the porcine model and human anatomy was considered high and averaged 8.1±1.0 among trainees. All study participants approved the low-cost porcine model for chest tube insertion. the presented low-cost porcine model for chest tube insertion training was feasible and had good acceptability among trainees. This model has potential use as a teaching tool in medical education.
Halsing, David L; Moore, Michael R
2008-04-01
The mandate to increase endangered salmon populations in the Columbia River Basin of North America has created a complex, controversial resource-management issue. We constructed an integrated assessment model as a tool for analyzing biological-economic trade-offs in recovery of Snake River spring- and summer-run chinook salmon (Oncorhynchus tshawytscha). We merged 3 frameworks: a salmon-passage model to predict migration and survival of smolts; an age-structured matrix model to predict long-term population growth rates of salmon stocks; and a cost-effectiveness analysis to determine a set of least-cost management alternatives for achieving particular population growth rates. We assessed 6 individual salmon-management measures and 76 management alternatives composed of one or more measures. To reflect uncertainty, results were derived for different assumptions of effectiveness of smolt transport around dams. Removal of an estuarine predator, the Caspian Tern (Sterna caspia), was cost-effective and generally increased long-term population growth rates regardless of transport effectiveness. Elimination of adult salmon harvest had a similar effect over a range of its cost estimates. The specific management alternatives in the cost-effective set depended on assumptions about transport effectiveness. On the basis of recent estimates of smolt transport effectiveness, alternatives that discontinued transportation or breached dams were prevalent in the cost-effective set, whereas alternatives that maximized transportation dominated if transport effectiveness was relatively high. More generally, the analysis eliminated 80-90% of management alternatives from the cost-effective set. Application of our results to salmon management is limited by data availability and model assumptions, but these limitations can help guide research that addresses critical uncertainties and information. Our results thus demonstrate that linking biology and economics through integrated models can provide valuable tools for science-based policy and management.
Halsing, D.L.; Moore, M.R.
2008-01-01
The mandate to increase endangered salmon populations in the Columbia River Basin of North America has created a complex, controversial resource-management issue. We constructed an integrated assessment model as a tool for analyzing biological-economic trade-offs in recovery of Snake River spring- and summer-run chinook salmon (Oncorhynchus tshawytscha). We merged 3 frameworks: a salmon-passage model to predict migration and survival of smolts; an age-structured matrix model to predict long-term population growth rates of salmon stocks; and a cost-effectiveness analysis to determine a set of least-cost management alternatives for achieving particular population growth rates. We assessed 6 individual salmon-management measures and 76 management alternatives composed of one or more measures. To reflect uncertainty, results were derived for different assumptions of effectiveness of smolt transport around dams. Removal of an estuarine predator, the Caspian Tern (Sterna caspia), was cost-effective and generally increased long-term population growth rates regardless of transport effectiveness. Elimination of adult salmon harvest had a similar effect over a range of its cost estimates. The specific management alternatives in the cost-effective set depended on assumptions about transport effectiveness. On the basis of recent estimates of smolt transport effectiveness, alternatives that discontinued transportation or breached dams were prevalent in the cost-effective set, whereas alternatives that maximized transportation dominated if transport effectiveness was relatively high. More generally, the analysis eliminated 80-90% of management alternatives from the cost-effective set. Application of our results to salmon management is limited by data availability and model assumptions, but these limitations can help guide research that addresses critical uncertainties and information. Our results thus demonstrate that linking biology and economics through integrated models can provide valuable tools for science-based policy and management.
Optimal distribution of borehole geophones for monitoring CO2-injection-induced seismicity
NASA Astrophysics Data System (ADS)
Huang, L.; Chen, T.; Foxall, W.; Wagoner, J. L.
2016-12-01
The U.S. DOE initiative, National Risk Assessment Partnership (NRAP), aims to develop quantitative risk assessment methodologies for carbon capture, utilization and storage (CCUS). As part of tasks of the Strategic Monitoring Group of NRAP, we develop a tool for optimal design of a borehole geophones distribution for monitoring CO2-injection-induced seismicity. The tool consists of a number of steps, including building a geophysical model for a given CO2 injection site, defining target monitoring regions within CO2-injection/migration zones, generating synthetic seismic data, giving acceptable uncertainties in input data, and determining the optimal distribution of borehole geophones. We use a synthetic geophysical model as an example to demonstrate the capability our new tool to design an optimal/cost-effective passive seismic monitoring network using borehole geophones. The model is built based on the geologic features found at the Kimberlina CCUS pilot site located in southern San Joaquin Valley, California. This tool can provide CCUS operators with a guideline for cost-effective microseismic monitoring of geologic carbon storage and utilization.
NASA Astrophysics Data System (ADS)
Sirirojvisuth, Apinut
In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
Decision models in the evaluation of psychotropic drugs : useful tool or useless toy?
Barbui, Corrado; Lintas, Camilla
2006-09-01
A current contribution in the European Journal of Health Economics employs a decision model to compare health care costs of olanzapine and risperidone treatment for schizophrenia. The model suggests that a treatment strategy of first-line olanzapine is cost-saving over a 1-year period, with additional clinical benefits in the form of avoided relapses in the long-term. From a clinical perspective this finding is indubitably relevant, but can physicians and policy makers believe it? The study is presented in a balanced way, assumptions are based on data extracted from clinical trials published in major psychiatric journals, and the theoretical underpinnings of the model are reasonable. Despite these positive aspects, we believe that the methodology used in this study-the decision model approach-is an unsuitable and potentially misleading tool for evaluating psychotropic drugs. In this commentary, taking the olanzapine vs. risperidone model as an example, arguments are provided to support this statement.
The Watershed Management Optimization Support Tool (WMOST v.1) was released by the US Environmental Protection Agency in December 2013 (http://www2.epa.gov/exposure-assessment-models/wmost-10-download-page). The objective of WMOST is to serve as a public-domain screening tool th...
ERIC Educational Resources Information Center
Stewart, Ellen; Smith, Katherine E.
2015-01-01
Concerns about the limited influence of research on decision making have prompted the development of tools intended to mediate evidence for policy audiences. This article focuses on three examples, prominent in public health: impact assessments; systematic reviews; and economic decision-making tools (cost-benefit analysis and scenario modelling).…
A managerial accounting analysis of hospital costs.
Frank, W G
1976-01-01
Variance analysis, an accounting technique, is applied to an eight-component model of hospital costs to determine the contribution each component makes to cost increases. The method is illustrated by application to data on total costs from 1950 to 1973 for all U.S. nongovernmental not-for-profit short-term general hospitals. The costs of a single hospital are analyzed and compared to the group costs. The potential uses and limitations of the method as a planning and research tool are discussed. PMID:965233
A managerial accounting analysis of hospital costs.
Frank, W G
1976-01-01
Variance analysis, an accounting technique, is applied to an eight-component model of hospital costs to determine the contribution each component makes to cost increases. The method is illustrated by application to data on total costs from 1950 to 1973 for all U.S. nongovernmental not-for-profit short-term general hospitals. The costs of a single hospital are analyzed and compared to the group costs. The potential uses and limitations of the method as a planning and research tool are discussed.
[Cost of mother-child care in Morelos State].
Cahuana-Hurtado, Lucero; Sosa-Rubí, Sandra; Bertozzi, Stefano
2004-01-01
To compare the cost of maternal and child health care (current model) to that of the WHO Mother-Baby Package if it were implemented. A pilot cross-sectional case study was conducted in September 2001 in Sanitary District No. III, Morelos State, Mexico. Two rural health centers, an urban health center, and a general hospital, all managed by the Ministry of Health, were selected for the study. The Mother-Baby Package Costing Spreadsheet was used to estimate the total cost and cost per intervention for the current model and for the Mother-Baby Package model. The total cost of the Mother-Baby Package was twice the cost of the current model. Of the 18 interventions evaluated, the highest proportion of total costs corresponded to antenatal care and normal delivery. Personnel costs represented more than half of the total costs. The Mother-Baby Package Costing Spreadsheet is a practical tool to estimate and compare costs and is useful to guide the distribution of financial resources allocated to maternal and child healthcare. However, this model has limited application unless it is adapted to the structure of each healthcare system. The English version of this paper is available at: http://www.insp.mx/salud/index.html.
DOE Office of Scientific and Technical Information (OSTI.GOV)
PIERSON KL; MEINERT FL
2012-01-26
Two notable modeling efforts within the Hanford Tank Waste Operations Simulator (HTWOS) are currently underway to (1) increase the robustness of the underlying chemistry approximations through the development and implementation of an aqueous thermodynamic model, and (2) add enhanced planning capabilities to the HTWOS model through development and incorporation of the lifecycle cost model (LCM). Since even seemingly small changes in apparent waste composition or treatment parameters can result in large changes in quantities of high-level waste (HLW) and low-activity waste (LAW) glass, mission duration or lifecycle cost, a solubility model that more accurately depicts the phases and concentrations ofmore » constituents in tank waste is required. The LCM enables evaluation of the interactions of proposed changes on lifecycle mission costs, which is critical for decision makers.« less
Cost Models for MMC Manufacturing Processes
NASA Technical Reports Server (NTRS)
Elzey, Dana M.; Wadley, Haydn N. G.
1996-01-01
Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Fifield, Leonard S.; Gandhi, Umesh N.
This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used asmore » resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.« less
Integrated modeling approach for optimal management of water, energy and food security nexus
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Vesselinov, Velimir V.
2017-03-01
Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.
Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)
NASA Technical Reports Server (NTRS)
Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.
2005-01-01
The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.
Alternative Fuels Data Center: Tools
Calculator Compare cost of ownership and emissions for most vehicle models. mobile Petroleum Reduction ROI and payback period for natural gas vehicles and infrastructure. AFLEET Tool Calculate a fleet's , hydrogen, or fuel cell infrastructure. GREET Fleet Footprint Calculator Calculate your fleet's petroleum
Information quality-control model
NASA Technical Reports Server (NTRS)
Vincent, D. A.
1971-01-01
Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.
Cost minimizing of cutting process for CNC thermal and water-jet machines
NASA Astrophysics Data System (ADS)
Tavaeva, Anastasia; Kurennov, Dmitry
2015-11-01
This paper deals with optimization problem of cutting process for CNC thermal and water-jet machines. The accuracy of objective function parameters calculation for optimization problem is investigated. This paper shows that working tool path speed is not constant value. One depends on some parameters that are described in this paper. The relations of working tool path speed depending on the numbers of NC programs frames, length of straight cut, configuration part are presented. Based on received results the correction coefficients for working tool speed are defined. Additionally the optimization problem may be solved by using mathematical model. Model takes into account the additional restrictions of thermal cutting (choice of piercing and output tool point, precedence condition, thermal deformations). At the second part of paper the non-standard cutting techniques are considered. Ones may lead to minimizing of cutting cost and time compared with standard cutting techniques. This paper considers the effectiveness of non-standard cutting techniques application. At the end of the paper the future research works are indicated.
2001-04-12
Comparison of Oversight Models in Managed Care 1 Running Head: Comparison of Oversight Models in Managed Care A Comparison of the Audit and...TITLE AND SUBTITLE A Comparison of the Audit and Accreditation Tools Used By The Health Care Financing Administration, The Texas Department of...Comparison of Oversight Models in Managed Care 5 A Comparison of the Audit and Accreditation Tools Used By The Health Care Financing
The Cost-Effectiveness of an Intensive Treatment Protocol for Severe Dyslexia in Children
ERIC Educational Resources Information Center
Hakkaart-van Roijen, Leona; Goettsch, Wim G.; Ekkebus, Michel; Gerretsen, Patty; Stolk, Elly A.
2011-01-01
Studies of interventions for dyslexia have focused entirely on outcomes related to literacy. In this study, we considered a broader picture assessing improved quality of life compared with costs. A model served as a tool to compare costs and effects of treatment according to a new protocol and care as usual. Quality of life was measured and valued…
Trade-Space Analysis Tool for Constellations (TAT-C)
NASA Technical Reports Server (NTRS)
Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja
2016-01-01
Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The Knowledge Base supports both analysis and exploration, and the current GUI prototype automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost.
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting themore » time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.« less
Information Technology: A Tool to Cut Health Care Costs
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.
1996-01-01
Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.
Interpreting cost of ownership for mix-and-match lithography
NASA Astrophysics Data System (ADS)
Levine, Alan L.; Bergendahl, Albert S.
1994-05-01
Cost of ownership modeling is a critical and emerging tool that provides significant insight into the ways to optimize device manufacturing costs. The development of a model to deal with a particular application, mix-and-match lithography, was performed in order to determine the level of cost savings and the optimum ways to create these savings. The use of sensitivity analysis with cost of ownership allows the user to make accurate trade-offs between technology and cost. The use and interpretation of the model results are described in this paper. Parameters analyzed include several manufacturing considerations -- depreciation, maintenance, engineering and operator labor, floorspace, resist, consumables and reticles. Inherent in this study is the ability to customize this analysis for a particular operating environment. Results demonstrate the clear advantages of a mix-and-match approach for three different operating environments. These case studies also demonstrate various methods to efficiently optimize cost savings strategies.
2014-01-01
The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated. PMID:24672351
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yuanyuan; Diao, Ruisheng; Huang, Renke
Maintaining good quality of power plant stability models is of critical importance to ensure the secure and economic operation and planning of today’s power grid with its increasing stochastic and dynamic behavior. According to North American Electric Reliability (NERC) standards, all generators in North America with capacities larger than 10 MVA are required to validate their models every five years. Validation is quite costly and can significantly affect the revenue of generator owners, because the traditional staged testing requires generators to be taken offline. Over the past few years, validating and calibrating parameters using online measurements including phasor measurement unitsmore » (PMUs) and digital fault recorders (DFRs) has been proven to be a cost-effective approach. In this paper, an innovative open-source tool suite is presented for validating power plant models using PPMV tool, identifying bad parameters with trajectory sensitivity analysis, and finally calibrating parameters using an ensemble Kalman filter (EnKF) based algorithm. The architectural design and the detailed procedures to run the tool suite are presented, with results of test on a realistic hydro power plant using PMU measurements for 12 different events. The calibrated parameters of machine, exciter, governor and PSS models demonstrate much better performance than the original models for all the events and show the robustness of the proposed calibration algorithm.« less
10 CFR 434.521 - The simulation tool.
Code of Federal Regulations, 2012 CFR
2012-01-01
... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...
10 CFR 434.521 - The simulation tool.
Code of Federal Regulations, 2013 CFR
2013-01-01
... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...
10 CFR 434.521 - The simulation tool.
Code of Federal Regulations, 2010 CFR
2010-01-01
... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...
10 CFR 434.521 - The simulation tool.
Code of Federal Regulations, 2014 CFR
2014-01-01
... of the building including night setback during various times of the year; and 521.1.5 Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost..., and ground-coupled buildings. In addition, models shall be capable of translating the Design Energy...
10 CFR 434.521 - The simulation tool.
Code of Federal Regulations, 2011 CFR
2011-01-01
... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...
Multi-objective reverse logistics model for integrated computer waste management.
Ahluwalia, Poonam Khanijo; Nema, Arvind K
2006-12-01
This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.
Ashley, Dennis W; Mullins, Robert F; Dente, Christopher J; Garlow, Laura; Medeiros, Regina S; Atkins, Elizabeth V; Solomon, Gina; Abston, Dena; Ferdinand, Colville H
2017-09-01
Trauma center readiness costs are incurred to maintain essential infrastructure and capacity to provide emergent services on a 24/7 basis. These costs are not captured by traditional hospital cost accounting, and no national consensus exists on appropriate definitions for each cost. Therefore, in 2010, stakeholders from all Level I and II trauma centers developed a survey tool standardizing and defining trauma center readiness costs. The survey tool underwent minor revisions to provide further clarity, and the survey was repeated in 2013. The purpose of this study was to provide a follow-up analysis of readiness costs for Georgia's Level I and Level II trauma centers. Using the American College of Surgeons Resources for Optimal Care of the Injured Patient guidelines, four readiness cost categories were identified: Administrative, Clinical Medical Staff, Operating Room, and Education/Outreach. Through conference calls, webinars and face-to-face meetings with financial officers, trauma medical directors, and program managers from all trauma centers, standardized definitions for reporting readiness costs within each category were developed. This resulted in a survey tool for centers to report their individual readiness costs for one year. The total readiness cost for all Level I trauma centers was $34,105,318 (avg $6,821,064) and all Level II trauma centers was $20,998,019 (avg $2,333,113). Methodology to standardize and define readiness costs for all trauma centers within the state was developed. Average costs for Level I and Level II trauma centers were identified. This model may be used to help other states define and standardize their trauma readiness costs.
Selecting and Ranking Cost Research Projects
1993-09-01
Model (STACM) Enhancements Automated Cost Estimating Integrated Tools ( ACEIT ) Libraries BM/C 3 GEP Engineering and Cost BM/C 3 EP Engineering and Cost...50K NOM VHI 0.02635481 STACM ENHANCEMENTS NOM អK NOM VHI 0.02468682 ACEIT LIBRARIES VHI 50-IOOK HI VHI 0.06357704 GEP ENGRG. & COST VHI 100-150K VHl...0.02505922 SCATS LOW អK NOM VIII 0.02468682 PICES SUPPORT NOM 50-100K NOM VlIl 0.02455186 ACEIT SUPPORT VIII 50-100K VHI VHI 0.08208244 GUARDIAN
Technology Transfer Challenges for High-Assurance Software Engineering Tools
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.
2003-01-01
In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.
Fiber optic submarine cables cuts cost modeling and cable protection aspects
NASA Astrophysics Data System (ADS)
Al-Lawati, Ali
2015-03-01
This work presents a model to calculate costs associated with submarine fiber optic cable cuts. It accounts for both fixed and variable factors determining cost of fixing cables and restoring data transmission. It considers duration of a cut, capacity of fibers, number of fiber pairs and expected number of cuts during cable life time. Moreover, it provides templates for initial feasibility assessments by comparing cut costs to cost of different cable protection schemes. It offers a needed tool to assist in guiding decision makers in selecting type of cable, length and depth of cable burial in terms of increase in initial investment due to adapting such protection methods, and compare it to cost of cuts repair and alternative restoration paths for data.
Satellite Systems Design/Simulation Environment: A Systems Approach to Pre-Phase A Design
NASA Technical Reports Server (NTRS)
Ferebee, Melvin J., Jr.; Troutman, Patrick A.; Monell, Donald W.
1997-01-01
A toolset for the rapid development of small satellite systems has been created. The objective of this tool is to support the definition of spacecraft mission concepts to satisfy a given set of mission and instrument requirements. The objective of this report is to provide an introduction to understanding and using the SMALLSAT Model. SMALLSAT is a computer-aided Phase A design and technology evaluation tool for small satellites. SMALLSAT enables satellite designers, mission planners, and technology program managers to observe the likely consequences of their decisions in terms of satellite configuration, non-recurring and recurring cost, and mission life cycle costs and availability statistics. It was developed by Princeton Synergetic, Inc. and User Systems, Inc. as a revision of the previous TECHSAT Phase A design tool, which modeled medium-sized Earth observation satellites. Both TECHSAT and SMALLSAT were developed for NASA.
Software Certification for Temporal Properties With Affordable Tool Qualification
NASA Technical Reports Server (NTRS)
Xia, Songtao; DiVito, Benedetto L.
2005-01-01
It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, Aladsair J.; Viswanathan, Vilayanur V.; Stephenson, David E.
A robust performance-based cost model is developed for all-vanadium, iron-vanadium and iron chromium redox flow batteries. Systems aspects such as shunt current losses, pumping losses and thermal management are accounted for. The objective function, set to minimize system cost, allows determination of stack design and operating parameters such as current density, flow rate and depth of discharge (DOD). Component costs obtained from vendors are used to calculate system costs for various time frames. A 2 kW stack data was used to estimate unit energy costs and compared with model estimates for the same size electrodes. The tool has been sharedmore » with the redox flow battery community to both validate their stack data and guide future direction.« less
An approach to software cost estimation
NASA Technical Reports Server (NTRS)
Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.
1984-01-01
A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.
The use of activity-based cost estimation as a management tool for cultural change
NASA Technical Reports Server (NTRS)
Mandell, Humboldt; Bilby, Curt
1991-01-01
It will be shown that the greatest barrier to American exploration of the planet Mars is not the development of the technology needed to deliver humans and return them safely to earth. Neither is it the cost of such an undertaking, as has been previously suggested, although certainly, such a venture may not be inexpensive by some measures. The predicted costs of exploration have discouraged serious political dialog on the subject. And, in fact, even optimistic projections of the NASA budget do not contain the resources required, under the existing development and management paradigm, for human space exploration programs. It will be demonstrated that the perception of the costs of such a venture, and the cultural responses to the perceptions are factors inhibiting American exploration of the moon and the planet Mars. Cost models employed in the aerospace industry today correctly mirror the history of past space programs, and as such, are representative of the existing management and development paradigms. However, if, under this current paradigm no major exploration programs are feasible, then cost analysis methods based in the past may not have great utility in exploring the needed cultural changes. This paper explores the use of a new type of model, the activity based cost model, which will treat management style as an input variable, in a sense providing a tool whereby a complete, affordable program might be designed, including both the technological and management aspects.
Evaluation of land use regression models in Detroit, Michigan
Introduction: Land use regression (LUR) models have emerged as a cost-effective tool for characterizing exposure in epidemiologic health studies. However, little critical attention has been focused on validation of these models as a step toward temporal and spatial extension of ...
Latest NASA Instrument Cost Model (NICM): Version VI
NASA Technical Reports Server (NTRS)
Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary
2014-01-01
The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.
Banz, Kurt
2005-01-01
This article describes the framework of a comprehensive European model developed to assess clinical and economic outcomes of cardiac resynchronization therapy (CRT) versus optimal pharmacological therapy (OPT) alone in patients with heart failure. The model structure is based on information obtained from the literature, expert opinion, and a European CRT Steering Committee. The decision-analysis tool allows a consideration of direct medical and indirect costs, and computes outcomes for distinctive periods of time up to 5 years. Qualitative data can also be entered for cost-utility analysis. Model input data for a preliminary economic appraisal of the economic value of CRT in Germany were obtained from clinical trials, experts, health statistics, and medical tariff lists. The model offers comprehensive analysis capabilities and high flexibility so that it can easily be adapted to any European country or special setting. The illustrative analysis for Germany indicates that CRT is a cost-effective intervention. Although CRT is associated with average direct medical net costs of Euro 5880 per patient, this finding means that 22% of its upfront implantation cost is recouped already within 1 year because of significantly decreased hospitalizations. With 36,600 Euros the incremental cost per quality-adjusted life-year (QALY) gained is below the euro equivalent (41,300 Euros, 1 Euro = US1.21 dollars) of the commonly used threshold level of US50,000 dollars considered to represent cost-effectiveness. The sensitivity analysis showed these preliminary results to be fairly robust towards changes in key assumptions. The European CRT model is an important tool to assess the economic value of CRT in patients with moderate to severe heart failure. In the light of the planned introduction of Diagnosis Related Group (DRG) based reimbursement in various European countries, the economic data generated by the model can play an important role in the decision-making process.
Cost-effectiveness acceptability curves revisited.
Al, Maiwenn J
2013-02-01
Since the introduction of the cost-effectiveness acceptability curve (CEAC) in 1994, its use as a method to describe uncertainty around incremental cost-effectiveness ratios (ICERs) has steadily increased. In this paper, first the construction and interpretation of the CEAC is explained, both in the context of modelling studies and in the context of cost-effectiveness (CE) studies alongside clinical trials. Additionally, this paper reviews the advantages and limitations of the CEAC. Many of the perceived limitations can be attributed to the practice of interpreting the CEAC as a decision rule while it was not developed as such. It is argued that the CEAC is still a useful tool in describing and quantifying uncertainty around the ICER, especially in combination with other tools such as plots on the CE plane and value-of-information analysis.
The Mission Planning Lab: A Visualization and Analysis Tool
NASA Technical Reports Server (NTRS)
Daugherty, Sarah C.; Cervantes, Benjamin W.
2009-01-01
Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).
Clark, Alistair; Moule, Pam; Topping, Annie; Serpell, Martin
2015-05-01
To review research in the literature on nursing shift scheduling / rescheduling, and to report key issues identified in a consultation exercise with managers in four English National Health Service trusts to inform the development of mathematical tools for rescheduling decision-making. Shift rescheduling is unrecognised as an everyday time-consuming management task with different imperatives from scheduling. Poor rescheduling decisions can have quality, cost and morale implications. A systematic critical literature review identified rescheduling issues and existing mathematic modelling tools. A consultation exercise with nursing managers examined the complex challenges associated with rescheduling. Minimal research exists on rescheduling compared with scheduling. Poor rescheduling can result in greater disruption to planned nursing shifts and may impact negatively on the quality and cost of patient care, and nurse morale and retention. Very little research examines management challenges or mathematical modelling for rescheduling. Shift rescheduling is a complex and frequent management activity that is more challenging than scheduling. Mathematical modelling may have potential as a tool to support managers to minimise rescheduling disruption. The lack of specific methodological support for rescheduling that takes into account its complexity, increases the likelihood of harm for patients and stress for nursing staff and managers. © 2013 John Wiley & Sons Ltd.
Machinability of titanium metal matrix composites (Ti-MMCs)
NASA Astrophysics Data System (ADS)
Aramesh, Maryam
Titanium metal matrix composites (Ti-MMCs), as a new generation of materials, have various potential applications in aerospace and automotive industries. The presence of ceramic particles enhances the physical and mechanical properties of the alloy matrix. However, the hard and abrasive nature of these particles causes various issues in the field of their machinability. Severe tool wear and short tool life are the most important drawbacks of machining this class of materials. There is very limited work in the literature regarding the machinability of this class of materials especially in the area of tool life estimation and tool wear. By far, polycrystalline diamond (PCD) tools appear to be the best choice for machining MMCs from researchers' point of view. However, due to their high cost, economical alternatives are sought. Cubic boron nitride (CBN) inserts, as the second hardest available tools, show superior characteristics such as great wear resistance, high hardness at elevated temperatures, a low coefficient of friction and a high melting point. Yet, so far CBN tools have not been studied during machining of Ti-MMCs. In this study, a comprehensive study has been performed to explore the tool wear mechanisms of CBN inserts during turning of Ti-MMCs. The unique morphology of the worn faces of the tools was investigated for the first time, which led to new insights in the identification of chemical wear mechanisms during machining of Ti-MMCs. Utilizing the full tool life capacity of cutting tools is also very crucial, due to the considerable costs associated with suboptimal replacement of tools. This strongly motivates development of a reliable model for tool life estimation under any cutting conditions. In this study, a novel model based on the survival analysis methodology is developed to estimate the progressive states of tool wear under any cutting conditions during machining of Ti-MMCs. This statistical model takes into account the machining time in addition to the effect of cutting parameters. Thus, promising results were obtained which showed a very good agreement with the experimental results. Moreover, a more advanced model was constructed, by adding the tool wear as another variable to the previous model. Therefore, a new model was proposed for estimating the remaining life of worn inserts under different cutting conditions, using the current tool wear data as an input. The results of this model were validated with the experimental results. The estimated results were well consistent with the results obtained from the experiments.
NREL and Panasonic | Energy Systems Integration Facility | NREL
with distribution system modeling for the first time. The tool combines NREL's building energy system distribution system models, and Panasonic will perform cost-benefit analyses. Along with the creation of the
CoMET: Cost and Mass Evaluation Tool for Spacecraft and Mission Design
NASA Technical Reports Server (NTRS)
Bieber, Ben S.
2005-01-01
New technology in space exploration is often developed without a complete knowledge of its impact. While the immediate benefits of a new technology are obvious, it is harder to understand its indirect consequences, which ripple through the entire system. COMET is a technology evaluation tool designed to illuminate how specific technology choices affect a mission at each system level. COMET uses simplified models for mass, power, and cost to analyze performance parameters of technologies of interest. The sensitivity analysis that CoMET provides shows whether developing a certain technology will greatly benefit the project or not. CoMET is an ongoing project approaching a web-based implementation phase. This year, development focused on the models for planetary daughter craft, such as atmospheric probes, blimps and balloons, and landers. These models are developed through research into historical data, well established rules of thumb, and engineering judgment of experts at JPL. The model is validated by corroboration with JpL advanced mission studies. Other enhancements to COMET include adding launch vehicle analysis and integrating an updated cost model. When completed, COMET will allow technological development to be focused on areas that will most drastically improve spacecraft performance.
NASA Technical Reports Server (NTRS)
Sundaram, Meenakshi
2005-01-01
NASA and the aerospace industry are extremely serious about reducing the cost and improving the performance of launch vehicles both manned or unmanned. In the aerospace industry, sharing infrastructure for manufacturing more than one type spacecraft is becoming a trend to achieve economy of scale. An example is the Boeing Decatur facility where both Delta II and Delta IV launch vehicles are made. The author is not sure how Boeing estimates the costs of each spacecraft made in the same facility. Regardless of how a contractor estimates the cost, NASA in its popular cost estimating tool, NASA Air force Cost Modeling (NAFCOM) has to have a method built in to account for the effect of infrastructure sharing. Since there is no provision in the most recent version of NAFCOM2002 to take care of this, it has been found by the Engineering Cost Community at MSFC that the tool overestimates the manufacturing cost by as much as 30%. Therefore, the objective of this study is to develop a methodology to assess the impact of infrastructure sharing so that better operations cost estimates may be made.
A web-based rapid assessment tool for production publishing solutions
NASA Astrophysics Data System (ADS)
Sun, Tong
2010-02-01
Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.
Designing tools for oil exploration using nuclear modeling
NASA Astrophysics Data System (ADS)
Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike
2017-09-01
When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.
Vehicle Lightweighting: Challenges and Opportunities with Aluminum
NASA Astrophysics Data System (ADS)
Sachdev, Anil K.; Mishra, Raja K.; Mahato, Anirban; Alpas, Ahmet
Rising energy costs, consumer preferences and regulations drive requirements for fuel economy, performance, comfort, safety and cost of future automobiles. These conflicting situations offer challenges for vehicle lightweighting, for which aluminum applications are key. This paper describes product design needs and materials and process development opportunities driven by theoretical, experimental and modeling tools in the area of sheet and castings. Computational tools and novel experimental techniques used in their development are described. The paper concludes with challenges that lie ahead for pervasive use of aluminum and the necessary fundamental R&D that is still needed.
Frequency Domain Modeling of SAW Devices
NASA Technical Reports Server (NTRS)
Wilson, W. C.; Atkinson, G. M.
2007-01-01
New SAW sensors for integrated vehicle health monitoring of aerospace vehicles are being investigated. SAW technology is low cost, rugged, lightweight, and extremely low power. However, the lack of design tools for MEMS devices in general, and for Surface Acoustic Wave (SAW) devices specifically, has led to the development of tools that will enable integrated design, modeling, simulation, analysis and automatic layout generation of SAW devices. A frequency domain model has been created. The model is mainly first order, but it includes second order effects from triple transit echoes. This paper presents the model and results from the model for a SAW delay line device.
An Affordability Comparison Tool (ACT) for Space Transportation
NASA Technical Reports Server (NTRS)
McCleskey, C. M.; Bollo, T. R.; Garcia, J. L.
2012-01-01
NASA bas recently emphasized the importance of affordability for Commercial Crew Development Program (CCDP), Space Launch Systems (SLS) and Multi-Purpose Crew Vehicle (MPCV). System architects and designers are challenged to come up with architectures and designs that do not bust the budget. This paper describes the Affordability Comparison Tool (ACT) analyzes different systems or architecture configurations for affordability that allows for a comparison of: total life cycle cost; annual recurring costs, affordability figures-of-merit, such as cost per pound, cost per seat, and cost per flight, as well as productivity measures, such as payload throughput. Although ACT is not a deterministic model, the paper develops algorithms and parametric factors that use characteristics of the architectures or systems being compared to produce important system outcomes (figures-of-merit). Example applications of outcome figures-of-merit are also documented to provide the designer with information on the relative affordability and productivity of different space transportation applications.
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
2016-12-28
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Barrera-Valencia, Camilo; Benito-Devia, Alexis Vladimir; Vélez-Álvarez, Consuelo; Figueroa-Barrera, Mario; Franco-Idárraga, Sandra Milena
Telepsychiatry is defined as the use of information and communication technology (ICT) in providing remote psychiatric services. Telepsychiatry is applied using two types of communication: synchronous (real time) and asynchronous (store and forward). To determine the cost-effectiveness of a synchronous and an asynchronous telepsychiatric model in prison inmate patients with symptoms of depression. A cost-effectiveness study was performed on a population consisting of 157 patients from the Establecimiento Penitenciario y Carcelario de Mediana Seguridad de Manizales, Colombia. The sample was determined by applying Zung self-administered surveys for depression (1965) and the Hamilton Depression Rating Scale (HDRS), the latter being the tool used for the comparison. Initial Hamilton score, arrival time, duration of system downtime, and clinical effectiveness variables had normal distributions (P>.05). There were significant differences (P<.001) between care costs for the different models, showing that the mean cost of the asynchronous model is less than synchronous model, and making the asynchronous model more cost-effective. The asynchronous model is the most cost-effective model of telepsychiatry care for patients with depression admitted to a detention centre, according to the results of clinical effectiveness, cost measurement, and patient satisfaction. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Geothermal probabilistic cost study
NASA Technical Reports Server (NTRS)
Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.
1981-01-01
A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.
Fargo, Kelly L.; Johnston, Jessica; Stevenson, Kurt B.; Deutscher, Meredith
2015-01-01
Background: Studies evaluating the impact of passive cost visibility tools on antibiotic prescribing are lacking. Objective: The objective of this study was to evaluate whether the implementation of a passive antibiotic cost visibility tool would impact antibiotic prescribing and decrease antibiotic spending. Methods: An efficiency and effectiveness initiative (EEI) was implemented in October 2012. To support the EEI, an antibiotic cost visibility tool was created in June 2013 displaying the relative cost of antibiotics. Using an observational study of interrupted time series design, 3 time frames were studied: pre EEI, post EEI, and post cost visibility tool implementation. The primary outcome was antibiotic cost per 1,000 patient days. Secondary outcomes included case mix index (CMI)–adjusted antibiotic cost per 1,000 patient days and utilization of the cost visibility tool. Results: Initiation of the EEI was associated with a $4,675 decrease in antibiotic cost per 1,000 patient days (P = .003), and costs continued to decrease in the months following EEI (P = .009). After implementation of the cost visibility tool, costs remained stable (P = .844). Despite CMI increasing over time, adjustment for CMI had no impact on the directionality or statistical significance of the results. Conclusion: Our study demonstrated a significant and sustained decrease in antibiotic cost per 1,000 patient days when focused medication cost reduction efforts were implemented, but passive cost visibility tool implementation was not associated with additional cost reduction. Antibiotic cost visibility tools may be of most benefit when prior medication cost reduction efforts are lacking or when an active intervention is incorporated. PMID:26405341
Additive Manufacturing of Tooling for Refrigeration Cabinet Foaming Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Post, Brian K; Nuttall, David; Cukier, Michael
The primary objective of this project was to leverage the Big Area Additive Manufacturing (BAAM) process and materials into a long term, quick change tooling concept to drastically reduce product lead and development timelines and costs. Current refrigeration foam molds are complicated to manufacture involving casting several aluminum parts in an approximate shape, machining components of the molds and post fitting and shimming of the parts in an articulated fixture. The total process timeline can take over 6 months. The foaming process is slower than required for production, therefore multiple fixtures, 10 to 27, are required per refrigerator model. Moldsmore » are particular to a specific product configuration making mixed model assembly challenging for sequencing, mold changes or auto changeover features. The initial goal was to create a tool leveraging the ORNL materials and additive process to build a tool in 4 to 6 weeks or less. A secondary goal was to create common fixture cores and provide lightweight fixture sections that could be revised in a very short time to increase equipment flexibility reduce lead times, lower the barriers to first production trials, and reduce tooling costs.« less
COSTMODL: An automated software development cost estimation tool
NASA Technical Reports Server (NTRS)
Roush, George B.
1991-01-01
The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.
A Layered Decision Model for Cost-Effective System Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Huaqiang; Alves-Foss, James; Soule, Terry
System security involves decisions in at least three areas: identification of well-defined security policies, selection of cost-effective defence strategies, and implementation of real-time defence tactics. Although choices made in each of these areas affect the others, existing decision models typically handle these three decision areas in isolation. There is no comprehensive tool that can integrate them to provide a single efficient model for safeguarding a network. In addition, there is no clear way to determine which particular combinations of defence decisions result in cost-effective solutions. To address these problems, this paper introduces a Layered Decision Model (LDM) for use inmore » deciding how to address defence decisions based on their cost-effectiveness. To validate the LDM and illustrate how it is used, we used simulation to test model rationality and applied the LDM to the design of system security for an e-commercial business case.« less
Wang, Li Yan; O'Brien, Mary Jane; Maughan, Erin D
2016-11-01
This paper describes a user-friendly, Excel spreadsheet model and two data collection instruments constructed by the authors to help states and districts perform cost-benefit analyses of school nursing services delivered by full-time school nurses. Prior to applying the model, states or districts need to collect data using two forms: "Daily Nurse Data Collection Form" and the "Teacher Survey." The former is used to record daily nursing activities, including number of student health encounters, number of medications administered, number of student early dismissals, and number of medical procedures performed. The latter is used to obtain estimates for the time teachers spend addressing student health issues. Once inputs are entered in the model, outputs are automatically calculated, including program costs, total benefits, net benefits, and benefit-cost ratio. The spreadsheet model, data collection tools, and instructions are available at the NASN website ( http://www.nasn.org/The/CostBenefitAnalysis ).
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
Conducting systematic reviews of economic evaluations.
Gomersall, Judith Streak; Jadotte, Yuri Tertilus; Xue, Yifan; Lockwood, Suzi; Riddle, Dru; Preda, Alin
2015-09-01
In 2012, a working group was established to review and enhance the Joanna Briggs Institute (JBI) guidance for conducting systematic review of evidence from economic evaluations addressing a question(s) about health intervention cost-effectiveness. The objective is to present the outcomes of the working group. The group conducted three activities to inform the new guidance: review of literature on the utility/futility of systematic reviews of economic evaluations and consideration of its implications for updating the existing methodology; assessment of the critical appraisal tool in the existing guidance against criteria that promotes validity in economic evaluation research and two other commonly used tools; and a workshop. The debate in the literature on the limitations/value of systematic review of economic evidence cautions that systematic reviews of economic evaluation evidence are unlikely to generate one size fits all answers to questions about the cost-effectiveness of interventions and their comparators. Informed by this finding, the working group adjusted the framing of the objectives definition in the existing JBI methodology. The shift is away from defining the objective as to determine one cost-effectiveness measure toward summarizing study estimates of cost-effectiveness and informed by consideration of the included study characteristics (patient, setting, intervention component, etc.), identifying conditions conducive to lowering costs and maximizing health benefits. The existing critical appraisal tool was included in the new guidance. The new guidance includes the recommendation that a tool designed specifically for the purpose of appraising model-based studies be used together with the generic appraisal tool for economic evaluations assessment to evaluate model-based evaluations. The guidance produced by the group offers reviewers guidance for each step of the systematic review process, which are the same steps followed in JBI reviews of other types of evidence. The updated JBI guidance will be useful for researchers wanting to synthesize evidence about economic questions, either as stand-alone reviews or part of comprehensive or mixed method evidence reviews. Although the updated methodology produced by the work of the working group has improved the JBI guidance for systematic reviews of economic evaluations, there are areas where further work is required. These include adjusting the critical appraisal tool to separate out questions addressing intervention cost and effectiveness measurement; providing more explicit guidance for assessing generalizability of findings; and offering a more robust method for evidence synthesis that facilitates achieving the more ambitious review objectives.
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
USDA-ARS?s Scientific Manuscript database
Hydrologic models are essential tools for environmental assessment of agricultural non-point source pollution. The automatic calibration of hydrologic models, though efficient, demands significant computational power, which can limit its application. The study objective was to investigate a cost e...
Re-evaluating causal modeling with mantel tests in landscape genetics
Samuel A. Cushman; Tzeidle N. Wasserman; Erin L. Landguth; Andrew J. Shirk
2013-01-01
The predominant analytical approach to associate landscape patterns with gene flow processes is based on the association of cost distances with genetic distances between individuals. Mantel and partial Mantel tests have been the dominant statistical tools used to correlate cost distances and genetic distances in landscape genetics. However, the inherent high...
Answering the Call for Accountability: An Activity and Cost Analysis Case Study
ERIC Educational Resources Information Center
Carducci, Rozana; Kisker, Carrie B.; Chang, June; Schirmer, James
2007-01-01
This article summarizes the findings of a case study on the creation and application of an activity-based cost accounting model that links community college salary expenditures to mission-critical practices within academic divisions of a southern California community college. Although initially applied as a financial management tool in private…
NASA Technical Reports Server (NTRS)
Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A
1992-01-01
The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.
A Cost Analysis Model for Army Sponsored Graduate Dental Education Programs.
1997-04-01
characteristics of a good measurement tool ? Cooper and Emory in their textbook, Business Research Methods, state there are three major criteria for evaluating...a measurement tool : validity, reliability, and practicality (Cooper and Emory 1995). Validity can be compartmentalized into internal and external...tremendous expense? The AEGD-1 year program is used extensively as a recruiting tool to encourage senior dental students to join the Army Dental Corps. The
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gifford, Jason S.; Grace, Robert C.; Rickerson, Wilson H.
This report serves as a resource for policymakers who wish to learn more about levelized cost of energy (LCOE) calculations, including cost-based incentives. The report identifies key renewable energy cost modeling options, highlights the policy implications of choosing one approach over the other, and presents recommendations on the optimal characteristics of a model to calculate rates for cost-based incentives, FITs, or similar policies. These recommendations shaped the design of NREL's Cost of Renewable Energy Spreadsheet Tool (CREST), which is used by state policymakers, regulators, utilities, developers, and other stakeholders to assist with analyses of policy and renewable energy incentive paymentmore » structures. Authored by Jason S. Gifford and Robert C. Grace of Sustainable Energy Advantage LLC and Wilson H. Rickerson of Meister Consultants Group, Inc.« less
Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain; ...
2017-09-23
Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain
Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less
Prediction Of Abrasive And Diffusive Tool Wear Mechanisms In Machining
NASA Astrophysics Data System (ADS)
Rizzuti, S.; Umbrello, D.
2011-01-01
Tool wear prediction is regarded as very important task in order to maximize tool performance, minimize cutting costs and improve the quality of workpiece in cutting. In this research work, an experimental campaign was carried out at the varying of cutting conditions with the aim to measure both crater and flank tool wear, during machining of an AISI 1045 with an uncoated carbide tool P40. Parallel a FEM-based analysis was developed in order to study the tool wear mechanisms, taking also into account the influence of the cutting conditions and the temperature reached on the tool surfaces. The results show that, when the temperature of the tool rake surface is lower than the activation temperature of the diffusive phenomenon, the wear rate can be estimated applying an abrasive model. In contrast, in the tool area where the temperature is higher than the diffusive activation temperature, the wear rate can be evaluated applying a diffusive model. Finally, for a temperature ranges within the above cited values an adopted abrasive-diffusive wear model furnished the possibility to correctly evaluate the tool wear phenomena.
The Watershed Management Optimization Support Tool (WMOST v.1) was released by the US Environmental Protection Agency in December 2013(http://www2.epa.gov/exposure-assessment-models/wmost-10-download-page). The objective of WMOST is to serve as a public-domain screening toolthat ...
School Attendance Problems: Using the TQM Tools To Identify Root Causes.
ERIC Educational Resources Information Center
Weller, L. David
2000-01-01
Deming's principles and TQM problem-solving tools and techniques can be used to solve noninstructional problems such as vandalism, dropouts, and student absenteeism. This case study presents a model for principals to apply to identify root causes, resolve problems, and provide quality outcomes (at reduced cost) in noninstructional areas. (Contains…
3-D Printing as an Effective Educational Tool for MEMS Design and Fabrication
ERIC Educational Resources Information Center
Dahle, Reena; Rasel, Rafiul
2016-01-01
This paper presents a series of course modules developed as a high-impact and cost-effective learning tool for modeling and simulating the microfabrication process and design of microelectromechanical systems (MEMS) devices using three-dimensional (3-D) printing. Microfabrication technology is an established fabrication technique for small and…
The Application of Function Points to Predict Source Lines of Code for Software Development
1992-09-01
there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available
User manual for the GSEModel which is a spreadsheet analysis tool for quantifying emission benefits and calculating the cost-effectiveness of converting to cleaner-burning fuels and engine technologies.
Logistics: Implementation of Performance - Based Logistics for the Javelin Weapon System
2005-03-07
the c.ontext of each lice within the Automated Cost 24 Batimating-hTasgraled Tools ( ACEIT ) mode], the Army’s standard cost model, containing the EA was...fully validated the EA, The Javelin E.A was valihdted through an extensive review of the EA cost documentation in (te ACEIT file in coordination with... ACEIT file of the system cost estimate- This documentation was conndered to be suflicienT by the CEAC Director once the EA was determinmd to be valid
EVALUATING HYDROLOGICAL RESPONSE TO ...
Studies of future management and policy options based on different assumptions provide a mechanism to examine possible outcomes and especially their likely benefits or consequences. Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extensive data requirements and the difficult task of building input parameter files, however, have long been an obstacle to the timely and cost-effective use of such complex models by resource managers. The U.S. EPA Landscape Ecology Branch in collaboration with the USDA-ARS Southwest Watershed Research Center has developed a geographic information system (GIS) tool to facilitate this process. A GIS provides the framework within which spatially distributed data are collected and used to prepare model input files, and model results are evaluated. The Automated Geospatial Watershed Assessment (AGWA) tool uses widely available standardized spatial datasets that can be obtained via the internet at no cost to the user. The data are used to develop input parameter files for KINEROS2 and SWAT, two watershed runoff and erosion simulation models that operate at different spatial and temporal scales. AGWA automates the process of transforming digital data into simulation model results and provides a visualization tool
Applying CASE Tools for On-Board Software Development
NASA Astrophysics Data System (ADS)
Brammer, U.; Hönle, A.
For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.
The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.
Implementation of a cost-accounting model in a biobank: practical implications.
Gonzalez-Sanchez, Maria Beatriz; Lopez-Valeiras, Ernesto; García-Montero, Andres C
2014-01-01
Given the state of global economy, cost measurement and control have become increasingly relevant over the past years. The scarcity of resources and the need to use these resources more efficiently is making cost information essential in management, even in non-profit public institutions. Biobanks are no exception. However, no empirical experiences on the implementation of cost accounting in biobanks have been published to date. The aim of this paper is to present a step-by-step implementation of a cost-accounting tool for the main production and distribution activities of a real/active biobank, including a comprehensive explanation on how to perform the calculations carried out in this model. Two mathematical models for the analysis of (1) production costs and (2) request costs (order management and sample distribution) have stemmed from the analysis of the results of this implementation, and different theoretical scenarios have been prepared. Global analysis and discussion provides valuable information for internal biobank management and even for strategic decisions at the research and development governmental policies level.
Recent experience with the CQE{trademark}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, C.D.; Kehoe, D.B.; O`Connor, D.C.
1997-12-31
CQE (the Coal Quality Expert) is a software tool that brings a new level of sophistication to fuel decisions by seamlessly integrating the system-wide effects of fuel purchase decisions on power plant performance, emissions, and power generation costs. The CQE technology, which addresses fuel quality from the coal mine to the busbar and the stack, is an integration and improvement of predecessor software tools including: EPRI`s Coal Quality Information System, EPRI`s Coal Cleaning Cost Model, EPRI`s Coal Quality Impact Model, and EPRI and DOE models to predict slagging and fouling. CQE can be used as a stand-alone workstation or asmore » a network application for utilities, coal producers, and equipment manufacturers to perform detailed analyses of the impacts of coal quality, capital improvements, operational changes, and/or environmental compliance alternatives on power plant emissions, performance and production costs. It can be used as a comprehensive, precise and organized methodology for systematically evaluating all such impacts or it may be used in pieces with some default data to perform more strategic or comparative studies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jonkman, Jason; Annoni, Jennifer; Hayman, Greg
2017-01-01
This paper presents the development of FAST.Farm, a new multiphysics tool applicable to engineering problems in research and industry involving wind farm performance and cost optimization that is needed to address the current underperformance, failures, and expenses plaguing the wind industry. Achieving wind cost-of-energy targets - which requires improvements in wind farm performance and reliability, together with reduced uncertainty and expenditures - has been eluded by the complicated nature of the wind farm design problem, especially the sophisticated interaction between atmospheric phenomena and wake dynamics and array effects. FAST.Farm aims to balance the need for accurate modeling of the relevantmore » physics for predicting power performance and loads while maintaining low computational cost to support a highly iterative and probabilistic design process and system-wide optimization. FAST.Farm makes use of FAST to model the aero-hydro-servo-elastics of distinct turbines in the wind farm, and it is based on some of the principles of the Dynamic Wake Meandering (DWM) model, but avoids many of the limitations of existing DWM implementations.« less
2013 Cost of Wind Energy Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mone, C.; Smith, A.; Maples, B.
2015-02-01
This report uses representative project types to estimate the levelized cost of wind energy (LCOE) in the United States for 2013. Scheduled to be published on an annual basis, it relies on both market and modeled data to maintain a current understanding of wind generation cost trends and drivers. It is intended to provide insight into current component-level costs and a basis for understanding current component-level costs and a basis for understanding variability in the LCOE across the industry. Data and tools developed from this analysis are used to inform wind technology cost projections, goals, and improvement opportunities.
A practical tool for modeling biospecimen user fees.
Matzke, Lise; Dee, Simon; Bartlett, John; Damaraju, Sambasivarao; Graham, Kathryn; Johnston, Randal; Mes-Masson, Anne-Marie; Murphy, Leigh; Shepherd, Lois; Schacter, Brent; Watson, Peter H
2014-08-01
The question of how best to attribute the unit costs of the annotated biospecimen product that is provided to a research user is a common issue for many biobanks. Some of the factors influencing user fees are capital and operating costs, internal and external demand and market competition, and moral standards that dictate that fees must have an ethical basis. It is therefore important to establish a transparent and accurate costing tool that can be utilized by biobanks and aid them in establishing biospecimen user fees. To address this issue, we built a biospecimen user fee calculator tool, accessible online at www.biobanking.org . The tool was built to allow input of: i) annual operating and capital costs; ii) costs categorized by the major core biobanking operations; iii) specimen products requested by a biobank user; and iv) services provided by the biobank beyond core operations (e.g., histology, tissue micro-array); as well as v) several user defined variables to allow the calculator to be adapted to different biobank operational designs. To establish default values for variables within the calculator, we first surveyed the members of the Canadian Tumour Repository Network (CTRNet) management committee. We then enrolled four different participants from CTRNet biobanks to test the hypothesis that the calculator tool could change approaches to user fees. Participants were first asked to estimate user fee pricing for three hypothetical user scenarios based on their biobanking experience (estimated pricing) and then to calculate fees for the same scenarios using the calculator tool (calculated pricing). Results demonstrated significant variation in estimated pricing that was reduced by calculated pricing, and that higher user fees are consistently derived when using the calculator. We conclude that adoption of this online calculator for user fee determination is an important first step towards harmonization and realistic user fees.
Wildfire suppression cost forecasts from the US Forest Service
Karen L. Abt; Jeffrey P. Prestemon; Krista M. Gebert
2009-01-01
The US Forest Service and other land-management agencies seek better tools for nticipating future expenditures for wildfire suppression. We developed regression models for forecasting US Forest Service suppression spending at 1-, 2-, and 3-year lead times. We compared these models to another readily available forecast model, the 10-year moving average model,...
Software Technology for Adaptable, Reliable Systems (STARS)
1994-03-25
Tmeline(3), SECOMO(3), SEER(3), GSFC Software Engineering Lab Model(l), SLIM(4), SEER-SEM(l), SPQR (2), PRICE-S(2), internally-developed models(3), APMSS(1...3 " Timeline - 3 " SASET (Software Architecture Sizing Estimating Tool) - 2 " MicroMan 11- 2 * LCM (Logistics Cost Model) - 2 * SPQR - 2 * PRICE-S - 2
Waters, Donald; Theodoratou, Evropi; Campbell, Harry; Rudan, Igor; Chopra, Mickey
2012-12-01
The aim of this study was to populate the Equitable Impact Sensitive Tool (EQUIST) framework with all necessary data and conduct the first implementation of EQUIST in studying cost-effectiveness of community case management of childhood pneumonia in 5 low- and middle-income countries with relation to equity impact. Wealth quintile-specific data were gathered or modelled for all contributory determinants of the EQUIST framework, namely: under-five mortality rate, cost of intervention, intervention effectiveness, current coverage of intervention and relative disease distribution. These were then combined statistically to calculate the final outcome of the EQUIST model for community case management of childhood pneumonia: US$ per life saved, in several different approaches to scaling-up. The current 'mainstream' approach to scaling-up of interventions is never the most cost-effective. Community-case management appears to strongly support an 'equity-promoting' approach to scaling-up, displaying the highest levels of cost-effectiveness in interventions targeted at the poorest quintile of each study country, although absolute cost differences vary by context. The relationship between cost-effectiveness and equity impact is complex, with many determinants to consider. One important way to increase intervention cost-effectiveness in poorer quintiles is to improve the efficiency and quality of delivery. More data are needed in all areas to increase the accuracy of EQUIST-based estimates.
Inexpensive anatomical trainer for bronchoscopy.
Di Domenico, Stefano; Simonassi, Claudio; Chessa, Leonardo
2007-08-01
Flexible fiberoptic bronchoscopy is an indispensable tool for optimal management of intensive care unit patients. However, the acquisition of sufficient training in bronchoscopy is not straightforward during residency, because of technical and ethical problems. Moreover, the use of commercial simulators is limited by their high cost. In order to overcome these limitations, we realized a low-cost anatomical simulator to acquire and maintain the basic skill to perform bronchoscopy in ventilated patients. We used 1.5 mm diameter iron wire to construct the bronchial tree scaffold; glazier-putty was applied to create the anatomical model. The model was covered by several layers of newspaper strips previously immersed in water and vinilic glue. When the model completely dried up, it was detached from the scaffold by cutting it into six pieces, it was reassembled, painted and fitted with an endotracheal tube. We used very cheap material and the final cost was euro16. The trainer resulted in real-scale and anatomically accurate, with appropriate correspondence on endoscopic view between model and patients. All bronchial segments can be explored and easily identified by endoscopic and external vision. This cheap simulator is a valuable tool for practicing, particularly in a hospital with limited resources for medical training.
Castañeda-Orjuela, Carlos; Romero, Martin; Arce, Patricia; Resch, Stephen; Janusz, Cara B; Toscano, Cristiana M; De la Hoz-Restrepo, Fernando
2013-07-02
The cost of Expanded Programs on Immunization (EPI) is an important aspect of the economic and financial analysis needed for planning purposes. Costs also are needed for cost-effectiveness analysis of introducing new vaccines. We describe a costing tool that improves the speed, accuracy, and availability of EPI costs and that was piloted in Colombia. The ProVac CostVac Tool is a spreadsheet-based tool that estimates overall EPI costs considering program inputs (personnel, cold chain, vaccines, supplies, etc.) at three administrative levels (central, departmental, and municipal) and one service delivery level (health facilities). It uses various costing methods. The tool was evaluated through a pilot exercise in Colombia. In addition to the costs obtained from the central and intermediate administrative levels, a survey of 112 local health facilities was conducted to collect vaccination costs. Total cost of the EPI, cost per dose of vaccine delivered, and cost per fully vaccinated child with the recommended immunization schedule in Colombia in 2009 were estimated. The ProVac CostVac Tool is a novel, user-friendly tool, which allows users to conduct an EPI costing study following guidelines for cost studies. The total costs of the Colombian EPI were estimated at US$ 107.8 million in 2009. The cost for a fully immunized child with the recommended schedule was estimated at US$ 153.62. Vaccines and vaccination supplies accounted for 58% of total costs, personnel for 21%, cold chain for 18%, and transportation for 2%. Most EPI costs are incurred at the central level (62%). The major cost driver at the department and municipal levels is personnel costs. The ProVac CostVac Tool proved to be a comprehensive and useful tool that will allow researchers and health officials to estimate the actual cost for national immunization programs. The present analysis shows that personnel, cold chain, and transportation are important components of EPI and should be carefully estimated in the cost analysis, particularly when evaluating new vaccine introduction. Copyright © 2013 Elsevier Ltd. All rights reserved.
Finite element simulation of cutting grey iron HT250 by self-prepared Si3N4 ceramic insert
NASA Astrophysics Data System (ADS)
Wang, Bo; Wang, Li; Zhang, Enguang
2017-04-01
The finite element method has been able to simulate and solve practical machining problems, achieve the required accuracy and the highly reliability. In this paper, the simulation models based on the material properties of the self-prepared Si3N4 insert and HT250 were created. Using these models, the results of cutting force, cutting temperature and tool wear rate were obtained, and tool wear mode was predicted after cutting simulation. These approaches may develop as the new method for testing new cutting-tool materials, shortening development cycle and reducing the cost.
Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J
2012-01-01
Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.
Predicting Production Costs for Advanced Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Bao, Han P.; Samareh, J. A.; Weston, R. P.
2002-01-01
For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This paper outlines the development of a process-based cost model in which the physical elements of the vehicle are soared according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this paper is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool.
Morris, J A; Smith, P S
2001-12-01
According to DiBenedetto, "Occupational health nurses enhance and maximize the health, safety, and productivity of the domestic and global work force" (1999b). This project clearly defined the multiple roles and activities provided by an occupational and environmental health nurse and assistant, supported by a part time contract occupational health nurse. A well defined estimate of the personnel costs for each of these roles is helpful both in demonstrating current value and in future strategic planning for this department. The model highlighted both successes and a business cost savings opportunity for integrated disability management. The AAOHN's Success Tools (1998) were invaluable in launching the development of this cost effectiveness model. The three methods were selected from several tools of varying complexities offered. Collecting available data to develop these metrics required internal consultation with finance, human resources, and risk management, as well as communication with external health, safety, and environmental providers in the community. Benchmarks, surveys, and performance indicators can be found readily in the literature and online. The primary motivation for occupational and environmental health nurses to develop cost effectiveness analyses is to demonstrate the value and worth of their programs and services. However, it can be equally important to identify which services are not cost effective so knowledge and skills may be used in ways that continue to provide value to employers (AAOHN, 1996). As evidence based health care challenges the occupational health community to demonstrate business rationale and financial return on investment, occupational and environmental health nurses must meet that challenge if they are to define their preferred future (DiBenedetto, 2000).
NASA Astrophysics Data System (ADS)
Hall, Roger W.; Foster, Alistair; Herrmann Praturlon, Anja
2017-09-01
The Hot Forming and in-tool Quenching (HFQ®) process is a proven technique to enable complex shaped stampings to be manufactured from high strength aluminium. Its widespread uptake for high volume production will be maximised if it is able to wholly amortise the additional investment cost of this process compared to conventional deep drawing techniques. This paper discusses the use of three techniques to guide some of the development decisions taken during upscaling of the HFQ® process. Modelling of Process timing, Cost and Life-cycle impact were found to be effective tools to identify where development budget could be focused in order to be able to manufacture low cost panels of different sizes from many different alloys in a sustainable way. The results confirm that raw material cost, panel trimming, and artificial ageing were some of the highest contributing factors to final component cost. Additionally, heat treatment and lubricant removal stages played a significant role in the overall life-cycle assessment of the final products. These findings confirmed development priorities as novel furnace design, fast artificial ageing and low-cost alloy development.
Exploration Supply Chain Simulation
NASA Technical Reports Server (NTRS)
2008-01-01
The Exploration Supply Chain Simulation project was chartered by the NASA Exploration Systems Mission Directorate to develop a software tool, with proper data, to quantitatively analyze supply chains for future program planning. This tool is a discrete-event simulation that uses the basic supply chain concepts of planning, sourcing, making, delivering, and returning. This supply chain perspective is combined with other discrete or continuous simulation factors. Discrete resource events (such as launch or delivery reviews) are represented as organizational functional units. Continuous resources (such as civil service or contractor program functions) are defined as enabling functional units. Concepts of fixed and variable costs are included in the model to allow the discrete events to interact with cost calculations. The definition file is intrinsic to the model, but a blank start can be initiated at any time. The current definition file is an Orion Ares I crew launch vehicle. Parameters stretch from Kennedy Space Center across and into other program entities (Michaud Assembly Facility, Aliant Techsystems, Stennis Space Center, Johnson Space Center, etc.) though these will only gain detail as the file continues to evolve. The Orion Ares I file definition in the tool continues to evolve, and analysis from this tool is expected in 2008. This is the first application of such business-driven modeling to a NASA/government-- aerospace contractor endeavor.
Pettitt, D; Goldstein, J L; McGuire, A; Schwartz, J S; Burke, T; Maniadakis, N
2000-12-01
Pharmacoeconomic analyses have become useful and essential tools for health care decision makers who increasingly require such analyses prior to placing a drug on a national, regional or hospital formulary. Previous health economic models of non-steroidal anti-inflammatory drugs (NSAIDs) have been restricted to evaluating a narrow range of agents within specific health care delivery systems using medical information derived from homogeneous clinical trial data. This paper summarizes the Arthritis Cost Consequence Evaluation System (ACCES)--a pharmacoeconomic model that has been developed to predict and evaluate the costs and consequences associated with the use of celecoxib in patients with arthritis, compared with other NSAIDs and NSAIDs plus gastroprotective agents. The advantage of this model is that it can be customized to reflect local practice patterns, resource utilization and costs, as well as provide context-specific health economic information to a variety of providers and/or decision makers.
Deriving Forest Harvesting Machine Productivity from Positional Data
T.P. McDonald; S.E. Taylor; R.B. Rummer
2000-01-01
Automated production study systems will provide researchers a valuable tool for developing cost and impact models of forest operations under a wide range of conditions, making the development of true planning tools for tailoring logging systems to a particular site a reality. An automated time study system for skidders was developed, and in this study application of...
Nagata, Tomohisa; Mori, Koji; Aratake, Yutaka; Ide, Hiroshi; Ishida, Hiromi; Nobori, Junichiro; Kojima, Reiko; Odagami, Kiminori; Kato, Anna; Tsutsumi, Akizumi; Matsuda, Shinya
2014-01-01
The aim of the present study was to develop standardized cost estimation tools that provide information to employers about occupational safety and health (OSH) activities for effective and efficient decision making in Japanese companies. We interviewed OSH staff members including full-time professional occupational physicians to list all OSH activities. Using activity-based costing, cost data were obtained from retrospective analyses of occupational safety and health costs over a 1-year period in three manufacturing workplaces and were obtained from retrospective analyses of occupational health services costs in four manufacturing workplaces. We verified the tools additionally in four workplaces including service businesses. We created the OSH and occupational health standardized cost estimation tools. OSH costs consisted of personnel costs, expenses, outsourcing costs and investments for 15 OSH activities. The tools provided accurate, relevant information on OSH activities and occupational health services. The standardized information obtained from our OSH and occupational health cost estimation tools can be used to manage OSH costs, make comparisons of OSH costs between companies and organizations and help occupational health physicians and employers to determine the best course of action.
Patient-centered medical home model: do school-based health centers fit the model?
Larson, Satu A; Chapman, Susan A
2013-01-01
School-based health centers (SBHCs) are an important component of health care reform. The SBHC model of care offers accessible, continuous, comprehensive, family-centered, coordinated, and compassionate care to infants, children, and adolescents. These same elements comprise the patient-centered medical home (PCMH) model of care being promoted by the Affordable Care Act with the hope of lowering health care costs by rewarding clinicians for primary care services. PCMH survey tools have been developed to help payers determine whether a clinician/site serves as a PCMH. Our concern is that current survey tools will be unable to capture how a SBHC may provide a medical home and therefore be denied needed funding. This article describes how SBHCs might meet the requirements of one PCMH tool. SBHC stakeholders need to advocate for the creation or modification of existing survey tools that allow the unique characteristics of SBHCs to qualify as PCMHs.
Integrated risk/cost planning models for the US Air Traffic system
NASA Technical Reports Server (NTRS)
Mulvey, J. M.; Zenios, S. A.
1985-01-01
A prototype network planning model for the U.S. Air Traffic control system is described. The model encompasses the dual objectives of managing collision risks and transportation costs where traffic flows can be related to these objectives. The underlying structure is a network graph with nonseparable convex costs; the model is solved efficiently by capitalizing on its intrinsic characteristics. Two specialized algorithms for solving the resulting problems are described: (1) truncated Newton, and (2) simplicial decomposition. The feasibility of the approach is demonstrated using data collected from a control center in the Midwest. Computational results with different computer systems are presented, including a vector supercomputer (CRAY-XMP). The risk/cost model has two primary uses: (1) as a strategic planning tool using aggregate flight information, and (2) as an integrated operational system for forecasting congestion and monitoring (controlling) flow throughout the U.S. In the latter case, access to a supercomputer is required due to the model's enormous size.
Offshore wind farm layout optimization
NASA Astrophysics Data System (ADS)
Elkinton, Christopher Neil
Offshore wind energy technology is maturing in Europe and is poised to make a significant contribution to the U.S. energy production portfolio. Building on the knowledge the wind industry has gained to date, this dissertation investigates the influences of different site conditions on offshore wind farm micrositing---the layout of individual turbines within the boundaries of a wind farm. For offshore wind farms, these conditions include, among others, the wind and wave climates, water depths, and soil conditions at the site. An analysis tool has been developed that is capable of estimating the cost of energy (COE) from offshore wind farms. For this analysis, the COE has been divided into several modeled components: major costs (e.g. turbines, electrical interconnection, maintenance, etc.), energy production, and energy losses. By treating these component models as functions of site-dependent parameters, the analysis tool can investigate the influence of these parameters on the COE. Some parameters result in simultaneous increases of both energy and cost. In these cases, the analysis tool was used to determine the value of the parameter that yielded the lowest COE and, thus, the best balance of cost and energy. The models have been validated and generally compare favorably with existing offshore wind farm data. The analysis technique was then paired with optimization algorithms to form a tool with which to design offshore wind farm layouts for which the COE was minimized. Greedy heuristic and genetic optimization algorithms have been tuned and implemented. The use of these two algorithms in series has been shown to produce the best, most consistent solutions. The influences of site conditions on the COE have been studied further by applying the analysis and optimization tools to the initial design of a small offshore wind farm near the town of Hull, Massachusetts. The results of an initial full-site analysis and optimization were used to constrain the boundaries of the farm. A more thorough optimization highlighted the features of the area that would result in a minimized COE. The results showed reasonable layout designs and COE estimates that are consistent with existing offshore wind farms.
Developing integrated parametric planning models for budgeting and managing complex projects
NASA Technical Reports Server (NTRS)
Etnyre, Vance A.; Black, Ken U.
1988-01-01
The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.
Reese, Jared C; Karsy, Michael; Twitchell, Spencer; Bisson, Erica F
2018-04-11
Examining the costs of single- and multilevel anterior cervical discectomy and fusion (ACDF) is important for the identification of cost drivers and potentially reducing patient costs. A novel tool at our institution provides direct costs for the identification of potential drivers. To assess perioperative healthcare costs for patients undergoing an ACDF. Patients who underwent an elective ACDF between July 2011 and January 2017 were identified retrospectively. Factors adding to total cost were placed into subcategories to identify the most significant contributors, and potential drivers of total cost were evaluated using a multivariable linear regression model. A total of 465 patients (mean, age 53 ± 12 yr, 54% male) met the inclusion criteria for this study. The distribution of total cost was broken down into supplies/implants (39%), facility utilization (37%), physician fees (14%), pharmacy (7%), imaging (2%), and laboratory studies (1%). A multivariable linear regression analysis showed that total cost was significantly affected by the number of levels operated on, operating room time, and length of stay. Costs also showed a narrow distribution with few outliers and did not vary significantly over time. These results suggest that facility utilization and supplies/implants are the predominant cost contributors, accounting for 76% of the total cost of ACDF procedures. Efforts at lowering costs within these categories should make the most impact on providing more cost-effective care.
NASA's X-Plane Database and Parametric Cost Model v 2.0
NASA Technical Reports Server (NTRS)
Sterk, Steve; Ogluin, Anthony; Greenberg, Marc
2016-01-01
The NASA Armstrong Cost Engineering Team with technical assistance from NASA HQ (SID)has gone through the full process in developing new CERs from Version #1 to Version #2 CERs. We took a step backward and reexamined all of the data collected, such as dependent and independent variables, cost, dry weight, length, wingspan, manned versus unmanned, altitude, Mach number, thrust, and skin. We used a well- known statistical analysis tool called CO$TAT instead of using "R" multiple linear or the "Regression" tool found in Microsoft Excel(TradeMark). We setup an "array of data" by adding 21" dummy variables;" we analyzed the standard error (SE) and then determined the "best fit." We have parametrically priced-out several future X-planes and compared our results to those of other resources. More work needs to be done in getting "accurate and traceable cost data" from historical X-plane records!
Evolutionary Development of the Simulation by Logical Modeling System (SIBYL)
NASA Technical Reports Server (NTRS)
Wu, Helen
1995-01-01
Through the evolutionary development of the Simulation by Logical Modeling System (SIBYL) we have re-engineered the expensive and complex IBM mainframe based Long-term Hardware Projection Model (LHPM) to a robust cost-effective computer based mode that is easy to use. We achieved significant cost reductions and improved productivity in preparing long-term forecasts of Space Shuttle Main Engine (SSME) hardware. The LHPM for the SSME is a stochastic simulation model that projects the hardware requirements over 10 years. SIBYL is now the primary modeling tool for developing SSME logistics proposals and Program Operating Plan (POP) for NASA and divisional marketing studies.
tool validation. Previous to joining NREL, her Masters thesis focused on a spatial analysis of strategies. Her undergraduate thesis focused on techno-economic and logistical cost modeling of offshore wind
Predicting tool life in turning operations using neural networks and image processing
NASA Astrophysics Data System (ADS)
Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.
2018-05-01
A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.
area, which includes work on whole building energy modeling, cost-based optimization, model accuracy optimization tool used to provide support for the Building America program's teams and energy efficiency goals Colorado graduate student exploring enhancements to building optimization in terms of robustness and speed
A Linear City Model with Asymmetric Consumer Distribution
Azar, Ofer H.
2015-01-01
The article analyzes a linear-city model where the consumer distribution can be asymmetric, which is important because in real markets this distribution is often asymmetric. The model yields equilibrium price differences, even though the firms’ costs are equal and their locations are symmetric (at the two endpoints of the city). The equilibrium price difference is proportional to the transportation cost parameter and does not depend on the good's cost. The firms' markups are also proportional to the transportation cost. The two firms’ prices will be equal in equilibrium if and only if half of the consumers are located to the left of the city’s midpoint, even if other characteristics of the consumer distribution are highly asymmetric. An extension analyzes what happens when the firms have different costs and how the two sources of asymmetry – the consumer distribution and the cost per unit – interact together. The model can be useful as a tool for further development by other researchers interested in applying this simple yet flexible framework for the analysis of various topics. PMID:26034984
Systems Engineering Models and Tools | Wind | NREL
(tm)) that provides wind turbine and plant engineering and cost models for holistic system analysis turbine/component models and wind plant analysis models that the systems engineering team produces. If you integrated modeling of wind turbines and plants. It provides guidance for overall wind turbine and plant
Due to the computational cost of running regional-scale numerical air quality models, reduced form models (RFM) have been proposed as computationally efficient simulation tools for characterizing the pollutant response to many different types of emission reductions. The U.S. Envi...
Reed, Shelby D.; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L.; Bowers, Margaret T.; Samsa, Gregory P.; Paul, Sara; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara J.
2011-01-01
Background Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Methods and Results Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers or health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. Conclusions The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions. PMID:22147884
Developing a cost accounting system for a physician group practice.
Mays, J; Gordon, G
1996-10-01
Physicians in group practices must gain a competitive edge to survive in a healthcare environment in which cost efficiency has become critical to success. One tool that can help them is a cost accounting system that yields reliable, detailed data on the costs of delivering care. Such a system not only can enable physicians and group administrators to manage their operations more cost-effectively, but also can help them accurately assess the potential profitability of prospective managed care plans. An otolaryngology practice located in Mississippi provides a model for developing a cost accounting system that can be applied to physician group practices.
Space system operations and support cost analysis using Markov chains
NASA Technical Reports Server (NTRS)
Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.
1990-01-01
This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.
Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR
NASA Technical Reports Server (NTRS)
Corpaccioli, Luca; Linskens, Harry; Komar, David R.
2014-01-01
The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.
Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P
2008-11-30
In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.
Ellett, Kevin M.; Middleton, Richard S.; Stauffer, Philip H.; ...
2017-08-18
The application of integrated system models for evaluating carbon capture and storage technology has expanded steadily over the past few years. To date, such models have focused largely on hypothetical scenarios of complex source-sink matching involving numerous large-scale CO 2 emitters, and high-volume, continuous reservoirs such as deep saline formations to function as geologic sinks for carbon storage. Though these models have provided unique insight on the potential costs and feasibility of deploying complex networks of integrated infrastructure, there remains a pressing need to translate such insight to the business community if this technology is to ever achieve a trulymore » meaningful impact in greenhouse gas mitigation. Here, we present a new integrated system modelling tool termed SimCCUS aimed at providing crucial decision support for businesses by extending the functionality of a previously developed model called SimCCS. The primary innovation of the SimCCUS tool development is the incorporation of stacked geological reservoir systems with explicit consideration of processes and costs associated with the operation of multiple CO 2 utilization and storage targets from a single geographic location. In such locations provide significant efficiencies through economies of scale, effectively minimizing CO 2 storage costs while simultaneously maximizing revenue streams via the utilization of CO 2 as a commodity for enhanced hydrocarbon recovery.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellett, Kevin M.; Middleton, Richard S.; Stauffer, Philip H.
The application of integrated system models for evaluating carbon capture and storage technology has expanded steadily over the past few years. To date, such models have focused largely on hypothetical scenarios of complex source-sink matching involving numerous large-scale CO 2 emitters, and high-volume, continuous reservoirs such as deep saline formations to function as geologic sinks for carbon storage. Though these models have provided unique insight on the potential costs and feasibility of deploying complex networks of integrated infrastructure, there remains a pressing need to translate such insight to the business community if this technology is to ever achieve a trulymore » meaningful impact in greenhouse gas mitigation. Here, we present a new integrated system modelling tool termed SimCCUS aimed at providing crucial decision support for businesses by extending the functionality of a previously developed model called SimCCS. The primary innovation of the SimCCUS tool development is the incorporation of stacked geological reservoir systems with explicit consideration of processes and costs associated with the operation of multiple CO 2 utilization and storage targets from a single geographic location. In such locations provide significant efficiencies through economies of scale, effectively minimizing CO 2 storage costs while simultaneously maximizing revenue streams via the utilization of CO 2 as a commodity for enhanced hydrocarbon recovery.« less
Data Service Provider Cost Estimation Tool
NASA Technical Reports Server (NTRS)
Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel
2011-01-01
The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denholm, P.; Hummon, M.
2013-02-01
Concentrating solar power (CSP) deployed with thermal energy storage (TES) provides a dispatchable source of renewable energy. The value of CSP with TES, as with other potential generation resources, needs to be established using traditional utility planning tools. Production cost models, which simulate the operation of grid, are often used to estimate the operational value of different generation mixes. CSP with TES has historically had limited analysis in commercial production simulations. This document describes the implementation of CSP with TES in a commercial production cost model. It also describes the simulation of grid operations with CSP in a test systemmore » consisting of two balancing areas located primarily in Colorado.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denholm, P.; Hummon, M.
2012-11-01
Concentrating solar power (CSP) deployed with thermal energy storage (TES) provides a dispatchable source of renewable energy. The value of CSP with TES, as with other potential generation resources, needs to be established using traditional utility planning tools. Production cost models, which simulate the operation of grid, are often used to estimate the operational value of different generation mixes. CSP with TES has historically had limited analysis in commercial production simulations. This document describes the implementation of CSP with TES in a commercial production cost model. It also describes the simulation of grid operations with CSP in a test systemmore » consisting of two balancing areas located primarily in Colorado.« less
Decision science and cervical cancer.
Cantor, Scott B; Fahs, Marianne C; Mandelblatt, Jeanne S; Myers, Evan R; Sanders, Gillian D
2003-11-01
Mathematical modeling is an effective tool for guiding cervical cancer screening, diagnosis, and treatment decisions for patients and policymakers. This article describes the use of mathematical modeling as outlined in five presentations from the Decision Science and Cervical Cancer session of the Second International Conference on Cervical Cancer held at The University of Texas M. D. Anderson Cancer Center, April 11-14, 2002. The authors provide an overview of mathematical modeling, especially decision analysis and cost-effectiveness analysis, and examples of how it can be used for clinical decision making regarding the prevention, diagnosis, and treatment of cervical cancer. Included are applications as well as theory regarding decision science and cervical cancer. Mathematical modeling can answer such questions as the optimal frequency for screening, the optimal age to stop screening, and the optimal way to diagnose cervical cancer. Results from one mathematical model demonstrated that a vaccine against high-risk strains of human papillomavirus was a cost-effective use of resources, and discussion of another model demonstrated the importance of collecting direct non-health care costs and time costs for cost-effectiveness analysis. Research presented indicated that care must be taken when applying the results of population-wide, cost-effectiveness analyses to reduce health disparities. Mathematical modeling can encompass a variety of theoretical and applied issues regarding decision science and cervical cancer. The ultimate objective of using decision-analytic and cost-effectiveness models is to identify ways to improve women's health at an economically reasonable cost. Copyright 2003 American Cancer Society.
MODELLING QUALITY ASSURANCE PLAN FOR THE LAKE MICHIGAN MASS BALANCE PROJECT
With the ever increasing complexity and costs of ecosystem protection and remediation, the USEPA is placing more emphasis on ensuring the quality and credibility of scientific tools, such as models, that are used to help guide decision-makers who are faced with difficult manageme...
APEX Model Simulation for Row Crop Watersheds with Agroforestry and Grass Buffers
USDA-ARS?s Scientific Manuscript database
Watershed model simulation has become an important tool in studying ways and means to reduce transport of agricultural pollutants. Conducting field experiments to assess buffer influences on water quality are constrained by the large-scale nature of watersheds, high experimental costs, private owner...
NASA Astrophysics Data System (ADS)
van Griensven, Ann; Haest, Pieter Jan; Broekx, Steven; Seuntjens, Piet; Campling, Paul; Ducos, Geraldine; Blaha, Ludek; Slobodnik, Jaroslav
2010-05-01
The European Union (EU) adopted the Water Framework Directive (WFD) in 2000 ensuring that all aquatic ecosystems meet ‘good status' by 2015. However, it is a major challenge for river basin managers to meet this requirement in river basins with a high population density as well as intensive agricultural and industrial activities. The EU financed AQUAREHAB project (FP7) specifically examines the ecological and economic impact of innovative rehabilitation technologies for multi-pressured degraded water bodies. For this purpose, a generic collaborative management tool ‘REACH-ER' is being developed that can be used by stakeholders, citizens and water managers to evaluate the ecological and economical effects of different remedial actions on waterbodies. The tool is built using databases from large scale models simulating the hydrological dynamics of the river basing and sub-basins, the costs of the measures and the effectiveness of the measures in terms of ecological impact. Knowledge rules are used to describe the relationships between these data in order to compute the flux concentrations or to compute the effectiveness of measures. The management tool specifically addresses nitrate pollution and pollution by organic micropollutants. Detailed models are also used to predict the effectiveness of site remedial technologies using readily available global data. Rules describing ecological impacts are derived from ecotoxicological data for (mixtures of) specific contaminants (msPAF) and ecological indices relating effects to the presence of certain contaminants. Rules describing the cost-effectiveness of measures are derived from linear programming models identifying the least-cost combination of abatement measures to satisfy multi-pollutant reduction targets and from multi-criteria analysis.
Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers
NASA Technical Reports Server (NTRS)
Kenny, Sean (Technical Monitor); Wertz, Julie
2002-01-01
As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.
USDA-ARS?s Scientific Manuscript database
Farms both produce greenhouse gas emissions that drive human-induced climate change and are impacted by that climate change. Whole farm and global climate models provide useful tools for studying the benefits and costs of greenhouse gas mitigation and the adaptation of farms to changing climate. The...
Implementing the Mother-Baby Model of Nursing Care Using Models and Quality Improvement Tools.
Brockman, Vicki
As family-centered care has become the expected standard, many facilities follow the mother-baby model, in which care is provided to both a woman and her newborn in the same room by the same nurse. My facility employed a traditional model of nursing care, which was not evidence-based or financially sustainable. After implementing the mother-baby model, we experienced an increase in exclusive breastfeeding rates at hospital discharge, increased patient satisfaction, improved staff productivity and decreased salary costs, all while the number of births increased. Our change was successful because it was guided by the use of quality improvement tools, change theory and evidence-based practice models. © 2015 AWHONN.
Yamaguchi, Satoshi; Yamada, Yuya; Yoshida, Yoshinori; Noborio, Hiroshi; Imazato, Satoshi
2012-01-01
The virtual reality (VR) simulator is a useful tool to develop dental hand skill. However, VR simulations with reactions of patients have limited computational time to reproduce a face model. Our aim was to develop a patient face model that enables real-time collision detection and cutting operation by using stereolithography (STL) and deterministic finite automaton (DFA) data files. We evaluated dependence of computational cost and constructed the patient face model using the optimum condition for combining STL and DFA data files, and assessed the computational costs for operation in do-nothing, collision, cutting, and combination of collision and cutting. The face model was successfully constructed with low computational costs of 11.3, 18.3, 30.3, and 33.5 ms for do-nothing, collision, cutting, and collision and cutting, respectively. The patient face model could be useful for developing dental hand skill with VR.
Intelligent Controls for Net-Zero Energy Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Haorong; Cho, Yong; Peng, Dongming
2011-10-30
The goal of this project is to develop and demonstrate enabling technologies that can empower homeowners to convert their homes into net-zero energy buildings in a cost-effective manner. The project objectives and expected outcomes are as follows: • To develop rapid and scalable building information collection and modeling technologies that can obtain and process “as-built” building information in an automated or semiautomated manner. • To identify low-cost measurements and develop low-cost virtual sensors that can monitor building operations in a plug-n-play and low-cost manner. • To integrate and demonstrate low-cost building information modeling (BIM) technologies. • To develop decision supportmore » tools which can empower building owners to perform energy auditing and retrofit analysis. • To develop and demonstrate low-cost automated diagnostics and optimal control technologies which can improve building energy efficiency in a continual manner.« less
Casas-Mulet, Roser; Saltveit, Svein Jakob; Alfredsen, Knut Tore
2016-12-15
Alterations in hydrological and thermal regimes can potentially affect salmonid early life stages development and survival. The dewatering of salmon spawning redds due to hydropeaking can lead to mortality in early life stages, with higher impact on the alevins as they have lower tolerance to dewatering than the eggs. Flow-related mitigation measures can reduce early life stage mortality. We present a set of modelling tools to assess impacts and mitigation options to minimise the risk of mortality in early life stages in hydropeaking rivers. We successfully modelled long-term hydrological and thermal alterations and consequences for development rates. We estimated the risk of early life stages mortality and assessed the cost-effectiveness of implementing three release-related mitigation options (A,B,C). The economic cost of mitigation was low and ranged between 0.7% and 2.6% of the annual hydropower production. Options reducing the flow during spawning (B and C) in addition to only release minimum flows during development (A) were considered more effective for egg and alevin survival. Options B and C were however constraint by water availability in the system for certain years, and therefore only option A was always feasible. The set of modelling tools used in this study were satisfactory and their applications can be useful especially in systems where little field data is available. Targeted measures built on well-informed modelling tools can be tested on their effectiveness to mitigate dewatering effects vs. the hydropower system capacity to release or conserve water for power production. Environmental flow releases targeting specific ecological objectives can provide better cost-effective options than conventional operational rules complying with general legislation. Copyright © 2016 Elsevier B.V. All rights reserved.
A discrete event simulation tool to support and predict hospital and clinic staffing.
DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David
2017-06-01
We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.
Operations and support cost modeling using Markov chains
NASA Technical Reports Server (NTRS)
Unal, Resit
1989-01-01
Systems for future missions will be selected with life cycle costs (LCC) as a primary evaluation criterion. This reflects the current realization that only systems which are considered affordable will be built in the future due to the national budget constaints. Such an environment calls for innovative cost modeling techniques which address all of the phases a space system goes through during its life cycle, namely: design and development, fabrication, operations and support; and retirement. A significant portion of the LCC for reusable systems are generated during the operations and support phase (OS). Typically, OS costs can account for 60 to 80 percent of the total LCC. Clearly, OS costs are wholly determined or at least strongly influenced by decisions made during the design and development phases of the project. As a result OS costs need to be considered and estimated early in the conceptual phase. To be effective, an OS cost estimating model needs to account for actual instead of ideal processes by associating cost elements with probabilities. One approach that may be suitable for OS cost modeling is the use of the Markov Chain Process. Markov chains are an important method of probabilistic analysis for operations research analysts but they are rarely used for life cycle cost analysis. This research effort evaluates the use of Markov Chains in LCC analysis by developing OS cost model for a hypothetical reusable space transportation vehicle (HSTV) and suggests further uses of the Markov Chain process as a design-aid tool.
Heany, Julia; Torres, Jennifer; Zagar, Cynthia; Kostelec, Tiffany
2018-06-05
Introduction In order to achieve the positive outcomes with parents and children demonstrated by many home visiting models, home visiting services must be well implemented. The Michigan Home Visiting Initiative developed a tool and procedure for monitoring implementation quality across models referred to as Michigan's Home Visiting Quality Assurance System (MHVQAS). This study field tested the MHVQAS. This article focuses on one of the study's evaluation questions: Can the MHVQAS be applied across models? Methods Eight local implementing agencies (LIAs) from four home visiting models (Healthy Families America, Early Head Start-Home Based, Parents as Teachers, Maternal Infant Health Program) and five reviewers participated in the study by completing site visits, tracking their time and costs, and completing surveys about the process. LIAs also submitted their most recent review by their model developer. The researchers conducted participant observation of the review process. Results Ratings on the MHVQAS were not significantly different between models. There were some differences in interrater reliability and perceived reliability between models. There were no significant differences between models in perceived validity, satisfaction with the review process, or cost to participate. Observational data suggested that cross-model applicability could be improved by assisting sites in relating the requirements of the tool to the specifics of their model. Discussion The MHVQAS shows promise as a tool and process to monitor implementation quality of home visiting services across models. The results of the study will be used to make improvements before the MHVQAS is used in practice.
Recent Advances in Algal Genetic Tool Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Dahlin, Lukas; T. Guarnieri, Michael
The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less
Recent Advances in Algal Genetic Tool Development
R. Dahlin, Lukas; T. Guarnieri, Michael
2016-06-24
The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less
Examining the Effect of the Die Angle on Tool Load and Wear in the Extrusion Process
NASA Astrophysics Data System (ADS)
Nowotyńska, Irena; Kut, Stanisław
2014-04-01
The tool durability is a crucial factor in each manufacturing process, and this also includes the extrusion process. Striving to achieve the higher product quality should be accompanied by a long-term tool life and production cost reduction. This article presents the comparative research of load and wear of die at various angles of working cone during the concurrent extrusion. The numerical calculations of a tool load during the concurrent extrusion were performed using the MSC MARC software using the finite element method (FEM). Archard model was used to determine and compare die wear. This model was implemented in the software using the FEM. The examined tool deformations and stress distribution were determined based on the performed analyses. The die wear depth at various working cone angles was determined. Properly shaped die has an effect on the extruded material properties, but also controls loads, elastic deformation, and the tool life.
Reinventing The Design Process: Teams and Models
NASA Technical Reports Server (NTRS)
Wall, Stephen D.
1999-01-01
The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.
BASIMO - Borehole Heat Exchanger Array Simulation and Optimization Tool
NASA Astrophysics Data System (ADS)
Schulte, Daniel O.; Bastian, Welsch; Wolfram, Rühaak; Kristian, Bär; Ingo, Sass
2017-04-01
Arrays of borehole heat exchangers are an increasingly popular source for renewable energy. Furthermore, they can serve as borehole thermal energy storage (BTES) systems for seasonally fluctuating heat sources like solar thermal energy or district heating grids. The high temperature level of these heat sources prohibits the use of the shallow subsurface for environmental reasons. Therefore, deeper reservoirs have to be accessed instead. The increased depth of the systems results in high investment costs and has hindered the implementation of this technology until now. Therefore, research of medium deep BTES systems relies on numerical simulation models. Current simulation tools cannot - or only to some extent - describe key features like partly insulated boreholes unless they run fully discretized models of the borehole heat exchangers. However, fully discretized models often come at a high computational cost, especially for large arrays of borehole heat exchangers. We give an update on the development of BASIMO: a tool, which uses one dimensional thermal resistance and capacity models for the borehole heat exchangers coupled with a numerical finite element model for the subsurface heat transport in a dual-continuum approach. An unstructured tetrahedral mesh bypasses the limitations of structured grids for borehole path geometries, while the thermal resistance and capacity model is improved to account for borehole heat exchanger properties changing with depth. Thereby, partly insulated boreholes can be considered in the model. Furthermore, BASIMO can be used to improve the design of BTES systems: the tool allows for automated parameter variations and is readily coupled to other code like mathematical optimization algorithms. Optimization can be used to determine the required minimum system size or to increase the system performance.
[A new low-cost webcam-based laparoscopic training model].
Langeron, A; Mercier, G; Lima, S; Chauleur, C; Golfier, F; Seffert, P; Chêne, G
2012-01-01
To validate a new laparoscopy home training model (GYN Trainer®) in order to practise and learn basic laparoscopic surgery. Ten junior surgical residents and six experienced operators were timed and assessed during six laparoscopic exercises performed on the home training model. Acquisition of skill was 35%. All the novices significantly improved performance in surgical skills despite an 8% partial loss of acquisition between two training sessions. Qualitative evaluation of the system was good (3.8/5). This low-cost personal laparoscopic model seems to be a useful tool to assist surgical novices in learning basic laparoscopic skills. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Clinical process analysis and activity-based costing at a heart center.
Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans
2002-08-01
Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.
CubeSat mission design software tool for risk estimating relationships
NASA Astrophysics Data System (ADS)
Gamble, Katharine Brumbaugh; Lightsey, E. Glenn
2014-09-01
In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.
Belciug, Smaranda; Gorunescu, Florin
2016-03-01
Explore how efficient intelligent decision support systems, both easily understandable and straightforwardly implemented, can help modern hospital managers to optimize both bed occupancy and utilization costs. This paper proposes a hybrid genetic algorithm-queuing multi-compartment model for the patient flow in hospitals. A finite capacity queuing model with phase-type service distribution is combined with a compartmental model, and an associated cost model is set up. An evolutionary-based approach is used for enhancing the ability to optimize both bed management and associated costs. In addition, a "What-if analysis" shows how changing the model parameters could improve performance while controlling costs. The study uses bed-occupancy data collected at the Department of Geriatric Medicine - St. George's Hospital, London, period 1969-1984, and January 2000. The hybrid model revealed that a bed-occupancy exceeding 91%, implying a patient rejection rate around 1.1%, can be carried out with 159 beds plus 8 unstaffed beds. The same holding and penalty costs, but significantly different bed allocations (156 vs. 184 staffed beds, and 8 vs. 9 unstaffed beds, respectively) will result in significantly different costs (£755 vs. £1172). Moreover, once the arrival rate exceeds 7 patient/day, the costs associated to the finite capacity system become significantly smaller than those associated to an Erlang B queuing model (£134 vs. £947). Encoding the whole information provided by both the queuing system and the cost model through chromosomes, the genetic algorithm represents an efficient tool in optimizing the bed allocation and associated costs. The methodology can be extended to different medical departments with minor modifications in structure and parameterization. Copyright © 2016 Elsevier B.V. All rights reserved.
Ekman, Björn; Borg, Johan
2017-08-01
The aim of this study is to provide evidence on the costs and health effects of two alternative hearing aid delivery models, a community-based and a centre-based approach. The study is set in Bangladesh and the study population is children between 12 and 18 years old. Data on resource use by participants and their caregivers were collected by a household survey. Follow-up data were collected after two months. Data on the costs to providers of the two approaches were collected by means of key informant interviews. The total cost per participant in the community-based model was BDT 6,333 (USD 79) compared with BDT 13,718 (USD 172) for the centre-based model. Both delivery models are found to be cost-effective with an estimated cost per DALY averted of BDT 17,611 (USD 220) for the community-based model and BDT 36,775 (USD 460) for the centre-based model. Using a community-based approach to deliver hearing aids to children in a resource constrained environment is a cost-effective alternative to the traditional centre-based approach. Further evidence is needed to draw conclusions for scale-up of approaches; rigorous analysis is possible using well-prepared data collection tools and working closely with sector professionals. Implications for Rehabilitation Delivery models vary by resources needed for their implementation. Community-based deliver models of hearing aids to children in low-income countries are a cost-effective alternative. The assessment of costs and effects of hearing aids delivery models in low-income countries is possible through planned collaboration between researchers and sector professionals.
Tracking PACS usage with open source tools.
French, Todd L; Langer, Steve G
2011-08-01
A typical choice faced by Picture Archiving and Communication System (PACS) administrators is deciding how many PACS workstations are needed and where they should be sited. Oftentimes, the social consequences of having too few are severe enough to encourage oversupply and underutilization. This is costly, at best in terms of hardware and electricity, and at worst (depending on the PACS licensing and support model) in capital costs and maintenance fees. The PACS administrator needs tools to asses accurately the use to which her fleet is being subjected, and thus make informed choices before buying more workstations. Lacking a vended solution for this challenge, we developed our own.
NASA Astrophysics Data System (ADS)
Gallo, E. M.; Hogue, T. S.; Bell, C. D.; Spahr, K.; McCray, J. E.
2017-12-01
The water quality of receiving streams and waterbodies in urban watersheds are increasingly polluted from stormwater runoff. The implementation of Green Infrastructure (GI), which includes Low Impact Developments (LIDs) and Best Management Practices (BMPs), within a watershed aim to mitigate the effects of urbanization by reducing pollutant loads, runoff volume, and storm peak flow. Stormwater modeling is generally used to assess the impact of GIs implemented within a watershed. These modeling tools are useful for determining the optimal suite of GIs to maximize pollutant load reduction and minimize cost. However, stormwater management for most resource managers and communities also includes the implementation of grey and hybrid stormwater infrastructure. An integrated decision support tool, called i-DST, that allows for the optimization and comprehensive life-cycle cost assessment of grey, green, and hybrid stormwater infrastructure, is currently being developed. The i-DST tool will evaluate optimal stormwater runoff management by taking into account the diverse economic, environmental, and societal needs associated with watersheds across the United States. Three watersheds from southern California will act as a test site and assist in the development and initial application of the i-DST tool. The Ballona Creek, Dominguez Channel, and Los Angeles River Watersheds are located in highly urbanized Los Angeles County. The water quality of the river channels flowing through each are impaired by heavy metals, including copper, lead, and zinc. However, despite being adjacent to one another within the same county, modeling results, using EPA System for Urban Stormwater Treatment and Analysis INtegration (SUSTAIN), found that the optimal path to compliance in each watershed differs significantly. The differences include varied costs, suites of BMPs, and ancillary benefits. This research analyzes how the economic, physical, and hydrological differences between the three watersheds shape the optimal plan for stormwater management.
Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, M.; Penev, M.
2012-09-01
NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.
On the use of high-frequency SCADA data for improved wind turbine performance monitoring
NASA Astrophysics Data System (ADS)
Gonzalez, E.; Stephen, B.; Infield, D.; Melero, J. J.
2017-11-01
SCADA-based condition monitoring of wind turbines facilitates the move from costly corrective repairs towards more proactive maintenance strategies. In this work, we advocate the use of high-frequency SCADA data and quantile regression to build a cost effective performance monitoring tool. The benefits of the approach are demonstrated through the comparison between state-of-the-art deterministic power curve modelling techniques and the suggested probabilistic model. Detection capabilities are compared for low and high-frequency SCADA data, providing evidence for monitoring at higher resolutions. Operational data from healthy and faulty turbines are used to provide a practical example of usage with the proposed tool, effectively achieving the detection of an incipient gearbox malfunction at a time horizon of more than one month prior to the actual occurrence of the failure.
Principles of continuous quality improvement applied to intravenous therapy.
Dunavin, M K; Lane, C; Parker, P E
1994-01-01
Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.
Trade-space Analysis for Constellations
NASA Astrophysics Data System (ADS)
Le Moigne, J.; Dabney, P.; de Weck, O. L.; Foreman, V.; Grogan, P.; Holland, M. P.; Hughes, S. P.; Nag, S.
2016-12-01
Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: "How many spacecraft should be included in the constellation? Which design has the best cost/risk value?" The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time. This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance. TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The Knowledge Base supports both analysis and exploration, and the current GUI prototype automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost.
Can automation in radiotherapy reduce costs?
Massaccesi, Mariangela; Corti, Michele; Azario, Luigi; Balducci, Mario; Ferro, Milena; Mantini, Giovanna; Mattiucci, Gian Carlo; Valentini, Vincenzo
2015-01-01
Computerized automation is likely to play an increasingly important role in radiotherapy. The objective of this study was to report the results of the first part of a program to implement a model for economical evaluation based on micro-costing method. To test the efficacy of the model, the financial impact of the introduction of an automation tool was estimated. A single- and multi-center validation of the model by a prospective collection of data is planned as the second step of the program. The model was implemented by using an interactive spreadsheet (Microsoft Excel, 2010). The variables to be included were identified across three components: productivity, staff, and equipment. To calculate staff requirements, the workflow of Gemelli ART center was mapped out and relevant workload measures were defined. Profit and loss, productivity and staffing were identified as significant outcomes. Results were presented in terms of earnings before interest and taxes (EBIT). Three different scenarios were hypothesized: baseline situation at Gemelli ART (scenario 1); reduction by 2 minutes of the average duration of treatment fractions (scenario 2); and increased incidence of advanced treatment modalities (scenario 3). By using the model, predicted EBIT values for each scenario were calculated across a period of eight years (from 2015 to 2022). For both scenarios 2 and 3 costs are expected to slightly increase as compared to baseline situation that is particularly due to a little increase in clinical personnel costs. However, in both cases EBIT values are more favorable than baseline situation (EBIT values: scenario 1, 27%, scenario 2, 30%, scenario 3, 28% of revenues). A model based on a micro-costing method was able to estimate the financial consequences of the introduction of an automation tool in our radiotherapy department. A prospective collection of data at Gemelli ART and in a consortium of centers is currently under way to prospectively validate the model.
Models of supply function equilibrium with applications to the electricity industry
NASA Astrophysics Data System (ADS)
Aromi, J. Daniel
Electricity market design requires tools that result in a better understanding of incentives of generators and consumers. Chapter 1 and 2 provide tools and applications of these tools to analyze incentive problems in electricity markets. In chapter 1, models of supply function equilibrium (SFE) with asymmetric bidders are studied. I prove the existence and uniqueness of equilibrium in an asymmetric SFE model. In addition, I propose a simple algorithm to calculate numerically the unique equilibrium. As an application, a model of investment decisions is considered that uses the asymmetric SFE as an input. In this model, firms can invest in different technologies, each characterized by distinct variable and fixed costs. In chapter 2, option contracts are introduced to a supply function equilibrium (SFE) model. The uniqueness of the equilibrium in the spot market is established. Comparative statics results on the effect of option contracts on the equilibrium price are presented. A multi-stage game where option contracts are traded before the spot market stage is considered. When contracts are optimally procured by a central authority, the selected profile of option contracts is such that the spot market price equals marginal cost for any load level resulting in a significant reduction in cost. If load serving entities (LSEs) are price takers, in equilibrium, there is no trade of option contracts. Even when LSEs have market power, the central authority's solution cannot be implemented in equilibrium. In chapter 3, we consider a game in which a buyer must repeatedly procure an input from a set of firms. In our model, the buyer is able to sign long term contracts that establish the likelihood with which the next period contract is awarded to an entrant or the incumbent. We find that the buyer finds it optimal to favor the incumbent, this generates more intense competition between suppliers. In a two period model we are able to completely characterize the optimal mechanism.
T. W. Appelboom; G. M. Chescheir; R. W. Skaggs; J. W. Gilliam; Devendra M. Amatya
2006-01-01
Watershed modeling has become an important tool for researchers with the high costs of water quality monitoring. When modeling nitrate transport within drainage networks, denitrification within the sediments needs to be accounted for. Birgand et. al. developed an equation using a term called a mass transfer coefficient to mathematically describe sediment...
Organizational Constraints and Goal Setting
ERIC Educational Resources Information Center
Putney, Frederick B.; Wotman, Stephen
1978-01-01
Management modeling techniques are applied to setting operational and capital goals using cost analysis techniques in this case study at the Columbia University School of Dental and Oral Surgery. The model was created as a planning tool used in developing a financially feasible operating plan and a 100 percent physical renewal plan. (LBH)
2001-07-21
APPENDIX A. ACRONYMS ACCES Attenuating Custom Communication Earpiece System ACEIT Automated Cost estimating Integrated Tools AFSC Air Force...documented in the ACEIT cost estimating tool developed by Tecolote, Inc. The factor used was 14 percent of PMP. 1.3 System Engineering/ Program...The data source is the ASC Aeronautical Engineering Products Cost Factor Handbook which is documented in the ACEIT cost estimating tool developed
Managing design excellence tools during the development of new orthopaedic implants.
Défossez, Henri J P; Serhan, Hassan
2013-11-01
Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
3D FEM Simulation of Flank Wear in Turning
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio
2011-05-01
This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.
Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel
NASA Astrophysics Data System (ADS)
Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania
2007-05-01
Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.
Multiperiod planning tool for multisite pig production systems.
Nadal-Roig, E; Plà, L M
2014-09-01
This paper presents a multiperiod planning tool for multisite pig production systems based on Linear Programming (LP). The aim of the model is to help pig managers of multisite systems in making short-term decisions (mainly related to pig transfers between farms and batch management in fattening units) and mid-term or long-term decisions (according to company targets and expansion strategy). The model skeleton follows the structure of a three-site system that can be adapted to any multisite system present in the modern pig industry. There are three basic phases, namely, piglet production, rearing pigs, and fattening. Each phase involves a different set of farms; therefore, transportation between farms and delivering of pigs to the abattoir are under consideration. The model maximizes the total gross margin calculated from the income of sales to the abattoir and the production costs over the time horizon considered. Production cost depends on each type of farm involved in the process. Parameters like number of farms per phase and distance, farm capacity, reproduction management policies, feeding and veterinary expenses, and transportation costs are taken into account. The model also provides a schedule of transfers between farms in terms of animals to be transported and number of trucks involved. The use of the model is illustrated with a case study based on a real instance of a company located in Catalonia (Spain).
Liu, Nan; D'Aunno, Thomas
2012-04-01
To develop simple stylized models for evaluating the productivity and cost-efficiencies of different practice models to involve nurse practitioners (NPs) in primary care, and in particular to generate insights on what affects the performance of these models and how. The productivity of a practice model is defined as the maximum number of patients that can be accounted for by the model under a given timeliness-to-care requirement; cost-efficiency is measured by the corresponding annual cost per patient in that model. Appropriate queueing analysis is conducted to generate formulas and values for these two performance measures. Model parameters for the analysis are extracted from the previous literature and survey reports. Sensitivity analysis is conducted to investigate the model performance under different scenarios and to verify the robustness of findings. Employing an NP, whose salary is usually lower than a primary care physician, may not be cost-efficient, in particular when the NP's capacity is underutilized. Besides provider service rates, workload allocation among providers is one of the most important determinants for the cost-efficiency of a practice model involving NPs. Capacity pooling among providers could be a helpful strategy to improve efficiency in care delivery. The productivity and cost-efficiency of a practice model depend heavily on how providers organize their work and a variety of other factors related to the practice environment. Queueing theory provides useful tools to take into account these factors in making strategic decisions on staffing and panel size selection for a practice model. © Health Research and Educational Trust.
Vibroacoustic test plan evaluation: Parameter variation study
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloef, H. R.
1976-01-01
Statistical decision models are shown to provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology developed provides a major step toward the development of a realistic tool to quantitatively tailor test programs to specific payloads. Testing is considered at the no test, component, subassembly, or system level of assembly. Component redundancy and partial loss of flight data are considered. Most and probabilistic costs are considered, and incipient failures resulting from ground tests are treated. Optimums defining both component and assembly test levels are indicated for the modified test plans considered. modeling simplifications must be considered in interpreting the results relative to a particular payload. New parameters introduced were a no test option, flight by flight failure probabilities, and a cost to design components for higher vibration requirements. Parameters varied were the shuttle payload bay internal acoustic environment, the STS launch cost, the component retest/repair cost, and the amount of redundancy in the housekeeping section of the payload reliability model.
NASA Astrophysics Data System (ADS)
Campbell, Diarmad; de Beer, Johannes; Lawrence, David; van der Meulen, Michiel; Mielby, Susie; Hay, David; Scanlon, Ray; Campenhout, Ignace; Taugs, Renate; Eriksson, Ingelov
2014-05-01
Sustainable urbanisation is the focus of SUB-URBAN, a European Cooperation in Science and Technology (COST) Action TU1206 - A European network to improve understanding and use of the ground beneath our cities. This aims to transform relationships between experts who develop urban subsurface geoscience knowledge - principally national Geological Survey Organisations (GSOs), and those who can most benefit from it - urban decision makers, planners, practitioners and the wider research community. Under COST's Transport and Urban Development Domain, SUB-URBAN has established a network of GSOs and other researchers in over 20 countries, to draw together and evaluate collective urban geoscience research in 3D/4D characterisation, prediction and visualisation. Knowledge exchange between researchers and City-partners within 'SUB-URBAN' is already facilitating new city-scale subsurface projects, and is developing a tool-box of good-practice guidance, decision-support tools, and cost-effective methodologies that are appropriate to local needs and circumstances. These are intended to act as catalysts in the transformation of relationships between geoscientists and urban decision-makers more generally. As a result, the importance of the urban sub-surface in the sustainable development of our cities will be better appreciated, and the conflicting demands currently placed on it will be acknowledged, and resolved appropriately. Existing city-scale 3D/4D model exemplars are being developed by partners in the UK (Glasgow, London), Germany (Hamburg) and France (Paris). These draw on extensive ground investigation (10s-100s of thousands of boreholes) and other data. Model linkage enables prediction of groundwater, heat, SuDS, and engineering properties. Combined subsurface and above-ground (CityGML, BIMs) models are in preparation. These models will provide valuable tools for more holistic urban planning; identifying subsurface opportunities and saving costs by reducing uncertainty in ground conditions. A key area of interest, and one of potential collaboration with COST Action TU1208, is in characterising and parameterising the very near urban subsurface, and especially the anthropogenic deposits, to assist decision-making by civil engineers, and others. Anthropogenic deposits may be many metres thick, are typically very heterogeneous, have complex histories of accumulation, and may including important archaeological assets. They display complex stratigraphies which are difficult to resolve using traditional methodologies, even with extensive invasive ground investigation. Ground Penetrating Radar, and other non-destructive methods of ground investigation hold considerable promise in greatly improving the resolution, understanding, and modelling, of these and other near-surface deposits in particular. This work is a contribution both to COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar" and to COST Action TU1206 "SUB-URBAN - A European network to improve understanding and use of the ground beneath our cities"
Time-driven activity-based costing: A dynamic value assessment model in pediatric appendicitis.
Yu, Yangyang R; Abbas, Paulette I; Smith, Carolyn M; Carberry, Kathleen E; Ren, Hui; Patel, Binita; Nuchtern, Jed G; Lopez, Monica E
2017-06-01
Healthcare reform policies are emphasizing value-based healthcare delivery. We hypothesize that time-driven activity-based costing (TDABC) can be used to appraise healthcare interventions in pediatric appendicitis. Triage-based standing delegation orders, surgical advanced practice providers, and a same-day discharge protocol were implemented to target deficiencies identified in our initial TDABC model. Post-intervention process maps for a hospital episode were created using electronic time stamp data for simple appendicitis cases during February to March 2016. Total personnel and consumable costs were determined using TDABC methodology. The post-intervention TDABC model featured 6 phases of care, 33 processes, and 19 personnel types. Our interventions reduced duration and costs in the emergency department (-41min, -$23) and pre-operative floor (-57min, -$18). While post-anesthesia care unit duration and costs increased (+224min, +$41), the same-day discharge protocol eliminated post-operative floor costs (-$306). Our model incorporating all three interventions reduced total direct costs by 11% ($2753.39 to $2447.68) and duration of hospitalization by 51% (1984min to 966min). Time-driven activity-based costing can dynamically model changes in our healthcare delivery as a result of process improvement interventions. It is an effective tool to continuously assess the impact of these interventions on the value of appendicitis care. II, Type of study: Economic Analysis. Copyright © 2017 Elsevier Inc. All rights reserved.
The Cost-Effectiveness of Real-Time Continuous Glucose Monitoring (RT-CGM) in Type 2 Diabetes.
Fonda, Stephanie J; Graham, Claudia; Munakata, Julie; Powers, Julia M; Price, David; Vigersky, Robert A
2016-07-01
This analysis models the cost-effectiveness of real-time continuous glucose monitoring (RT-CGM) using evidence from a randomized controlled trial (RCT) that demonstrated RT-CGM reduced A1C, for up to 9 months after using the technology, among patients with type 2 diabetes not on prandial insulin. RT-CGM was offered short-term and intermittently as a self-care tool to inform patients' behavior. The analyses projected lifetime clinical and economic outcomes for RT-CGM versus self-monitoring of blood glucose by fingerstick only. The base-case analysis was consistent with the RCT (RT-CGM for 2 weeks on/1 week off over 3 months). A scenario analysis simulated outcomes of an RT-CGM "refresher" after the active intervention of the RCT. Analyses used the IMS CORE Diabetes Model and were conducted from a US third-party payer perspective, including direct costs obtained from published sources and inflated to 2011 US dollars. Costs and health outcomes were discounted at 3% per annum. Life expectancy (LE) and quality-adjusted life expectancy (QALE) from RT-CGM were 0.10 and 0.07, with a cost of $653/patient over a lifetime. Incremental LE and QALE from a "refresher" were 0.14 and 0.10, with a cost of $1312/patient over a lifetime, and incremental cost-effectiveness ratios were $9319 and $13 030 per LY and QALY gained. RT-CGM, as a self-care tool, is a cost-effective disease management option in the US for people with type 2 diabetes not on prandial insulin. Repeated use of RT-CGM may result in additional cost-effectiveness. © 2016 Diabetes Technology Society.
Analytical Tools for Affordability Analysis
2015-04-30
flunk this basic test from their inception. —Honorable Ashton B. Carter (2010), Under Secretary of Defense for Acquisition, Technology, and Logistics... Testing , and Evaluation] funding has been lost to cancelled programs. (Decker & Wagner, 2011) The Army is scarcely unique in this regard. All... econometric model of how schedule affects cost should take advantage of these different cost categories and treat them separately when they are known
Modeling U.S. Air Force Occupational Health Costs
2009-03-01
for the 75th Aerospace Medicine Group, Hill Air Force Base, Utah. Bioenvironmental engineers sought a more robust cost comparison tool, allowing...to Major Feltenberger and Major Johns at the Air Force Medical Operations Agency and Captain Batchellor from the 75th Aerospace Medicine Squadron...resources on support functions is challenging, and rightly so. In a sense, commanders are fiduciaries to the taxpayers and must responsibly spend
Nano-Launcher Technologies, Approaches, and Life Cycle Assessment. Phase II
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2014-01-01
Assist in understanding NASA technology and investment approaches, and other driving factors, necessary for enabling dedicated nano-launchers by industry at a cost and flight rate that (1) could support and be supported by an emerging nano-satellite market and (2) would benefit NASAs needs. Develop life-cycle cost, performance and other NASA analysis tools or models required to understand issues, drivers and challenges.
NASA Astrophysics Data System (ADS)
Yang, Y.; Chui, T. F. M.
2016-12-01
Green infrastructure (GI) is identified as sustainable and environmentally friendly alternatives to the conventional grey stormwater infrastructure. Commonly used GI (e.g. green roof, bioretention, porous pavement) can provide multifunctional benefits, e.g. mitigation of urban heat island effects, improvements in air quality. Therefore, to optimize the design of GI and grey drainage infrastructure, it is essential to account for their benefits together with the costs. In this study, a comprehensive simulation-optimization modelling framework that considers the economic and hydro-environmental aspects of GI and grey infrastructure for small urban catchment applications is developed. Several modelling tools (i.e., EPA SWMM model, the WERF BMP and LID Whole Life Cycle Cost Modelling Tools) and optimization solvers are coupled together to assess the life-cycle cost-effectiveness of GI and grey infrastructure, and to further develop optimal stormwater drainage solutions. A typical residential lot in New York City is examined as a case study. The life-cycle cost-effectiveness of various GI and grey infrastructure are first examined at different investment levels. The results together with the catchment parameters are then provided to the optimization solvers, to derive the optimal investment and contributing area of each type of the stormwater controls. The relationship between the investment and optimized environmental benefit is found to be nonlinear. The optimized drainage solutions demonstrate that grey infrastructure is preferred at low total investments while more GI should be adopted at high investments. The sensitivity of the optimized solutions to the prices the stormwater controls is evaluated and is found to be highly associated with their utilizations in the base optimization case. The overall simulation-optimization framework can be easily applied to other sites world-wide, and to be further developed into powerful decision support systems.
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Scenario analysis for techno-economic model development of U.S. offshore wind support structures
Damiani, Rick; Ning, Andrew; Maples, Ben; ...
2016-09-22
Challenging bathymetry and soil conditions of future US offshore wind power plants might promote the use of multimember, fixed-bottom structures (or 'jackets') in place of monopiles. Support structures affect costs associated with the balance of system and operation and maintenance. Understanding the link between these costs and the main environmental design drivers is crucial in the quest for a lower levelized cost of energy, and it is the main rationale for this work. Actual cost and engineering data are still scarce; hence, we evaluated a simplified engineering approach to tie key site and turbine parameters (e.g. water depth, wave height,more » tower-head mass, hub height and generator rating) to the overall support weight. A jacket-and-tower sizing tool, part of the National Renewable Energy Laboratory's system engineering software suite, was utilized to achieve mass-optimized support structures for 81 different configurations. This tool set provides preliminary sizing of all jacket components. Results showed reasonable agreement with the available industry data, and that the jacket mass is mainly driven by water depth, but hub height and tower-head mass become more influential at greater turbine ratings. A larger sensitivity of the structural mass to wave height and target eigenfrequency was observed for the deepest water conditions (>40 m). Thus, techno-economic analyses using this model should be based on accurate estimates of actual metocean conditions and turbine parameters especially for deep waters. Finally, the relationships derived from this study will inform National Renewable Energy Laboratory's offshore balance of system cost model, and they will be used to evaluate the impact of changes in technology on offshore wind lower levelized cost of energy.« less
Colston, Josh; Saboyá, Martha
2013-05-01
We present an example of a tool for quantifying the burden, the population in need of intervention and resources need to contribute for the control of soil-transmitted helminth (STH) infection at multiple administrative levels for the region of Latin America and the Caribbean (LAC). The tool relies on published STH prevalence data along with data on the distribution of several STH transmission determinants for 12,273 sub-national administrative units in 22 LAC countries taken from national censuses. Data on these determinants was aggregated into a single risk index based on a conceptual framework and the statistical significance of the association between this index and the STH prevalence indicators was tested using simple linear regression. The coefficient and constant from the output of this regression was then put into a regression formula that was applied to the risk index values for all of the administrative units in order to model the estimated prevalence of each STH species. We then combine these estimates with population data, treatment thresholds and unit cost data to calculate total control costs. The model predicts an annual cost for the procurement of preventive chemotherapy of around US$ 1.7 million and a total cost of US$ 47 million for implementing a comprehensive STH control programme targeting an estimated 78.7 million school-aged children according to the WHO guidelines throughout the entirety of the countries included in the study. Considerable savings to this cost could potentially be made by embedding STH control interventions within existing health programmes and systems. A study of this scope is prone to many limitations which restrict the interpretation of the results and the uses to which its findings may be put. We discuss several of these limitations.
Animal Models and Bone Histomorphometry: Translational Research for the Human Research Program
NASA Technical Reports Server (NTRS)
Sibonga, Jean D.
2010-01-01
This slide presentation reviews the use of animal models to research and inform bone morphology, in particular relating to human research in bone loss as a result of low gravity environments. Reasons for use of animal models as tools for human research programs include: time-efficient, cost-effective, invasive measures, and predictability as some model are predictive for drug effects.
NASA Astrophysics Data System (ADS)
Jain, A.
2017-08-01
Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.
cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.
Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E
2018-06-01
Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.
The financial impact of a clinical academic practice partnership.
Greene, Mary Ann; Turner, James
2014-01-01
New strategies to provide clinical experiences for nursing students have caused nursing schools and hospitals to evaluate program costs. A Microsoft Excel model, which captures costs and associated benefits, was developed and is described here. The financial analysis shows that the Clinical Academic Practice Program framework for nursing clinical education, often preferred by students, can offer financial advantages to participating hospitals and schools of nursing. The model is potentially a tool for schools of nursing to enlist hospitals and to help manage expenses of clinical education. Hospitals may also use the Hospital Nursing Unit Staffing and Expense Worksheet in planning staffing when students are assigned to units and the cost/benefit findings to enlist management support.
simuwatt - A Tablet Based Electronic Auditing Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macumber, Daniel; Parker, Andrew; Lisell, Lars
2014-05-08
'simuwatt Energy Auditor' (TM) is a new tablet-based electronic auditing tool that is designed to dramatically reduce the time and cost to perform investment-grade audits and improve quality and consistency. The tool uses the U.S. Department of Energy's OpenStudio modeling platform and integrated Building Component Library to automate modeling and analysis. simuwatt's software-guided workflow helps users gather required data, and provides the data in a standard electronic format that is automatically converted to a baseline OpenStudio model for energy analysis. The baseline energy model is calibrated against actual monthly energy use to ASHRAE Standard 14 guidelines. Energy conservation measures frommore » the Building Component Library are then evaluated using OpenStudio's parametric analysis capability. Automated reporting creates audit documents that describe recommended packages of energy conservation measures. The development of this tool was partially funded by the U.S. Department of Defense's Environmental Security Technology Certification Program. As part of this program, the tool is being tested at 13 buildings on 5 Department of Defense sites across the United States. Results of the first simuwatt audit tool demonstration are presented in this paper.« less
Duane, B G; Freeman, R; Richards, D; Crosbie, S; Patel, P; White, S; Humphris, G
2017-03-01
To commission dental services for vulnerable (special care) patient groups effectively, consistently and fairly an evidence base is needed of the costs involved. The simplified Case Mixed Tool (sCMT) can assess treatment mode complexity for these patient groups. To determine if the sCMT can be used to identify costs of service provision. Patients (n=495) attending the Sussex Community NHS Trust Special Care Dental Service for care were assessed using the sCMT. sCMT score and costs (staffing, laboratory fees, etc.) besides patient age, whether a new patient and use of general anaesthetic/intravenous sedation. Statistical analysis (adjusted linear regression modelling) compared sCMT score and costs then sensitivity analyses of the costings to age, being a new patient and sedation use were undertaken. Regression tables were produced to present estimates of service costs. Costs increased with sCMT total scale and single item values in a predictable manner in all analyses except for 'cooperation'. Costs increased with the use of IV sedation; with each rising level of the sCMT, and with complexity in every sCMT category, except cooperation. Costs increased with increase in complexity of treatment mode as measured by sCMT scores. Measures such as the sCMT can provide predictions of the resource allocations required when commissioning special care dental services. Copyright© 2017 Dennis Barber Ltd.
Gaziano, Thomas; Abrahams-Gessel, Shafika; Surka, Sam; Sy, Stephen; Pandya, Ankur; Denman, Catalina A.; Mendoza, Carlos; Puoane, Thandi; Levitt, Naomi S.
2016-01-01
In low-resource settings, a physician is not always available. We recently demonstrated that community health workers—instead of physicians or nurses—can efficiently screen adults for cardiovascular disease in South Africa, Mexico, and Guatemala. In this analysis we sought to determine the health and economic impacts of shifting this screening to community health workers equipped with either a paper-based or a mobile phone–based screening tool. We found that screening by community health workers was very cost-effective or even cost-saving in all three countries, compared to the usual clinic-based screening. The mobile application emerged as the most cost-effective strategy because it could save more lives than the paper tool at minimal extra cost. Our modeling indicated that screening by community health workers, combined with improved treatment rates, would increase the number of deaths averted from 15,000 to 110,000, compared to standard care. Policy makers should promote greater acceptance of community health workers by both national populations and health professionals and should increase their commitment to treating cardiovascular disease and making medications available. PMID:26355056
Leach, A W; Mumford, J D
2008-01-01
The Pesticide Environmental Accounting (PEA) tool provides a monetary estimate of environmental and health impacts per hectare-application for any pesticide. The model combines the Environmental Impact Quotient method and a methodology for absolute estimates of external pesticide costs in UK, USA and Germany. For many countries resources are not available for intensive assessments of external pesticide costs. The model converts external costs of a pesticide in the UK, USA and Germany to Mediterranean countries. Economic and policy applications include estimating impacts of pesticide reduction policies or benefits from technologies replacing pesticides, such as sterile insect technique. The system integrates disparate data and approaches into a single logical method. The assumptions in the system provide transparency and consistency but at the cost of some specificity and precision, a reasonable trade-off for a method that provides both comparative estimates of pesticide impacts and area-based assessments of absolute impacts.
Afzali, Anita; Ogden, Kristine; Friedman, Michael L; Chao, Jingdong; Wang, Anthony
2017-04-01
Inflammatory bowel disease (IBD) (e.g. ulcerative colitis [UC] and Crohn's disease [CD]) severely impacts patient quality-of-life. Moderate-to-severe disease is often treated with biologics requiring infusion therapy, adding incremental costs beyond drug costs. This study evaluates US hospital-based infusion services costs for treatment of UC or CD patients receiving infliximab or vedolizumab therapy. A model was developed, estimating annual costs of providing monitored infusions using an activity-based costing framework approach. Multiple sources (published literature, treatment product inserts) informed base-case model input estimates. The total modeled per patient infusion therapy costs in Year 1 with infliximab and vedolizumab was $38,782 and $41,320, respectively, and Year 2+, $49,897 and $36,197, respectively. Drug acquisition cost was the largest total costs driver (90-93%), followed by costs associated with hospital-based infusion provision: labor (53-56%, non-drug costs), allocated overhead (23%, non-drug costs), non-labor (23%, non-drug costs), and laboratory (7-10%, non-drug costs). Limitations included reliance on published estimates, base-case cost estimates infusion drug, and supplies, not accounting for volume pricing, assumption of a small hospital infusion center, and that, given the model adopts the hospital perspective, costs to the patient were not included in infusion administration cost base-case estimates. This model is an early step towards a framework to fully analyze infusion therapies' associated costs. Given the lack of published data, it would be beneficial for hospital administrators to assess total costs and trade-offs with alternative means of providing biologic therapies. This analysis highlights the value to hospital administrators of assessing cost associated with infusion patient mix to make more informed resource allocation decisions. As the landscape for reimbursement changes, tools for evaluating the costs of infusion therapy may help hospital administrators make informed choices and weigh trade-offs associated with providing infusion services for IBD patients.
Evanoff, Bradley; Kymes, Steve
2010-06-01
The aim of this study was to evaluate the costs associated with pre-employment nerve conduction testing as a screening tool for carpal tunnel syndrome (CTS) in the workplace. We used a Markov decision analysis model to compare the costs associated with a strategy of screening all prospective employees for CTS and not hiring those with abnormal nerve conduction, versus a strategy of not screening for CTS. The variables included in our model included employee turnover rate, the incidence of CTS, the prevalence of median nerve conduction abnormalities, the relative risk of developing CTS conferred by abnormal nerve conduction screening, the costs of pre-employment screening, and the worker's compensation costs to the employer for each case of CTS. In our base case, total employer costs for CTS from the perspective of the employer (cost of screening plus costs for workers' compensation associated with CTS) were higher when screening was used. Median costs per employee position over five years were US$503 for the screening strategy versus US$200 for a no-screening strategy. A sensitivity analysis showed that a strategy of screening was cost-beneficial from the perspective of the employer only under a few circumstances. Using Monte Carlo simulation varying all parameters, we found a 30% probability that screening would be cost-beneficial. A strategy of pre-employment screening for CTS should be carefully evaluated for yield and social consequences before being implemented. Our model suggests such screening is not appropriate for most employers.
An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System
1989-09-01
residual and it is described as the residual divided by its standard deviation (13:App A,17). Neter, Wasserman, and Kutner, in Applied Linear Regression Models...others. Applied Linear Regression Models. Homewood IL: Irwin, 1983. 19. Raduchel, William J. "A Professional’s Perspective on User-Friendliness," Byte
From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment
NASA Astrophysics Data System (ADS)
Klose, M.; Damm, B.
2014-12-01
The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brackney, L.
Broadly accessible, low cost, accurate, and easy-to-use energy auditing tools remain out of reach for managers of the aging U.S. building population (over 80% of U.S. commercial buildings are more than 10 years old*). concept3D and NREL's commercial buildings group will work to translate and extend NREL's existing spreadsheet-based energy auditing tool for a browser-friendly and mobile-computing platform. NREL will also work with concept3D to further develop a prototype geometry capture and materials inference tool operable on a smart phone/pad platform. These tools will be developed to interoperate with NREL's Building Component Library and OpenStudio energy modeling platforms, and willmore » be marketed by concept3D to commercial developers, academic institutions and governmental agencies. concept3D is NREL's lead developer and subcontractor of the Building Component Library.« less
NASA Astrophysics Data System (ADS)
Stamenkovic, Dragan D.; Popovic, Vladimir M.
2015-02-01
Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.
A Web-based cost-effective training tool with possible application to brain injury rehabilitation.
Wang, Peijun; Kreutzer, Ina Anna; Bjärnemo, Robert; Davies, Roy C
2004-06-01
Virtual reality (VR) has provoked enormous interest in the medical community. In particular, VR offers therapists new approaches for improving rehabilitation effects. However, most of these VR assistant tools are not very portable, extensible or economical. Due to the vast amount of 3D data, they are not suitable for Internet transfer. Furthermore, in order to run these VR systems smoothly, special hardware devices are needed. As a result, existing VR assistant tools tend to be available in hospitals but not in patients' homes. To overcome these disadvantages, as a case study, this paper proposes a Web-based Virtual Ticket Machine, called WBVTM, using VRML [VRML Consortium, The Virtual Reality Modeling Language: International Standard ISO/IEC DIS 14772-1, 1997, available at ], Java and EAI (External Authoring Interface) [Silicon Graphics, Inc., The External Authoring Interface (EAI), available at ], to help people with acquired brain injury (ABI) to relearn basic living skills at home at a low cost. As these technologies are open standard and feature usability on the Internet, WBVTM achieves the goals of portability, easy accessibility and cost-effectiveness.
Gerwin, Philip M; Norinsky, Rada M; Tolwani, Ravi J
2018-03-01
Laboratory animal programs and core laboratories often set service rates based on cost estimates. However, actual costs may be unknown, and service rates may not reflect the actual cost of services. Accurately evaluating the actual costs of services can be challenging and time-consuming. We used a time-driven activity-based costing (ABC) model to determine the cost of services provided by a resource laboratory at our institution. The time-driven approach is a more efficient approach to calculating costs than using a traditional ABC model. We calculated only 2 parameters: the time required to perform an activity and the unit cost of the activity based on employee cost. This method allowed us to rapidly and accurately calculate the actual cost of services provided, including microinjection of a DNA construct, microinjection of embryonic stem cells, embryo transfer, and in vitro fertilization. We successfully implemented a time-driven ABC model to evaluate the cost of these services and the capacity of labor used to deliver them. We determined how actual costs compared with current service rates. In addition, we determined that the labor supplied to conduct all services (10,645 min/wk) exceeded the practical labor capacity (8400 min/wk), indicating that the laboratory team was highly efficient and that additional labor capacity was needed to prevent overloading of the current team. Importantly, this time-driven ABC approach allowed us to establish a baseline model that can easily be updated to reflect operational changes or changes in labor costs. We demonstrated that a time-driven ABC model is a powerful management tool that can be applied to other core facilities as well as to entire animal programs, providing valuable information that can be used to set rates based on the actual cost of services and to improve operating efficiency.
2010-12-01
processes. Novice estimators must often use of these complicated cost estimation tools (e.g., ACEIT , SEER-H, SEER-S, PRICE-H, PRICE-S, etc.) until...However, the thesis will leverage the processes embedded in cost estimation tools such as the Automated Cost Estimating Integration Tool ( ACEIT ) and the
Assessing the cost of a cardiology residency program with a cost construction model.
Franzini, L; Chen, S C; McGhie, A I; Low, M D
1999-09-01
Although the total costs of graduate medical education are difficult to quantify, this information is of great importance in planning over the next decade. A cost construction model was used to quantify the costs of teaching faculty, cardiology fellows' salaries and benefits, overhead (physical plant, equipment, and support staff), and other costs associated with the cardiology residency program at the University of Texas-Houston during the 1996 to 1997 academic year. Surveys of cardiology faculty and fellows, checked by the program director, were conducted to determine the time spent in teaching activities; access to institutional and departmental financial records was obtained to quantify associated costs. The model was then developed and examined for a range of assumptions concerning cardiology fellows' productivity, replacement costs, and the cost allocation of activities jointly producing clinical care and education. The instructional cost of training (cost of didactic, direct clinical supervision, preparation for teaching, and teaching-related administration, plus the support of the teaching program) was estimated at $73,939 per cardiology fellow per year. This cost was less than the estimated replacement value of the teaching and clinical services provided by cardiology fellows, $100,937 per cardiology fellow per year. Sensitivity analysis, with different assumptions on cardiology fellows' productivity and replacement costs, varied the cost estimates but generally represented the cardiology residency program as an asset. Cost construction models can be used as a tool to estimate variations in resource requirements resulting from changes in curriculum or educators' costs. In this residency, the value of the teaching and clinical services provided by cardiology fellows exceeded the cost of the resources used in the educational program.
Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation
NASA Technical Reports Server (NTRS)
DePriest, Douglas; Morgan, Carolyn
2003-01-01
The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.
Cecchel, S; Chindamo, D; Turrini, E; Carnevale, C; Cornacchia, G; Gadola, M; Panvini, A; Volta, M; Ferrario, D; Golimbioschi, R
2018-02-01
This study presents a modelling system to evaluate the impact of weight reduction in light commercial vehicles with diesel engines on air quality and greenhouse gas emissions. The PROPS model assesses the emissions of one vehicle in the aforementioned category and its corresponding reduced-weight version. The results serve as an input to the RIAT+ tool, an air quality integrated assessment modelling system. This paper applies the tools in a case study in the Lombardy region (Italy) and discusses the input data pre-processing, the PROPS-RIAT+ modelling system runs, and the results. Copyright © 2017 Elsevier B.V. All rights reserved.
Human performance cognitive-behavioral modeling: a benefit for occupational safety.
Gore, Brian F
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Human performance cognitive-behavioral modeling: a benefit for occupational safety
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
NASA Technical Reports Server (NTRS)
Jack, John; Kwan, Eric; Wood, Milana
2011-01-01
PRICE H was introduced into the JPL cost estimation tool set circa 2003. It became more available at JPL when IPAO funded the NASA-wide site license for all NASA centers. PRICE H was mainly used as one of the cost tools to validate proposal grassroots cost estimates. Program offices at JPL view PRICE H as an additional crosscheck to Team X (JPL Concurrent Engineering Design Center) estimates. PRICE H became widely accepted ca, 2007 at JPL when the program offices moved away from grassroots cost estimation for Step 1 proposals. PRICE H is now one of the key cost tools used for cost validation, cost trades, and independent cost estimates.
1988-11-17
cost. Simplicity will be of achieved by the use of a web in the the essence , to keep costs down in extruder crosshead. In the cable what are likely...lower layer in each of the field models, JASMINE ’ 6 , "appears to offer a reliable tool compartment. Also given are surface temperatures and heat for
RFI and SCRIMP Model Development and Verification
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Sayre, Jay
2000-01-01
Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.
Kemp, Chandler E; Ravikumar, Arvind P; Brandt, Adam R
2016-04-19
We present a tool for modeling the performance of methane leak detection and repair programs that can be used to evaluate the effectiveness of detection technologies and proposed mitigation policies. The tool uses a two-state Markov model to simulate the evolution of methane leakage from an artificial natural gas field. Leaks are created stochastically, drawing from the current understanding of the frequency and size distributions at production facilities. Various leak detection and repair programs can be simulated to determine the rate at which each would identify and repair leaks. Integrating the methane leakage over time enables a meaningful comparison between technologies, using both economic and environmental metrics. We simulate four existing or proposed detection technologies: flame ionization detection, manual infrared camera, automated infrared drone, and distributed detectors. Comparing these four technologies, we found that over 80% of simulated leakage could be mitigated with a positive net present value, although the maximum benefit is realized by selectively targeting larger leaks. Our results show that low-cost leak detection programs can rely on high-cost technology, as long as it is applied in a way that allows for rapid detection of large leaks. Any strategy to reduce leakage should require a careful consideration of the differences between low-cost technologies and low-cost programs.
NASA Astrophysics Data System (ADS)
Nyboer, John
Issues related to the reduction of greenhouse gases are encumbered with uncertainties for decision makers. Unfortunately, conventional analytical tools generate widely divergent forecasts of the effects of actions designed to mitigate these emissions. "Bottom-up" models show the costs of reducing emissions attained through the penetration of efficient technologies to be low or negative. In contrast, more aggregate "top-down" models show costs of reduction to be high. The methodological approaches of the different models used to simulate energy consumption generate, in part, the divergence found in model outputs. To address this uncertainty and bring convergence, I use a technology-explicit model that simulates turnover of equipment stock as a function of detailed data on equipment costs and stock characteristics and of verified behavioural data related to equipment acquisition and retrofitting. Such detail can inform the decision maker of the effects of actions to reduce greenhouse gases due to changes in (1) technology stocks, (2) products or services, or (3) the mix of fuels used. This thesis involves two main components: (1) the development of a quantitative model to analyse energy demand and (2) the application of this tool to a policy issue, abatement of COsb2 emissions. The analysis covers all of Canada by sector (8 industrial subsectors, residential commercial) and region. An electricity supply model to provide local electricity prices supplemented the quantitative model. Forecasts of growth and structural change were provided by national macroeconomic models. Seven different simulations were applied to each sector in each region including a base case run and three runs simulating emissions charges of 75/tonne, 150/tonne and 225/tonne CO sb2. The analysis reveals that there is significant variation in the costs and quantity of emissions reduction by sector and region. Aggregated results show that Canada can meet both stabilisation targets (1990 levels of emissions by 2000) and reduction targets (20% less than 1990 by 2010), but the cost of meeting reduction targets exceeds 225/tonne. After a review of the results, I provide several reasons for concluding that the costs are overestimated and the emissions reduction underestimated. I also provide several future research options.
Navigation Constellation Design Using a Multi-Objective Genetic Algorithm
2015-03-26
programs. This specific tool not only offers high fidelity simulations, but it also offers the visual aid provided by STK . The ability to...MATLAB and STK . STK is a program that allows users to model, analyze, and visualize space systems. Users can create objects such as satellites and...position dilution of precision (PDOP) and system cost. This thesis utilized Satellite Tool Kit ( STK ) to calculate PDOP values of navigation
Cost-effectiveness of human papillomavirus vaccination in the United States.
Chesson, Harrell W; Ekwueme, Donatus U; Saraiya, Mona; Markowitz, Lauri E
2008-02-01
We describe a simplified model, based on the current economic and health effects of human papillomavirus (HPV), to estimate the cost-effectiveness of HPV vaccination of 12-year-old girls in the United States. Under base-case parameter values, the estimated cost per quality-adjusted life year gained by vaccination in the context of current cervical cancer screening practices in the United States ranged from $3,906 to $14,723 (2005 US dollars), depending on factors such as whether herd immunity effects were assumed; the types of HPV targeted by the vaccine; and whether the benefits of preventing anal, vaginal, vulvar, and oropharyngeal cancers were included. The results of our simplified model were consistent with published studies based on more complex models when key assumptions were similar. This consistency is reassuring because models of varying complexity will be essential tools for policy makers in the development of optimal HPV vaccination strategies.
High Thermal Conductivity and High Wear Resistance Tool Steels for cost-effective Hot Stamping Tools
NASA Astrophysics Data System (ADS)
Valls, I.; Hamasaiid, A.; Padré, A.
2017-09-01
In hot stamping/press hardening, in addition to its shaping function, the tool controls the cycle time, the quality of the stamped components through determining the cooling rate of the stamped blank, the production costs and the feasibility frontier for stamping a given component. During the stamping, heat is extracted from the stamped blank and transported through the tool to the cooling medium in the cooling lines. Hence, the tools’ thermal properties determine the cooling rate of the blank, the heat transport mechanism, stamping times and temperature distribution. The tool’s surface resistance to adhesive and abrasive wear is also an important cost factor, as it determines the tool durability and maintenance costs. Wear is influenced by many tool material parameters, such as the microstructure, composition, hardness level and distribution of strengthening phases, as well as the tool’s working temperature. A decade ago, Rovalma developed a hot work tool steel for hot stamping that features a thermal conductivity of more than double that of any conventional hot work tool steel. Since that time, many complimentary grades have been developed in order to provide tailored material solutions as a function of the production volume, degree of blank cooling and wear resistance requirements, tool geometries, tool manufacturing method, type and thickness of the blank material, etc. Recently, Rovalma has developed a new generation of high thermal conductivity, high wear resistance tool steel grades that enable the manufacture of cost effective tools for hot stamping to increase process productivity and reduce tool manufacturing costs and lead times. Both of these novel grades feature high wear resistance and high thermal conductivity to enhance tool durability and cut cycle times in the production process of hot stamped components. Furthermore, one of these new grades reduces tool manufacturing costs through low tool material cost and hardening through readily available gas-quenching, whereas the other new grade enables a faster manufacturing of the tool at reduced cost by eliminating the time and money consuming high temperature hardening altogether. The latter newly developed grade can be hardened from a soft delivery state for easy machining to 52 HRc by way of a simple low temperature precipitation hardening. In this work, these new grades and the role of the tool material’s thermal, mechanical and tribological properties as well as their processing features will be discussed in light of enabling the manufacture of intelligent hot stamping tools.
Liu, Nan; D'Aunno, Thomas
2012-01-01
Objective To develop simple stylized models for evaluating the productivity and cost-efficiencies of different practice models to involve nurse practitioners (NPs) in primary care, and in particular to generate insights on what affects the performance of these models and how. Data Sources and Study Design The productivity of a practice model is defined as the maximum number of patients that can be accounted for by the model under a given timeliness-to-care requirement; cost-efficiency is measured by the corresponding annual cost per patient in that model. Appropriate queueing analysis is conducted to generate formulas and values for these two performance measures. Model parameters for the analysis are extracted from the previous literature and survey reports. Sensitivity analysis is conducted to investigate the model performance under different scenarios and to verify the robustness of findings. Principal Findings Employing an NP, whose salary is usually lower than a primary care physician, may not be cost-efficient, in particular when the NP's capacity is underutilized. Besides provider service rates, workload allocation among providers is one of the most important determinants for the cost-efficiency of a practice model involving NPs. Capacity pooling among providers could be a helpful strategy to improve efficiency in care delivery. Conclusions The productivity and cost-efficiency of a practice model depend heavily on how providers organize their work and a variety of other factors related to the practice environment. Queueing theory provides useful tools to take into account these factors in making strategic decisions on staffing and panel size selection for a practice model. PMID:22092009
A Methodology for Developing Army Acquisition Strategies for an Uncertain Future
2007-01-01
manuscript for publication. Acronyms ABP Assumption-Based Planning ACEIT Automated Cost Estimating Integrated Tool ACR Armored Cavalry Regiment ACTD...decisions. For example, they employ the Automated Cost Estimating Integrated Tools ( ACEIT ) to simplify life cycle cost estimates; other tools are
Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling
NASA Technical Reports Server (NTRS)
Glaab, Patricia; Madden, Michael
2014-01-01
The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.
Emerging from the bottleneck: Benefits of the comparative approach to modern neuroscience
Brenowitz, Eliot A.; Zakon, Harold H.
2015-01-01
Neuroscience historically exploited a wide diversity of animal taxa. Recently, however, research focused increasingly on a few model species. This trend accelerated with the genetic revolution, as genomic sequences and genetic tools became available for a few species, which formed a bottleneck. This coalescence on a small set of model species comes with several costs often not considered, especially in the current drive to use mice explicitly as models for human diseases. Comparative studies of strategically chosen non-model species can complement model species research and yield more rigorous studies. As genetic sequences and tools become available for many more species, we are poised to emerge from the bottleneck and once again exploit the rich biological diversity offered by comparative studies. PMID:25800324
Whitfield, Geoffrey P; Meehan, Leslie A; Maizlish, Neil; Wendel, Arthur M
2016-01-01
The Integrated Transport and Health Impact Model (ITHIM) is a comprehensive tool that estimates the hypothetical health effects of transportation mode shifts through changes to physical activity, air pollution, and injuries. The purpose of this paper is to describe the implementation of ITHIM in greater Nashville, Tennessee (USA), describe important lessons learned, and serve as an implementation guide for other practitioners and researchers interested in running ITHIM. As might be expected in other metropolitan areas in the US, not all the required calibration data was available locally. We utilized data from local, state, and federal sources to fulfill the 14 ITHIM calibration items, which include disease burdens, travel habits, physical activity participation, air pollution levels, and traffic injuries and fatalities. Three scenarios were developed that modeled stepwise increases in walking and bicycling, and one that modeled reductions in car travel. Cost savings estimates were calculated by scaling national-level, disease-specific direct treatment costs and indirect lost productivity costs to the greater Nashville population of approximately 1.5 million. Implementation required approximately one year of intermittent, part-time work. Across the range of scenarios, results suggested that 24 to 123 deaths per year could be averted in the region through a 1%–5% reduction in the burden of several chronic diseases. This translated into $10–$63 million in estimated direct and indirect cost savings per year. Implementing ITHIM in greater Nashville has provided local decision makers with important information on the potential health effects of transportation choices. Other jurisdictions interested in ITHIM might find the Nashville example as a useful guide to streamline the effort required to calibrate and run the model. PMID:27595067
2015 Cost of Wind Energy Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mone, Christopher; Hand, Maureen; Bolinger, Mark
This report uses representative commercial projects to estimate the levelized cost of energy (LCOE) for both land-based and offshore wind plants in the United States for 2015. Scheduled to be published on an annual basis, the analysis relies on both market and modeled data to maintain an up-to-date understanding of wind generation cost trends and drivers. It is intended to provide insight into current component-level costs and a basis for understanding variability in the LCOE across the industry. Data and tools developed by the National Renewable Energy Laboratory (NREL) are used in this analysis to inform wind technology cost projections,more » goals, and improvement opportunities.« less
2014 Cost of Wind Energy Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mone, Christopher; Stehly, Tyler; Maples, Ben
2015-10-01
This report uses representative commercial projects to estimate the levelized cost of energy (LCOE) for both land-based and offshore wind plants in the United States for 2014. Scheduled to be published on an annual basis, the analysis relies on both market and modeled data to maintain an up-to-date understanding of wind generation cost trends and drivers. It is intended to provide insight into current component-level costs and a basis for understanding variability in the LCOE across the industry. Data and tools developed by the National Renewable Energy Laboratory (NREL) are used in this analysis to inform wind technology cost projections,more » goals, and improvement opportunities.« less
NETIMIS: Dynamic Simulation of Health Economics Outcomes Using Big Data.
Johnson, Owen A; Hall, Peter S; Hulme, Claire
2016-02-01
Many healthcare organizations are now making good use of electronic health record (EHR) systems to record clinical information about their patients and the details of their healthcare. Electronic data in EHRs is generated by people engaged in complex processes within complex environments, and their human input, albeit shaped by computer systems, is compromised by many human factors. These data are potentially valuable to health economists and outcomes researchers but are sufficiently large and complex enough to be considered part of the new frontier of 'big data'. This paper describes emerging methods that draw together data mining, process modelling, activity-based costing and dynamic simulation models. Our research infrastructure includes safe links to Leeds hospital's EHRs with 3 million secondary and tertiary care patients. We created a multidisciplinary team of health economists, clinical specialists, and data and computer scientists, and developed a dynamic simulation tool called NETIMIS (Network Tools for Intervention Modelling with Intelligent Simulation; http://www.netimis.com ) suitable for visualization of both human-designed and data-mined processes which can then be used for 'what-if' analysis by stakeholders interested in costing, designing and evaluating healthcare interventions. We present two examples of model development to illustrate how dynamic simulation can be informed by big data from an EHR. We found the tool provided a focal point for multidisciplinary team work to help them iteratively and collaboratively 'deep dive' into big data.
Engagement and Empowerment Through Self-Service.
Endriss, Jason
2016-01-01
Self-service tools represent the next frontier for leave and disability. This article discusses several critical com- ponents of a successful leave and disability self-service tool. If given the proper investment and thoughtfully designed, self-service tools have the potential to augment an organization's existing interaction channels, im- proving the employee experience while delivering efficiencies for an administrative model. In an operating en- vironment in which cost savings sometimes are at the expense of employee experience, such a win-win solution should not be taken lightly and, more importantly, should not be missed.
Systems Engineering and Integration (SE and I)
NASA Technical Reports Server (NTRS)
Chevers, ED; Haley, Sam
1990-01-01
The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.
The migraine ACE model: evaluating the impact on time lost and medical resource Use.
Caro, J J; Caro, G; Getsios, D; Raggio, G; Burrows, M; Black, L
2000-04-01
To describe the Migraine Adaptive Cost-Effectiveness Model in the context of an analysis of a simulated population of Canadian patients with migraine. The high prevalence of migraine and its substantial impact on patients' ability to function normally present a significant economic burden to society. In light of the recent availability of improved pharmaceutical treatments, a model was developed to assess their economic impact. The Migraine Adaptive Cost-Effectiveness Model incorporates the costs of time lost from both work and nonwork activities, as well as medical resource and medication use. Using Monte Carlo techniques, the model simulates the experience of a population of patients with migraine over the course of 1 year. As an example, analyses of a Canadian population were carried out using data from a multinational trial, surveys, national statistics, and the available literature. Using customary therapy, mean productivity losses (amounting to 84 hours of paid work time, 48 hours of unpaid work time, and 113 hours of leisure time lost) were estimated to cost $1949 (in 1997 Canadian dollars) per patient, with medical expenditures adding an average of $280 to the cost of illness. With customary treatment patterns, the costs of migraine associated with reduced functional capacity are substantial. The migraine model represents a flexible tool for the economic evaluation of different migraine treatments in various populations.
Vector production in an academic environment: a tool to assess production costs.
Boeke, Aaron; Doumas, Patrick; Reeves, Lilith; McClurg, Kyle; Bischof, Daniela; Sego, Lina; Auberry, Alisha; Tatikonda, Mohan; Cornetta, Kenneth
2013-02-01
Generating gene and cell therapy products under good manufacturing practices is a complex process. When determining the cost of these products, researchers must consider the large number of supplies used for manufacturing and the personnel and facility costs to generate vector and maintain a cleanroom facility. To facilitate cost estimates, the Indiana University Vector Production Facility teamed with the Indiana University Kelley School of Business to develop a costing tool that, in turn, provides pricing. The tool is designed in Microsoft Excel and is customizable to meet the needs of other core facilities. It is available from the National Gene Vector Biorepository. The tool allows cost determinations using three different costing methods and was developed in an effort to meet the A21 circular requirements for U.S. core facilities performing work for federally funded projects. The costing tool analysis reveals that the cost of vector production does not have a linear relationship with batch size. For example, increasing the production from 9 to18 liters of a retroviral vector product increases total costs a modest 1.2-fold rather than doubling in total cost. The analysis discussed in this article will help core facilities and investigators plan a cost-effective strategy for gene and cell therapy production.
Vector Production in an Academic Environment: A Tool to Assess Production Costs
Boeke, Aaron; Doumas, Patrick; Reeves, Lilith; McClurg, Kyle; Bischof, Daniela; Sego, Lina; Auberry, Alisha; Tatikonda, Mohan
2013-01-01
Abstract Generating gene and cell therapy products under good manufacturing practices is a complex process. When determining the cost of these products, researchers must consider the large number of supplies used for manufacturing and the personnel and facility costs to generate vector and maintain a cleanroom facility. To facilitate cost estimates, the Indiana University Vector Production Facility teamed with the Indiana University Kelley School of Business to develop a costing tool that, in turn, provides pricing. The tool is designed in Microsoft Excel and is customizable to meet the needs of other core facilities. It is available from the National Gene Vector Biorepository. The tool allows cost determinations using three different costing methods and was developed in an effort to meet the A21 circular requirements for U.S. core facilities performing work for federally funded projects. The costing tool analysis reveals that the cost of vector production does not have a linear relationship with batch size. For example, increasing the production from 9 to18 liters of a retroviral vector product increases total costs a modest 1.2-fold rather than doubling in total cost. The analysis discussed in this article will help core facilities and investigators plan a cost-effective strategy for gene and cell therapy production. PMID:23360377
Simscape Modeling Verification in the Simulink Development Environment
NASA Technical Reports Server (NTRS)
Volle, Christopher E. E.
2014-01-01
The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.
NASA Technical Reports Server (NTRS)
Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray
1992-01-01
Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.
NASA Astrophysics Data System (ADS)
Al-Alawi, Baha Mohammed
Plug-in hybrid electric vehicles (PHEVs) are an emerging automotive technology that has the capability to reduce transportation environmental impacts, but at an increased production cost. PHEVs can draw and store energy from an electric grid and consequently show reductions in petroleum consumption, air emissions, ownership costs, and regulation compliance costs, and various other externalities. Decision makers in the policy, consumer, and industry spheres would like to understand the impact of HEV and PHEV technologies on the U.S. vehicle fleets, but to date, only the disciplinary characteristics of PHEVs been considered. The multidisciplinary tradeoffs between vehicle energy sources, policy requirements, market conditions, consumer preferences and technology improvements are not well understood. For example, the results of recent studies have posited the importance of PHEVs to the future US vehicle fleet. No studies have considered the value of PHEVs to automakers and policy makers as a tool for achieving US corporate average fuel economy (CAFE) standards which are planned to double by 2030. Previous studies have demonstrated the cost and benefit of PHEVs but there is no study that comprehensively accounts for the cost and benefits of PHEV to consumers. The diffusion rate of hybrid electric vehicle (HEV) and PHEV technology into the marketplace has been estimated by existing studies using various tools and scenarios, but results show wide variations between studies. There is no comprehensive modeling study that combines policy, consumers, society and automakers in the U.S. new vehicle sales cost and benefits analysis. The aim of this research is to build a potential framework that can simulate and optimize the benefits of PHEVs for a multiplicity of stakeholders. This dissertation describes the results of modeling that integrates the effects of PHEV market penetration on policy, consumer and economic spheres. A model of fleet fuel economy and CAFE compliance for a large US automaker will be developed. A comprehensive total cost of ownership model will be constructed to calculate and compare the cost and benefits of PHEVs, conventional vehicles (CVs) and HEVs. Then a comprehensive literature review of PHEVs penetration rate studies will be developed to review and analyze the primary purposes, methods, and results of studies of PHEV market penetration. Finally a multi-criteria modeling system will incorporate results of the support model results. In this project, the models, analysis and results will provide a broader understanding of the benefits and costs of PHEV technology and the parties to whom those benefits accrue. The findings will provide important information for consumers, automakers and policy makers to understand and define HEVs and PHEVs costs, benefits, expected penetration rate and the preferred vehicle design and technology scenario to meet the requirements of policy, society, industry and consumers.
Lessons learned in deploying software estimation technology and tools
NASA Technical Reports Server (NTRS)
Panlilio-Yap, Nikki; Ho, Danny
1994-01-01
Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.
TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling
NASA Astrophysics Data System (ADS)
Nelson, J.; Jones, N.; Ames, D. P.
2015-12-01
Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.
Group Prenatal Care: A Financial Perspective.
Rowley, Rebecca A; Phillips, Lindsay E; O'Dell, Lisa; Husseini, Racha El; Carpino, Sarah; Hartman, Scott
2016-01-01
Multiple studies have demonstrated improved perinatal outcomes for group prenatal care (GPC) when compared to traditional prenatal care. Benefits of GPC include lower rates of prematurity and low birth weight, fewer cesarean deliveries, improved breastfeeding outcomes and improved maternal satisfaction with care. However, the outpatient financial costs of running a GPC program are not well established. This study involved the creation of a financial model that forecasted costs and revenues for prenatal care groups with various numbers of participants based on numerous variables, including patient population, payor mix, patient show rates, staffing mix, supply usage and overhead costs. The model was developed for use in an urban underserved practice. Adjusted revenue per pregnancy in this model was found to be $989.93 for traditional care and $1080.69 for GPC. Cost neutrality for GPC was achieved when each group enrolled an average of 10.652 women with an enriched staffing model or 4.801 women when groups were staffed by a single nurse and single clinician. Mathematical cost-benefit modeling in an urban underserved practice demonstrated that GPC can be not only financially sustainable but possibly a net income generator for the outpatient clinic. Use of this model could offer maternity care practices an important tool for demonstrating the financial practicality of GPC.
Elements of Designing for Cost
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1992-01-01
During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.
Elements of designing for cost
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1992-01-01
During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity-based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.
Extending simulation modeling to activity-based costing for clinical procedures.
Glick, N D; Blackmore, C C; Zelman, W N
2000-04-01
A simulation model was developed to measure costs in an Emergency Department setting for patients presenting with possible cervical-spine injury who needed radiological imaging. Simulation, a tool widely used to account for process variability but typically focused on utilization and throughput analysis, is being introduced here as a realistic means to perform an activity-based-costing (ABC) analysis, because traditional ABC methods have difficulty coping with process variation in healthcare. Though the study model has a very specific application, it can be generalized to other settings simply by changing the input parameters. In essence, simulation was found to be an accurate and viable means to conduct an ABC analysis; in fact, the output provides more complete information than could be achieved through other conventional analyses, which gives management more leverage with which to negotiate contractual reimbursements.
NASA Astrophysics Data System (ADS)
Christian, Paul M.; Wells, Randy
2001-09-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
R.T. McNider; C. Handyside; K. Doty; W.L. Ellenburg; J.F. Cruise; J.R. Christy; D. Moss; V. Sharda; G. Hoogenboom; Peter Caldwell
2015-01-01
The present paper discusses a coupled gridded crop modeling and hydrologic modeling system that can examine the benefits of irrigation and costs of irrigation and the coincident impact of the irrigation water withdrawals on surface water hydrology. The system is applied to the Southeastern U.S. The system tools to be discussed include a gridded version (GriDSSAT) of...
Alisa A. Wade; Kevin S. McKelvey; Michael K. Schwartz
2015-01-01
Resistance-surface-based connectivity modeling has become a widespread tool for conservation planning. The current ease with which connectivity models can be created, however, masks the numerous untested assumptions underlying both the rules that produce the resistance surface and the algorithms used to locate low-cost paths across the target landscape. Here we present...
Assessing the potential of economic instruments for managing drought risk at river basin scale
NASA Astrophysics Data System (ADS)
Pulido-Velazquez, M.; Lopez-Nicolas, A.; Macian-Sorribes, H.
2015-12-01
Economic instruments work as incentives to adapt individual decisions to collectively agreed goals. Different types of economic instruments have been applied to manage water resources, such as water-related taxes and charges (water pricing, environmental taxes, etc.), subsidies, markets or voluntary agreements. Hydroeconomic models (HEM) provide useful insight on optimal strategies for coping with droughts by simultaneously analysing engineering, hydrology and economics of water resources management. We use HEMs for evaluating the potential of economic instruments on managing drought risk at river basin scale, considering three criteria for assessing drought risk: reliability, resilience and vulnerability. HEMs allow to calculate water scarcity costs as the economic losses due to water deliveries below the target demands, which can be used as a vulnerability descriptor of drought risk. Two generic hydroeconomic DSS tools, SIMGAMS and OPTIGAMS ( both programmed in GAMS) have been developed to evaluate water scarcity cost at river basin scale based on simulation and optimization approaches. The simulation tool SIMGAMS allocates water according to the system priorities and operating rules, and evaluate the scarcity costs using economic demand functions. The optimization tool allocates water resources for maximizing net benefits (minimizing total water scarcity plus operating cost of water use). SIMGAS allows to simulate incentive water pricing policies based on water availability in the system (scarcity pricing), while OPTIGAMS is used to simulate the effect of ideal water markets by economic optimization. These tools have been applied to the Jucar river system (Spain), highly regulated and with high share of water use for crop irrigation (greater than 80%), where water scarcity, irregular hydrology and groundwater overdraft cause droughts to have significant economic, social and environmental consequences. An econometric model was first used to explain the variation of the production value of irrigated agriculture during droughts, assessing revenue responses to varying crop prices and water availability. Hydroeconomic approaches were then used to show the potential of economic instruments in setting incentives for a more efficient management of water resources systems.
The Locus of Tool-Transformation Costs
ERIC Educational Resources Information Center
Kunde, Wilfried; Pfister, Roland; Janczyk, Markus
2012-01-01
Transformations of hand movements by tools such as levers or electronic input devices can invoke performance costs compared to untransformed movements. This study investigated by means of the Psychological Refractory Period (PRP) paradigm at which stage of information processing such tool-transformation costs arise. We used an inversion…
NASA Astrophysics Data System (ADS)
Chen, J.; Xue, L.
2012-06-01
This paper summarizes our research on laser cladding of high-vanadium CPM® tool steels (3V, 9V, and 15V) onto the surfaces of low-cost hardened H13 hot-work tool steel to substantially enhance resistance against abrasive wear. The results provide great potential for fabricating high-performance automotive tooling (including molds and dies) at affordable cost. The microstructure and hardness development of the laser-clad tool steels so obtained are presented as well.
A cellular automaton model for evacuation flow using game theory
NASA Astrophysics Data System (ADS)
Guan, Junbiao; Wang, Kaihua; Chen, Fangyue
2016-11-01
Game theory serves as a good tool to explore crowd dynamic conflicts during evacuation processes. The purpose of this study is to simulate the complicated interaction behavior among the conflicting pedestrians in an evacuation flow. Two types of pedestrians, namely, defectors and cooperators, are considered, and two important factors including fear index and cost coefficient are taken into account. By combining the snowdrift game theory with a cellular automaton (CA) model, it is shown that the increase of fear index and cost coefficient will lengthen the evacuation time, which is more apparent for large values of cost coefficient. Meanwhile, it is found that the defectors to cooperators ratio could always tend to consistent states despite different values of parameters, largely owing to self-organization effects.
NASA Astrophysics Data System (ADS)
Zbiciak, R.; Grabowik, C.; Janik, W.
2015-11-01
The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing costs estimation has been designed properly was made by comparison of the achieved from the algorithm results with those obtained from industry. This verification has indicated that in most cases both group of results are similar. Taking into account above it is possible to draw a conclusion that the Cost module might play significant role in design constructional process by adding an engineer at the selection stage of alternative gear wheels design. It should be remembered that real manufacturing cost can differ significantly according to available in a factory manufacturing techniques and stock of machine tools.
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
The software-cycle model for re-engineering and reuse
NASA Technical Reports Server (NTRS)
Bailey, John W.; Basili, Victor R.
1992-01-01
This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.
Depth of manual dismantling analysis: A cost–benefit approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achillas, Ch., E-mail: c.achillas@ihu.edu.gr; Aidonis, D.; Vlachokostas, Ch.
Highlights: ► A mathematical modeling tool for OEMs. ► The tool can be used by OEMs, recyclers of electr(on)ic equipment or WEEE management systems’ regulators. ► The tool makes use of cost–benefit analysis in order to determine the optimal depth of product disassembly. ► The reusable materials and the quantity of metals and plastics recycled can be quantified in an easy-to-comprehend manner. - Abstract: This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in ordermore » to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models’ applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product’s components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93–23.06 €, depending on the level of disassembly.« less
NASA Technical Reports Server (NTRS)
Westra, Doug G.; West, Jeffrey S.; Richardson, Brian R.
2015-01-01
Historically, the analysis and design of liquid rocket engines (LREs) has relied on full-scale testing and one-dimensional empirical tools. The testing is extremely expensive and the one-dimensional tools are not designed to capture the highly complex, and multi-dimensional features that are inherent to LREs. Recent advances in computational fluid dynamics (CFD) tools have made it possible to predict liquid rocket engine performance, stability, to assess the effect of complex flow features, and to evaluate injector-driven thermal environments, to mitigate the cost of testing. Extensive efforts to verify and validate these CFD tools have been conducted, to provide confidence for using them during the design cycle. Previous validation efforts have documented comparisons of predicted heat flux thermal environments with test data for a single element gaseous oxygen (GO2) and gaseous hydrogen (GH2) injector. The most notable validation effort was a comprehensive validation effort conducted by Tucker et al. [1], in which a number of different groups modeled a GO2/GH2 single element configuration by Pal et al [2]. The tools used for this validation comparison employed a range of algorithms, from both steady and unsteady Reynolds Averaged Navier-Stokes (U/RANS) calculations, large-eddy simulations (LES), detached eddy simulations (DES), and various combinations. A more recent effort by Thakur et al. [3] focused on using a state-of-the-art CFD simulation tool, Loci/STREAM, on a two-dimensional grid. Loci/STREAM was chosen because it has a unique, very efficient flamelet parameterization of combustion reactions that are too computationally expensive to simulate with conventional finite-rate chemistry calculations. The current effort focuses on further advancement of validation efforts, again using the Loci/STREAM tool with the flamelet parameterization, but this time with a three-dimensional grid. Comparisons to the Pal et al. heat flux data will be made for both RANS and Hybrid RANSLES/ Detached Eddy simulations (DES). Computation costs will be reported, along with comparison of accuracy and cost to much less expensive two-dimensional RANS simulations of the same geometry.
The Lunar Mapping and Modeling Project
NASA Technical Reports Server (NTRS)
Noble, Sarah; French, Raymond; Nall, Mark; Muery, Kimberly
2009-01-01
LMMP was initiated in 2007 to help in making the anticipated results of the LRO spacecraft useful and accessible to Constellation. The LMMP is managing and developing a suite of lunar mapping and modeling tools and products that support the Constellation Program (CxP) and other lunar exploration activities. In addition to the LRO Principal Investigators, relevant activities and expertise that had already been funded by NASA was identified at ARC, CRREL (Army Cold Regions Research & Engineering Laboratory), GSFC, JPL, & USGS. LMMP is a cost capped, design-to-cost project (Project budget was established prior to obtaining Constellation needs)
NASA Technical Reports Server (NTRS)
Rosenberg, L. S.; Revere, W. R.; Selcuk, M. K.
1981-01-01
Small solar thermal power systems (up to 10 MWe in size) were tested. The solar thermal power plant ranking study was performed to aid in experiment activity and support decisions for the selection of the most appropriate technological approach. The cost and performance were determined for insolation conditions by utilizing the Solar Energy Simulation computer code (SESII). This model optimizes the size of the collector field and energy storage subsystem for given engine generator and energy transport characteristics. The development of the simulation tool, its operation, and the results achieved from the analysis are discussed.
Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Templeton, Jeremy Alan; Blaylock, Myra L.; Domino, Stefan P.
2015-09-01
The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.
Architecting Science: Practical Tools for Architecting Flexible Systems
2009-08-31
States of America and many other nations. Typically, designers would consider objectives such as cost, mass , volume, and capability, and the satellite...that minimized mass , volume and cost at the greatest capability would be the optimal design. However, the lifetime of the system can often be measured...described in terms of geometric and mass properties. Figure l shows the various components of the MAV airframe. A physical model of the air vehicle
2013-06-30
QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC
Digital telephony analysis model and issues
NASA Astrophysics Data System (ADS)
Keuthan, Lynn M.
1995-09-01
Experts in the fields of digital telephony and communications security have stated the need for an analytical tool for evaluating complex issues. Some important policy issues discussed by experts recently include implementing digital wire-taps, implementation of the 'Clipper Chip', required registration of encryption/decryption keys, and export control of cryptographic equipment. Associated with the implementation of these policies are direct costs resulting from implementation, indirect cost benefits from implementation, and indirect costs resulting from the risks of implementation or factors reducing cost benefits. Presented herein is a model for analyzing digital telephony policies and systems and their associated direct costs and indirect benefit and risk factors. In order to present the structure of the model, issues of national importance and business-related issues are discussed. The various factors impacting the implementation of the associated communications systems and communications security are summarized, and various implementation tradeoffs are compared based on economic benefits/impact. The importance of the issues addressed herein, as well as other digital telephony issues, has greatly increased with the enormous increases in communication system connectivity due to the advance of the National Information Infrastructure.
A user-friendly, open-source tool to project impact and cost of diagnostic tests for tuberculosis
Dowdy, David W; Andrews, Jason R; Dodd, Peter J; Gilman, Robert H
2014-01-01
Most models of infectious diseases, including tuberculosis (TB), do not provide results customized to local conditions. We created a dynamic transmission model to project TB incidence, TB mortality, multidrug-resistant (MDR) TB prevalence, and incremental costs over 5 years after scale-up of nine alternative diagnostic strategies. A corresponding web-based interface allows users to specify local costs and epidemiology. In settings with little capacity for up-front investment, same-day microscopy had the greatest impact on TB incidence and became cost-saving within 5 years if delivered at $10/test. With greater initial investment, population-level scale-up of Xpert MTB/RIF or microcolony-based culture often averted 10 times more TB cases than narrowly-targeted strategies, at minimal incremental long-term cost. Xpert for smear-positive TB had reasonable impact on MDR-TB incidence, but at substantial price and little impact on overall TB incidence and mortality. This user-friendly modeling framework improves decision-makers' ability to evaluate the local impact of TB diagnostic strategies. DOI: http://dx.doi.org/10.7554/eLife.02565.001 PMID:24898755
User's manual for The TIM benefit-cost (TIM-BC) Tool (Version: 1.0.0)
DOT National Transportation Integrated Search
2015-07-04
This document serves as a users manual for the Traffic Incident Management Benefit-Cost Tool (TIM-BC) Version 1.0.0 - Safety Service Patrol Benefit-Cost (SSP-BC) Tool, which is used to assist State and local engineers and decisionmakers with evalu...
Appendix W. Cost Analysis in Teacher Education Programs.
ERIC Educational Resources Information Center
Sell, G. Roger; And Others
This paper is an introduction to the basic cost-related tools available to management for planning, evaluating, and organizing resources for the purpose of achieving objectives within a teacher education preparation program. Three tools are presented in separate sections. Part I on the cost accounting tool for identifying, categorizing, and…
National Water Quality Benefits
This project will provide the basis for advancing the goal of producing tools in support of quantifying and valuing changes in water quality for EPA regulations. It will also identify specific data and modeling gaps and Improve benefits estimation for more complete benefit-cost a...
NASA Technical Reports Server (NTRS)
ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.
2005-01-01
The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.
System-level perturbations of cell metabolism using CRISPR/Cas9
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakočiūnas, Tadas; Jensen, Michael K.; Keasling, Jay D.
CRISPR/Cas9 (clustered regularly interspaced palindromic repeats and the associated protein Cas9) techniques have made genome engineering and transcriptional reprogramming studies much more advanced and cost-effective. For metabolic engineering purposes, the CRISPR-based tools have been applied to single and multiplex pathway modifications and transcriptional regulations. The effectiveness of these tools allows researchers to implement genome-wide perturbations, test model-guided genome editing strategies, and perform transcriptional reprogramming perturbations in a more advanced manner than previously possible. In this mini-review we highlight recent studies adopting CRISPR/Cas9 for systems-level perturbations and model-guided metabolic engineering.
Computational Process Modeling for Additive Manufacturing (OSU)
NASA Technical Reports Server (NTRS)
Bagg, Stacey; Zhang, Wei
2015-01-01
Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.
NASA Astrophysics Data System (ADS)
Loomis, John
2002-06-01
A travel cost demand model that uses intended trips if dams are removed and the river restored is presented as a tool for evaluating the potential recreation benefits in this counterfactual but increasingly policy relevant analysis of dam removal. The model is applied to the Lower Snake River in Washington using data from mail surveys of households in the Pacific Northwest region. Five years after dam removal, about 1.5 million visitor days are estimated, with this number growing to 2.5 million annually during years 20-100. Using the travel cost method model estimate of the value of river recreation, if the four dams are removed and the 225 km river is restored, the annualized benefits at a 6.875% discount rate would be $310 million. This gain in river recreation exceeds the loss of reservoir recreation but is about $60 million less than the total costs of the dam removal alternative. The analysis suggests this extension of the standard travel cost method may be suitable for evaluating the gain in river recreation associated with restoration of river systems from dam removal or associated with dam relicensing conditions.
Cost-effectiveness modelling in diagnostic imaging: a stepwise approach.
Sailer, Anna M; van Zwam, Wim H; Wildberger, Joachim E; Grutters, Janneke P C
2015-12-01
Diagnostic imaging (DI) is the fastest growing sector in medical expenditures and takes a central role in medical decision-making. The increasing number of various and new imaging technologies induces a growing demand for cost-effectiveness analysis (CEA) in imaging technology assessment. In this article we provide a comprehensive framework of direct and indirect effects that should be considered for CEA in DI, suitable for all imaging modalities. We describe and explain the methodology of decision analytic modelling in six steps aiming to transfer theory of CEA to clinical research by demonstrating key principles of CEA in a practical approach. We thereby provide radiologists with an introduction to the tools necessary to perform and interpret CEA as part of their research and clinical practice. • DI influences medical decision making, affecting both costs and health outcome. • This article provides a comprehensive framework for CEA in DI. • A six-step methodology for conducting and interpreting cost-effectiveness modelling is proposed.
A model to minimize joint total costs for industrial waste producers and waste management companies.
Tietze-Stöckinger, Ingela; Fichtner, Wolf; Rentz, Otto
2004-12-01
The model LINKopt is a mixed-integer, linear programming model for mid- and long-term planning of waste management options on an inter-company level. There has been a large increase in the transportation of waste material in Germany, which has been attributed to the implementation of the European Directive 75/442/EEC on waste. Similar situations are expected to emerge in other European countries. The model LINKopt has been developed to determine a waste management system with minimal decision-relevant costs considering transportation, handling, storage and treatment of waste materials. The model can serve as a tool to evaluate various waste management strategies and to obtain the optimal combination of investment options. In addition to costs, ecological aspects are considered by determining the total mileage associated with the waste management system. The model has been applied to a German case study evaluating different investment options for a co-operation between Daimler-Chrysler AG at Rastatt, its suppliers, and the waste management company SITA P+R GmbH. The results show that the installation of waste management facilities at the premises of the waste producer would lead to significant reductions in costs and transportation.
Trends and Issues in Fuzzy Control and Neuro-Fuzzy Modeling
NASA Technical Reports Server (NTRS)
Chiu, Stephen
1996-01-01
Everyday experience in building and repairing things around the home have taught us the importance of using the right tool for the right job. Although we tend to think of a 'job' in broad terms, such as 'build a bookcase,' we understand well that the 'right job' associated with each 'right tool' is typically a narrowly bounded subtask, such as 'tighten the screws.' Unfortunately, we often lose sight of this principle when solving engineering problems; we treat a broadly defined problem, such as controlling or modeling a system, as a narrow one that has a single 'right tool' (e.g., linear analysis, fuzzy logic, neural network). We need to recognize that a typical real-world problem contains a number of different sub-problems, and that a truly optimal solution (the best combination of cost, performance and feature) is obtained by applying the right tool to the right sub-problem. Here I share some of my perspectives on what constitutes the 'right job' for fuzzy control and describe recent advances in neuro-fuzzy modeling to illustrate and to motivate the synergistic use of different tools.
Tools for studying dry-cured ham processing by using computed tomography.
Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena
2012-01-11
An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.
Foundations of the Bandera Abstraction Tools
NASA Technical Reports Server (NTRS)
Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby
2003-01-01
Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.
Factors affecting economies of scale in combined sewer systems.
Maurer, Max; Wolfram, Martin; Anja, Herlyn
2010-01-01
A generic model is introduced that represents the combined sewer infrastructure of a settlement quantitatively. A catchment area module first calculates the length and size distribution of the required sewer pipes on the basis of rain patterns, housing densities and area size. These results are fed into the sewer-cost module in order to estimate the combined sewer costs of the entire catchment area. A detailed analysis of the relevant input parameters for Swiss settlements is used to identify the influence of size on costs. The simulation results confirm that an economy of scale exists for combined sewer systems. This is the result of two main opposing cost factors: (i) increased construction costs for larger sewer systems due to larger pipes and increased rain runoff in larger settlements, and (ii) lower costs due to higher population and building densities in larger towns. In Switzerland, the more or less organically grown settlement structures and limited land availability emphasise the second factor to show an apparent economy of scale. This modelling approach proved to be a powerful tool for understanding the underlying factors affecting the cost structure for water infrastructures.
NASA Astrophysics Data System (ADS)
S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr
2014-03-01
An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.
Maru, Shoko; Byrnes, Joshua; Whitty, Jennifer A; Carrington, Melinda J; Stewart, Simon; Scuffham, Paul A
2015-02-01
The reported cost effectiveness of cardiovascular disease management programs (CVD-MPs) is highly variable, potentially leading to different funding decisions. This systematic review evaluates published modeled analyses to compare study methods and quality. Articles were included if an incremental cost-effectiveness ratio (ICER) or cost-utility ratio (ICUR) was reported, it is a multi-component intervention designed to manage or prevent a cardiovascular disease condition, and it addressed all domains specified in the American Heart Association Taxonomy for Disease Management. Nine articles (reporting 10 clinical outcomes) were included. Eight cost-utility and two cost-effectiveness analyses targeted hypertension (n=4), coronary heart disease (n=2), coronary heart disease plus stoke (n=1), heart failure (n=2) and hyperlipidemia (n=1). Study perspectives included the healthcare system (n=5), societal and fund holders (n=1), a third party payer (n=3), or was not explicitly stated (n=1). All analyses were modeled based on interventions of one to two years' duration. Time horizon ranged from two years (n=1), 10 years (n=1) and lifetime (n=8). Model structures included Markov model (n=8), 'decision analytic models' (n=1), or was not explicitly stated (n=1). Considerable variation was observed in clinical and economic assumptions and reporting practices. Of all ICERs/ICURs reported, including those of subgroups (n=16), four were above a US$50,000 acceptability threshold, six were below and six were dominant. The majority of CVD-MPs was reported to have favorable economic outcomes, but 25% were at unacceptably high cost for the outcomes. Use of standardized reporting tools should increase transparency and inform what drives the cost-effectiveness of CVD-MPs. © The European Society of Cardiology 2014.
The ABCs of Activity-Based Costing: A Cost Containment and Reallocation Tool.
ERIC Educational Resources Information Center
Turk, Frederick J.
1992-01-01
This article describes activity-based costing (ABC) and how this tool may help management understand the costs of major activities and identify possible alternatives. Also discussed are the traditional costing systems used by higher education and ways of applying ABC to higher education. (GLR)
Fault management for the Space Station Freedom control center
NASA Technical Reports Server (NTRS)
Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet
1992-01-01
This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.
Space Station needs, attributes and architectural options, volume 2, book 3: Cost and programmatics
NASA Technical Reports Server (NTRS)
1983-01-01
The cost and programmatic considerations which integrate mission requirements and architectural options into a cohesive system for exploitation of space opportunities within affordable limits are discussed. The mission requirements, baseline architecture, a top level baseline schedule, and acquisition costs are summarized. The work breakdown structure (WBS) used to structure the program, and the WBS dictionary are included. The costing approach used, including the operation of the primary costing tool, the SPACE cost model are described. The rationale for the choice of cost estimating relationships is given and costs at the module level are shown. Detailed costs at the subsystem level are shown. The baseline schedule and annual funding profiles are provided. Alternate schedules are developed to provide different funding profiles. Alternate funding sources are discussed and foreign and contractor participation is outlined. The results of the benefit analysis are given and the accrued benefits deriving from an implemented space station program are outlined.
NASA Astrophysics Data System (ADS)
Maringanti, Chetan; Chaubey, Indrajeet; Popp, Jennie
2009-06-01
Best management practices (BMPs) are effective in reducing the transport of agricultural nonpoint source pollutants to receiving water bodies. However, selection of BMPs for placement in a watershed requires optimization of the available resources to obtain maximum possible pollution reduction. In this study, an optimization methodology is developed to select and place BMPs in a watershed to provide solutions that are both economically and ecologically effective. This novel approach develops and utilizes a BMP tool, a database that stores the pollution reduction and cost information of different BMPs under consideration. The BMP tool replaces the dynamic linkage of the distributed parameter watershed model during optimization and therefore reduces the computation time considerably. Total pollutant load from the watershed, and net cost increase from the baseline, were the two objective functions minimized during the optimization process. The optimization model, consisting of a multiobjective genetic algorithm (NSGA-II) in combination with a watershed simulation tool (Soil Water and Assessment Tool (SWAT)), was developed and tested for nonpoint source pollution control in the L'Anguille River watershed located in eastern Arkansas. The optimized solutions provided a trade-off between the two objective functions for sediment, phosphorus, and nitrogen reduction. The results indicated that buffer strips were very effective in controlling the nonpoint source pollutants from leaving the croplands. The optimized BMP plans resulted in potential reductions of 33%, 32%, and 13% in sediment, phosphorus, and nitrogen loads, respectively, from the watershed.
The cost-effectiveness of an intensive treatment protocol for severe dyslexia in children.
Hakkaart-van Roijen, Leona; Goettsch, Wim G; Ekkebus, Michel; Gerretsen, Patty; Stolk, Elly A
2011-08-01
Studies of interventions for dyslexia have focused entirely on outcomes related to literacy. In this study, we considered a broader picture assessing improved quality of life compared with costs. A model served as a tool to compare costs and effects of treatment according to a new protocol and care as usual. Quality of life was measured and valued by proxies using a general quality-of-life instrument (EQ-5D). We considered medical cost and non-medical cost (e.g. remedial teaching). The model computed cost per successful treatment and cost per quality adjusted life year (QALY) in time. About 75% of the total costs was related to diagnostic tests to distinguish between children with severe dyslexia and children who have reading difficulties for other reasons. The costs per successful treatment of severe dyslexia were €36 366. Successful treatment showed a quality-of-life gain of about 11%. At primary school, the average cost per QALY for severe dyslexia amounted to €58 647. In the long term, the cost per QALY decreased to €26 386 at secondary school and €17 663 thereafter. The results of this study provide evidence that treatment of severe dyslexia is cost-effective when the investigated protocol is followed. Copyright © 2011 John Wiley & Sons, Ltd.
Azman, Andrew S; Golub, Jonathan E; Dowdy, David W
2014-10-30
Current approaches are unlikely to achieve the aggressive global tuberculosis (TB) control targets set for 2035 and beyond. Active case finding (ACF) may be an important tool for augmenting existing strategies, but the cost-effectiveness of ACF remains uncertain. Program evaluators can often measure the cost of ACF per TB case detected, but how this accessible measure translates into traditional metrics of cost-effectiveness, such as the cost per disability-adjusted life year (DALY), remains unclear. We constructed dynamic models of TB in India, China, and South Africa to explore the medium-term impact and cost-effectiveness of generic ACF activities, conceptualized separately as discrete (2-year) campaigns and as continuous activities integrated into ongoing TB control programs. Our primary outcome was the cost per DALY, measured in relationship to the cost per TB case actively detected and started on treatment. Discrete campaigns costing up to $1,200 (95% uncertainty range [UR] 850-2,043) per case actively detected and started on treatment in India, $3,800 (95% UR 2,706-6,392) in China, and $9,400 (95% UR 6,957-13,221) in South Africa were all highly cost-effective (cost per DALY averted less than per capita gross domestic product). Prolonged integration was even more effective and cost-effective. Short-term assessments of ACF dramatically underestimated potential longer term gains; for example, an assessment of an ACF program at 2 years might find a non-significant 11% reduction in prevalence, but a 10-year evaluation of that same intervention would show a 33% reduction. ACF can be a powerful and highly cost-effective tool in the fight against TB. Given that short-term assessments may dramatically underestimate medium-term effectiveness, current willingness to pay may be too low. ACF should receive strong consideration as a basic tool for TB control in most high-burden settings, even when it may cost over $1,000 to detect and initiate treatment for each extra case of active TB.
Rationality Validation of a Layered Decision Model for Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Huaqiang; Alves-Foss, James; Zhang, Du
2007-08-31
We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDMmore » rationality through simulation.« less
Process Cost Modeling for Multi-Disciplinary Design Optimization
NASA Technical Reports Server (NTRS)
Bao, Han P.; Freeman, William (Technical Monitor)
2002-01-01
For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to highlight their inappropriateness for what is really needed at the conceptual phase of the design process. The First-Order Process Velocity Cost Model (FOPV) is discussed at length in the next section. This is followed by an application of the FOPV cost model to a generic wing. For designs that have no precedence as far as acquisition costs are concerned, cost data derived from the FOPV cost model may not be accurate enough because of new requirements for shape complexity, material, equipment and precision/tolerance. The concept of Cost Modulus is introduced at this point to compensate for these new burdens on the basic processes. This is treated in section 5. The cost of a design must be conveniently linked to its CAD representation. The interfacing of CAD models and spreadsheets containing the cost equations is the subject of the next section, section 6. The last section of the report is a summary of the progress made so far, and the anticipated research work to be achieved in the future.
Re-engineering the mission life cycle with ABC and IDEF
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Rackley, Michael; Karlin, Jay
1994-01-01
The theory behind re-engineering a business process is to remove the non-value added activities thereby lowering the process cost. In order to achieve this, one must be able to identify where the non-value added elements are located which is not a trivial task. This is because the non-value added elements are often hidden in the form of overhead and/or pooled resources. In order to be able to isolate these non-value added processes from among the other processes, one must first decompose the overall top level process into lower layers of sub-processes. In addition, costing data must be assigned to each sub-process along with the value the sub-process adds towards the final product. IDEF0 is a Federal Information Processing Standard (FIPS) process-modeling tool that allows for this functional decomposition through structured analysis. In addition, it illustrates the relationship of the process and the value added to the product or service. The value added portion is further defined in IDEF1X which is an entity relationship diagramming tool. The entity relationship model is the blueprint of the product as it moves along the 'assembly line' and therefore relates all of the parts to each other and the final product. It also relates the parts to the tools that produce the product and all of the paper work that is used in their acquisition. The use of IDEF therefore facilitates the use of Activity Based Costing (ABC). ABC is an essential method in a high variety, product-customizing environment, to facilitate rapid response to externally caused change. This paper describes the work being done in the Mission Operations Division to re-engineer the development and operation life cycle of Mission Operations Centers using these tools.
NASA Technical Reports Server (NTRS)
Long, Dou; Lee, David; Johnson, Jesse; Gaier, Eric; Kostiuk, Peter
1999-01-01
This report describes an integrated model of air traffic management (ATM) tools under development in two National Aeronautics and Space Administration (NASA) programs -Terminal Area Productivity (TAP) and Advanced Air Transport Technologies (AATT). The model is made by adjusting parameters of LMINET, a queuing network model of the National Airspace System (NAS), which the Logistics Management Institute (LMI) developed for NASA. Operating LMINET with models of various combinations of TAP and AATT will give quantitative information about the effects of the tools on operations of the NAS. The costs of delays under different scenarios are calculated. An extension of Air Carrier Investment Model (ACIM) under ASAC developed by the Institute for NASA maps the technologies' impacts on NASA operations into cross-comparable benefits estimates for technologies and sets of technologies.
Ramos-Jiliberto, Rodrigo; González-Olivares, Eduardo; Bozinovic, Francisco
2002-08-01
We present a predator-prey metaphysiological model, based on the available behavioral and physiological information of the sigmodontine rodent Phyllotis darwini. The model is focused on the population-level consequences of the antipredator behavior, performed by the rodent population, which is assumed to be an inducible response of predation avoidance. The decrease in vulnerability is explicitly considered to have two associated costs: a decreasing foraging success and an increasing metabolic loss. The model analysis was carried out on a reduced form of the system by means of numerical and analytical tools. We evaluated the stability properties of equilibrium points in the phase plane, and carried out bifurcation analyses of rodent equilibrium density under varying conditions of three relevant parameters. The bifurcation parameters chosen represent predator avoidance effectiveness (A), foraging cost of antipredator behavior (C(1)'), and activity-metabolism cost (C(4)'). Our analysis suggests that the trade-offs involved in antipredator behavior plays a fundamental role in the stability properties of the system. Under conditions of high foraging cost, stability decreases as antipredator effectiveness increases. Under the complementary scenario (not considering the highest foraging costs), the equilibria are either stable when both costs are low, or unstable when both costs are higher, independent of antipredator effectiveness. No evidence of stabilizing effects of antipredator behavior was found. Copyright 2002 Elsevier Science (USA).
Calibration of a COTS Integration Cost Model Using Local Project Data
NASA Technical Reports Server (NTRS)
Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David
1997-01-01
The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.
Manufacturing Cost Levelization Model – A User’s Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrow, William R.; Shehabi, Arman; Smith, Sarah Josephine
The Manufacturing Cost Levelization Model is a cost-performance techno-economic model that estimates total large-scale manufacturing costs for necessary to produce a given product. It is designed to provide production cost estimates for technology researchers to help guide technology research and development towards an eventual cost-effective product. The model presented in this user’s guide is generic and can be tailored to the manufacturing of any product, including the generation of electricity (as a product). This flexibility, however, requires the user to develop the processes and process efficiencies that represents a full-scale manufacturing facility. The generic model is comprised of several modulesmore » that estimate variable costs (material, labor, and operating), fixed costs (capital & maintenance), financing structures (debt and equity financing), and tax implications (taxable income after equipment and building depreciation, debt interest payments, and expenses) of a notional manufacturing plant. A cash-flow method is used to estimate a selling price necessary for the manufacturing plant to recover its total cost of production. A levelized unit sales price ($ per unit of product) is determined by dividing the net-present value of the manufacturing plant’s expenses ($) by the net present value of its product output. A user defined production schedule drives the cash-flow method that determines the levelized unit price. In addition, an analyst can increase the levelized unit price to include a gross profit margin to estimate a product sales price. This model allows an analyst to understand the effect that any input variables could have on the cost of manufacturing a product. In addition, the tool is able to perform sensitivity analysis, which can be used to identify the key variables and assumptions that have the greatest influence on the levelized costs. This component is intended to help technology researchers focus their research attention on tasks that offer the greatest opportunities for cost reduction early in the research and development stages of technology invention.« less
Gaziano, Thomas; Abrahams-Gessel, Shafika; Surka, Sam; Sy, Stephen; Pandya, Ankur; Denman, Catalina A; Mendoza, Carlos; Puoane, Thandi; Levitt, Naomi S
2015-09-01
In low-resource settings, a physician is not always available. We recently demonstrated that community health workers-instead of physicians or nurses-can efficiently screen adults for cardiovascular disease in South Africa, Mexico, and Guatemala. In this analysis we sought to determine the health and economic impacts of shifting this screening to community health workers equipped with either a paper-based or a mobile phone-based screening tool. We found that screening by community health workers was very cost-effective or even cost-saving in all three countries, compared to the usual clinic-based screening. The mobile application emerged as the most cost-effective strategy because it could save more lives than the paper tool at minimal extra cost. Our modeling indicated that screening by community health workers, combined with improved treatment rates, would increase the number of deaths averted from 15,000 to 110,000, compared to standard care. Policy makers should promote greater acceptance of community health workers by both national populations and health professionals and should increase their commitment to treating cardiovascular disease and making medications available. Project HOPE—The People-to-People Health Foundation, Inc.
Depth of manual dismantling analysis: a cost-benefit approach.
Achillas, Ch; Aidonis, D; Vlachokostas, Ch; Karagiannidis, A; Moussiopoulos, N; Loulos, V
2013-04-01
This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in order to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models' applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product's components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93-23.06 €, depending on the level of disassembly. Copyright © 2013 Elsevier Ltd. All rights reserved.
Improving Balance in TBI Using a Low-Cost Customized Virtual Reality Rehabilitation Tool
2016-10-01
AWARD NUMBER: W81XWH-14-2-0150 TITLE: Improving Balance in TBI Using a Low-Cost Customized Virtual Reality Rehabilitation Tool PRINCIPAL...AND SUBTITLE Improving Balance in TBI Using a Low-Cost Customized Virtual Reality Rehabilitation Tool 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH...Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The proposed study will implement and evaluate a novel, low-cost, Virtual Reality (VR
Crash Attenuator Data Collection and Life Cycle Tool Development
DOT National Transportation Integrated Search
2014-06-14
This research study was aimed at data collection and development of a decision support tool for life cycle cost assessment of crash attenuators. Assessing arrenuator life cycle costs based on in-place expected costs and not just the initial cost enha...
Toward transient finite element simulation of thermal deformation of machine tools in real-time
NASA Astrophysics Data System (ADS)
Naumann, Andreas; Ruprecht, Daniel; Wensch, Joerg
2018-01-01
Finite element models without simplifying assumptions can accurately describe the spatial and temporal distribution of heat in machine tools as well as the resulting deformation. In principle, this allows to correct for displacements of the Tool Centre Point and enables high precision manufacturing. However, the computational cost of FE models and restriction to generic algorithms in commercial tools like ANSYS prevents their operational use since simulations have to run faster than real-time. For the case where heat diffusion is slow compared to machine movement, we introduce a tailored implicit-explicit multi-rate time stepping method of higher order based on spectral deferred corrections. Using the open-source FEM library DUNE, we show that fully coupled simulations of the temperature field are possible in real-time for a machine consisting of a stock sliding up and down on rails attached to a stand.
The Psychological Cost of Making Control Responses in the Nonstereotype Direction.
Chan, Alan H S; Hoffmann, Errol R
2016-12-01
The aim of this study was to develop a scale for the "psychological cost" of making control responses in the nonstereotype direction. Wickens, Keller, and Small suggested values for the psychological cost arising from having control/display relationships that were not in the common stereotype directions. We provide values of such costs specifically for these situations. Working from data of Chan and Hoffmann for 168 combinations of display location, control type, and display movement direction, we define values for the cost and compare these with the suggested values of Wickens et al.'s Frame of Reference Transformation Tool (FORT) model. We found marked differences between the values of the FORT model and the data of our experiments. The differences arise largely from the effects of the Worringham and Beringer visual field principle not being adequately considered in the previous research. A better indication of the psychological cost for use of incorrect control/display stereotypes is given. It is noted that these costs are applicable only to the factor of stereotype strength and not other factors considered in the FORT model. Effects of having controls and displays that are not arranged to operate with population expectancies can be readily determined from the data in this paper. © 2016, Human Factors and Ergonomics Society.
The cost of clinical mastitis in the first 30 days of lactation: An economic modeling tool.
Rollin, E; Dhuyvetter, K C; Overton, M W
2015-12-01
Clinical mastitis results in considerable economic losses for dairy producers and is most commonly diagnosed in early lactation. The objective of this research was to estimate the economic impact of clinical mastitis occurring during the first 30 days of lactation for a representative US dairy. A deterministic partial budget model was created to estimate direct and indirect costs per case of clinical mastitis occurring during the first 30 days of lactation. Model inputs were selected from the available literature, or when none were available, from herd data. The average case of clinical mastitis resulted in a total economic cost of $444, including $128 in direct costs and $316 in indirect costs. Direct costs included diagnostics ($10), therapeutics ($36), non-saleable milk ($25), veterinary service ($4), labor ($21), and death loss ($32). Indirect costs included future milk production loss ($125), premature culling and replacement loss ($182), and future reproductive loss ($9). Accurate decision making regarding mastitis control relies on understanding the economic impacts of clinical mastitis, especially the longer term indirect costs that represent 71% of the total cost per case of mastitis. Future milk production loss represents 28% of total cost, and future culling and replacement loss represents 41% of the total cost of a case of clinical mastitis. In contrast to older estimates, these values represent the current dairy economic climate, including milk price ($0.461/kg), feed price ($0.279/kg DM (dry matter)), and replacement costs ($2,094/head), along with the latest published estimates on the production and culling effects of clinical mastitis. This economic model is designed to be customized for specific dairy producers and their herd characteristics to better aid them in developing mastitis control strategies. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Christian, Paul M.
2002-07-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
A cost-efficiency and health benefit approach to improve urban air quality.
Miranda, A I; Ferreira, J; Silveira, C; Relvas, H; Duque, L; Roebeling, P; Lopes, M; Costa, S; Monteiro, A; Gama, C; Sá, E; Borrego, C; Teixeira, J P
2016-11-01
When ambient air quality standards established in the EU Directive 2008/50/EC are exceeded, Member States are obliged to develop and implement Air Quality Plans (AQP) to improve air quality and health. Notwithstanding the achievements in emission reductions and air quality improvement, additional efforts need to be undertaken to improve air quality in a sustainable way - i.e. through a cost-efficiency approach. This work was developed in the scope of the recently concluded MAPLIA project "Moving from Air Pollution to Local Integrated Assessment", and focuses on the definition and assessment of emission abatement measures and their associated costs, air quality and health impacts and benefits by means of air quality modelling tools, health impact functions and cost-efficiency analysis. The MAPLIA system was applied to the Grande Porto urban area (Portugal), addressing PM10 and NOx as the most important pollutants in the region. Four different measures to reduce PM10 and NOx emissions were defined and characterized in terms of emissions and implementation costs, and combined into 15 emission scenarios, simulated by the TAPM air quality modelling tool. Air pollutant concentration fields were then used to estimate health benefits in terms of avoided costs (external costs), using dose-response health impact functions. Results revealed that, among the 15 scenarios analysed, the scenario including all 4 measures lead to a total net benefit of 0.3M€·y(-1). The largest net benefit is obtained for the scenario considering the conversion of 50% of open fire places into heat recovery wood stoves. Although the implementation costs of this measure are high, the benefits outweigh the costs. Research outcomes confirm that the MAPLIA system is useful for policy decision support on air quality improvement strategies, and could be applied to other urban areas where AQP need to be implemented and monitored. Copyright © 2016. Published by Elsevier B.V.
Biomass supply chain optimisation for Organosolv-based biorefineries.
Giarola, Sara; Patel, Mayank; Shah, Nilay
2014-05-01
This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs. Copyright © 2014 Elsevier Ltd. All rights reserved.
GIS-based spatial decision support system for grain logistics management
NASA Astrophysics Data System (ADS)
Zhen, Tong; Ge, Hongyi; Jiang, Yuying; Che, Yi
2010-07-01
Grain logistics is the important component of the social logistics, which can be attributed to frequent circulation and the great quantity. At present time, there is no modern grain logistics distribution management system, and the logistics cost is the high. Geographic Information Systems (GIS) have been widely used for spatial data manipulation and model operations and provide effective decision support through its spatial database management capabilities and cartographic visualization. In the present paper, a spatial decision support system (SDSS) is proposed to support policy makers and to reduce the cost of grain logistics. The system is composed of two major components: grain logistics goods tracking model and vehicle routing problem optimization model and also allows incorporation of data coming from external sources. The proposed system is an effective tool to manage grain logistics in order to increase the speed of grain logistics and reduce the grain circulation cost.
USDA-ARS?s Scientific Manuscript database
Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...
Ranked set sampling: cost and optimal set size.
Nahhas, Ramzi W; Wolfe, Douglas A; Chen, Haiying
2002-12-01
McIntyre (1952, Australian Journal of Agricultural Research 3, 385-390) introduced ranked set sampling (RSS) as a method for improving estimation of a population mean in settings where sampling and ranking of units from the population are inexpensive when compared with actual measurement of the units. Two of the major factors in the usefulness of RSS are the set size and the relative costs of the various operations of sampling, ranking, and measurement. In this article, we consider ranking error models and cost models that enable us to assess the effect of different cost structures on the optimal set size for RSS. For reasonable cost structures, we find that the optimal RSS set sizes are generally larger than had been anticipated previously. These results will provide a useful tool for determining whether RSS is likely to lead to an improvement over simple random sampling in a given setting and, if so, what RSS set size is best to use in this case.
The Applications of NASA Mission Technologies to the Greening of Human Impact
NASA Technical Reports Server (NTRS)
Sims, Michael H.
2009-01-01
I will give an overview talk about flight software systems, robotics technologies and modeling for energy minimization as applied to vehicles and buildings infrastructures. A dominant issue in both design and operations of robotic spacecraft is the minimization of energy use. In the design and building of spacecraft increased power is acquired only at the cost of additional mass and volumes and ultimately cost. Consequently, interplanetary spacecrafts are designed to have the minimum essential power and those designs often incorporate careful timing of all power use. Operationally, the availability of power is the most influential constraint for the use of planetary surface robots, such as the Mars Exploration Rovers. The amount of driving done, the amount of science accomplished and indeed the survivability of the spacecraft itself is determined by the power available for use. For the Mars Exploration Rovers there are four tools which are used: (1) models of the rover and it s thermal and power use (2) predictive environmental models of power input and thermal environment (3) fine grained manipulation of power use (4) optimization modeling and planning tools. In this talk I will discuss possible applications of this methodology to minimizing power use on Earth, especially in buildings.
NASA Astrophysics Data System (ADS)
Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil
2018-03-01
The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.
NASA Astrophysics Data System (ADS)
Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil
2018-06-01
The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.
NASA Astrophysics Data System (ADS)
Pulido-Velazquez, Manuel; Lopez-Nicolas, Antonio; Harou, Julien J.; Andreu, Joaquin
2013-04-01
Hydrologic-economic models allow integrated analysis of water supply, demand and infrastructure management at the river basin scale. These models simultaneously analyze engineering, hydrology and economic aspects of water resources management. Two new tools have been designed to develop models within this approach: a simulation tool (SIM_GAMS), for models in which water is allocated each month based on supply priorities to competing uses and system operating rules, and an optimization tool (OPT_GAMS), in which water resources are allocated optimally following economic criteria. The characterization of the water resource network system requires a connectivity matrix representing the topology of the elements, generated using HydroPlatform. HydroPlatform, an open-source software platform for network (node-link) models, allows to store, display and export all information needed to characterize the system. Two generic non-linear models have been programmed in GAMS to use the inputs from HydroPlatform in simulation and optimization models. The simulation model allocates water resources on a monthly basis, according to different targets (demands, storage, environmental flows, hydropower production, etc.), priorities and other system operating rules (such as reservoir operating rules). The optimization model's objective function is designed so that the system meets operational targets (ranked according to priorities) each month while following system operating rules. This function is analogous to the one used in the simulation module of the DSS AQUATOOL. Each element of the system has its own contribution to the objective function through unit cost coefficients that preserve the relative priority rank and the system operating rules. The model incorporates groundwater and stream-aquifer interaction (allowing conjunctive use simulation) with a wide range of modeling options, from lumped and analytical approaches to parameter-distributed models (eigenvalue approach). Such functionality is not typically included in other water DSS. Based on the resulting water resources allocation, the model calculates operating and water scarcity costs caused by supply deficits based on economic demand functions for each demand node. The optimization model allocates the available resource over time based on economic criteria (net benefits from demand curves and cost functions), minimizing the total water scarcity and operating cost of water use. This approach provides solutions that optimize the economic efficiency (as total net benefit) in water resources management over the optimization period. Both models must be used together in water resource planning and management. The optimization model provides an initial insight on economically efficient solutions, from which different operating rules can be further developed and tested using the simulation model. The hydro-economic simulation model allows assessing economic impacts of alternative policies or operating criteria, avoiding the perfect foresight issues associated with the optimization. The tools have been applied to the Jucar river basin (Spain) in order to assess the economic results corresponding to the current modus operandi of the system and compare them with the solution from the optimization that maximizes economic efficiency. Acknowledgments: The study has been partially supported by the European Community 7th Framework Project (GENESIS project, n. 226536) and the Plan Nacional I+D+I 2008-2011 of the Spanish Ministry of Science and Innovation (CGL2009-13238-C02-01 and CGL2009-13238-C02-02).
Tax, Casper; Govaert, Paulien H M; Stommel, Martijn W J; Besselink, Marc G H; Gooszen, Hein G; Rovers, Maroeska M
2017-11-02
To illustrate how decision modeling may identify relevant uncertainty and can preclude or identify areas of future research in surgery. To optimize use of research resources, a tool is needed that assists in identifying relevant uncertainties and the added value of reducing these uncertainties. The clinical pathway for laparoscopic distal pancreatectomy (LDP) versus open (ODP) for nonmalignant lesions was modeled in a decision tree. Cost-effectiveness based on complications, hospital stay, costs, quality of life, and survival was analyzed. The effect of existing uncertainty on the cost-effectiveness was addressed, as well as the expected value of eliminating uncertainties. Based on 29 nonrandomized studies (3.701 patients) the model shows that LDP is more cost-effective compared with ODP. Scenarios in which LDP does not outperform ODP for cost-effectiveness seem unrealistic, e.g., a 30-day mortality rate of 1.79 times higher after LDP as compared with ODP, conversion in 62.2%, surgically repair of incisional hernias in 21% after LDP, or an average 2.3 days longer hospital stay after LDP than after ODP. Taking all uncertainty into account, LDP remained more cost-effective. Minimizing these uncertainties did not change the outcome. The results show how decision analytical modeling can help to identify relevant uncertainty and guide decisions for future research in surgery. Based on the current available evidence, a randomized clinical trial on complications, hospital stay, costs, quality of life, and survival is highly unlikely to change the conclusion that LDP is more cost-effective than ODP.
2017-03-21
Energy and Water Projects March 21, 2017 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of...included reduced system energy use and cost as well as improved performance driven by autonomous commissioning and optimized system control. In the end...improve system performance and reduce energy use and cost. However, implementing these solutions into the extremely heterogeneous and often
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timbario, Thomas A.; Timbario, Thomas J.; Laffen, Melissa J.
2011-04-12
Currently, several cost-per-mile calculators exist that can provide estimates of acquisition and operating costs for consumers and fleets. However, these calculators are limited in their ability to determine the difference in cost per mile for consumer versus fleet ownership, to calculate the costs beyond one ownership period, to show the sensitivity of the cost per mile to the annual vehicle miles traveled (VMT), and to estimate future increases in operating and ownership costs. Oftentimes, these tools apply a constant percentage increase over the time period of vehicle operation, or in some cases, no increase in direct costs at all overmore » time. A more accurate cost-per-mile calculator has been developed that allows the user to analyze these costs for both consumers and fleets. Operating costs included in the calculation tool include fuel, maintenance, tires, and repairs; ownership costs include insurance, registration, taxes and fees, depreciation, financing, and tax credits. The calculator was developed to allow simultaneous comparisons of conventional light-duty internal combustion engine (ICE) vehicles, mild and full hybrid electric vehicles (HEVs), and fuel cell vehicles (FCVs). Additionally, multiple periods of operation, as well as three different annual VMT values for both the consumer case and fleets can be investigated to the year 2024. These capabilities were included since today's “cost to own” calculators typically include the ability to evaluate only one VMT value and are limited to current model year vehicles. The calculator allows the user to select between default values or user-defined values for certain inputs including fuel cost, vehicle fuel economy, manufacturer's suggested retail price (MSRP) or invoice price, depreciation and financing rates.« less
ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis
NASA Technical Reports Server (NTRS)
Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.
2006-01-01
Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.
System Architecture Modeling for Technology Portfolio Management using ATLAS
NASA Technical Reports Server (NTRS)
Thompson, Robert W.; O'Neil, Daniel A.
2006-01-01
Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.
Commercial Building Energy Asset Score
DOE Office of Scientific and Technical Information (OSTI.GOV)
This software (Asset Scoring Tool) is designed to help building owners and managers to gain insight into the as-built efficiency of their buildings. It is a web tool where users can enter their building information and obtain an asset score report. The asset score report consists of modeled building energy use (by end use and by fuel type), building systems (envelope, lighting, heating, cooling, service hot water) evaluations, and recommended energy efficiency measures. The intended users are building owners and operators who have limited knowledge of building energy efficiency. The scoring tool collects minimum building data (~20 data entries) frommore » users and build a full-scale energy model using the inference functionalities from Facility Energy Decision System (FEDS). The scoring tool runs real-time building energy simulation using EnergyPlus and performs life-cycle cost analysis using FEDS. An API is also under development to allow the third-party applications to exchange data with the web service of the scoring tool.« less
Alternative Fuels Data Center: Biodiesel Vehicle Emissions
Petroleum Reduction Planning Tool AFLEET Tool All Tools Vehicle Cost Calculator Choose a vehicle to compare fuel cost and emissions with a conventional vehicle. Select Fuel/Technology Electric Hybrid Electric Cost Calculator Vehicle 0 City 0 Hwy (mi/gal) 0 City 0 Hwy (kWh/100m) Gasoline Vehicle 0 City 0 Hwy (mi
75 FR 13289 - Agency Information Collection Request, 60-Day Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-19
... Tool Program Director 45 4 3 540 CPPW Cost Study Tool Business Manager 45 4 3 540 CPPW Cost Study Tool... performance of the agency's functions; (2) the accuracy of the estimated burden; (3) ways to enhance the... Prevention to Work Cost Study Instrument--OMB No. 0990-NEW- Office of the Assistant Secretary for Planning...
Development of a Carbon Management Geographic Information System (GIS) for the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howard Herzog; Holly Javedan
In this project a Carbon Management Geographical Information System (GIS) for the US was developed. The GIS stored, integrated, and manipulated information relating to the components of carbon management systems. Additionally, the GIS was used to interpret and analyze the effect of developing these systems. This report documents the key deliverables from the project: (1) Carbon Management Geographical Information System (GIS) Documentation; (2) Stationary CO{sub 2} Source Database; (3) Regulatory Data for CCS in United States; (4) CO{sub 2} Capture Cost Estimation; (5) CO{sub 2} Storage Capacity Tools; (6) CO{sub 2} Injection Cost Modeling; (7) CO{sub 2} Pipeline Transport Costmore » Estimation; (8) CO{sub 2} Source-Sink Matching Algorithm; and (9) CO{sub 2} Pipeline Transport and Cost Model.« less
Solving lot-sizing problem with quantity discount and transportation cost
NASA Astrophysics Data System (ADS)
Lee, Amy H. I.; Kang, He-Yau; Lai, Chun-Mei
2013-04-01
Owing to today's increasingly competitive market and ever-changing manufacturing environment, the inventory problem is becoming more complicated to solve. The incorporation of heuristics methods has become a new trend to tackle the complex problem in the past decade. This article considers a lot-sizing problem, and the objective is to minimise total costs, where the costs include ordering, holding, purchase and transportation costs, under the requirement that no inventory shortage is allowed in the system. We first formulate the lot-sizing problem as a mixed integer programming (MIP) model. Next, an efficient genetic algorithm (GA) model is constructed for solving large-scale lot-sizing problems. An illustrative example with two cases in a touch panel manufacturer is used to illustrate the practicality of these models, and a sensitivity analysis is applied to understand the impact of the changes in parameters to the outcomes. The results demonstrate that both the MIP model and the GA model are effective and relatively accurate tools for determining the replenishment for touch panel manufacturing for multi-periods with quantity discount and batch transportation. The contributions of this article are to construct an MIP model to obtain an optimal solution when the problem is not too complicated itself and to present a GA model to find a near-optimal solution efficiently when the problem is complicated.
Gruss, H-J; Cockett, A; Leicester, R J
2012-01-01
With the availability of several bowel cleansing agents, physicians and hospitals performing colonoscopies will often base their choice of cleansing agent purely on acquisition cost. Therefore, an easy to use budget impact model has been developed and established as a tool to compare total colon preparation costs between different established bowel cleansing agents. The model was programmed in Excel and designed as a questionnaire evaluating information on treatment costs for a range of established bowel cleansing products. The sum of costs is based on National Health Service reference costs for bowel cleansing products. Estimations are made for savings achievable when using a 2-litre polyethylene glycol with ascorbate components solution (PEG+ASC) in place of other bowel cleansing solutions. Test data were entered into the model to confirm validity and sensitivity. The model was then applied to a set of audit cost data from a major hospital colonoscopy unit in the UK. Descriptive analysis of the test data showed that the main cost drivers in the colonoscopy process are the procedure costs and costs for bed days rather than drug acquisition costs, irrespective of the cleansing agent. Audit data from a colonoscopy unit in the UK confirmed the finding with a saving of £107,000 per year in favour of PEG+ASC when compared to sodium picosulphate with magnesium citrate solution (NaPic+MgCit). For every patient group the model calculated overall cost savings. This was irrespective of the higher drug expenditure associated with the use of PEG+ASC for bowel preparation. Savings were mainly realized through reduced costs for repeat colonoscopy procedures and associated costs, such as inpatient length of stay. The budget impact model demonstrated that the primary cost driver was the procedure cost for colonoscopy. Savings can be realized through the use of PEG+ASC despite higher drug acquisition costs relative to the comparator products. From a global hospital funding perspective, the acquisition costs of bowel preparations should not be used as the primary reason to select the preferred treatment agent, but should be part of the consideration, with an emphasis on the clinical outcome.
Menzin, Joseph; Marton, Jeno P; Menzin, Jordan A; Willke, Richard J; Woodward, Rebecca M; Federico, Victoria
2012-06-25
Researchers and policy makers have determined that accounting for productivity costs, or "indirect costs," may be as important as including direct medical expenditures when evaluating the societal value of health interventions. These costs are also important when estimating the global burden of disease. The estimation of indirect costs is commonly done on a country-specific basis. However, there are few studies that evaluate indirect costs across countries using a consistent methodology. Using the human capital approach, we developed a model that estimates productivity costs as the present value of lifetime earnings (PVLE) lost due to premature mortality. Applying this methodology, the model estimates productivity costs for 29 selected countries, both developed and emerging. We also provide an illustration of how the inclusion of productivity costs contributes to an analysis of the societal burden of smoking. A sensitivity analysis is undertaken to assess productivity costs on the basis of the friction cost approach. PVLE estimates were higher for certain subpopulations, such as men, younger people, and people in developed countries. In the case study, productivity cost estimates from our model showed that productivity loss was a substantial share of the total cost burden of premature mortality due to smoking, accounting for over 75 % of total lifetime costs in the United States and 67 % of total lifetime costs in Brazil. Productivity costs were much lower using the friction cost approach among those of working age. Our PVLE model is a novel tool allowing researchers to incorporate the value of lost productivity due to premature mortality into economic analyses of treatments for diseases or health interventions. We provide PVLE estimates for a number of emerging and developed countries. Including productivity costs in a health economics study allows for a more comprehensive analysis, and, as demonstrated by our illustration, can have important effects on the results and conclusions.
2012-01-01
Background Researchers and policy makers have determined that accounting for productivity costs, or “indirect costs,” may be as important as including direct medical expenditures when evaluating the societal value of health interventions. These costs are also important when estimating the global burden of disease. The estimation of indirect costs is commonly done on a country-specific basis. However, there are few studies that evaluate indirect costs across countries using a consistent methodology. Methods Using the human capital approach, we developed a model that estimates productivity costs as the present value of lifetime earnings (PVLE) lost due to premature mortality. Applying this methodology, the model estimates productivity costs for 29 selected countries, both developed and emerging. We also provide an illustration of how the inclusion of productivity costs contributes to an analysis of the societal burden of smoking. A sensitivity analysis is undertaken to assess productivity costs on the basis of the friction cost approach. Results PVLE estimates were higher for certain subpopulations, such as men, younger people, and people in developed countries. In the case study, productivity cost estimates from our model showed that productivity loss was a substantial share of the total cost burden of premature mortality due to smoking, accounting for over 75 % of total lifetime costs in the United States and 67 % of total lifetime costs in Brazil. Productivity costs were much lower using the friction cost approach among those of working age. Conclusions Our PVLE model is a novel tool allowing researchers to incorporate the value of lost productivity due to premature mortality into economic analyses of treatments for diseases or health interventions. We provide PVLE estimates for a number of emerging and developed countries. Including productivity costs in a health economics study allows for a more comprehensive analysis, and, as demonstrated by our illustration, can have important effects on the results and conclusions. PMID:22731620
The Stirling engine as a low cost tool to educate mechanical engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gros, J.; Munoz, M.; Moreno, F.
1995-12-31
The University of Zaragoza through CIRCE, the New Enterprise foundation, an Opel foundation and the local Government of Aragon have been developed a program to introduce the Stirling Engine as a low cost tool to educate students in mechanical engineering. The promotion of a prize like GNAT Power organized by the magazine Model Engineer in London, has improved the practical education of students in the field of mechanical devices and thermal engines. Two editions of the contest, 1993 and 1994, awarded the greatest power Stirling engine made by only using a little candle of paraffin as a heat source. Fourmore » engines were presented in the first edition, with an average power of about 100 mW, and seven engines in the second one, achieving a power of about 230 mW. Presentations in Technical Schools and the University have been carried out. Also low cost tools have been made for measuring an electronic device to draw the real internal pressure volume diagram using a PC. A very didactic software to design classic kinematic alpha, beta and gamma engines plus Ringbom beta and gamma engines has been created. A book is going to be published (in Spanish) explaining the design of small Stirling engines as a way to start with low cost research in thermal engines, a very difficult target with IC engines.« less
Multifidelity Analysis and Optimization for Supersonic Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory
2010-01-01
Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.
On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.
Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar
2015-01-01
Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.
COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL
NASA Technical Reports Server (NTRS)
Roush, G. B.
1994-01-01
The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.
DOT National Transportation Integrated Search
2014-12-01
The objective of this project is to design decision-support tools for identifying : biorefinery locations that ensure a cost-efficient and reliable supply chain. We built : mathematical models which take into consideration the benefits (such as acces...
E-Commerce and Privacy: Conflict and Opportunity.
ERIC Educational Resources Information Center
Farah, Badie N.; Higby, Mary A.
2001-01-01
Electronic commerce has intensified conflict between businesses' need to collect data and customers' desire to protect privacy. Web-based privacy tools and legislation could add to the costs of e-commerce and reduce profitability. Business models not based on profiling customers may be needed. (SK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glass, Samuel W.; Fifield, Leonard S.; Jones, Anthony M.
Cable insulation polymers are among the more susceptible materials to age-related degradation within a nuclear power plant. This is recognized by both regulators and utilities, so all plants have developed cable aging management programs to detect damage before critical component failure in compliance with regulatory guidelines. Although a wide range of tools are available to evaluate cables and cable systems, cable aging management programs vary in how condition monitoring and NDE is conducted as utilities search for the most reliable and cost-effective ways to assess cable system condition. Frequency domain reflectometry (FDR) is emerging as one valuable tool to locatemore » and assess damaged portions of a cable system with minimal cost and only requires access in most cases to one of the cable terminal ends. This work examines a physics-based model of a cable system and relates it to FDR measurements for a better understanding of specific damage influences on defect detectability.« less
Gray, Ewan; Butler, Holly J; Board, Ruth; Brennan, Paul M; Chalmers, Anthony J; Dawson, Timothy; Goodden, John; Hamilton, Willie; Hegarty, Mark G; James, Allan; Jenkinson, Michael D; Kernick, David; Lekka, Elvira; Livermore, Laurent J; Mills, Samantha J; O'Neill, Kevin; Palmer, David S; Vaqas, Babar; Baker, Matthew J
2018-05-24
To determine the potential costs and health benefits of a serum-based spectroscopic triage tool for brain tumours, which could be developed to reduce diagnostic delays in the current clinical pathway. A model-based health pre-trial economic assessment. Decision tree models were constructed based on simplified diagnostic pathways. Models were populated with parameters identified from rapid reviews of the literature and clinical expert opinion. Explored as a test in both primary and secondary care (neuroimaging) in the UK health service, as well as application to the USA. Calculations based on an initial cohort of 10 000 patients. In primary care, it is estimated that the volume of tests would approach 75 000 per annum. The volume of tests in secondary care is estimated at 53 000 per annum. The primary outcome measure was quality-adjusted life-years (QALY), which were employed to derive incremental cost-effectiveness ratios (ICER) in a cost-effectiveness analysis. Results indicate that using a blood-based spectroscopic test in both scenarios has the potential to be highly cost-effective in a health technology assessment agency decision-making process, as ICERs were well below standard threshold values of £20 000-£30 000 per QALY. This test may be cost-effective in both scenarios with test sensitivities and specificities as low as 80%; however, the price of the test would need to be lower (less than approximately £40). Use of this test as triage tool in primary care has the potential to be both more effective and cost saving for the health service. In secondary care, this test would also be deemed more effective than the current diagnostic pathway. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
University of California, Berkeley; Wei, Max; Lipman, Timothy
2014-06-23
A total cost of ownership model is described for low temperature proton exchange membrane stationary fuel cell systems for combined heat and power (CHP) applications from 1-250kW and backup power applications from 1-50kW. System designs and functional specifications for these two applications were developed across the range of system power levels. Bottom-up cost estimates were made for balance of plant costs, and detailed direct cost estimates for key fuel cell stack components were derived using design-for-manufacturing-and-assembly techniques. The development of high throughput, automated processes achieving high yield are projected to reduce the cost for fuel cell stacks to the $300/kWmore » level at an annual production volume of 100 MW. Several promising combinations of building types and geographical location in the U.S. were identified for installation of fuel cell CHP systems based on the LBNL modelling tool DER CAM. Life-cycle modelling and externality assessment were done for hotels and hospitals. Reduced electricity demand charges, heating credits and carbon credits can reduce the effective cost of electricity ($/kWhe) by 26-44percent in locations such as Minneapolis, where high carbon intensity electricity from the grid is displaces by a fuel cell system operating on reformate fuel. This project extends the scope of existing cost studies to include externalities and ancillary financial benefits and thus provides a more comprehensive picture of fuel cell system benefits, consistent with a policy and incentive environment that increasingly values these ancillary benefits. The project provides a critical, new modelling capacity and should aid a broad range of policy makers in assessing the integrated costs and benefits of fuel cell systems versus other distributed generation technologies.« less
Shireman, Theresa I.; Svarstad, Bonnie L.
2016-01-01
Objective To assess the cost-effectiveness of the 6-month Team Education and Adherence Monitoring (TEAM) intervention for black patients with hypertension in community pharmacies using prospectively collected cost data. Design Cost-effectiveness analysis of a cluster-randomized trial. Setting 28 chain pharmacies in five Wisconsin cities from December 2006 to February 2009. Participants 576 black patients with uncontrolled hypertension Intervention Pharmacists and pharmacy technicians using novel tools for improving adherence and feedback to patients and physicians as compared to information only control group. Main outcome measure(s) Incremental cost analysis of variable costs from the pharmacy perspective captured prospectively at the participant level. Outcomes (effect measures) were 6-month refill adherence, changes in SBP and DBP, and proportion of patients achieving BP control. Results Mean cost of intervention personnel time and tools was $104.8± 45.2. Incremental variable costs per mmHg decrease in SBP and DBP were $22.2 ± 16.3 and $66.0 ± 228.4, respectively. The cost of helping one more person achieve the BP goal (< 140/90) was $665.2 ± 265.2; the cost of helping one more person achieve good refill adherence was $463.3 ± 110.7. Prescription drug costs were higher for the TEAM group ($392.8, SD = 396.3 versus $307.0, SD = 295.2, p = 0.02). The start-up cost for pharmacy furniture, equipment, and privacy screen was $168 per pharmacy. Conclusions Our randomized, practice based intervention demonstrated that community pharmacists can implement a cost-effective intervention to improve hypertension control in blacks. This approach imposes a nominal expense at the pharmacy level, can be integrated into the ongoing pharmacist-patient relationship, and can enhance clinical and behavioral outcomes. PMID:27184784
Shui, Wuyang; Zhou, Mingquan; Chen, Shi; Pan, Zhouxian; Deng, Qingqiong; Yao, Yong; Pan, Hui; He, Taiping; Wang, Xingce
2017-01-01
Virtual digital resources and printed models have become indispensable tools for medical training and surgical planning. Nevertheless, printed models of soft tissue organs are still challenging to reproduce. This study adopts open source packages and a low-cost desktop 3D printer to convert multiple modalities of medical images to digital resources (volume rendering images and digital models) and lifelike printed models, which are useful to enhance our understanding of the geometric structure and complex spatial nature of anatomical organs. Neuroimaging technologies such as CT, CTA, MRI, and TOF-MRA collect serial medical images. The procedures for producing digital resources can be divided into volume rendering and medical image reconstruction. To verify the accuracy of reconstruction, this study presents qualitative and quantitative assessments. Subsequently, digital models are archived as stereolithography format files and imported to the bundled software of the 3D printer. The printed models are produced using polylactide filament materials. We have successfully converted multiple modalities of medical images to digital resources and printed models for both hard organs (cranial base and tooth) and soft tissue organs (brain, blood vessels of the brain, the heart chambers and vessel lumen, and pituitary tumor). Multiple digital resources and printed models were provided to illustrate the anatomical relationship between organs and complicated surrounding structures. Three-dimensional printing (3DP) is a powerful tool to produce lifelike and tangible models. We present an available and cost-effective method for producing both digital resources and printed models. The choice of modality in medical images and the processing approach is important when reproducing soft tissue organs models. The accuracy of the printed model is determined by the quality of organ models and 3DP. With the ongoing improvement of printing techniques and the variety of materials available, 3DP will become an indispensable tool in medical training and surgical planning.
Context-dependent ‘safekeeping’ of foraging tools in New Caledonian crows
Klump, Barbara C.; van der Wal, Jessica E. M.; St Clair, James J. H.; Rutz, Christian
2015-01-01
Several animal species use tools for foraging, such as sticks to extract embedded arthropods and honey, or stones to crack open nuts and eggs. While providing access to nutritious foods, these behaviours may incur significant costs, such as the time and energy spent searching for, manufacturing and transporting tools. These costs can be reduced by re-using tools, keeping them safe when not needed. We experimentally investigated what New Caledonian crows do with their tools between successive prey extractions, and whether they express tool ‘safekeeping’ behaviours more often when the costs (foraging at height), or likelihood (handling of demanding prey), of tool loss are high. Birds generally took care of their tools (84% of 176 prey extractions, nine subjects), either trapping them underfoot (74%) or storing them in holes (26%)—behaviours we also observed in the wild (19 cases, four subjects). Moreover, tool-handling behaviour was context-dependent, with subjects: keeping their tools safe significantly more often when foraging at height; and storing tools significantly more often in holes when extracting more demanding prey (under these conditions, foot-trapping proved challenging). In arboreal environments, safekeeping can prevent costly tool losses, removing a potentially important constraint on the evolution of habitual and complex tool behaviour. PMID:25994674
NASA Astrophysics Data System (ADS)
Sadjadi, Seyed Jafar; Hamidi Hesarsorkh, Aghil; Mohammadi, Mehdi; Bonyadi Naeini, Ali
2015-06-01
Coordination and harmony between different departments of a company can be an important factor in achieving competitive advantage if the company corrects alignment between strategies of different departments. This paper presents an integrated decision model based on recent advances of geometric programming technique. The demand of a product considers as a power function of factors such as product's price, marketing expenditures, and consumer service expenditures. Furthermore, production cost considers as a cubic power function of outputs. The model will be solved by recent advances in convex optimization tools. Finally, the solution procedure is illustrated by numerical example.
2000-01-01
for flight test data, and both generic and specialized tools of data filtering , data calibration, modeling , system identification, and simulation...GRAMMATICAL MODEL AND PARSER FOR AIR TRAFFIC CONTROLLER’S COMMANDS 11 A SPEECH-CONTROLLED INTERACTIVE VIRTUAL ENVIRONMENT FOR SHIP FAMILIARIZATION 12... MODELING AND SIMULATION IN THE 21ST CENTURY 23 NEW COTS HARDWARE AND SOFTWARE REDUCE THE COST AND EFFORT IN REPLACING AGING FLIGHT SIMULATORS SUBSYSTEMS
Zanini, Gabriele
2009-01-01
Selecting the best emissions abatement strategy is very difficult due to the complexity of the processes that determine the impact of atmospheric pollutants and to the connection with climate change issues. Atmospheric pollution models can provide policy makers with a tool for assessing the effectiveness of abatement measures and their associated costs. The MINNI integrated model has been developed to link policy and atmospheric science and to assess the costs of the measures. The results have been carefully verified in order to identify uncertainties and the models are continuously updated to represent the state of the art in atmospheric science. The fine spatial and temporal resolution of the simulations provide a strong basis for assessing impacts on environment and health.
NASA Astrophysics Data System (ADS)
Pervaiz, S.; Anwar, S.; Kannan, S.; Almarfadi, A.
2018-04-01
Ti6Al4V is known as difficult-to-cut material due to its inherent properties such as high hot hardness, low thermal conductivity and high chemical reactivity. Though, Ti6Al4V is utilized by industrial sectors such as aeronautics, energy generation, petrochemical and bio-medical etc. For the metal cutting community, competent and cost-effective machining of Ti6Al4V is a challenging task. To optimize cost and machining performance for the machining of Ti6Al4V, finite element based cutting simulation can be a very useful tool. The aim of this paper is to develop a finite element machining model for the simulation of Ti6Al4V machining process. The study incorporates material constitutive models namely Power Law (PL) and Johnson – Cook (JC) material models to mimic the mechanical behaviour of Ti6Al4V. The study investigates cutting temperatures, cutting forces, stresses, and plastic strains with respect to different PL and JC material models with associated parameters. In addition, the numerical study also integrates different cutting tool rake angles in the machining simulations. The simulated results will be beneficial to draw conclusions for improving the overall machining performance of Ti6Al4V.
McLeod, Melissa; Blakely, Tony; Kvizhinadze, Giorgi; Harris, Ricci
2014-01-01
A critical first step toward incorporating equity into cost-effectiveness analyses is to appropriately model interventions by population subgroups. In this paper we use a standardized treatment intervention to examine the impact of using ethnic-specific (Māori and non-Māori) data in cost-utility analyses for three cancers. We estimate gains in health-adjusted life years (HALYs) for a simple intervention (20% reduction in excess cancer mortality) for lung, female breast, and colon cancers, using Markov modeling. Base models include ethnic-specific cancer incidence with other parameters either turned off or set to non-Māori levels for both groups. Subsequent models add ethnic-specific cancer survival, morbidity, and life expectancy. Costs include intervention and downstream health system costs. For the three cancers, including existing inequalities in background parameters (population mortality and comorbidities) for Māori attributes less value to a year of life saved compared to non-Māori and lowers the relative health gains for Māori. In contrast, ethnic inequalities in cancer parameters have less predictable effects. Despite Māori having higher excess mortality from all three cancers, modeled health gains for Māori were less from the lung cancer intervention than for non-Māori but higher for the breast and colon interventions. Cost-effectiveness modeling is a useful tool in the prioritization of health services. But there are important (and sometimes counterintuitive) implications of including ethnic-specific background and disease parameters. In order to avoid perpetuating existing ethnic inequalities in health, such analyses should be undertaken with care.
2014-01-01
Background A critical first step toward incorporating equity into cost-effectiveness analyses is to appropriately model interventions by population subgroups. In this paper we use a standardized treatment intervention to examine the impact of using ethnic-specific (Māori and non-Māori) data in cost-utility analyses for three cancers. Methods We estimate gains in health-adjusted life years (HALYs) for a simple intervention (20% reduction in excess cancer mortality) for lung, female breast, and colon cancers, using Markov modeling. Base models include ethnic-specific cancer incidence with other parameters either turned off or set to non-Māori levels for both groups. Subsequent models add ethnic-specific cancer survival, morbidity, and life expectancy. Costs include intervention and downstream health system costs. Results For the three cancers, including existing inequalities in background parameters (population mortality and comorbidities) for Māori attributes less value to a year of life saved compared to non-Māori and lowers the relative health gains for Māori. In contrast, ethnic inequalities in cancer parameters have less predictable effects. Despite Māori having higher excess mortality from all three cancers, modeled health gains for Māori were less from the lung cancer intervention than for non-Māori but higher for the breast and colon interventions. Conclusions Cost-effectiveness modeling is a useful tool in the prioritization of health services. But there are important (and sometimes counterintuitive) implications of including ethnic-specific background and disease parameters. In order to avoid perpetuating existing ethnic inequalities in health, such analyses should be undertaken with care. PMID:24910540
Baldi, Giulia; Martini, Elviyanti; Catharina, Maria; Muslimatun, Siti; Fahmida, Umi; Jahari, Abas Basuni; Hardinsyah; Frega, Romeo; Geniez, Perrine; Grede, Nils; Minarto; Bloem, Martin W; de Pee, Saskia
2013-06-01
The Minimum Cost of a Nutritious Diet (MCNut) is the cost of a theoretical diet satisfying all nutrient requirements of a family at the lowest possible cost, based on availability, price, and nutrient content of local foods. A comparison with household expenditure shows the proportion of households that would be able to afford a nutritious diet. To explore using the Cost of Diet (CoD) tool for policy dialogue on food and nutrition security in Indonesia. From October 2011 to June 2012, market surveys collected data on food commodity availability and pricing in four provinces. Household composition and expenditure data were obtained from secondary data (SUSENAS 2010). Focus group discussions were conducted to better understand food consumption practices. Different types of fortified foods and distribution mechanisms were also modeled. Stark differences were found among the four areas: in Timor Tengah Selatan, only 25% of households could afford to meet the nutrient requirements, whereas in urban Surabaya, 80% could. The prevalence rates of underweight and stunting among children under 5 years of age in the four areas were inversely correlated with the proportion of households that could afford a nutritious diet. The highest reduction in the cost of the child's diet was achieved by modeling provision of fortified blended food through Social Safety Nets. Rice fortification, subsidized or at commercial price, can greatly improve nutrient affordability for households. The CoD analysis is a useful entry point for discussions on constraints on achieving adequate nutrition in different areas and on possible ways to improve nutrition, including the use of special foods and different distribution strategies.
NASA Astrophysics Data System (ADS)
Lopez-Nicolas, Antonio; Pulido-Velazquez, Manuel
2014-05-01
The main challenge of the BLUEPRINT to safeguard Europe's water resources (EC, 2012) is to guarantee that enough good quality water is available for people's needs, the economy and the environment. In this sense, economic policy instruments such as water pricing policies and water markets can be applied to enhance efficient use of water. This paper presents a method based on hydro-economic tools to assess the effect of economic instruments on water resource systems. Hydro-economic models allow integrated analysis of water supply, demand and infrastructure operation at the river basin scale, by simultaneously combining engineering, hydrologic and economic aspects of water resources management. The method made use of the simulation and optimization hydroeconomic tools SIMGAMS and OPTIGAMS. The simulation tool SIMGAMS allocates water resources among the users according to priorities and operating rules, and evaluate economic scarcity costs of the system by using economic demand functions. The model's objective function is designed so that the system aims to meet the operational targets (ranked according to priorities) at each month while following the system operating rules. The optimization tool OPTIGAMS allocates water resources based on an economic efficiency criterion: maximize net benefits, or alternatively, minimizing the total water scarcity and operating cost of water use. SIMGAS allows to simulate incentive water pricing policies based on marginal resource opportunity costs (MROC; Pulido-Velazquez et al., 2013). Storage-dependent step pricing functions are derived from the time series of MROC values at a certain reservoir in the system. These water pricing policies are defined based on water availability in the system (scarcity pricing), so that when water storage is high, the MROC is low, while low storage (drought periods) will be associated to high MROC and therefore, high prices. We also illustrate the use of OPTIGAMS to simulate the effect of ideal water markets by economic optimization, without considering the potential effect of transaction costs. These methods and tools have been applied to the Jucar River basin (Spain). The results show the potential of economic instruments in setting incentives for a more efficient management of water resources systems. Acknowledgments: The study has been partially supported by the European Community 7th Framework Project (GENESIS project, n. 226536), SAWARES (Plan Nacional I+D+i 2008-2011, CGL2009-13238-C02-01 and C02-02), SCARCE (Consolider-Ingenio 2010 CSD2009-00065) of the Spanish Ministry of Economy and Competitiveness; and EC 7th Framework Project ENHANCE (n. 308438) Reference: Pulido-Velazquez, M., Alvarez-Mendiola, E., and Andreu, J., 2013. Design of Efficient Water Pricing Policies Integrating Basinwide Resource Opportunity Costs. J. Water Resour. Plann. Manage., 139(5): 583-592.
Development of a Multivariable Parametric Cost Analysis for Space-Based Telescopes
NASA Technical Reports Server (NTRS)
Dollinger, Courtnay
2011-01-01
Over the past 400 years, the telescope has proven to be a valuable tool in helping humankind understand the Universe around us. The images and data produced by telescopes have revolutionized planetary, solar, stellar, and galactic astronomy and have inspired a wide range of people, from the child who dreams about the images seen on NASA websites to the most highly trained scientist. Like all scientific endeavors, astronomical research must operate within the constraints imposed by budget limitations. Hence the importance of understanding cost: to find the balance between the dreams of scientists and the restrictions of the available budget. By logically analyzing the data we have collected for over thirty different telescopes from more than 200 different sources, statistical methods, such as plotting regressions and residuals, can be used to determine what drives the cost of telescopes to build and use a cost model for space-based telescopes. Previous cost models have focused their attention on ground-based telescopes due to limited data for space telescopes and the larger number and longer history of ground-based astronomy. Due to the increased availability of cost data from recent space-telescope construction, we have been able to produce and begin testing a comprehensive cost model for space telescopes, with guidance from the cost models for ground-based telescopes. By separating the variables that effect cost such as diameter, mass, wavelength, density, data rate, and number of instruments, we advance the goal to better understand the cost drivers of space telescopes.. The use of sophisticated mathematical techniques to improve the accuracy of cost models has the potential to help society make informed decisions about proposed scientific projects. An improved knowledge of cost will allow scientists to get the maximum value returned for the money given and create a harmony between the visions of scientists and the reality of a budget.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pavlak, Gregory S.; Henze, Gregor P.; Hirsch, Adam I.
This paper demonstrates an energy signal tool to assess the system-level and whole-building energy use of an office building in downtown Denver, Colorado. The energy signal tool uses a traffic light visualization to alert a building operator to energy use which is substantially different from expected. The tool selects which light to display for a given energy end-use by comparing measured energy use to expected energy use, accounting for uncertainty. A red light is only displayed when a fault is likely enough, and abnormal operation costly enough, that taking action will yield the lowest cost result. While the theoretical advancesmore » and tool development were reported previously, it has only been tested using a basic building model and has not, until now, been experimentally verified. Expected energy use for the field demonstration is provided by a compact reduced-order representation of the Alliance Center, generated from a detailed DOE-2.2 energy model. Actual building energy consumption data is taken from the summer of 2014 for the office building immediately after a significant renovation project. The purpose of this paper is to demonstrate a first look at the building following its major renovation compared to the design intent. The tool indicated strong under-consumption in lighting and plug loads and strong over-consumption in HVAC energy consumption, which prompted several focused actions for follow-up investigation. In addition, this paper illustrates the application of Bayesian inference to the estimation of posterior parameter probability distributions to measured data. Practical discussion of the application is provided, along with additional findings from further investigating the significant difference between expected and actual energy consumption.« less
CATO: a CAD tool for intelligent design of optical networks and interconnects
NASA Astrophysics Data System (ADS)
Chlamtac, Imrich; Ciesielski, Maciej; Fumagalli, Andrea F.; Ruszczyk, Chester; Wedzinga, Gosse
1997-10-01
Increasing communication speed requirements have created a great interest in very high speed optical and all-optical networks and interconnects. The design of these optical systems is a highly complex task, requiring the simultaneous optimization of various parts of the system, ranging from optical components' characteristics to access protocol techniques. Currently there are no computer aided design (CAD) tools on the market to support the interrelated design of all parts of optical communication systems, thus the designer has to rely on costly and time consuming testbed evaluations. The objective of the CATO (CAD tool for optical networks and interconnects) project is to develop a prototype of an intelligent CAD tool for the specification, design, simulation and optimization of optical communication networks. CATO allows the user to build an abstract, possible incomplete, model of the system, and determine its expected performance. Based on design constraints provided by the user, CATO will automatically complete an optimum design, using mathematical programming techniques, intelligent search methods and artificial intelligence (AI). Initial design and testing of a CATO prototype (CATO-1) has been completed recently. The objective was to prove the feasibility of combining AI techniques, simulation techniques, an optical device library and a graphical user interface into a flexible CAD tool for obtaining optimal communication network designs in terms of system cost and performance. CATO-1 is an experimental tool for designing packet-switching wavelength division multiplexing all-optical communication systems using a LAN/MAN ring topology as the underlying network. The two specific AI algorithms incorporated are simulated annealing and a genetic algorithm. CATO-1 finds the optimal number of transceivers for each network node, using an objective function that includes the cost of the devices and the overall system performance.
Clevin, Lotte; Grantcharov, Teodor P
2008-01-01
Laparoscopic box model trainers have been used in training curricula for a long time, however data on their impact on skills acquisition is still limited. Our aim was to validate a low cost box model trainer as a tool for the training of skills relevant to laparoscopic surgery. Randomised, controlled trial (Canadian Task Force Classification I). University Hospital. Sixteen gynaecologic residents with limited laparoscopic experience were randomised to a group that received a structured box model training curriculum, and a control group. Performance before and after the training was assessed in a virtual reality laparoscopic trainer (LapSim and was based on objective parameters, registered by the computer system (time, error, and economy of motion scores). Group A showed significantly greater improvement in all performance parameters compared with the control group: economy of movement (p=0.001), time (p=0.001) and tissue damage (p=0.036), confirming the positive impact of box-trainer curriculum on laparoscopic skills acquisition. Structured laparoscopic skill training on a low cost box model trainer improves performance as assessed using the VR system. Trainees who used the box model trainer showed significant improvement compared to the control group. Box model trainers are valid tools for laparoscopic skills training and should be implemented in the comprehensive training curricula in gynaecology.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.
2010-12-01
Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.
Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support
NASA Technical Reports Server (NTRS)
Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun
2012-01-01
This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.
Crane, Glenis J; Kymes, Steven M; Hiller, Janet E; Casson, Robert; Martin, Adam; Karnon, Jonathan D
2013-11-01
Decision-analytic models are routinely used as a framework for cost-effectiveness analyses of health care services and technologies; however, these models mostly ignore resource constraints. In this study, we use a discrete-event simulation model to inform a cost-effectiveness analysis of alternative options for the organization and delivery of clinical services in the ophthalmology department of a public hospital. The model is novel, given that it represents both disease outcomes and resource constraints in a routine clinical setting. A 5-year discrete-event simulation model representing glaucoma patient services at the Royal Adelaide Hospital (RAH) was implemented and calibrated to patient-level data. The data were sourced from routinely collected waiting and appointment lists, patient record data, and the published literature. Patient-level costs and quality-adjusted life years were estimated for a range of alternative scenarios, including combinations of alternate follow-up times, booking cycles, and treatment pathways. The model shows that a) extending booking cycle length from 4 to 6 months, b) extending follow-up visit times by 2 to 3 months, and c) using laser in preference to medication are more cost-effective than current practice at the RAH eye clinic. The current simulation model provides a useful tool for informing improvements in the organization and delivery of glaucoma services at a local level (e.g., within a hospital), on the basis of expected effects on costs and health outcomes while accounting for current capacity constraints. Our model may be adapted to represent glaucoma services at other hospitals, whereas the general modeling approach could be applied to many other clinical service areas.
Advanced Engineering Environments: Implications for Aerospace Manufacturing
NASA Technical Reports Server (NTRS)
Thomas, D.
2001-01-01
There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.
Generalizability of cost-utility analyses across countries and settings.
Ginsberg, Gary M
2013-12-01
All societies have limited resources, so decisions have to be made about which public health interventions should be provided. A major tool used for prioritisation is cost-utility analysis (CUA) where the outcomes are measured in terms of Disability Adjusted Life Years (DALYs) prevented. Collecting data and building models to calculate the ratio of net costs (i.e.: intervention costs less treatment costs averted due to decreases in morbidity and mortality) to outcomes (CUR) is complex and time consuming. Therefore, there is a great appeal in using CUA calculations that have already been published in other countries. This paper points out the many limitations and inaccuracies caused by generalizing results from CUAs across different countries. However, if time constraints are pressing then first-order estimates of results could be presented after adjustments for the major drivers of the CUR, such as incidence rates, intervention costs and averted treatment costs. Copyright © 2013 Elsevier Ltd. All rights reserved.
High Thermal Conductivity Polymer Composites for Low Cost Heat Exchangers
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2017-08-01
This factsheet describes a project that identified and evaluated commercially available and state-of-the-art polymer-based material options for manufacturing industrial and commercial non-metallic heat exchangers. A heat exchanger concept was also developed and its performance evaluated with heat transfer modeling tools.
Emerging from the bottleneck: benefits of the comparative approach to modern neuroscience.
Brenowitz, Eliot A; Zakon, Harold H
2015-05-01
Neuroscience has historically exploited a wide diversity of animal taxa. Recently, however, research has focused increasingly on a few model species. This trend has accelerated with the genetic revolution, as genomic sequences and genetic tools became available for a few species, which formed a bottleneck. This coalescence on a small set of model species comes with several costs that are often not considered, especially in the current drive to use mice explicitly as models for human diseases. Comparative studies of strategically chosen non-model species can complement model species research and yield more rigorous studies. As genetic sequences and tools become available for many more species, we are poised to emerge from the bottleneck and once again exploit the rich biological diversity offered by comparative studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Innovations in healthcare finance lessons from the 401(k) model.
Myers, Chris; Lineen, Jason
2008-10-01
*Escalating health benefit expenses are leading employers to shift more of the costs to their employees. *Global financial services companies and startup entrepreneurs are competing to develop private-sector solutions to capitalize on the ailing and mis-aligned healthcare financing system. *Emerging innovations are targeting insured individuals who are facing increasing responsibility for first-dollar coverage. *Healthcare providers should view patients as individual "price-sensitive payers" as new tools enable them to shop around for services based on cost and quality.
An Investigation of Human Performance Model Validation
2005-03-01
of design decisions, what the costs and benefits are for each of the stages of analysis options. On the ’benefits’ side, the manager needs to know...confidence. But we also want to know that we are not expending any more effort (and other costs ) than necessary to ensure that the right decision is...supported at each stage. Ultimately, we want to enable SBA managers to have confidence that they are selecting the right HPM tools and using them correctly in
Bradshaw, Jonathan L; Luthy, Richard G
2017-10-17
Infrastructure systems that use stormwater and recycled water to augment groundwater recharge through spreading basins represent cost-effective opportunities to diversify urban water supplies. However, technical questions remain about how these types of managed aquifer recharge systems should be designed; furthermore, existing planning tools are insufficient for performing robust design comparisons. Addressing this need, we present a model for identifying the best-case design and operation schedule for systems that deliver recycled water to underutilized stormwater spreading basins. Resulting systems are optimal with respect to life cycle costs and water deliveries. Through a case study of Los Angeles, California, we illustrate how delivering recycled water to spreading basins could be optimally implemented. Results illustrate trade-offs between centralized and decentralized configurations. For example, while a centralized Hyperion system could deliver more recycled water to the Hansen Spreading Grounds, this system incurs approximately twice the conveyance cost of a decentralized Tillman system (mean of 44% vs 22% of unit life cycle costs). Compared to existing methods, our model allows for more comprehensive and precise analyses of cost, water volume, and energy trade-offs among different design scenarios. This model can inform decisions about spreading basin operation policies and the development of new water supplies.
Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho
2015-04-01
Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.
Self-monitoring induced savings on type 2 diabetes patients' travel and healthcare costs.
Leminen, Aapeli; Tykkyläinen, Markku; Laatikainen, Tiina
2018-07-01
Type 2 diabetes (T2DM) is a major health concern in most regions. In addition to direct healthcare costs, diabetes causes many indirect costs that are often ignored in economic analyses. Patients' travel and time costs associated with the follow-up of T2DM patients have not been previously calculated systematically over an entire healthcare district. The aim of the study was to develop a georeferenced cost model that could be used to measure healthcare accessibility and patient travel and time costs in a sparsely populated healthcare district in Finland. Additionally, the model was used to test whether savings in the total costs of follow-up of T2DM patients are achieved by increasing self-monitoring and implementing electronic feedback practices between healthcare staff and patients. Patient data for this study was obtained from the regional electronic patient database Mediatri. A georeferenced cost model of linear equations was developed with ESRI ArcGIS 10.3 software and ModelBuilder tool. The Model utilizes OD Cost Matrix method of network analysis to calculate optimal routes for primary-care follow-up visits. In the study region of North Karelia, the average annual total costs of T2DM follow-up screening of HbA1c (9070 patients) conforming to the national clinical guidelines are 280 EUR/297 USD per patient. Combined travel and time costs are 21 percent of the total costs. Implementing self-monitoring for a half of the follow-up still within the guidelines, the average annual total costs of HbA1c screening could be reduced by 57 percent from 280 EUR/297 USD to 121 EUR/129 USD per patient. Travel costs related to HbA1c screening of T2DM patients constitute a substantial cost item, the consideration of which in healthcare planning would enable the societal cost-efficiency of T2DM care to be improved. Even more savings in both travel costs and healthcare costs of T2DM can be achieved by utilizing more self-monitoring and electronic feedback practices. Additionally, the cost model composed in the study can be developed and expanded further to address other healthcare processes and patient groups. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
CRADA Final Report: Weld Predictor App
DOE Office of Scientific and Technical Information (OSTI.GOV)
Billings, Jay Jay
Welding is an important manufacturing process used in a broad range of industries and market sectors, including automotive, aerospace, heavy manufacturing, medical, and defense. During welded fabrication, high localized heat input and subsequent rapid cooling result in the creation of residual stresses and distortion. These residual stresses can significantly affect the fatigue resistance, cracking behavior, and load-carrying capacity of welded structures during service. Further, additional fitting and tacking time is often required to fit distorted subassemblies together, resulting in non-value added cost. Using trial-and-error methods to determine which welding parameters, welding sequences, and fixture designs will most effectively reduce distortionmore » is a time-consuming and expensive process. For complex structures with many welds, this approach can take several months. For this reason, efficient and accurate methods of mitigating distortion are in-demand across all industries where welding is used. Analytical and computational methods and commercial software tools have been developed to predict welding-induced residual stresses and distortion. Welding process parameters, fixtures, and tooling can be optimized to reduce the HAZ softening and minimize weld residual stress and distortion, improving performance and reducing design, fabrication and testing costs. However, weld modeling technology tools are currently accessible only to engineers and designers with a background in finite element analysis (FEA) who work with large manufacturers, research institutes, and universities with access to high-performance computing (HPC) resources. Small and medium enterprises (SMEs) in the US do not typically have the human and computational resources needed to adopt and utilize weld modeling technology. To allow an engineer with no background in FEA and SMEs to gain access to this important design tool, EWI and the Ohio Supercomputer Center (OSC) developed the online weld application software tool “WeldPredictor” ( https://eweldpredictor.ewi.org ). About 1400 users have tested this application. This project marked the beginning of development on the next version of WeldPredictor that addresses many outstanding features of the original, including 3D models, allow more material hardening laws, model material phase transformation, and uses open source finite element solvers to quickly solve problems (as opposed to expensive commercial tools).« less
ISRU System Model Tool: From Excavation to Oxygen Production
NASA Technical Reports Server (NTRS)
Santiago-Maldonado, Edgardo; Linne, Diane L.
2007-01-01
In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.
NASA Technical Reports Server (NTRS)
McElroy, Mack; de Carvalho, Nelson; Estes, Ashley; Lin, Shih-yung
2017-01-01
Use of lightweight composite materials in space and aircraft structure designs is often challenging due to high costs associated with structural certification. Of primary concern in the use of composite structures is durability and damage tolerance. This concern is due to the inherent susceptibility of composite materials to both fabrication and service induced flaws. Due to a lack of general industry accepted analysis tools applicable to composites damage simulation, a certification procedure relies almost entirely on testing. It is this reliance on testing, especially compared to structures comprised of legacy metallic materials where damage simulation tools are available, that can drive costs for using composite materials in aerospace structures. The observation that use of composites can be expensive due to testing requirements is not new and as such, research on analysis tools for simulating damage in composite structures has been occurring for several decades. A convenient approach many researchers/model-developers in this area have taken is to select a specific problem relevant to aerospace structural certification and develop a model that is accurate within that scope. Some examples are open hole tension tests, compression after impact tests, low-velocity impact, damage tolerance of an embedded flaw, and fatigue crack growth to name a few. Based on the premise that running analyses is cheaper than running tests, one motivation that many researchers in this area have is that if generally applicable and reliable damage simulation tools were available the dependence on certification testing could be lessened thereby reducing overall design cost. It is generally accepted that simulation tools if applied in this manner would still need to be thoroughly validated and that composite testing will never be completely replaced by analysis. Research and development is currently occurring at NASA to create numerical damage simulation tools applicable to damage in composites. The Advanced Composites Project (ACP) at NASA Langley has supported the development of composites damage simulation tools in a consortium of aerospace companies with a goal of reducing the certification time of a commercial aircraft by 30%. And while the scope of ACP does not include spacecraft, much of the methodology and simulation capabilities can apply to spacecraft certification in the Space Launch System and Orion programs as well. Some specific applications of composite damage simulation models in a certification program are (1) evaluation of damage during service when maintenance may be difficult or impossible, (2) a tool for early design iterations, (3) gaining insight into a particular damage process and applying this insight towards a test coupon or structural design, and (4) analysis of damage scenarios that are difficult or impossible to recreate in a test. As analysis capabilities improve, these applications and more will become realized resulting in a reduction in cost for use of composites in aerospace vehicles. NASA is engaged in this process from both research and application perspectives. In addition to the background information discussed previously, this presentation covers a look at recent research at NASA in this area and some current/potential applications in the Orion program.
NASA Technical Reports Server (NTRS)
Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.
2000-01-01
This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.
Modeling the trade-off between diet costs and methane emissions: A goal programming approach.
Moraes, L E; Fadel, J G; Castillo, A R; Casper, D P; Tricarico, J M; Kebreab, E
2015-08-01
Enteric methane emission is a major greenhouse gas from livestock production systems worldwide. Dietary manipulation may be an effective emission-reduction tool; however, the associated costs may preclude its use as a mitigation strategy. Several studies have identified dietary manipulation strategies for the mitigation of emissions, but studies examining the costs of reducing methane by manipulating diets are scarce. Furthermore, the trade-off between increase in dietary costs and reduction in methane emissions has only been determined for a limited number of production scenarios. The objective of this study was to develop an optimization framework for the joint minimization of dietary costs and methane emissions based on the identification of a set of feasible solutions for various levels of trade-off between emissions and costs. Such a set of solutions was created by the specification of a systematic grid of goal programming weights, enabling the decision maker to choose the solution that achieves the desired trade-off level. Moreover, the model enables the calculation of emission-mitigation costs imputing a trading value for methane emissions. Emission imputed costs can be used in emission-unit trading schemes, such as cap-and-trade policy designs. An application of the model using data from lactating cows from dairies in the California Central Valley is presented to illustrate the use of model-generated results in the identification of optimal diets when reducing emissions. The optimization framework is flexible and can be adapted to jointly minimize diet costs and other potential environmental impacts (e.g., nitrogen excretion). It is also flexible so that dietary costs, feed nutrient composition, and animal nutrient requirements can be altered to accommodate various production systems. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)
2002-01-01
Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.
Upstream solutions to coral reef conservation: The payoffs of smart and cooperative decision-making.
Oleson, Kirsten L L; Falinski, Kim A; Lecky, Joey; Rowe, Clara; Kappel, Carrie V; Selkoe, Kimberly A; White, Crow
2017-04-15
Land-based source pollutants (LBSP) actively threaten coral reef ecosystems globally. To achieve the greatest conservation outcome at the lowest cost, managers could benefit from appropriate tools that evaluate the benefits (in terms of LBSP reduction) and costs of implementing alternative land management strategies. Here we use a spatially explicit predictive model (InVEST-SDR) that quantifies change in sediment reaching the coast for evaluating the costs and benefits of alternative threat-abatement scenarios. We specifically use the model to examine trade-offs among possible agricultural road repair management actions (water bars to divert runoff and gravel to protect the road surface) across the landscape in West Maui, Hawaii, USA. We investigated changes in sediment delivery to coasts and costs incurred from management decision-making that is (1) cooperative or independent among landowners, and focused on (2) minimizing costs, reducing sediment, or both. The results illuminate which management scenarios most effectively minimize sediment while also minimizing the cost of mitigation efforts. We find targeting specific "hotspots" within all individual parcels is more cost-effective than targeting all road segments. The best outcomes are achieved when landowners cooperate and target cost-effective road repairs, however, a cooperative strategy can be counter-productive in some instances when cost-effectiveness is ignored. Simple models, such as the one developed here, have the potential to help managers make better choices about how to use limited resources. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Burgess, James F; Shwartz, Michael; Stolzmann, Kelly; Sullivan, Jennifer L
2018-05-18
To examine the relationship between cost and quality in Veterans Health Administration (VA) nursing homes (called Community Living Centers, CLCs) using longitudinal data. One hundred and thirty CLCs over 13 quarters (from FY2009 to FY2012) were studied. Costs, resident days, and resident severity (RUGs score) were obtained from the VA Managerial Cost Accounting System. Clinical quality measures were obtained from the Minimum Data Set, and resident-centered care (RCC) was measured using the Artifacts of Culture Change Tool. We used a generalized estimating equation model with facilities included as fixed effects to examine the relationship between total cost and quality after controlling for resident days and severity. The model included linear and squared terms for all independent variables and interactions with resident days. With the exception of RCC, all other variables had a statistically significant relationship with total costs. For most poorer performing smaller facilities (lower size quartile), improvements in quality were associated with higher costs. For most larger facilities, improvements in quality were associated with lower costs. The relationship between cost and quality depends on facility size and current level of performance. © Published 2018. This article is a U.S. Government work and is in the public domain in the USA.
Analyzing medical costs with time-dependent treatment: The nested g-formula.
Spieker, Andrew; Roy, Jason; Mitra, Nandita
2018-04-16
As medical expenses continue to rise, methods to properly analyze cost outcomes are becoming of increasing relevance when seeking to compare average costs across treatments. Inverse probability weighted regression models have been developed to address the challenge of cost censoring in order to identify intent-to-treat effects (i.e., to compare mean costs between groups on the basis of their initial treatment assignment, irrespective of any subsequent changes to their treatment status). In this paper, we describe a nested g-computation procedure that can be used to compare mean costs between two or more time-varying treatment regimes. We highlight the relative advantages and limitations of this approach when compared with existing regression-based models. We illustrate the utility of this approach as a means to inform public policy by applying it to a simulated data example motivated by costs associated with cancer treatments. Simulations confirm that inference regarding intent-to-treat effects versus the joint causal effects estimated by the nested g-formula can lead to markedly different conclusions regarding differential costs. Therefore, it is essential to prespecify the desired target of inference when choosing between these two frameworks. The nested g-formula should be considered as a useful, complementary tool to existing methods when analyzing cost outcomes. Copyright © 2018 John Wiley & Sons, Ltd.
van Asselt, Thea; Ramaekers, Bram; Corro Ramos, Isaac; Joore, Manuela; Al, Maiwenn; Lesman-Leegte, Ivonne; Postma, Maarten; Vemer, Pepijn; Feenstra, Talitha
2018-01-01
The costs of performing research are an important input in value of information (VOI) analyses but are difficult to assess. The aim of this study was to investigate the costs of research, serving two purposes: (1) estimating research costs for use in VOI analyses; and (2) developing a costing tool to support reviewers of grant proposals in assessing whether the proposed budget is realistic. For granted study proposals from the Netherlands Organization for Health Research and Development (ZonMw), type of study, potential cost drivers, proposed budget, and general characteristics were extracted. Regression analysis was conducted in an attempt to generate a 'predicted budget' for certain combinations of cost drivers, for implementation in the costing tool. Of 133 drug-related research grant proposals, 74 were included for complete data extraction. Because an association between cost drivers and budgets was not confirmed, we could not generate a predicted budget based on regression analysis, but only historic reference budgets given certain study characteristics. The costing tool was designed accordingly, i.e. with given selection criteria the tool returns the range of budgets in comparable studies. This range can be used in VOI analysis to estimate whether the expected net benefit of sampling will be positive to decide upon the net value of future research. The absence of association between study characteristics and budgets may indicate inconsistencies in the budgeting or granting process. Nonetheless, the tool generates useful information on historical budgets, and the option to formally relate VOI to budgets. To our knowledge, this is the first attempt at creating such a tool, which can be complemented with new studies being granted, enlarging the underlying database and keeping estimates up to date.
Bakam, Innocent; Balana, Bedru Babulo; Matthews, Robin
2012-12-15
Market-based policy instruments to reduce greenhouse gas (GHG) emissions are generally considered more appropriate than command and control tools. However, the omission of transaction costs from policy evaluations and decision-making processes may result in inefficiency in public resource allocation and sub-optimal policy choices and outcomes. This paper aims to assess the relative cost-effectiveness of market-based GHG mitigation policy instruments in the agricultural sector by incorporating transaction costs. Assuming that farmers' responses to mitigation policies are economically rationale, an individual-based model is developed to study the relative performances of an emission tax, a nitrogen fertilizer tax, and a carbon trading scheme using farm data from the Scottish farm account survey (FAS) and emissions and transaction cost data from literature metadata survey. Model simulations show that none of the three schemes could be considered the most cost effective in all circumstances. The cost effectiveness depends both on the tax rate and the amount of free permits allocated to farmers. However, the emissions trading scheme appears to outperform both other policies in realistic scenarios. Copyright © 2012 Elsevier Ltd. All rights reserved.
Sequential Tool Use in Great Apes
Martin-Ordas, Gema; Schumacher, Lena; Call, Josep
2012-01-01
Sequential tool use is defined as using a tool to obtain another non-food object which subsequently itself will serve as a tool to act upon a further (sub)goal. Previous studies have shown that birds and great apes succeed in such tasks. However, the inclusion of a training phase for each of the sequential steps and the low cost associated with retrieving the longest tools limits the scope of the conclusions. The goal of the experiments presented here was, first to replicate a previous study on sequential tool use conducted on New Caledonian crows and, second, extend this work by increasing the cost of retrieving a tool in order to test tool selectivity of apes. In Experiment 1, we presented chimpanzees, orangutans and bonobos with an out-of-reach reward, two tools that were available but too short to reach the food and four out-of-reach tools differing in functionality. Similar to crows, apes spontaneously used up to 3 tools in sequence to get the reward and also showed a strong preference for the longest out-of reach tool independently of the distance of the food. In Experiment 2, we increased the cost of reaching for the longest out-of reach tool. Now apes used up to 5 tools in sequence to get the reward and became more selective in their choice of the longest tool as the costs of its retrieval increased. The findings of the studies presented here contribute to the growing body of comparative research on tool use. PMID:23300592
What Can the Diffusion Model Tell Us About Prospective Memory?
Horn, Sebastian S.; Bayen, Ute J.; Smith, Rebekah E.
2011-01-01
Cognitive process models, such as Ratcliff’s (1978) diffusion model, are useful tools for examining cost- or interference effects in event-based prospective memory (PM). The diffusion model includes several parameters that provide insight into how and why ongoing-task performance may be affected by a PM task and is ideally suited to analyze performance because both reaction time and accuracy are taken into account. Separate analyses of these measures can easily yield misleading interpretations in cases of speed-accuracy tradeoffs. The diffusion model allows us to measure possible criterion shifts and is thus an important methodological improvement over standard analyses. Performance in an ongoing lexical decision task (Smith, 2003) was analyzed with the diffusion model. The results suggest that criterion shifts play an important role when a PM task is added, but do not fully explain the cost effect on RT. PMID:21443332
Optimization of power systems with voltage security constraints
NASA Astrophysics Data System (ADS)
Rosehart, William Daniel
As open access market principles are applied to power systems, significant changes in their operation and control are occurring. In the new marketplace, power systems are operating under higher loading conditions as market influences demand greater attention to operating cost versus stability margins. Since stability continues to be a basic requirement in the operation of any power system, new tools are being considered to analyze the effect of stability on the operating cost of the system, so that system stability can be incorporated into the costs of operating the system. In this thesis, new optimal power flow (OPF) formulations are proposed based on multi-objective methodologies to optimize active and reactive power dispatch while maximizing voltage security in power systems. The effects of minimizing operating costs, minimizing reactive power generation and/or maximizing voltage stability margins are analyzed. Results obtained using the proposed Voltage Stability Constrained OPF formulations are compared and analyzed to suggest possible ways of costing voltage security in power systems. When considering voltage stability margins the importance of system modeling becomes critical, since it has been demonstrated, based on bifurcation analysis, that modeling can have a significant effect of the behavior of power systems, especially at high loading levels. Therefore, this thesis also examines the effects of detailed generator models and several exponential load models. Furthermore, because of its influence on voltage stability, a Static Var Compensator model is also incorporated into the optimization problems.
Strategy and gaps for modeling, simulation, and control of hybrid systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob
2015-04-01
The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less
Annual cost of antiretroviral therapy among three service delivery models in Uganda
Vu, Lung; Waliggo, Samuel; Zieman, Brady; Jani, Nrupa; Buzaalirwa, Lydia; Okoboi, Stephen; Okal, Jerry; Borse, Nagesh N; Kalibala, Samuel
2016-01-01
Introduction In response to the increasing burden of HIV, the Ugandan government has employed different service delivery models since 2004 that aim to reduce costs and remove barriers to accessing HIV care. These models include community-based approaches to delivering antiretroviral therapy (ART) and delegating tasks to lower-level health workers. This study aimed to provide data on annual ART cost per client among three different service delivery models in Uganda. Methods Costing data for the entire year 2012 were retrospectively collected as part of a larger task-shifting study conducted in three organizations in Uganda: Kitovu Mobile (KM), the AIDS Support Organisation (TASO) and Uganda Cares (UC). A standard cost data capture tool was developed and used to retrospectively collect cost information regarding antiretroviral (ARV) drugs and non-ARV drugs, ART-related lab tests, personnel and administrative costs. A random sample of four TASO centres (out of 11), four UC clinics (out of 29) and all KM outreach units were selected for the study. Results Cost varied across sites within each organization as well as across the three organizations. In addition, the number of annual ART visits was more frequent in rural areas and through KM (the community distribution model), which played a major part in the overall annual ART cost. The annual cost per client (in USD) was $404 for KM, $332 for TASO and $257 for UC. These estimates were lower than previous analyses in Uganda or the region compared to data from 2001 to 2009, but comparable with recent estimates using data from 2010 to 2013. ARVs accounted for the majority of the total cost, followed by personnel and operational costs. Conclusions The study provides updated data on annual cost per ART visit for three service delivery models in Uganda. These data will be vital for in-country budgetary efforts to ensure that universal access to ART, as called for in the 2015 World Health Organization (WHO) guidelines, is achievable. The lower annual ART cost found in this study indicates that we may be able to treat all people with HIV as laid out in the 2015 WHO guidelines. The variation of costs across sites and the three models indicates the potential for efficiency gains. PMID:27443270