Assuring Software Cost Estimates: Is it an Oxymoron?
NASA Technical Reports Server (NTRS)
Hihn, Jarius; Tregre, Grant
2013-01-01
The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.
Estimating Software-Development Costs With Greater Accuracy
NASA Technical Reports Server (NTRS)
Baker, Dan; Hihn, Jairus; Lum, Karen
2008-01-01
COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.
An approach to software cost estimation
NASA Technical Reports Server (NTRS)
Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.
1984-01-01
A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.
COSTMODL: An automated software development cost estimation tool
NASA Technical Reports Server (NTRS)
Roush, George B.
1991-01-01
The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
DOT National Transportation Integrated Search
2013-04-01
The Rural Road Upgrade Inventory and Cost Estimation Software is designed by the AUTC : research team to help the Fairbanks North Star Borough (FNSB) estimate the cost of upgrading : rural roads located in the Borough's Service Areas. The Software pe...
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
Software Cost-Estimation Model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1985-01-01
Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Malone, Linda
2007-01-01
It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.
Cost benefits of advanced software: A review of methodology used at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla N.
1993-01-01
To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.
Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1981-01-01
A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
1988-09-01
20 SLIM . . . . .e & . . . . . . . . . . . . 24 SoftCost-R . . . . . . . . . . . . . . . 26 SPQR /20 . . . . . . . . . . .*. . . . . 28...PRICB-8 . . . . . . . . . . .. . 83 softCost-R ............. 84 SPQR /20 . . . . . . . . . . . . 0 . 84 System-3 . . . . . . . . . . . . . . 85 Summry...128 Appendix G: SoftCost-R Input Values . . . . . . . . . . 129 Appendix H: SoftCost-R Resources Estimate . . . . . . . 131 Appendix I: SPQR
A Decision Support System for Planning, Control and Auditing of DoD Software Cost Estimation.
1986-03-01
is frequently used in U. S. Air Force software cost estimates. Barry Boehm’s Constructive Cost Estimation Model (COCOMO) was recently selected for use...are considered basic to the proper development of software. Pressman , [Ref. 11], addresses these basic elements in a manner which attempts to integrate...H., Jr., and Carlson, Eric D., Building E fective Decision SUDDOrt Systems, Prentice-Hal, EnglewoodNJ, 1982 11. Pressman , Roger S., o A Practioner’s A
Software cost/resource modeling: Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. J.
1980-01-01
A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
Cost Estimation of Software Development and the Implications for the Program Manager
1992-06-01
Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome
An Exploratory Study of Software Cost Estimating at the Electronic Systems Division.
1976-07-01
action’. to improve the software cost Sestimating proces., While thin research was limited to the M.nD onvironment, the same types of problema may exist...Methods in Social Science. Now York: Random House, 1969. 57. Smith, Ronald L. Structured Programming Series (Vol. XI) - Estimating Software Project
Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1981-01-01
A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
Cost Estimating Cases: Educational Tools for Cost Analysts
1993-09-01
only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output
Which factors affect software projects maintenance cost more?
Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nafiseh
2013-03-01
The software industry has had significant progress in recent years. The entire life of software includes two phases: production and maintenance. Software maintenance cost is increasingly growing and estimates showed that about 90% of software life cost is related to its maintenance phase. Extraction and considering the factors affecting the software maintenance cost help to estimate the cost and reduce it by controlling the factors. In this study, the factors affecting software maintenance cost were determined then were ranked based on their priority and after that effective ways to reduce the maintenance costs were presented. This paper is a research study. 15 software related to health care centers information systems in Isfahan University of Medical Sciences and hospitals function were studied in the years 2010 to 2011. Among Medical software maintenance team members, 40 were selected as sample. After interviews with experts in this field, factors affecting maintenance cost were determined. In order to prioritize the factors derived by AHP, at first, measurement criteria (factors found) were appointed by members of the maintenance team and eventually were prioritized with the help of EC software. Based on the results of this study, 32 factors were obtained which were classified in six groups. "Project" was ranked the most effective feature in maintenance cost with the highest priority. By taking into account some major elements like careful feasibility of IT projects, full documentation and accompany the designers in the maintenance phase good results can be achieved to reduce maintenance costs and increase longevity of the software.
Operations analysis (study 2.1): Shuttle upper stage software requirements
NASA Technical Reports Server (NTRS)
Wolfe, R. R.
1974-01-01
An investigation of software costs related to space shuttle upper stage operations with emphasis on the additional costs attributable to space servicing was conducted. The questions and problem areas include the following: (1) the key parameters involved with software costs; (2) historical data for extrapolation of future costs; (3) elements of the basic software development effort that are applicable to servicing functions; (4) effect of multiple servicing on complexity of the operation; and (5) are recurring software costs significant. The results address these questions and provide a foundation for estimating software costs based on the costs of similar programs and a series of empirical factors.
1988-12-01
software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value
NASA Technical Reports Server (NTRS)
McNeill, Justin
1995-01-01
The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.
ERIC Educational Resources Information Center
Lafferty, Mark T.
2010-01-01
The number of project failures and those projects completed over cost and over schedule has been a significant issue for software project managers. Among the many reasons for failure, inaccuracy in software estimation--the basis for project bidding, budgeting, planning, and probability estimates--has been identified as a root cause of a high…
Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach
ERIC Educational Resources Information Center
Stevenson, Glenn A.
2012-01-01
For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…
CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 6, June 2007
2007-06-01
California. He has co-authored the book Software Cost Estimation With COCOMO II with Barry Boehm and others. Clark helped define the COCOMO II model...Software Engineering at the University of Southern California. She worked with Barry Boehm and Chris Abts to develop and calibrate a cost-estimation...2003/02/ schorsch.html>. 2. See “Software Engineering, A Practitioners Approach” by Roger Pressman for a good description of coupling, cohesion
Users guide for STHARVEST: software to estimate the cost of harvesting small timber.
Roger D. Fight; Xiaoshan Zhang; Bruce R. Hartsough
2003-01-01
The STHARVEST computer application is Windows-based, public-domain software used to estimate costs for harvesting small-diameter stands or the small-diameter component of a mixed-sized stand. The equipment production rates were developed from existing studies. Equipment operating cost rates were based on November 1998 prices for new equipment and wage rates for the...
Dennis R. Becker; Debra Larson; Eini C. Lowell; Robert B. Rummer
2008-01-01
The HCR (Harvest Cost-Revenue) Estimator is engineering and financial analysis software used to evaluate stand-level financial thresholds for harvesting small-diameter ponderosa pine (Pinus ponderosa Dougl. ex Laws.) in the Southwest United States. The Windows-based program helps contractors and planners to identify costs associated with tree...
Methods for cost estimation in software project management
NASA Astrophysics Data System (ADS)
Briciu, C. V.; Filip, I.; Indries, I. I.
2016-02-01
The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.
2010-09-01
NNWC) was used to calculate major cost components—labor, hardware, software , and transport, while a VMware tool was used to calculate power and...cooling costs for both solutions. In addition, VMware provided a cost estimate for the upfront hardware and software licensing costs needed to support...cost per seat (CPS) model developed by Naval Network Warfare Command (NNWC) was used to calculate major cost components—labor, hardware, software , and
SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1994-01-01
The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a file for later use. The SOFTCOST program is written in Microsoft BASIC for interactive execution and has been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. SOFTCOST was originally developed for the Zylog Z80 system running under CP/M in 1981. It was converted to run on the IBM PC XT/AT in 1986. SOFTCOST is a copyrighted work with all copyright vested in NASA.
On a Formal Tool for Reasoning About Flight Software Cost Analysis
NASA Technical Reports Server (NTRS)
Spagnuolo, John N., Jr.; Stukes, Sherry A.
2013-01-01
A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.
Estimating Software Effort Hours for Major Defense Acquisition Programs
ERIC Educational Resources Information Center
Wallshein, Corinne C.
2010-01-01
Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…
1990-09-01
following two chapters. 28 V. COCOMO MODEL A. OVERVIEW The COCOMO model which stands for COnstructive COst MOdel was developed by Barry Boehm and is...estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W. Boehm and...cost estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W
Quantitative software models for the estimation of cost, size, and defects
NASA Technical Reports Server (NTRS)
Hihn, J.; Bright, L.; Decker, B.; Lum, K.; Mikulski, C.; Powell, J.
2002-01-01
The presentation will provide a brief overview of the SQI measurement program as well as describe each of these models and how they are currently being used in supporting JPL project, task and software managers to estimate and plan future software systems and subsystems.
Data Service Provider Cost Estimation Tool
NASA Technical Reports Server (NTRS)
Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel
2011-01-01
The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.
Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests
NASA Technical Reports Server (NTRS)
Douglas, Freddie; Bourgeois, Edit Kaminsky
2005-01-01
The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).
Users guide for FRCS: fuel reduction cost simulator software.
Roger D. Fight; Bruce R. Hartsough; Peter Noordijk
2006-01-01
The Fuel Reduction Cost Simulator (FRCS) spreadsheet application is public domain software used to estimate costs for fuel reduction treatments involving removal of trees of mixed sizes in the form of whole trees, logs, or chips from a forest. Equipment production rates were developed from existing studies. Equipment operating cost rates are from December 2002 prices...
Improving Software Engineering on NASA Projects
NASA Technical Reports Server (NTRS)
Crumbley, Tim; Kelly, John C.
2010-01-01
Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.
Software Cost Estimation Using a Decision Graph Process: A Knowledge Engineering Approach
NASA Technical Reports Server (NTRS)
Stukes, Sherry; Spagnuolo, John, Jr.
2011-01-01
This paper is not a description per se of the efforts by two software cost analysts. Rather, it is an outline of the methodology used for FSW cost analysis presented in a form that would serve as a foundation upon which others may gain insight into how to perform FSW cost analyses for their own problems at hand.
Computer software to estimate timber harvesting system production, cost, and revenue
Dr. John E. Baumgras; Dr. Chris B. LeDoux
1992-01-01
Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...
NASA Technical Reports Server (NTRS)
Hihn, Jairus
2011-01-01
This presentation is about cost risk identification and estimation which is only a part of risk management. The purpose of this step is to identify common software risks, to assess their impact on the cost estimate, and to make revisions to the estimate based on these impacts.
Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?
NASA Technical Reports Server (NTRS)
Lum, Karen; Hihn, Jairus; Menzies, Tim
2006-01-01
While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.
1982-05-13
Size Of The Software. A favourite measure for software system size is linos of operational code, or deliverable code (operational code plus...regression models, these conversions are either derived from productivity measures using the "cost per instruction" type of equation or they are...appropriate to different development organisattons, differert project types, different sets of units for measuring e and s, and different items
Lessons learned in deploying software estimation technology and tools
NASA Technical Reports Server (NTRS)
Panlilio-Yap, Nikki; Ho, Danny
1994-01-01
Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.
Software risk estimation and management techniques at JPL
NASA Technical Reports Server (NTRS)
Hihn, J.; Lum, K.
2002-01-01
In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.
CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 4
2005-04-01
older automated cost- estimating tools are no longer being actively marketed but are still in use such as CheckPoint, COCOMO, ESTIMACS, REVIC, and SPQR ...estimation tools: SPQR /20, Checkpoint, and Knowl- edgePlan. These software estimation tools pioneered the use of function point metrics for sizing and
Software management tools: Lessons learned from use
NASA Technical Reports Server (NTRS)
Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.
1985-01-01
Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.
COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL
NASA Technical Reports Server (NTRS)
Roush, G. B.
1994-01-01
The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.
NASA Technical Reports Server (NTRS)
Hops, J. M.; Sherif, J. S.
1994-01-01
A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.
NASA Technical Reports Server (NTRS)
Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A
1992-01-01
The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.
IDA Cost Research Symposium Held 25 May 1995.
1995-08-01
Excel Spreadsheet Publications: MCR Report TR-9507/01 Category: II.B Keywords: Government, Estimating, Missiles, Analysis, Production, Data...originally developed by Martin Marietta as part of SASET software estimating model. To be implemented as part of SoftEST Software Estimating Tool...following documents to report the results of Its work. Reports Reports are the most authoritative and most carefully considered products IDA
Utilizing Expert Knowledge in Estimating Future STS Costs
NASA Technical Reports Server (NTRS)
Fortner, David B.; Ruiz-Torres, Alex J.
2004-01-01
A method of estimating the costs of future space transportation systems (STSs) involves classical activity-based cost (ABC) modeling combined with systematic utilization of the knowledge and opinions of experts to extend the process-flow knowledge of existing systems to systems that involve new materials and/or new architectures. The expert knowledge is particularly helpful in filling gaps that arise in computational models of processes because of inconsistencies in historical cost data. Heretofore, the costs of planned STSs have been estimated following a "top-down" approach that tends to force the architectures of new systems to incorporate process flows like those of the space shuttles. In this ABC-based method, one makes assumptions about the processes, but otherwise follows a "bottoms up" approach that does not force the new system architecture to incorporate a space-shuttle-like process flow. Prototype software has been developed to implement this method. Through further development of software, it should be possible to extend the method beyond the space program to almost any setting in which there is a need to estimate the costs of a new system and to extend the applicable knowledge base in order to make the estimate.
Selected Tether Applications Cost Model
NASA Technical Reports Server (NTRS)
Keeley, Michael G.
1988-01-01
Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.
Software for Tracking Costs of Mars Projects
NASA Technical Reports Server (NTRS)
Wong, Alvin; Warfield, Keith
2003-01-01
The Mars Cost Tracking Model is a computer program that administers a system set up for tracking the costs of future NASA projects that pertain to Mars. Previously, no such tracking system existed, and documentation was written in a variety of formats and scattered in various places. It was difficult to justify costs or even track the history of costs of a spacecraft mission to Mars. The present software enables users to maintain all cost-model definitions, documentation, and justifications of cost estimates in one computer system that is accessible via the Internet. The software provides sign-off safeguards to ensure the reliability of information entered into the system. This system may eventually be used to track the costs of projects other than only those that pertain to Mars.
The impact of organizational structure on flight software cost risk
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lum, Karen; Monson, Erik
2004-01-01
This paper summarizes the final results of the follow-up study updating the estimated software effort growth for those projects that were still under development and including an evaluation of the roles versus observed cost risk for the missions included in the original study which expands the data set to thirteen missions.
Systems engineering and integration: Cost estimation and benefits analysis
NASA Technical Reports Server (NTRS)
Dean, ED; Fridge, Ernie; Hamaker, Joe
1990-01-01
Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are needed as well as improved usage of risk data by decision-makers. More and better ways to display and communicate cost and cost risk to management are required.
The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.
Mental Models of Software Forecasting
NASA Technical Reports Server (NTRS)
Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.
1993-01-01
The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.
Highway User Benefit Analysis System Research Project #128
DOT National Transportation Integrated Search
2000-10-01
In this research, a methodology for estimating road user costs of various competing alternatives was developed. Also, software was developed to calculate the road user cost, perform economic analysis and update cost tables. The methodology is based o...
Evaluation and selection of security products for authentication of computer software
NASA Astrophysics Data System (ADS)
Roenigk, Mark W.
2000-04-01
Software Piracy is estimated to cost software companies over eleven billion dollars per year in lost revenue worldwide. Over fifty three percent of all intellectual property in the form of software is pirated on a global basis. Software piracy has a dramatic effect on the employment figures for the information industry as well. In the US alone, over 130,000 jobs are lost annually as a result of software piracy.
Unmanned Aerial Vehicles unique cost estimating requirements
NASA Astrophysics Data System (ADS)
Malone, P.; Apgar, H.; Stukes, S.; Sterk, S.
Unmanned Aerial Vehicles (UAVs), also referred to as drones, are aerial platforms that fly without a human pilot onboard. UAVs are controlled autonomously by a computer in the vehicle or under the remote control of a pilot stationed at a fixed ground location. There are a wide variety of drone shapes, sizes, configurations, complexities, and characteristics. Use of these devices by the Department of Defense (DoD), NASA, civil and commercial organizations continues to grow. UAVs are commonly used for intelligence, surveillance, reconnaissance (ISR). They are also use for combat operations, and civil applications, such as firefighting, non-military security work, surveillance of infrastructure (e.g. pipelines, power lines and country borders). UAVs are often preferred for missions that require sustained persistence (over 4 hours in duration), or are “ too dangerous, dull or dirty” for manned aircraft. Moreover, they can offer significant acquisition and operations cost savings over traditional manned aircraft. Because of these unique characteristics and missions, UAV estimates require some unique estimating methods. This paper describes a framework for estimating UAV systems total ownership cost including hardware components, software design, and operations. The challenge of collecting data, testing the sensitivities of cost drivers, and creating cost estimating relationships (CERs) for each key work breakdown structure (WBS) element is discussed. The autonomous operation of UAVs is especially challenging from a software perspective.
Software Technology for Adaptable, Reliable Systems (STARS)
1994-03-25
Tmeline(3), SECOMO(3), SEER(3), GSFC Software Engineering Lab Model(l), SLIM(4), SEER-SEM(l), SPQR (2), PRICE-S(2), internally-developed models(3), APMSS(1...3 " Timeline - 3 " SASET (Software Architecture Sizing Estimating Tool) - 2 " MicroMan 11- 2 * LCM (Logistics Cost Model) - 2 * SPQR - 2 * PRICE-S - 2
Managers Handbook for Software Development
NASA Technical Reports Server (NTRS)
Agresti, W.; Mcgarry, F.; Card, D.; Page, J.; Church, V.; Werking, R.
1984-01-01
Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences with flight dynamics software development. The management aspects of organizing the project, producing a development plan, estimation costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying are described.
Rule-Based Flight Software Cost Estimation
NASA Technical Reports Server (NTRS)
Stukes, Sherry A.; Spagnuolo, John N. Jr.
2015-01-01
This paper discusses the fundamental process for the computation of Flight Software (FSW) cost estimates. This process has been incorporated in a rule-based expert system [1] that can be used for Independent Cost Estimates (ICEs), Proposals, and for the validation of Cost Analysis Data Requirements (CADRe) submissions. A high-level directed graph (referred to here as a decision graph) illustrates the steps taken in the production of these estimated costs and serves as a basis of design for the expert system described in this paper. Detailed discussions are subsequently given elaborating upon the methodology, tools, charts, and caveats related to the various nodes of the graph. We present general principles for the estimation of FSW using SEER-SEM as an illustration of these principles when appropriate. Since Source Lines of Code (SLOC) is a major cost driver, a discussion of various SLOC data sources for the preparation of the estimates is given together with an explanation of how contractor SLOC estimates compare with the SLOC estimates used by JPL. Obtaining consistency in code counting will be presented as well as factors used in reconciling SLOC estimates from different code counters. When sufficient data is obtained, a mapping into the JPL Work Breakdown Structure (WBS) from the SEER-SEM output is illustrated. For across the board FSW estimates, as was done for the NASA Discovery Mission proposal estimates performed at JPL, a comparative high-level summary sheet for all missions with the SLOC, data description, brief mission description and the most relevant SEER-SEM parameter values is given to illustrate an encapsulation of the used and calculated data involved in the estimates. The rule-based expert system described provides the user with inputs useful or sufficient to run generic cost estimation programs. This system's incarnation is achieved via the C Language Integrated Production System (CLIPS) and will be addressed at the end of this paper.
Testing Software Development Project Productivity Model
NASA Astrophysics Data System (ADS)
Lipkin, Ilya
Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.
Manager's handbook for software development, revision 1
NASA Technical Reports Server (NTRS)
1990-01-01
Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences of the Software Engineering Laboratory (SEL) with flight dynamics software development. The management aspects of the following subjects are described: organizing the project, producing a development plan, estimating costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying.
A Descriptive Evaluation of Automated Software Cost-Estimation Models,
1986-10-01
Version 1.03D) * PCOC (Version 7.01) - PRICE S • SLIM (Version 1.1) • SoftCost (Version 5. 1) * SPQR /20 (Version 1. 1) - WICOMO (Version 1.3) These...produce detailed GANTT and PERT charts. SPQR /20 is based on a cost model developed at ITT. In addition to cost, schedule, and staffing estimates, it...cases and test runs required, and the effectiveness of pre-test and test activities. SPQR /20 also predicts enhancement and maintenance activities. C
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Estimate of the Potential Costs and Effectiveness of Scaling Up CRESST Assessment Software.
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.; Herl, Howard E.; Klein, Davina C. D.; O'Neil, Harold F., Jr.; Schacter, John
This report examines issues in the scale-up of assessment software from the Center for Research on Evaluation, Standards, and Student Testing (CRESST). "Scale-up" is used in a metaphorical sense, meaning adding new assessment tools to CRESST's assessment software. During the past several years, CRESST has been developing and evaluating a…
Harvesting costs for management planning for ponderosa pine plantations.
Roger D. Fight; Alex Gicqueau; Bruce R. Hartsough
1999-01-01
The PPHARVST computer application is Windows-based, public-domain software used to estimate harvesting costs for management planning for ponderosa pine (Pinus ponderosa Dougl. ex Laws.) plantations. The equipment production rates were developed from existing studies. Equipment cost rates were based on 1996 prices for new...
Software engineering project management - A state-of-the-art report
NASA Technical Reports Server (NTRS)
Thayer, R. H.; Lehman, J. H.
1977-01-01
The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.
System-of-Systems Technology-Portfolio-Analysis Tool
NASA Technical Reports Server (NTRS)
O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne
2012-01-01
Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.
76 FR 21393 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-15
... startup cost components or annual operation, maintenance, and purchase of service components. You should describe the methods you use to estimate major cost factors, including system and technology acquisition.... Capital and startup costs include, among other items, computers and software you purchase to prepare for...
76 FR 25367 - Agency Information Collection Activities: Proposed Collection, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-04
... startup cost components or annual operation, maintenance, and purchase of service components. You should describe the methods you use to estimate major cost factors, including system and technology acquisition.... Capital and startup costs include, among other items, computers and software you purchase to prepare for...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-28
... disclose this information, you should comment and provide your total capital and startup cost components or... use to estimate major cost factors, including system and technology acquisition, expected useful life... startup costs include, among other items, computers and software you purchase to prepare for collecting...
Development and Application of New Quality Model for Software Projects
Karnavel, K.; Dillibabu, R.
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594
Development and application of new quality model for software projects.
Karnavel, K; Dillibabu, R
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.
Cost Estimation Techniques for C3I System Software.
1984-07-01
opment manmonth have been determined for maxi, midi , and mini .1 type computers. Small to median size timeshared developments used 0.2 to 1.5 hours...development schedule 1.23 1.00 1.10 2.1.3 Detailed Model The final codification of the COCOMO regressions was the development of separate effort...regardless of the software structure level being estimated: D8VC -- the expected development computer (maxi. midi . mini, micro) MODE -- the expected
Software Cost Measuring and Reporting. One of the Software Acquisition Engineering Guidebook Series.
1979-01-02
through the peripherals. How- and performance criteria), ever, his interaction is usually minimal since, by difinition , the automatic test Since TS...performs its Software estimating is still heavily intended functions properly. dependent on experienced judgement. However, quantitative methods...apply to systems of totally different can be distributed to specialists who content. The Quantitative guideline may are most familiar with the work. One
Software Engineering Education Directory
1988-01-01
Dana Hausman and Suzanne Woolf were crucial to the successful completion of this edition of the directory. Their teamwork, energy, and dedication...for this directory began in the summer of 1986 with a questionnaire mailed to schools selected from Peterson’s Graduate Programs in Engineering and...Christoper, and Siegel, Stan Software Cost Estimation and Life-Cycle Control by Putnam, Lawrence H. Software Quality Assurance: A Practical Approach by
2003-03-01
within the Automated Cost Estimating Integrated Tools ( ACEIT ) software suite (version 5.x). With this capability, one can set cost targets or time...not allow the user to vary more than one decision variable. This limitation of the ACEIT approach thus hinders a holistic view when attempting to
Investigating Team Cohesion in COCOMO II.2000
ERIC Educational Resources Information Center
Snowdeal-Carden, Betty A.
2013-01-01
Software engineering is team oriented and intensely complex, relying on human collaboration and creativity more than any other engineering discipline. Poor software estimation is a problem that within the United States costs over a billion dollars per year. Effective measurement of team cohesion is foundationally important to gain accurate…
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2004-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merion M.
2002-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2003-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
An evolutionary morphological approach for software development cost estimation.
Araújo, Ricardo de A; Oliveira, Adriano L I; Soares, Sergio; Meira, Silvio
2012-08-01
In this work we present an evolutionary morphological approach to solve the software development cost estimation (SDCE) problem. The proposed approach consists of a hybrid artificial neuron based on framework of mathematical morphology (MM) with algebraic foundations in the complete lattice theory (CLT), referred to as dilation-erosion perceptron (DEP). Also, we present an evolutionary learning process, called DEP(MGA), using a modified genetic algorithm (MGA) to design the DEP model, because a drawback arises from the gradient estimation of morphological operators in the classical learning process of the DEP, since they are not differentiable in the usual way. Furthermore, an experimental analysis is conducted with the proposed model using five complex SDCE problems and three well-known performance metrics, demonstrating good performance of the DEP model to solve SDCE problems. Copyright © 2012 Elsevier Ltd. All rights reserved.
Use of multispectral data in design of forest sample surveys
NASA Technical Reports Server (NTRS)
Titus, S. J.; Wensel, L. C.
1977-01-01
The use of multispectral data in design of forest sample surveys using a computer software package is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.
Use of multispectral data in design of forest sample surveys
NASA Technical Reports Server (NTRS)
Titus, S. J.; Wensel, L. C.
1977-01-01
The use of multispectral data in design of forest sample surveys using a computer software package, WILLIAM, is described. The system allows evaluation of a number of alternative sampling systems and, with appropriate cost data, estimates the implementation cost for each.
Predictive Software Cost Model Study. Volume II. Software Package Detailed Data.
1980-06-01
will not be limited to: a. ASN-91 NWDS Computer b. Armament System Control Unit ( ASCU ) c. AN/ASN-90 IMS 6. CONFIGURATION CONTROL. OFP/OTP...planned approach. 3. Detailed analysis and study; impacts on hardware, manuals, data, AGE , etc; alternatives with pros and cons; cost estimates; ECP...WAIT UNTIL RESOURCE REQUEST FOR * : HAG TAPE HAS BEEN FULFILLED )MTS 0 RI * Ae* NESDIIRCE MAG TAPE (SHORT FORM)I:TST IN I" . TEST " AG TAPE RESOURCE
NASA Technical Reports Server (NTRS)
Cole, Stuart K.; Wallace, Jon; Schaffer, Mark; May, M. Scott; Greenberg, Marc W.
2014-01-01
As a leader in space technology research and development, NASA is continuing in the development of the Technology Estimating process, initiated in 2012, for estimating the cost and schedule of low maturity technology research and development, where the Technology Readiness Level is less than TRL 6. NASA' s Technology Roadmap areas consist of 14 technology areas. The focus of this continuing Technology Estimating effort included four Technology Areas (TA): TA3 Space Power and Energy Storage, TA4 Robotics, TA8 Instruments, and TA12 Materials, to confine the research to the most abundant data pool. This research report continues the development of technology estimating efforts completed during 2013-2014, and addresses the refinement of parameters selected and recommended for use in the estimating process, where the parameters developed are applicable to Cost Estimating Relationships (CERs) used in the parametric cost estimating analysis. This research addresses the architecture for administration of the Technology Cost and Scheduling Estimating tool, the parameters suggested for computer software adjunct to any technology area, and the identification of gaps in the Technology Estimating process.
CubeSat mission design software tool for risk estimating relationships
NASA Astrophysics Data System (ADS)
Gamble, Katharine Brumbaugh; Lightsey, E. Glenn
2014-09-01
In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.
The Application of Function Points to Predict Source Lines of Code for Software Development
1992-09-01
there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available
Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates
NASA Technical Reports Server (NTRS)
Peffley, Al F.
1991-01-01
The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.
Requirements: Towards an understanding on why software projects fail
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.
2016-08-01
Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.
2012-03-13
Legacy Maintenance and Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level...NDI)-intensive systems Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This...be furnished by NDI components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software
Calibration of a COTS Integration Cost Model Using Local Project Data
NASA Technical Reports Server (NTRS)
Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David
1997-01-01
The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Arthur, James D.; Stapko, Ruth K.; Davani, Darush
1999-01-01
The Software Assurance Technology Center (SATC) at NASA Goddard Space Flight Center has been investigating how projects can determine when sufficient testing has been completed. For most projects, schedules are underestimated, and the last phase of the software development, testing, must be decreased. Two questions are frequently asked: "To what extent is the software error-free? " and "How much time and effort is required to detect and remove the remaining errors? " Clearly, neither question can be answered with absolute certainty. Nonetheless, the ability to answer these questions with some acceptable level of confidence is highly desirable. First, knowing the extent to which a product is error-free, we can judge when it is time to terminate testing. Secondly, if errors are judged to be present, we can perform a cost/benefit trade-off analysis to estimate when the software will be ready for use and at what cost. This paper explains the efforts of the SATC to help projects determine what is sufficient testing and when is the most cost-effective time to stop testing.
1979-12-01
because of the use of complex computational algorithms (Ref 25). Another important factor effecting the cost of soft- ware is the size of the development...involved the alignment and navigational algorithm portions of the software. The second avionics system application was the development of an inertial...001 1 COAT CONL CREA CINT CMAT CSTR COPR CAPP New Code .001 .001 .001 .001 1001 ,OO .00 Device TDAT T03NL TREA TINT Types o * Quantity OGAT OONL OREA
Public safety answering point readiness for wireless E-911 in New York State.
Bailey, Bob W; Scott, Jay M; Brown, Lawrence H
2003-01-01
To determine the level of wireless enhanced 911 readiness among New York's primary public safety answering points. This descriptive study utilized a simple, single-page survey that was distributed in August 2001, with telephone follow-up concluding in January 2002. Surveys were distributed to directors of the primary public safety answering points in each of New York's 62 counties. Information was requested regarding current readiness for providing wireless enhanced 911 service, hardware and software needs for implementing the service, and the estimated costs for obtaining the necessary hardware and software. Two directors did not respond and could not be contacted by telephone; three declined participation; one did not operate an answering point; and seven provided incomplete responses, resulting in usable data from 49 (79%) of the state's public safety answering points. Only 27% of the responding public safety answering points were currently wireless enhanced 911 ready. Specific needs included obtaining or upgrading computer systems (16%), computer-aided dispatch systems (53%), mapping software (71%), telephone systems (27%), and local exchange carrier trunk lines (42%). The total estimated hardware and software costs for achieving wireless enhanced 911 readiness was between 16 million and 20 million dollars. New York's primary public safety answering points are not currently ready to provide wireless enhanced 911 service, and the cost for achieving readiness could be as high as 20 million dollars.
2012-03-13
Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level 6.7 References 7...Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This chapter summarizes each...components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software-intensive systems have
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2012-01-01
This paper presents past and current work in dealing with indirect industry and NASA costs when providing cost estimation or analysis for NASA projects and programs. Indirect costs, when defined as those costs in a project removed from the actual hardware or software hands-on labor; makes up most of the costs of today's complex large scale NASA space/industry projects. This appears to be the case across phases from research into development into production and into the operation of the system. Space transportation is the case of interest here. Modeling and cost estimation as a process rather than a product will be emphasized. Analysis as a series of belief systems in play among decision makers and decision factors will also be emphasized to provide context.
Systems Engineering and Integration (SE and I)
NASA Technical Reports Server (NTRS)
Chevers, ED; Haley, Sam
1990-01-01
The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.
Top Level Space Cost Methodology (TLSCM)
1997-12-02
Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and
1997-09-01
factor values are identified. For SASET, revised cost estimating relationships are provided ( Apgar et al., 1991). A 1991 AFIT thesis by Gerald Ourada...description of the model is a paragraph directly quoted from the user’s manual . This is not to imply that a lack of a thorough analysis indicates...constraints imposed by the system. The effective technology rating is computed from the basic technology rating by the following equation ( Apgar et al., 1991
Cost estimation and analysis using the Sherpa Automated Mine Cost Engineering System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stebbins, P.E.
1993-09-01
The Sherpa Automated Mine Cost Engineering System is a menu-driven software package designed to estimate capital and operating costs for proposed surface mining operations. The program is engineering (as opposed to statistically) based, meaning that all equipment, manpower, and supply requirements are determined from deposit geology, project design and mine production information using standard engineering techniques. These requirements are used in conjunction with equipment, supply, and labor cost databases internal to the program to estimate all associated costs. Because virtually all on-site cost parameters are interrelated within the program, Sherpa provides an efficient means of examining the impact of changesmore » in the equipment mix on total capital and operating costs. If any aspect of the operation is changed, Sherpa immediately adjusts all related aspects as necessary. For instance, if the user wishes to examine the cost ramifications of selecting larger trucks, the program not only considers truck purchase and operation costs, it also automatically and immediately adjusts excavator requirements, operator and mechanic needs, repair facility size, haul road construction and maintenance costs, and ancillary equipment specifications.« less
Microcomputer software to facilitate costing in pathology laboratories.
Stilwell, J A; Woodford, F P
1987-01-01
A software program is described which will enable laboratory managers to calculate, for their laboratory over a 12 month period, the cost of each test or investigation and of components of that cost. These comprise the costs of direct labour, consumables, equipment maintenance and depreciation; allocated costs of intermediate operations--for example, specimen procurement, reception, and data processing; and apportioned indirect costs such as senior staff time as well as external overheads such as telephone charges, rent, and rates. Total annual expenditure on each type of test is also calculated. The principles on which the program is based are discussed. Considered in particular, are the problems of apportioning indirect costs (which are considerable in clinical laboratory work) over different test costs, and the merits of different ways of estimating the amount or fraction of staff members' time spent on each kind of test. The computer program is Crown copyright but is available under licence from one of us (JAS). PMID:3654982
Designing Control System Application Software for Change
NASA Technical Reports Server (NTRS)
Boulanger, Richard
2001-01-01
The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.
A Software Development Simulation Model of a Spiral Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Malone, Linda
2007-01-01
There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.
A comparison of time-shared vs. batch development of space software
NASA Technical Reports Server (NTRS)
Forthofer, M.
1977-01-01
In connection with a study regarding the ground support software development for the Space Shuttle, an investigation was conducted concerning the most suitable software development techniques to be employed. A time-sharing 'trial period' was used to determine whether or not time-sharing would be a cost-effective software development technique for the Ground Based Shuttle system. It was found that time-sharing substantially improved job turnaround and programmer access to the computer for the representative group of ground support programmers. Moreover, this improvement resulted in an estimated saving of over fifty programmer days during the trial period.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
The 2009 DOD Cost Research Workshop: Acquisition Reform
2010-02-01
2 ACEIT Enhancement, Help-Desk/Training, Consulting DASA-CE–3 Command, Control, Communications, Computers, Intelligence, Surveillance, and...Management Information System (OSMIS) online interactive relational database DASA-CE–2 Title: ACEIT Enhancement, Help-Desk/Training, Consulting Summary...support and training for the Automated Cost estimator Integrated Tools ( ACEIT ) software suite. ACEIT is the Army standard suite of analytical tools for
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
Sears, Jeanne M; Blanar, Laura; Bowman, Stephen M
2014-01-01
Acute work-related trauma is a leading cause of death and disability among U.S. workers. Occupational health services researchers have described the pressing need to identify valid injury severity measures for purposes such as case-mix adjustment and the construction of appropriate comparison groups in programme evaluation, intervention, quality improvement, and outcome studies. The objective of this study was to compare the performance of several injury severity scores and scoring methods in the context of predicting work-related disability and medical cost outcomes. Washington State Trauma Registry (WTR) records for injuries treated from 1998 to 2008 were linked with workers' compensation claims. Several Abbreviated Injury Scale (AIS)-based injury severity measures (ISS, New ISS, maximum AIS) were estimated directly from ICD-9-CM codes using two software packages: (1) ICDMAP-90, and (2) Stata's user-written ICDPIC programme (ICDPIC). ICDMAP-90 and ICDPIC scores were compared with existing WTR scores using the Akaike Information Criterion, amount of variance explained, and estimated effects on outcomes. Competing risks survival analysis was used to evaluate work disability outcomes. Adjusted total medical costs were modelled using linear regression. The linked sample contained 6052 work-related injury events. There was substantial agreement between WTR scores and those estimated by ICDMAP-90 (kappa=0.73), and between WTR scores and those estimated by ICDPIC (kappa=0.68). Work disability and medical costs increased monotonically with injury severity, and injury severity was a significant predictor of work disability and medical cost outcomes in all models. WTR and ICDMAP-90 scores performed better with regard to predicting outcomes than did ICDPIC scores, but effect estimates were similar. Of the three severity measures, maxAIS was usually weakest, except when predicting total permanent disability. Injury severity was significantly associated with work disability and medical cost outcomes for work-related injuries. Injury severity can be estimated using either ICDMAP-90 or ICDPIC when ICD-9-CM codes are available. We observed little practical difference between severity measures or scoring methods. This study demonstrated that using existing software to estimate injury severity may be useful to enhance occupational injury surveillance and research. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zbiciak, R.; Grabowik, C.; Janik, W.
2015-11-01
The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing costs estimation has been designed properly was made by comparison of the achieved from the algorithm results with those obtained from industry. This verification has indicated that in most cases both group of results are similar. Taking into account above it is possible to draw a conclusion that the Cost module might play significant role in design constructional process by adding an engineer at the selection stage of alternative gear wheels design. It should be remembered that real manufacturing cost can differ significantly according to available in a factory manufacturing techniques and stock of machine tools.
Implementation and Simulation Results using Autonomous Aerobraking Development Software
NASA Technical Reports Server (NTRS)
Maddock, Robert W.; DwyerCianciolo, Alicia M.; Bowes, Angela; Prince, Jill L. H.; Powell, Richard W.
2011-01-01
An Autonomous Aerobraking software system is currently under development with support from the NASA Engineering and Safety Center (NESC) that would move typically ground-based operations functions to onboard an aerobraking spacecraft, reducing mission risk and mission cost. The suite of software that will enable autonomous aerobraking is the Autonomous Aerobraking Development Software (AADS) and consists of an ephemeris model, onboard atmosphere estimator, temperature and loads prediction, and a maneuver calculation. The software calculates the maneuver time, magnitude and direction commands to maintain the spacecraft periapsis parameters within design structural load and/or thermal constraints. The AADS is currently tested in simulations at Mars, with plans to also evaluate feasibility and performance at Venus and Titan.
The National Solar Permitting Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunderson, Renic
"The soft costs of solar — costs not associated with hardware — remain stubbornly high. Among the biggest soft costs are those associated with inefficiencies in local permitting and inspection. A study by the National Renewable Energy Laboratory and Lawrence Berkeley National Laboratory estimates that these costs add an average of $0.22/W per residential installation. This project helps reduce non-hardware/balance of system (BOS) costs by creating and maintaining a free and available site of permitting requirements and solar system verification software that installers can use to reduce time, capital, and resource investments in tracking permitting requirements. Software tools to identifymore » best permitting practices can enable government stakeholders to optimize their permitting process and remove superfluous costs and requirements. Like ""a Wikipedia for solar permitting"", users can add, edit, delete, and update information for a given jurisdiction. We incentivize this crowdsourcing approach by recognizing users for their contributions in the form of SEO benefits to their company or organization by linking back to users' websites."« less
Object-oriented productivity metrics
NASA Technical Reports Server (NTRS)
Connell, John L.; Eller, Nancy
1992-01-01
Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.
NASA Astrophysics Data System (ADS)
Fryling, Meg
2010-11-01
Organisations often make implementation decisions with little consideration for the maintenance phase of an enterprise resource planning (ERP) system, resulting in significant recurring maintenance costs. Poor cost estimations are likely related to the lack of an appropriate framework for enterprise-wide pre-packaged software maintenance, which requires an ongoing relationship with the software vendor (Markus, M.L., Tanis, C., and Fenema, P.C., 2000. Multisite ERP implementation. CACM, 43 (4), 42-46). The end result is that critical project decisions are made with little empirical data, resulting in substantial long-term cost impacts. The product of this research is a formal dynamic simulation model that enables theory testing, scenario exploration and policy analysis. The simulation model ERPMAINT1 was developed by combining and extending existing frameworks in several research domains, and by incorporating quantitative and qualitative case study data. The ERPMAINT1 model evaluates tradeoffs between different ERP project management decisions and their impact on post-implementation total cost of ownership (TCO). Through model simulations a variety of dynamic insights were revealed that could assist ERP project managers. Major findings from the simulation show that upfront investments in mentoring and system exposure translate to long-term cost savings. The findings also indicate that in addition to customisations, add-ons have a significant impact on TCO.
van Beek, J; Haanperä, M; Smit, P W; Mentula, S; Soini, H
2018-04-11
Culture-based assays are currently the reference standard for drug susceptibility testing for Mycobacterium tuberculosis. They provide good sensitivity and specificity but are time consuming. The objective of this study was to evaluate whether whole genome sequencing (WGS), combined with software tools for data analysis, can replace routine culture-based assays for drug susceptibility testing of M. tuberculosis. M. tuberculosis cultures sent to the Finnish mycobacterial reference laboratory in 2014 (n = 211) were phenotypically tested by Mycobacteria Growth Indicator Tube (MGIT) for first-line drug susceptibilities. WGS was performed for all isolates using the Illumina MiSeq system, and data were analysed using five software tools (PhyResSE, Mykrobe Predictor, TB Profiler, TGS-TB and KvarQ). Diagnostic time and reagent costs were estimated for both methods. The sensitivity of the five software tools to predict any resistance among strains was almost identical, ranging from 74% to 80%, and specificity was more than 95% for all software tools except for TGS-TB. The sensitivity and specificity to predict resistance to individual drugs varied considerably among the software tools. Reagent costs for MGIT and WGS were €26 and €143 per isolate respectively. Turnaround time for MGIT was 19 days (range 10-50 days) for first-line drugs, and turnaround time for WGS was estimated to be 5 days (range 3-7 days). WGS could be used as a prescreening assay for drug susceptibility testing with confirmation of resistant strains by MGIT. The functionality and ease of use of the software tools need to be improved. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Department of Defense Software Factbook
2017-07-07
parameters, these rules of thumb may not provide a lot of value to project managers estimating their software efforts. To get the information useful to them...organization determine the total cost of a particular project , but it is a useful metric to technical managers when they are required to submit an annual...outcome. It is most likely a combination of engineering, management , and funding factors. Although a project may resist planning a schedule slip, this
Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien
2018-01-01
In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.
Resch, Stephen
2018-01-01
Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964
Generalized Support Software: Domain Analysis and Implementation
NASA Technical Reports Server (NTRS)
Stark, Mike; Seidewitz, Ed
1995-01-01
For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.
Standardization in software conversion of (ROM) estimating
NASA Technical Reports Server (NTRS)
Roat, G. H.
1984-01-01
Technical problems and their solutions comprise by far the majority of work involved in space simulation engineering. Fixed price contracts with schedule award fees are becoming more and more prevalent. Accurate estimation of these jobs is critical to maintain costs within limits and to predict realistic contract schedule dates. Computerized estimating may hold the answer to these new problems, though up to now computerized estimating has been complex, expensive, and geared to the business world, not to technical people. The objective of this effort was to provide a simple program on a desk top computer capable of providing a Rough Order of Magnitude (ROM) estimate in a short time. This program is not intended to provide a highly detailed breakdown of costs to a customer, but to provide a number which can be used as a rough estimate on short notice. With more debugging and fine tuning, a more detailed estimate can be made.
Has your greenhouse gone virtual?
USDA-ARS?s Scientific Manuscript database
Virtual Grower is a free decision-support software program available from USDA-ARS that allows growers to build a virtual greenhouse. It was initially designed to help greenhouse growers estimate heating costs and conduct simple simulations to figure out where heat savings could be achieved. Featu...
Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frothingham, David; Barker, Michelle; Buechi, Steve
2013-07-01
Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less
The UARS and open data concept and analysis study. [upper atmosphere
NASA Technical Reports Server (NTRS)
Mittal, M.; Nebb, J.; Woodward, H.
1983-01-01
Alternative concepts for a common design for the UARS and OPEN Central Data Handling Facility (CDHF) are offered. Costs for alternative implementations of the UARS designs are presented, showing that the system design does not restrict the implementation to a single manufacturer. Processing demands on the alternative UARS CDHF implementations are then discussed. With this information at hand together with estimates for OPEN processing demands, it is shown that any shortfall in system capability for OPEN support can be remedied by either component upgrades or array processing attachments rather than a system redesign. In addition to a common system design, it is shown that there is significant potential for common software design, especially in the areas of data management software and non-user-unique production software. Archiving the CDHF data are discussed. Following that, cost examples for several modes of communications between the CDHF and Remote User Facilities are presented. Technology application is discussed.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Sizing and Lifecycle Cost Analysis of an Ares V Composite Interstage
NASA Technical Reports Server (NTRS)
Mann, Troy; Smeltzer, Stan; Grenoble, Ray; Mason, Brian; Rosario, Sev; Fairbairn, Bob
2012-01-01
The Interstage Element of the Ares V launch vehicle was sized using a commercially available structural sizing software tool. Two different concepts were considered, a metallic design and a composite design. Both concepts were sized using similar levels of analysis fidelity and included the influence of design details on each concept. Additionally, the impact of the different manufacturing techniques and failure mechanisms for composite and metallic construction were considered. Significant details were included in analysis models of each concept, including penetrations for human access, joint connections, as well as secondary loading effects. The designs and results of the analysis were used to determine lifecycle cost estimates for the two Interstage designs. Lifecycle cost estimates were based on industry provided cost data for similar launch vehicle components. The results indicated that significant mass as well as cost savings are attainable for the chosen composite concept as compared with a metallic option.
The Macintosh Based Design Studio.
ERIC Educational Resources Information Center
Earle, Daniel W., Jr.
1988-01-01
Describes the configuration of a workstation for a college design studio based on the Macintosh Plus microcomputer. Highlights include cost estimates, computer hardware peripherals, computer aided design software, networked studios, and potentials for new approaches to design activity in the computer based studio of the future. (Author/LRW)
Safikhani, Zhaleh; Sadeghi, Mehdi; Pezeshk, Hamid; Eslahchi, Changiz
2013-01-01
Recent advances in the sequencing technologies have provided a handful of RNA-seq datasets for transcriptome analysis. However, reconstruction of full-length isoforms and estimation of the expression level of transcripts with a low cost are challenging tasks. We propose a novel de novo method named SSP that incorporates interval integer linear programming to resolve alternatively spliced isoforms and reconstruct the whole transcriptome from short reads. Experimental results show that SSP is fast and precise in determining different alternatively spliced isoforms along with the estimation of reconstructed transcript abundances. The SSP software package is available at http://www.bioinf.cs.ipm.ir/software/ssp. © 2013.
Software Estimates Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Smith, C. L.
2003-01-01
Simulation-Based Cost Model (SiCM), a discrete event simulation developed in Extend , simulates pertinent aspects of the testing of rocket propulsion test articles for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.
Cost and schedule estimation study report
NASA Technical Reports Server (NTRS)
Condon, Steve; Regardie, Myrna; Stark, Mike; Waligora, Sharon
1993-01-01
This report describes the analysis performed and the findings of a study of the software development cost and schedule estimation models used by the Flight Dynamics Division (FDD), Goddard Space Flight Center. The study analyzes typical FDD projects, focusing primarily on those developed since 1982. The study reconfirms the standard SEL effort estimation model that is based on size adjusted for reuse; however, guidelines for the productivity and growth parameters in the baseline effort model have been updated. The study also produced a schedule prediction model based on empirical data that varies depending on application type. Models for the distribution of effort and schedule by life-cycle phase are also presented. Finally, this report explains how to use these models to plan SEL projects.
Complexity as a Factor of Quality and Cost in Large Scale Software Development.
1979-12-01
allocating testing resources." [69 69I V. THE ROLE OF COMPLEXITY IN RESOURCE ESTIMATION AND ALLOCATION A. GENERAL It can be argued that blame for the...and allocation of testing resource by - identifying independent substructures and - identifying heavily used logic paths. 2. Setting a Design Threshold... RESOURCE ESTIMATION -------- 70 1. New Dynamic Field ------------------------- 70 2. Quality and Testing ----------------------- 71 3. Programming Units of
Module Interconnection Frameworks for a Real-Time Spreadsheet
1993-10-19
Malaysian Navy by the Malaysian subsidiary of Encore, and that response has been favorable. Prototyping is another area where rapid. interactive...contract with Encore which provides software royalties for every Infinity system shipped. Royalties from the Encore product RTware, Inc. 714 9th St. Suite...Overhead: $153,035 Total Cost: $ 680,074 13.17. Royalties : N/A 13.18. Fee or Profit- $50,000 13.19. Total Estimated Cost: $ 730,743 13.20. Authorized
The Utility of Handheld Programmable Calculators in Aircraft Life Cycle Cost Estimation.
1982-09-01
are available for extended mem - ory, hardcopy printout, video interface, and special application software. Any calculator of comparable memory could...condi- tioning system. OG Total number of engine, air turbine motor (ATM) and auxiliary power unit (APU) driven generator/alternators. OHP Total number
14 CFR 1214.205 - Revisit and/or retrieval services.
Code of Federal Regulations, 2012 CFR
2012-01-01
... a scheduled Shuttle flight, he will only pay for added mission planning, unique hardware or software... Section 1214.205 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT... priced on the basis of estimated costs. If a special dedicated Shuttle flight is required, the full...
14 CFR 1214.205 - Revisit and/or retrieval services.
Code of Federal Regulations, 2011 CFR
2011-01-01
... a scheduled Shuttle flight, he will only pay for added mission planning, unique hardware or software... Section 1214.205 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT... priced on the basis of estimated costs. If a special dedicated Shuttle flight is required, the full...
14 CFR 1214.205 - Revisit and/or retrieval services.
Code of Federal Regulations, 2013 CFR
2013-01-01
... a scheduled Shuttle flight, he will only pay for added mission planning, unique hardware or software... Section 1214.205 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT... priced on the basis of estimated costs. If a special dedicated Shuttle flight is required, the full...
Launch vehicle operations cost reduction through artificial intelligence techniques
NASA Technical Reports Server (NTRS)
Davis, Tom C., Jr.
1988-01-01
NASA's Kennedy Space Center has attempted to develop AI methods in order to reduce the cost of launch vehicle ground operations as well as to improve the reliability and safety of such operations. Attention is presently given to cost savings estimates for systems involving launch vehicle firing-room software and hardware real-time diagnostics, as well as the nature of configuration control and the real-time autonomous diagnostics of launch-processing systems by these means. Intelligent launch decisions and intelligent weather forecasting are additional applications of AI being considered.
Test Case Study: Estimating the Cost of Ada Software Development
1989-04-01
2-9 2.1.5 SPQR /20 ....... .................... ... 2-12 2.1.6 SYST’E-3 ....... ................... .. 2-12 2.2 ADA...SoftCost-Ada 3 out of 3 0% to 6% Commercial Contracts SPQR /20 1 out of 4 -22% Model Consistency on SoftCost-Ada 3 out of 3 -13% to - 8% Commercial Contracts...PRICE-S 2 out of 4 - 1% to 22% Model Accuracy on SASET 3 out of 4 - 7% to 29% Command & Control SPQR /20 3 out of 4 -22% to 19% Applications Model
A simplified economic filter for open-pit gold-silver mining in the United States
Singer, Donald A.; Menzie, W. David; Long, Keith R.
1998-01-01
In resource assessments of undiscovered mineral deposits and in the early stages of exploration, including planning, a need for prefeasibility cost models exists. In exploration, these models to filter economic from uneconomic deposits help to focus on targets that can really benefit the exploration enterprise. In resource assessment, these models can be used to eliminate deposits that would probably be uneconomic even if discovered. The U. S. Bureau of Mines (USBM) previously developed simplified cost models for such problems (Camm, 1991). These cost models estimate operating and capital expenditures for a mineral deposit given its tonnage, grade, and depth. These cost models were also incorporated in USBM prefeasibility software (Smith, 1991). Because the cost data used to estimate operating and capital costs in these models are now over ten years old, we decided that it was necessary to test these equations with more current data. We limited this study to open-pit gold-silver mines located in the United States.
NASA Astrophysics Data System (ADS)
Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.
1995-12-01
Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.
The costs of avoiding environmental impacts from shale-gas surface infrastructure.
Milt, Austin W; Gagnolet, Tamara D; Armsworth, Paul R
2016-12-01
Growing energy demand has increased the need to manage conflicts between energy production and the environment. As an example, shale-gas extraction requires substantial surface infrastructure, which fragments habitats, erodes soils, degrades freshwater systems, and displaces rare species. Strategic planning of shale-gas infrastructure can reduce trade-offs between economic and environmental objectives, but the specific nature of these trade-offs is not known. We estimated the cost of avoiding impacts from land-use change on forests, wetlands, rare species, and streams from shale-energy development within leaseholds. We created software for optimally siting shale-gas surface infrastructure to minimize its environmental impacts at reasonable construction cost. We visually assessed sites before infrastructure optimization to test whether such inspection could be used to predict whether impacts could be avoided at the site. On average, up to 38% of aggregate environmental impacts of infrastructure could be avoided for 20% greater development costs by spatially optimizing infrastructure. However, we found trade-offs between environmental impacts and costs among sites. In visual inspections, we often distinguished between sites that could be developed to avoid impacts at relatively low cost (29%) and those that could not (20%). Reductions in a metric of aggregate environmental impact could be largely attributed to potential displacement of rare species, sedimentation, and forest fragmentation. Planners and regulators can estimate and use heterogeneous trade-offs among development sites to create industry-wide improvements in environmental performance and do so at reasonable costs by, for example, leveraging low-cost avoidance of impacts at some sites to offset others. This could require substantial effort, but the results and software we provide can facilitate the process. © 2016 Society for Conservation Biology.
Examination of the Open Market Corridor
2003-12-01
105 D. BENEFITS OF THE PURCHASE CARD PROGRAM ..........................107 1. List of Benefits ...107 2. Additional Benefits and How OMC Can Increase the Benefits ...107 E. WEAKNESSES OF...software licenses and support services. Estimated life-cycle costs for FY 1995 through FY 2005 are $3.7 billion. Operational benefits from SPS are
101 Computer Projects for Libraries. 101 Micro Series.
ERIC Educational Resources Information Center
Dewey, Patrick R.
The projects collected in this book represent a wide cross section of the way microcomputers are used in libraries. Each project description includes organization and contact information, hardware and software used, cost and project length estimates, and Web or print references when available. Projects come from academic and public libraries,…
Software Intensive Systems Cost and Schedule Estimation
2013-06-13
Radio communication systems RTE Electronic navigation systems RTE Space vehicle electronic tracking systems RTE Sonar systems RTE...MONITORING AGENCY NAME(S) AND ADDRESS(ES) DASD (SE), DoD, AIRFORCE 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12... 11 3.2.2 SEER‐SEM
Sneak Analysis Application Guidelines
1982-06-01
Hardware Program Change Cost Trend, Airborne Environment ....... ....................... 111 3-11 Relative Software Program Change Costs...113 3-50 Derived Software Program Change Cost by Phase,* Airborne Environment ..... ............... 114 3-51 Derived Software Program Change...Cost by Phase, Ground/Water Environment ... ............. .... 114 3-52 Total Software Program Change Costs ................ 115 3-53 Sneak Analysis
An Application of Discriminant Analysis to the Selection of Software Cost Estimating Models.
1984-09-01
the PRICE S Users Manual (29:111-25) was used with a slight modification. Based on the experience and advice of Captain Joe Dean, Electronic System...this study, and EXP is the expansion factor listed in the PRICE S User’s Manual . Another important factor needing explanation is development cost...coefficients and a unique constant. According to the SPSS manual (26:445) "Under the assumption of a multivariate normal distribution, the
Effort Drivers Estimation for Brazilian Geographically Distributed Software Development
NASA Astrophysics Data System (ADS)
Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio
To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.
Technology Benefit Estimator (T/BEST): User's Manual
NASA Technical Reports Server (NTRS)
Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib
1994-01-01
The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security. T/BESt's sotware framework, status, beginner-to-expert operation, interface architecture, analysis module addition, and key analysis modules are discussed. Representative examples of T/BEST benefit analyses are shown.
Technology Benefit Estimator (T/BEST): User's manual
NASA Astrophysics Data System (ADS)
Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib
1994-12-01
The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security.
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
Predicting tool life in turning operations using neural networks and image processing
NASA Astrophysics Data System (ADS)
Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.
2018-05-01
A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.
Optical Storage System For Small Software Package Distribution
NASA Astrophysics Data System (ADS)
Wehrenberg, Paul J.
1985-04-01
This paper describes an optical mass storage system being developed for extremely low cost distribution of small software packages. The structure of the media, design of the optical playback system, and some aspects of mastering and media production are discussed. This read only system is designed solely for the purpose of down loading code in a spooling fashion from the media to the host machine. The media is configured as a plastic card with dimensions 85 mm x 12 mm x 2mm. Each data region on a card is a rectangle 1.33 mm x 59.4 mm which carries up to 64 KB of user data. Cost estimates for production are 0.06 per card for the media and 38.00 for the playback device. The mastering process for the production tooling uses photolithography techniques and can provide production tooling within a few hours of software release. The playback mechanism is rugged and small, and does not require the use of any electromechanical servos.
Mouseli, Ali; Barouni, Mohsen; Amiresmaili, Mohammadreza; Samiee, Siamak Mirab; Vali, Leila
2017-04-01
It is believed that laboratory tariffs in Iran don't reflect the real costs. This might expose private laboratories at financial hardship. Activity Based Costing is widely used as a cost measurement instrument to more closely approximate the true cost of operations. This study aimed to determine the real price of different clinical tests of a selected private clinical laboratory. This study was a cross sectional study carried out in 2015. The study setting was the private laboratories in the city of Kerman, Iran. Of 629 tests in the tariff book of the laboratory (relative value), 188 tests were conducted in the laboratory that used Activity Based Costing (ABC) methodology to estimate cost-price. Analyzing and cost-price estimating of laboratory services were performed by MY ABCM software Version 5.0. In 2015, the total costs were $641,645. Direct and indirect costs were 78.3% and 21.7% respectively. Laboratory consumable costs by 37% and personnel costs by 36.3% had the largest share of the costing. Also, group of hormone tests cost the most $147,741 (23.03%), and other tests group cost the least $3,611 (0.56%). Also after calculating the cost of laboratory services, a comparison was made between the calculated price and the private sector's tariffs in 2015. This study showed that there was a difference between costs and tariffs in the private laboratory. One way to overcome this problem is to increase the number of laboratory tests with regard to capacity of the laboratories.
Mouseli, Ali; Barouni, Mohsen; Amiresmaili, Mohammadreza; Samiee, Siamak Mirab; Vali, Leila
2017-01-01
Background It is believed that laboratory tariffs in Iran don’t reflect the real costs. This might expose private laboratories at financial hardship. Activity Based Costing is widely used as a cost measurement instrument to more closely approximate the true cost of operations. Objective This study aimed to determine the real price of different clinical tests of a selected private clinical laboratory. Methods This study was a cross sectional study carried out in 2015. The study setting was the private laboratories in the city of Kerman, Iran. Of 629 tests in the tariff book of the laboratory (relative value), 188 tests were conducted in the laboratory that used Activity Based Costing (ABC) methodology to estimate cost-price. Analyzing and cost-price estimating of laboratory services were performed by MY ABCM software Version 5.0. Results In 2015, the total costs were $641,645. Direct and indirect costs were 78.3% and 21.7% respectively. Laboratory consumable costs by 37% and personnel costs by 36.3% had the largest share of the costing. Also, group of hormone tests cost the most $147,741 (23.03%), and other tests group cost the least $3,611 (0.56%). Also after calculating the cost of laboratory services, a comparison was made between the calculated price and the private sector’s tariffs in 2015. Conclusion This study showed that there was a difference between costs and tariffs in the private laboratory. One way to overcome this problem is to increase the number of laboratory tests with regard to capacity of the laboratories. PMID:28607638
System Risk Balancing Profiles: Software Component
NASA Technical Reports Server (NTRS)
Kelly, John C.; Sigal, Burton C.; Gindorf, Tom
2000-01-01
The Software QA / V&V guide will be reviewed and updated based on feedback from NASA organizations and others with a vested interest in this area. Hardware, EEE Parts, Reliability, and Systems Safety are a sample of the future guides that will be developed. Cost Estimates, Lessons Learned, Probability of Failure and PACTS (Prevention, Avoidance, Control or Test) are needed to provide a more complete risk management strategy. This approach to risk management is designed to help balance the resources and program content for risk reduction for NASA's changing environment.
Investigating DRG cost weights for hospitals in middle income countries.
Ghaffari, Shahram; Doran, Christopher; Wilson, Andrew; Aisbett, Chris; Jackson, Terri
2009-01-01
Identifying the cost of hospital outputs, particularly acute inpatients measured by Diagnosis Related Groups (DRGs), is an important component of casemix implementation. Measuring the relative costliness of specific DRGs is useful for a wide range of policy and planning applications. Estimating the relative use of resources per DRG can be done through different costing approaches depending on availability of information and time and budget. This study aims to guide costing efforts in Iran and other countries in the region that are pursuing casemix funding, through identifying the main issues facing cost finding approaches and introducing the costing models compatible with their hospitals accounting and management structures. The results show that inadequate financial and utilisation information at the patient's level, poorly computerized 'feeder systems'; and low quality data make it impossible to estimate reliable DRGs costs through clinical costing. A cost modelling approach estimates the average cost of 2.723 million Rials (Iranian Currency) per DRG. Using standard linear regression, a coefficient of 0.14 (CI = 0.12-0.16) suggests that the average cost weight increases by 14% for every one-day increase in average length of stay (LOS).We concluded that calculation of DRG cost weights (CWs) using Australian service weights provides a sensible starting place for DRG-based hospital management; but restructuring hospital accounting systems, designing computerized feeder systems, using appropriate software, and development of national service weights that reflect local practice patterns will enhance the accuracy of DRG CWs.
Feasibility study for automatic reduction of phase change imagery
NASA Technical Reports Server (NTRS)
Nossaman, G. O.
1971-01-01
The feasibility of automatically reducing a form of pictorial aerodynamic heating data is discussed. The imagery, depicting the melting history of a thin coat of fusible temperature indicator painted on an aerodynamically heated model, was previously reduced by manual methods. Careful examination of various lighting theories and approaches led to an experimentally verified illumination concept capable of yielding high-quality imagery. Both digital and video image processing techniques were applied to reduction of the data, and it was demonstrated that either method can be used to develop superimposed contours. Mathematical techniques were developed to find the model-to-image and the inverse image-to-model transformation using six conjugate points, and methods were developed using these transformations to determine heating rates on the model surface. A video system was designed which is able to reduce the imagery rapidly, economically and accurately. Costs for this system were estimated. A study plan was outlined whereby the mathematical transformation techniques developed to produce model coordinate heating data could be applied to operational software, and methods were discussed and costs estimated for obtaining the digital information necessary for this software.
Costs of health IT: beginning to understand the financial impact of a dental school EHR.
Spallek, Heiko; Johnson, Lynn; Kerr, Joseph; Rankin, David
2014-11-01
Health Information Technology (Health IT) constitutes an integral component of the operations of most academic dental institutions nowadays. However, the expenses associated with the acquisition and the ongoing maintenance of these complex systems have often been buried among costs for other electronic infrastructure systems, distributed across various cost centers including unmeasured central campus support, covered centrally and therefore difficult to quantify, and spread over years, denying school administrators a clear understanding of the resources that have been dedicated to Health IT. The aim of this study was to understand the financial impact of Health IT at four similar U.S. dental schools: two schools using a purchased Electronic Health Record (EHR), and two schools that developed their own EHR. For these schools, the costs of creating ($2.5 million) and sustaining ($174,000) custom EHR software were significantly higher than acquiring ($500,000) and sustaining ($121,000) purchased software. These results are based on historical data and should not be regarded as a gold standard for what a complete Health IT suite should cost. The presented data are intended to inform school administrators about the myriad of costs associated with Health IT and give them a point of reference when comparing costs or making estimates for implementation projects.
Estimating Velocities of Glaciers Using Sentinel-1 SAR Imagery
NASA Astrophysics Data System (ADS)
Gens, R.; Arnoult, K., Jr.; Friedl, P.; Vijay, S.; Braun, M.; Meyer, F. J.; Gracheva, V.; Hogenson, K.
2017-12-01
In an international collaborative effort, software has been developed to estimate the velocities of glaciers by using Sentinel-1 Synthetic Aperture Radar (SAR) imagery. The technique, initially designed by the University of Erlangen-Nuremberg (FAU), has been previously used to quantify spatial and temporal variabilities in the velocities of surging glaciers in the Pakistan Karakoram. The software estimates surface velocities by first co-registering image pairs to sub-pixel precision and then by estimating local offsets based on cross-correlation. The Alaska Satellite Facility (ASF) at the University of Alaska Fairbanks (UAF) has modified the software to make it more robust and also capable of migration into the Amazon Cloud. Additionally, ASF has implemented a prototype that offers the glacier tracking processing flow as a subscription service as part of its Hybrid Pluggable Processing Pipeline (HyP3). Since the software is co-located with ASF's cloud-based Sentinel-1 archive, processing of large data volumes is now more efficient and cost effective. Velocity maps are estimated for Single Look Complex (SLC) SAR image pairs and a digital elevation model (DEM) of the local topography. A time series of these velocity maps then allows the long-term monitoring of these glaciers. Due to the all-weather capabilities and the dense coverage of Sentinel-1 data, the results are complementary to optically generated ones. Together with the products from the Global Land Ice Velocity Extraction project (GoLIVE) derived from Landsat 8 data, glacier speeds can be monitored more comprehensively. Examples from Sentinel-1 SAR-derived results are presented along with optical results for the same glaciers.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... 215, 234, 242, and 252 Defense Federal Acquisition Regulation Supplement; Cost and Software Data... Regulation Supplement (DFARS) to set forth DoD Cost and Software Data Reporting system requirements for major... set forth the DoD requirement for offerors to: Describe the standard Cost and Software Data Reporting...
Software algorithm and hardware design for real-time implementation of new spectral estimator
2014-01-01
Background Real-time spectral analyzers can be difficult to implement for PC computer-based systems because of the potential for high computational cost, and algorithm complexity. In this work a new spectral estimator (NSE) is developed for real-time analysis, and compared with the discrete Fourier transform (DFT). Method Clinical data in the form of 216 fractionated atrial electrogram sequences were used as inputs. The sample rate for acquisition was 977 Hz, or approximately 1 millisecond between digital samples. Real-time NSE power spectra were generated for 16,384 consecutive data points. The same data sequences were used for spectral calculation using a radix-2 implementation of the DFT. The NSE algorithm was also developed for implementation as a real-time spectral analyzer electronic circuit board. Results The average interval for a single real-time spectral calculation in software was 3.29 μs for NSE versus 504.5 μs for DFT. Thus for real-time spectral analysis, the NSE algorithm is approximately 150× faster than the DFT. Over a 1 millisecond sampling period, the NSE algorithm had the capability to spectrally analyze a maximum of 303 data channels, while the DFT algorithm could only analyze a single channel. Moreover, for the 8 second sequences, the NSE spectral resolution in the 3-12 Hz range was 0.037 Hz while the DFT spectral resolution was only 0.122 Hz. The NSE was also found to be implementable as a standalone spectral analyzer board using approximately 26 integrated circuits at a cost of approximately $500. The software files used for analysis are included as a supplement, please see the Additional files 1 and 2. Conclusions The NSE real-time algorithm has low computational cost and complexity, and is implementable in both software and hardware for 1 millisecond updates of multichannel spectra. The algorithm may be helpful to guide radiofrequency catheter ablation in real time. PMID:24886214
Cost Model Comparison: A Study of Internally and Commercially Developed Cost Models in Use by NASA
NASA Technical Reports Server (NTRS)
Gupta, Garima
2011-01-01
NASA makes use of numerous cost models to accurately estimate the cost of various components of a mission - hardware, software, mission/ground operations - during the different stages of a mission's lifecycle. The purpose of this project was to survey these models and determine in which respects they are similar and in which they are different. The initial survey included a study of the cost drivers for each model, the form of each model (linear/exponential/other CER, range/point output, capable of risk/sensitivity analysis), and for what types of missions and for what phases of a mission lifecycle each model is capable of estimating cost. The models taken into consideration consisted of both those that were developed by NASA and those that were commercially developed: GSECT, NAFCOM, SCAT, QuickCost, PRICE, and SEER. Once the initial survey was completed, the next step in the project was to compare the cost models' capabilities in terms of Work Breakdown Structure (WBS) elements. This final comparison was then portrayed in a visual manner with Venn diagrams. All of the materials produced in the process of this study were then posted on the Ground Segment Team (GST) Wiki.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: Cost and Software Data...
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: Cost and Software Data...
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: COST AND SOFTWARE DATA...
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: Cost and Software Data...
Software Measurement Guidebook. Version 02.00.02
1992-12-01
Compatibility Testing Process .............................. 9-5 Figure 9-3. Development Effort Planning Curve ................................. 9-7 Figure 10-1...requirements, design, code, and test and for analyzing this data. "* Proposal Manager. The person responsible for describing and supporting the estimated...designed, build/elease ranges, variances, and comparisons size growth; costs; completions; and content, units completing test , units with historical
Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven
1997-01-01
Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.
Saronga, Happiness Pius; Dalaba, Maxwell Ayindenaba; Dong, Hengjin; Leshabari, Melkizedeck; Sauerborn, Rainer; Sukums, Felix; Blank, Antje; Kaltschmidt, Jens; Loukanova, Svetla
2015-04-02
Poor quality of care is among the causes of high maternal and newborn disease burden in Tanzania. Potential reason for poor quality of care is the existence of a "know-do gap" where by health workers do not perform to the best of their knowledge. An electronic clinical decision support system (CDSS) for maternal health care was piloted in six rural primary health centers of Tanzania to improve performance of health workers by facilitating adherence to World Health Organization (WHO) guidelines and ultimately improve quality of maternal health care. This study aimed at assessing the cost of installing and operating the system in the health centers. This retrospective study was conducted in Lindi, Tanzania. Costs incurred by the project were analyzed using Ingredients approach. These costs broadly included vehicle, computers, furniture, facility, CDSS software, transport, personnel, training, supplies and communication. These were grouped into installation and operation cost; recurrent and capital cost; and fixed and variable cost. We assessed the CDSS in terms of its financial and economic cost implications. We also conducted a sensitivity analysis on the estimations. Total financial cost of CDSS intervention amounted to 185,927.78 USD. 77% of these costs were incurred in the installation phase and included all the activities in preparation for the actual operation of the system for client care. Generally, training made the largest share of costs (33% of total cost and more than half of the recurrent cost) followed by CDSS software- 32% of total cost. There was a difference of 31.4% between the economic and financial costs. 92.5% of economic costs were fixed costs consisting of inputs whose costs do not vary with the volume of activity within a given range. Economic cost per CDSS contact was 52.7 USD but sensitive to discount rate, asset useful life and input cost variations. Our study presents financial and economic cost estimates of installing and operating an electronic CDSS for maternal health care in six rural health centres. From these findings one can understand exactly what goes into a similar investment and thus determine sorts of input modification needed to fit their context.
Repository Planning, Design, and Engineering: Part II-Equipment and Costing.
Baird, Phillip M; Gunter, Elaine W
2016-08-01
Part II of this article discusses and provides guidance on the equipment and systems necessary to operate a repository. The various types of storage equipment and monitoring and support systems are presented in detail. While the material focuses on the large repository, the requirements for a small-scale startup are also presented. Cost estimates and a cost model for establishing a repository are presented. The cost model presents an expected range of acquisition costs for the large capital items in developing a repository. A range of 5,000-7,000 ft(2) constructed has been assumed, with 50 frozen storage units, to reflect a successful operation with growth potential. No design or engineering costs, permit or regulatory costs, or smaller items such as the computers, software, furniture, phones, and barcode readers required for operations have been included.
The cost of software fault tolerance
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1982-01-01
The proposed use of software fault tolerance techniques as a means of reducing software costs in avionics and as a means of addressing the issue of system unreliability due to faults in software is examined. A model is developed to provide a view of the relationships among cost, redundancy, and reliability which suggests strategies for software development and maintenance which are not conventional.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, Lizhi
Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations tomore » minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.« less
Open source software and low cost sensors for teaching UAV science
NASA Astrophysics Data System (ADS)
Kefauver, S. C.; Sanchez-Bragado, R.; El-Haddad, G.; Araus, J. L.
2016-12-01
Drones, also known as UASs (unmanned aerial systems), UAVs (Unmanned Aerial Vehicles) or RPAS (Remotely piloted aircraft systems), are both useful advanced scientific platforms and recreational toys that are appealing to younger generations. As such, they can make for excellent education tools as well as low-cost scientific research project alternatives. However, the process of taking pretty pictures to remote sensing science can be daunting if one is presented with only expensive software and sensor options. There are a number of open-source tools and low cost platform and sensor options available that can provide excellent scientific research results, and, by often requiring more user-involvement than commercial software and sensors, provide even greater educational benefits. Scale-invariant feature transform (SIFT) algorithm implementations, such as the Microsoft Image Composite Editor (ICE), which can create quality 2D image mosaics with some motion and terrain adjustments and VisualSFM (Structure from Motion), which can provide full image mosaicking with movement and orthorectification capacities. RGB image quantification using alternate color space transforms, such as the BreedPix indices, can be calculated via plugins in the open-source software Fiji (http://fiji.sc/Fiji; http://github.com/george-haddad/CIMMYT). Recent analyses of aerial images from UAVs over different vegetation types and environments have shown RGB metrics can outperform more costly commercial sensors. Specifically, Hue-based pixel counts, the Triangle Greenness Index (TGI), and the Normalized Green Red Difference Index (NGRDI) consistently outperformed NDVI in estimating abiotic and biotic stress impacts on crop health. Also, simple kits are available for NDVI camera conversions. Furthermore, suggestions for multivariate analyses of the different RGB indices in the "R program for statistical computing", such as classification and regression trees can allow for a more approachable interpretation of results in the classroom.
ERIC Educational Resources Information Center
Thankachan, Briju; Moore, David Richard
2017-01-01
The use of Free and Open Source Software (FOSS), a subset of Information and Communication Technology (ICT), can reduce the cost of purchasing software. Despite the benefit in the initial purchase price of software, deploying software requires total cost that goes beyond the initial purchase price. Total cost is a silent issue of FOSS and can only…
Treatment Cost Analysis Tool (TCAT) for Estimating Costs of Outpatient Treatment Services
Flynn, Patrick M.; Broome, Kirk M.; Beaston-Blaakman, Aaron; Knight, Danica K.; Horgan, Constance M.; Shepard, Donald S.
2009-01-01
A Microsoft® Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment types were $882, $1,310, and $1,381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment. PMID:19004576
Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.
Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S
2009-02-01
A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment.
Lessons Learned for Planning and Estimating Operations Support Requirements
NASA Technical Reports Server (NTRS)
Newhouse, Marilyn
2011-01-01
Operations (phase E) costs are typically small compared to the spacecraft development and test costs. This, combined with the long lead time for realizing operations costs, can lead projects to focus on hardware development schedules and costs, de-emphasizing estimation of operations support requirements during proposal, early design, and replan cost exercises. The Discovery and New Frontiers (D&NF) programs comprise small, cost-capped missions supporting scientific exploration of the solar system. Even moderate yearly underestimates of the operations costs can present significant LCC impacts for deep space missions with long operational durations, and any LCC growth can directly impact the programs ability to fund new missions. The D&NF Program Office at Marshall Space Flight Center recently studied cost overruns for 7 D&NF missions related to phase C/D development of operational capabilities and phase E mission operations. The goal was to identify the underlying causes for the overruns and develop practical mitigations to assist the D&NF projects in identifying potential operations risks and controlling the associated impacts to operations development and execution costs. The study found that the drivers behind these overruns include overly optimistic assumptions regarding the savings resulting from the use of heritage technology, late development of operations requirements, inadequate planning for sustaining engineering and the special requirements of long duration missions (e.g., knowledge retention and hardware/software refresh), and delayed completion of ground system development work. This presentation summarizes the study and the results, providing a set of lessons NASA can use to improve early estimation and validation of operations costs.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-05
.... We prepared a regulatory evaluation of the estimated costs to comply with this AD and placed it in... Friday, except Federal holidays. The AD docket contains the NPRM, the regulatory evaluation, any comments... corresponding requirements of paragraph (g) of this AD, install FWC software standard T2-0 in accordance with...
2006-03-01
included zero, there is insufficient evidence to indicate that the error mean is 35 not zero. The Breusch - Pagan test was used to test the constant...Multicollinearity .............................................................................. 33 Testing OLS Assumptions...programming styles used by developers (Stamelos and others, 2003:733). Kemerer tested to see how models utilizing SLOC as an independent variable
Dietary intake assessment using integrated sensors and software
NASA Astrophysics Data System (ADS)
Shang, Junqing; Pepin, Eric; Johnson, Eric; Hazel, David; Teredesai, Ankur; Kristal, Alan; Mamishev, Alexander
2012-02-01
The area of dietary assessment is becoming increasingly important as obesity rates soar, but valid measurement of the food intake in free-living persons is extraordinarily challenging. Traditional paper-based dietary assessment methods have limitations due to bias, user burden and cost, and therefore improved methods are needed to address important hypotheses related to diet and health. In this paper, we will describe the progress of our mobile Diet Data Recorder System (DDRS), where an electronic device is used for objective measurement on dietary intake in real time and at moderate cost. The DDRS consists of (1) a mobile device that integrates a smartphone and an integrated laser package, (2) software on the smartphone for data collection and laser control, (3) an algorithm to process acquired data for food volume estimation, which is the largest source of error in calculating dietary intake, and (4) database and interface for data storage and management. The estimated food volume, together with direct entries of food questionnaires and voice recordings, could provide dietitians and nutritional epidemiologists with more complete food description and more accurate food portion sizes. In this paper, we will describe the system design of DDRS and initial results of dietary assessment.
Understanding and Predicting the Process of Software Maintenance Releases
NASA Technical Reports Server (NTRS)
Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.
1996-01-01
One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.
A Novel Collaboration to Reduce the Travel-Related Cost of Residency Interviewing.
Shappell, Eric; Fant, Abra; Schnapp, Benjamin; Craig, Jill P; Ahn, James; Babcock, Christine; Gisondi, Michael A
2017-04-01
Interviewing for residency is a complicated and often expensive endeavor. Literature has estimated interview costs of $4,000 to $15,000 per applicant, mostly attributable to travel and lodging. The authors sought to reduce these costs and improve the applicant interview experience by coordinating interview dates between two residency programs in Chicago, Illinois. Two emergency medicine residency programs scheduled contiguous interview dates for the 2015-2016 interview season. We used a survey to assess applicant experiences interviewing in Chicago and attitudes regarding coordinated scheduling. Data on utilization of coordinated dates were obtained from interview scheduling software. The target group for this intervention consisted of applicants from medical schools outside Illinois who completed interviews at both programs. Of the 158 applicants invited to both programs, 84 (53%) responded to the survey. Scheduling data were available for all applicants. The total estimated cost savings for target applicants coordinating interview dates was $13,950. The majority of target applicants reported that this intervention increased the ease of scheduling (84%), made them less likely to cancel the interview (82%), and saved them money (71%). Coordinated scheduling of interview dates was associated with significant estimated cost savings and was reviewed favorably by applicants across all measures of experience. Expanding use of this practice geographically and across specialties may further reduce the cost of interviewing for applicants.
Impact of Agile Software Development Model on Software Maintainability
ERIC Educational Resources Information Center
Gawali, Ajay R.
2012-01-01
Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…
NASA Astrophysics Data System (ADS)
Uzunoglu, B.; Hussaini, Y.
2017-12-01
Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.
NASA Astrophysics Data System (ADS)
Benachenhou, D.
2009-04-01
Information-technology departments in large enterprises spend 40% of budget on information integration-combining information from different data sources into a coherent form. IDC, a market-intelligence firm, estimates that the market for data integration and access software (which includes the key enabling technology for information integration) was about 2.5 billion in 2007, and is expected to grow to 3.8 billion in 2012. This is only the cost estimate for structured or traditional database information integration. Just imagine the market for transforming text into structured information and subsequent fusion with traditional databases.
Software Solution Saves Dollars
ERIC Educational Resources Information Center
Trotter, Andrew
2004-01-01
This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…
Horn, Brady P; Barragan, Gary N; Fore, Chis; Bonham, Caroline A
2016-01-01
The purpose of this study was to model the cost of delivering behavioural health services to rural Native American populations using telecommunications and compare these costs with the travel costs associated with providing equivalent care. Behavioural telehealth costs were modelled using equipment, transmission, administrative and IT costs from an established telecommunications centre. Two types of travel models were estimated: a patient travel model and a physician travel model. These costs were modelled using the New Mexico resource geographic information system program (RGIS) and ArcGIS software and unit costs (e.g. fuel prices, vehicle depreciation, lodging, physician wages, and patient wages) that were obtained from the literature and US government agencies. The average per-patient cost of providing behavioural healthcare via telehealth was US$138.34, and the average per-patient travel cost was US$169.76 for physicians and US$333.52 for patients. Sensitivity analysis found these results to be rather robust to changes in imputed parameters and preliminary evidence of economies of scale was found. Besides the obvious benefits of increased access to healthcare and reduced health disparities, providing behavioural telehealth for rural Native American populations was estimated to be less costly than modelled equivalent care provided by travelling. Additionally, as administrative and coordination costs are a major component of telehealth costs, as programmes grow to serve more patients, the relative costs of these initial infrastructure as well as overall per-patient costs should decrease. © The Author(s) 2015.
The cost of work-related physical assaults in Minnesota.
McGovern, P; Kochevar, L; Lohman, W; Zaidman, B; Gerberich, S G; Nyman, J; Findorff-Dennis, M
2000-01-01
OBJECTIVE: To describe the long-term productivity costs of occupational assaults. DATA SOURCES/STUDY SETTING: All incidents of physical assaults that resulted in indemnity payments, identified from the Minnesota Department of Labor and Industry (DLI) Workers' Compensation system in 1992. Medical expenditures were obtained from insurers, and data on lost wages, legal fees, and permanency ratings were collected from DLI records. Insurance administrative expenses were estimated. Lost fringe benefits and household production losses were imputed. STUDY DESIGN: The human capital approach was used to describe the long-term costs of occupational assaults. Economic software was used to apply a modified version of Rice, MacKenzie, and Associates' (1989) model for estimating the present value of past losses from 1992 through 1995 for all cases, and the future losses for cases open in 1996. PRINCIPAL FINDINGS: The total costs for 344 nonfatal work-related assaults were estimated at $5,885,448 (1996 dollars). Calculation of injury incidence and average costs per case and per employee identified populations with an elevated risk of assault. An analysis by industry revealed an elevated risk for workers employed in justice and safety (incidence: 198/100,000; $19,251 per case; $38 per employee), social service (incidence: 127/100,000; $24,210 per case; $31 per employee), and health care (incidence: 76/100,000; $13,197 per case; $10 per employee). CONCLUSIONS: Identified subgroups warrant attention for risk factor identification and prevention efforts. Cost estimates can serve as the basis for business calculations on the potential value of risk management interventions. PMID:10966089
Planning and Estimation of Operations Support Requirements
NASA Technical Reports Server (NTRS)
Newhouse, Marilyn E.; Barley, Bryan; Bacskay, Allen; Clardy, Dennon
2010-01-01
Life Cycle Cost (LCC) estimates during the proposal and early design phases, as well as project replans during the development phase, are heavily focused on hardware development schedules and costs. Operations (phase E) costs are typically small compared to the spacecraft development and test costs. This, combined with the long lead time for realizing operations costs, can lead to de-emphasizing estimation of operations support requirements during proposal, early design, and replan cost exercises. The Discovery and New Frontiers (D&NF) programs comprise small, cost-capped missions supporting scientific exploration of the solar system. Any LCC growth can directly impact the programs' ability to fund new missions, and even moderate yearly underestimates of the operations costs can present significant LCC impacts for deep space missions with long operational durations. The National Aeronautics and Space Administration (NASA) D&NF Program Office at Marshall Space Flight Center (MSFC) recently studied cost overruns and schedule delays for 5 missions. The goal was to identify the underlying causes for the overruns and delays, and to develop practical mitigations to assist the D&NF projects in identifying potential risks and controlling the associated impacts to proposed mission costs and schedules. The study found that 4 out of the 5 missions studied had significant overruns at or after launch due to underestimation of the complexity and supporting requirements for operations activities; the fifth mission had not launched at the time of the mission. The drivers behind these overruns include overly optimistic assumptions regarding the savings resulting from the use of heritage technology, late development of operations requirements, inadequate planning for sustaining engineering and the special requirements of long duration missions (e.g., knowledge retention and hardware/software refresh), and delayed completion of ground system development work. This paper updates the D&NF LCC study, looking at the operations (phase E) cost drivers in more detail and extending the study to include 2 additional missions and identifies areas for increased emphasis by project management in order to improve the fidelity of operations estimates.
Spacelab experiment computer study. Volume 1: Executive summary (presentation)
NASA Technical Reports Server (NTRS)
Lewis, J. L.; Hodges, B. C.; Christy, J. O.
1976-01-01
A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.
NASA Astrophysics Data System (ADS)
Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.
1997-12-01
Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.
Kenneth A. Baerenklau; Armando González-Cabán; Catrina I. Páez; Edgard Chávez
2009-01-01
The U.S. Forest Service is responsible for developing tools to facilitate effective and efficient fire management on wildlands and urban-wildland interfaces. Existing GIS-based fire modeling software only permits estimation of the costs of fire prevention and mitigation efforts as well as the effects of those efforts on fire behavior. This research demonstrates how the...
Chris B. LeDoux; Gary W. Miller
2008-01-01
In this study we used data from 16 Appalachian hardwood stands, a growth and yield computer simulation model, and stump-to-mill logging cost-estimating software to evaluate the optimal economic timing of crop tree release (CTR) treatments. The simulated CTR treatments consisted of one-time logging operations at stand age 11, 23, 31, or 36 years, with the residual...
Code of Federal Regulations, 2010 CFR
2010-04-01
... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts to...
48 CFR 1845.7101-3 - Unit acquisition cost.
Code of Federal Regulations, 2010 CFR
2010-10-01
... services for designs, plans, specifications, and surveys. (6) Acquisition and preparation costs of... acquisition cost is under $100,000, it shall be reported as under $100,000. (g) Software acquisition costs include software costs incurred up through acceptance testing and material internal costs incurred to...
S-1 project. Volume I. Architecture. 1979 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-01-01
The US Navy is one of the world's largest users of digital computing equipment having a procurement cost of at least $50,000, and is the single largest such computer customer in the Department of Defense. Its projected acquisition plan for embedded computer systems during the first half of the 80s contemplates the installation of over 10,000 such systems at an estimated cost of several billions of dollars. This expenditure, though large, is dwarfed by the 85 billion dollars which DOD is projected to spend during the next half-decade on computer software, the near-majority of which will be spent by themore » Navy; the life-cycle costs of the 700,000+ lines of software for a single large Navy weapons systems application (e.g., AEGIS) have been conservatively estimated at most of a billion dollars. The S-1 Project is dedicated to realizing potentially large improvements in the efficiency with which such very large sums may be spent, so that greater military effectiveness may be secured earlier, and with smaller expenditures. The fundamental objectives of the S-1 Project's work are first to enable the Navy to be able to quickly, reliably and inexpensively evaluate at any time what is available from the state-of-the-art in digital processing systems and what the relevance of such systems may be to Navy data processing applications: and second to provide reference prototype systems to support possible competitive procurement action leading to deployment of such systems.« less
ERIC Educational Resources Information Center
Mitchell, Susan Marie
2012-01-01
Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…
Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing
NASA Astrophysics Data System (ADS)
Srivastava, Praveen Ranjan; Pareek, Deepak
Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.
Practical Methods for Estimating Software Systems Fault Content and Location
NASA Technical Reports Server (NTRS)
Nikora, A.; Schneidewind, N.; Munson, J.
1999-01-01
Over the past several years, we have developed techniques to discriminate between fault-prone software modules and those that are not, to estimate a software system's residual fault content, to identify those portions of a software system having the highest estimated number of faults, and to estimate the effects of requirements changes on software quality.
Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
2018-04-10
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.
Prediction and Estimation of Scaffold Strength with different pore size
NASA Astrophysics Data System (ADS)
Muthu, P.; Mishra, Shubhanvit; Sri Sai Shilpa, R.; Veerendranath, B.; Latha, S.
2018-04-01
This paper emphasizes the significance of prediction and estimation of the mechanical strength of 3D functional scaffolds before the manufacturing process. Prior evaluation of the mechanical strength and structural properties of the scaffold will reduce the cost fabrication and in fact ease up the designing process. Detailed analysis and investigation of various mechanical properties including shear stress equivalence have helped to estimate the effect of porosity and pore size on the functionality of the scaffold. The influence of variation in porosity was examined by computational approach via finite element analysis (FEA) and ANSYS application software. The results designate the adequate perspective of the evolutionary method for the regulation and optimization of the intricate engineering design process.
ROI Analysis of the System Architecture Virtual Integration Initiative
2018-04-01
The ROI anal- ysis uses conservative estimates of costs and benefits, especially for those parameters that have a proven, strong correlation to overall...formula: • In Section 3, we discuss the exponential growth of avionics software systems in terms of SLOC by analyzing the historical data to correlate ...which implies that the system has good structure (high cohesion, low coupling), good ap- plication clarity (good correlation between program and
Coal gasification systems engineering and analysis. Appendix H: Work breakdown structure
NASA Technical Reports Server (NTRS)
1980-01-01
A work breakdown structure (WBS) is presented which encompasses the multiple facets (hardware, software, services, and other tasks) of the coal gasification program. The WBS is shown to provide the basis for the following: management and control; cost estimating; budgeting and reporting; scheduling activities; organizational structuring; specification tree generation; weight allocation and control; procurement and contracting activities; and serves as a tool for program evaluation.
A Toolchain for the Detection of Structural and Behavioral Latent System Properties
2011-03-19
thus reducing the number of defects propagated to successive phases. 1 Introduction In software development, the cost to repair a defect increases...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send
Autotune Calibrates Models to Building Use Data
None
2018-01-16
Models of existing buildings are currently unreliable unless calibrated manually by a skilled professional. Autotune, as the name implies, automates this process by calibrating the model of an existing building to measured data, and is now available as open source software. This enables private businesses to incorporate Autotune into their products so that their customers can more effectively estimate cost savings of reduced energy consumption measures in existing buildings.
Nkonki, Lungiswa Ll; Chola, Lumbwe L; Tugendhaft, Aviva A; Hofman, Karen K
2017-08-28
To estimate the costs and impact on reducing child mortality of scaling up interventions that can be delivered by community health workers at community level from a provider's perspective. In this study, we used the Lives Saved Tool (LiST), a module in the spectrum software. Within the spectrum software, LiST interacts with other modules, the AIDS Impact Module, Family Planning Module and Demography Projections Module (Dem Proj), to model the impact of more than 60 interventions that affect cause-specific mortality. DemProj Based on National South African Data. A total of nine interventions namely, breastfeeding promotion, complementary feeding, vitamin supplementation, hand washing with soap, hygienic disposal of children's stools, oral rehydration solution, oral antibiotics for the treatment of pneumonia, therapeutic feeding for wasting and treatment for moderate malnutrition. Reducing child mortality. A total of 9 interventions can prevent 8891 deaths by 2030. Hand washing with soap (21%) accounts for the highest number of deaths prevented, followed by therapeutic feeding (19%) and oral rehydration therapy (16%). The top 5 interventions account for 77% of all deaths prevented. At scale, an estimated cost of US$169.5 million (US$3 per capita) per year will be required in community health worker costs. The use of community health workers offers enormous opportunities for saving lives. These programmes require appropriate financial investments. Findings from this study show what can be achieved if concerted effort is channelled towards the identified set of life-saving interventions. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Networked Sensor System for the Analysis of Plot-Scale Hydrology.
Villalba, German; Plaza, Fernando; Zhong, Xiaoyang; Davis, Tyler W; Navarro, Miguel; Li, Yimei; Slater, Thomas A; Liang, Yao; Liang, Xu
2017-03-20
This study presents the latest updates to the Audubon Society of Western Pennsylvania (ASWP) testbed, a $50,000 USD, 104-node outdoor multi-hop wireless sensor network (WSN). The network collects environmental data from over 240 sensors, including the EC-5, MPS-1 and MPS-2 soil moisture and soil water potential sensors and self-made sap flow sensors, across a heterogeneous deployment comprised of MICAz, IRIS and TelosB wireless motes. A low-cost sensor board and software driver was developed for communicating with the analog and digital sensors. Innovative techniques (e.g., balanced energy efficient routing and heterogeneous over-the-air mote reprogramming) maintained high success rates (>96%) and enabled effective software updating, throughout the large-scale heterogeneous WSN. The edaphic properties monitored by the network showed strong agreement with data logger measurements and were fitted to pedotransfer functions for estimating local soil hydraulic properties. Furthermore, sap flow measurements, scaled to tree stand transpiration, were found to be at or below potential evapotranspiration estimates. While outdoor WSNs still present numerous challenges, the ASWP testbed proves to be an effective and (relatively) low-cost environmental monitoring solution and represents a step towards developing a platform for monitoring and quantifying statistically relevant environmental parameters from large-scale network deployments.
A Networked Sensor System for the Analysis of Plot-Scale Hydrology
Villalba, German; Plaza, Fernando; Zhong, Xiaoyang; Davis, Tyler W.; Navarro, Miguel; Li, Yimei; Slater, Thomas A.; Liang, Yao; Liang, Xu
2017-01-01
This study presents the latest updates to the Audubon Society of Western Pennsylvania (ASWP) testbed, a $50,000 USD, 104-node outdoor multi-hop wireless sensor network (WSN). The network collects environmental data from over 240 sensors, including the EC-5, MPS-1 and MPS-2 soil moisture and soil water potential sensors and self-made sap flow sensors, across a heterogeneous deployment comprised of MICAz, IRIS and TelosB wireless motes. A low-cost sensor board and software driver was developed for communicating with the analog and digital sensors. Innovative techniques (e.g., balanced energy efficient routing and heterogeneous over-the-air mote reprogramming) maintained high success rates (>96%) and enabled effective software updating, throughout the large-scale heterogeneous WSN. The edaphic properties monitored by the network showed strong agreement with data logger measurements and were fitted to pedotransfer functions for estimating local soil hydraulic properties. Furthermore, sap flow measurements, scaled to tree stand transpiration, were found to be at or below potential evapotranspiration estimates. While outdoor WSNs still present numerous challenges, the ASWP testbed proves to be an effective and (relatively) low-cost environmental monitoring solution and represents a step towards developing a platform for monitoring and quantifying statistically relevant environmental parameters from large-scale network deployments. PMID:28335534
32 CFR 37.550 - May I accept intellectual property as cost sharing?
Code of Federal Regulations, 2013 CFR
2013-07-01
... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for...
32 CFR 37.550 - May I accept intellectual property as cost sharing?
Code of Federal Regulations, 2011 CFR
2011-07-01
... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions...
The Effect of Utilization Review on Emergency Department Operations.
Desai, Shoma; Gruber, Phillip F; Eiting, Erick; Seabury, Seth A; Mack, Wendy J; Voyageur, Christian; Vasquez, Veronica; Kim, Hyung T; Terp, Sophie
2017-11-01
Increasingly, hospitals are using utilization review software to reduce hospital admissions in an effort to contain costs. Such practices have the potential to increase the number of unsafe discharges, particularly in public safety-net hospitals. Utilization review software tools are not well studied with regard to their effect on emergency department (ED) operations. We study the effect of prospectively used admission decision support on ED operations. In 2012, Los Angeles County + University of Southern California Medical Center implemented prospective use of computerized admission criteria. After implementation, only ED patients meeting primary review (diagnosis-based criteria) or secondary review (medical necessity as determined by an on-site emergency physician) were assigned inpatient beds. Data were extracted from electronic medical records from September 2011 through December 2013. Outcomes included operational metrics, 30-day ED revisits, and 30-day admission rates. Excluding a 6-month implementation period, monthly summary metrics were compared pre- and postimplementation with nonparametric and negative binomial regression methods. All adult ED visits, excluding incarcerated and purely behavioral health visits, were analyzed. The primary outcomes were disposition rates. Secondary outcomes were 30-day ED revisits, 30-day admission rate among return visitors to the ED, and estimated cost. Analysis of 245,662 ED encounters was performed. The inpatient admission rate decreased from 14.2% to 12.8%. Increases in discharge rate (82.4% to 83.4%) and ED observation unit utilization (2.5% to 3.4%) were found. Thirty-day revisits increased (20.4% to 24.4%), although the 30-day admission rate decreased (3.2% to 2.8%). Estimated cost savings totaled $193.17 per ED visit. The prospective application of utilization review software in the ED led to a decrease in the admission rate. This was tempered by a concomitant increase in ED observation unit utilization and 30-day ED revisits. Cost savings suggest that resources should be redirected to the more highly affected ED and ED observation unit, although more work is needed to confirm the generalizability of these findings. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Vitásek, Stanislav; Matějka, Petr
2017-09-01
The article deals with problematic parts of automated processing of quantity takeoff (QTO) from data generated in BIM model. It focuses on models of road constructions, and uses volumes and dimensions of excavation work to create an estimate of construction costs. The article uses a case study and explorative methods to discuss possibilities and problems of data transfer from a model to a price system of construction production when such transfer is used for price estimates of construction works. Current QTOs and price tenders are made with 2D documents. This process is becoming obsolete because more modern tools can be used. The BIM phenomenon enables partial automation in processing volumes and dimensions of construction units and matching the data to units in a given price scheme. Therefore price of construction can be estimated and structured without lengthy and often imprecise manual calculations. The use of BIM for QTO is highly dependent on local market budgeting systems, therefore proper push/pull strategy is required. It also requires proper requirements specification, compatible pricing database and software.
NASA Astrophysics Data System (ADS)
Rios, J. Fernando; Ye, Ming; Wang, Liying; Lee, Paul Z.; Davis, Hal; Hicks, Rick
2013-03-01
Onsite wastewater treatment systems (OWTS), or septic systems, can be a significant source of nitrates in groundwater and surface water. The adverse effects that nitrates have on human and environmental health have given rise to the need to estimate the actual or potential level of nitrate contamination. With the goal of reducing data collection and preparation costs, and decreasing the time required to produce an estimate compared to complex nitrate modeling tools, we developed the ArcGIS-based Nitrate Load Estimation Toolkit (ArcNLET) software. Leveraging the power of geographic information systems (GIS), ArcNLET is an easy-to-use software capable of simulating nitrate transport in groundwater and estimating long-term nitrate loads from groundwater to surface water bodies. Data requirements are reduced by using simplified models of groundwater flow and nitrate transport which consider nitrate attenuation mechanisms (subsurface dispersion and denitrification) as well as spatial variability in the hydraulic parameters and septic tank distribution. ArcNLET provides a spatial distribution of nitrate plumes from multiple septic systems and a load estimate to water bodies. ArcNLET's conceptual model is divided into three sub-models: a groundwater flow model, a nitrate transport and fate model, and a load estimation model which are implemented as an extension to ArcGIS. The groundwater flow model uses a map of topography in order to generate a steady-state approximation of the water table. In a validation study, this approximation was found to correlate well with a water table produced by a calibrated numerical model although it was found that the degree to which the water table resembles the topography can vary greatly across the modeling domain. The transport model uses a semi-analytical solution to estimate the distribution of nitrate within groundwater, which is then used to estimate a nitrate load using a mass balance argument. The estimates given by ArcNLET are suitable for a screening-level analysis.
A Cost Comparison of Alternative Approaches to Distance Education in Developing Countries
NASA Technical Reports Server (NTRS)
Ventre, Gerard G.; Kalu, Alex
1996-01-01
This paper presents a cost comparison of three approaches to two-way interactive distance learning systems for developing countries. Included are costs for distance learning hardware, terrestrial and satellite communication links, and designing instruction for two-way interactive courses. As part of this project, FSEC is developing a 30-hour course in photovoltaic system design that will be used in a variety of experiments using the Advanced Communications Technology Satellite (ACTS). A primary goal of the project is to develop an instructional design and delivery model that can be used for other education and training programs. Over two-thirds of the world photovoltaics market is in developing countries. One of the objectives of this NASA-sponsored project was to develop new and better energy education programs that take advantage of advances in telecommunications and computer technology. The combination of desktop video systems and the sharing of computer applications software is of special interest. Research is being performed to evaluate the effectiveness of some of these technologies as part of this project. The design of the distance learning origination and receive sites discussed in this paper were influenced by the educational community's growing interest in distance education. The following approach was used to develop comparative costs for delivering interactive distance education to developing countries: (1) Representative target locations for receive sites were chosen. The originating site was assumed to be Cocoa, Florida, where FSEC is located; (2) A range of course development costs were determined; (3) The cost of equipment for three alternative two-way interactive distance learning system configurations was determined or estimated. The types of system configurations ranged from a PC-based system that allows instructors to originate instruction from their office using desktop video and shared application software, to a high cost system that uses a electronic classroom; (4) A range of costs for both satellite and terrestrial communications was investigated; (5) The costs of equipment and operation of the alternative configurations for the origination and receive sites were determined; (6) A range of costs for several alternative delivery scenarios (i.e., a mix of live-interactive; asynchronous interactive;use of videotapes) was determined; and (7) A preferred delivery scenario, including cost estimate, was developed.
Creating an automated tool for measuring software cohesion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tutton, J.M.; Zucconi, L.
1994-05-06
Program modules with high complexity tend to be more error prone and more difficult to understand. These factors increase maintenance and enhancement costs. Hence, a tool that can help programmers determine a key factor in module complexity should be very useful. Our goal is to create a software tool that will automatically give a quantitative measure of the cohesiveness of a given module, and hence give us an estimate of the {open_quotes}maintainability{close_quotes} of that module. The Tool will use a metric developed by Professors Linda M. Ott and James M. Bieman. The Ott/Bieman metric gives quantitative measures that indicate themore » degree of functional cohesion using abstract data slices.« less
The Flight Optimization System Weights Estimation Method
NASA Technical Reports Server (NTRS)
Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.
2017-01-01
FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.
Modeling SMAP Spacecraft Attitude Control Estimation Error Using Signal Generation Model
NASA Technical Reports Server (NTRS)
Rizvi, Farheen
2016-01-01
Two ground simulation software are used to model the SMAP spacecraft dynamics. The CAST software uses a higher fidelity model than the ADAMS software. The ADAMS software models the spacecraft plant, controller and actuator models, and assumes a perfect sensor and estimator model. In this simulation study, the spacecraft dynamics results from the ADAMS software are used as CAST software is unavailable. The main source of spacecraft dynamics error in the higher fidelity CAST software is due to the estimation error. A signal generation model is developed to capture the effect of this estimation error in the overall spacecraft dynamics. Then, this signal generation model is included in the ADAMS software spacecraft dynamics estimate such that the results are similar to CAST. This signal generation model has similar characteristics mean, variance and power spectral density as the true CAST estimation error. In this way, ADAMS software can still be used while capturing the higher fidelity spacecraft dynamics modeling from CAST software.
de Souza, John Kennedy Schettino; Pinto, Marcos Antonio da Silva; Vieira, Pedro Gabrielle; Baron, Jerome; Tierra-Criollo, Carlos Julio
2013-12-01
The dynamic, accurate measurement of pupil size is extremely valuable for studying a large number of neuronal functions and dysfunctions. Despite tremendous and well-documented progress in image processing techniques for estimating pupil parameters, comparatively little work has been reported on practical hardware issues involved in designing image acquisition systems for pupil analysis. Here, we describe and validate the basic features of such a system which is based on a relatively compact, off-the-shelf, low-cost FireWire digital camera. We successfully implemented two configurable modes of video record: a continuous mode and an event-triggered mode. The interoperability of the whole system is guaranteed by a set of modular software components hosted on a personal computer and written in Labview. An offline analysis suite of image processing algorithms for automatically estimating pupillary and eyelid parameters were assessed using data obtained in human subjects. Our benchmark results show that such measurements can be done in a temporally precise way at a sampling frequency of up to 120 Hz and with an estimated maximum spatial resolution of 0.03 mm. Our software is made available free of charge to the scientific community, allowing end users to either use the software as is or modify it to suit their own needs. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
An Algorithm and R Program for Fitting and Simulation of Pharmacokinetic and Pharmacodynamic Data.
Li, Jijie; Yan, Kewei; Hou, Lisha; Du, Xudong; Zhu, Ping; Zheng, Li; Zhu, Cairong
2017-06-01
Pharmacokinetic/pharmacodynamic link models are widely used in dose-finding studies. By applying such models, the results of initial pharmacokinetic/pharmacodynamic studies can be used to predict the potential therapeutic dose range. This knowledge can improve the design of later comparative large-scale clinical trials by reducing the number of participants and saving time and resources. However, the modeling process can be challenging, time consuming, and costly, even when using cutting-edge, powerful pharmacological software. Here, we provide a freely available R program for expediently analyzing pharmacokinetic/pharmacodynamic data, including data importation, parameter estimation, simulation, and model diagnostics. First, we explain the theory related to the establishment of the pharmacokinetic/pharmacodynamic link model. Subsequently, we present the algorithms used for parameter estimation and potential therapeutic dose computation. The implementation of the R program is illustrated by a clinical example. The software package is then validated by comparing the model parameters and the goodness-of-fit statistics generated by our R package with those generated by the widely used pharmacological software WinNonlin. The pharmacokinetic and pharmacodynamic parameters as well as the potential recommended therapeutic dose can be acquired with the R package. The validation process shows that the parameters estimated using our package are satisfactory. The R program developed and presented here provides pharmacokinetic researchers with a simple and easy-to-access tool for pharmacokinetic/pharmacodynamic analysis on personal computers.
Model for Simulating a Spiral Software-Development Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Curley, Charles; Nayak, Umanath
2010-01-01
A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.
Knowledge-based assistance in costing the space station DMS
NASA Technical Reports Server (NTRS)
Henson, Troy; Rone, Kyle
1988-01-01
The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.
Guidance and Control Software,
1980-05-01
commitments of function, cost, and schedule . The phrase "software engineering" was intended to contrast with the phrase "computer science" the latter aims...the software problems of cost, delivery schedule , and quality were gradually being recognized at the highest management levels. Thus, in a project... schedule dates. Although the analysis of software problems indicated that the entire software development process (figure 1) needed new methods, only
NASA Software Engineering Benchmarking Study
NASA Technical Reports Server (NTRS)
Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.
2013-01-01
To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5.onsolidate, collect and, if needed, develop common processes principles and other assets across the Agency in order to provide more consistency in software development and acquisition practices and to reduce the overall cost of maintaining or increasing current NASA CMMI maturity levels. 6. Provide additional support for small projects that includes: (a) guidance for appropriate tailoring of requirements for small projects, (b) availability of suitable tools, including support tool set-up and training, and (c) training for small project personnel, assurance personnel and technical authorities on the acceptable options for tailoring requirements and performing assurance on small projects. 7. Develop software training classes for the more experienced software engineers using on-line training, videos, or small separate modules of training that can be accommodated as needed throughout a project. 8. Create guidelines to structure non-classroom training opportunities such as mentoring, peer reviews, lessons learned sessions, and on-the-job training. 9. Develop a set of predictive software defect data and a process for assessing software testing metric data against it. 10. Assess Agency-wide licenses for commonly used software tools. 11. Fill the knowledge gap in common software engineering practices for new hires and co-ops.12. Work through the Science, Technology, Engineering and Mathematics (STEM) program with universities in strengthening education in the use of common software engineering practices and standards. 13. Follow up this benchmark study with a deeper look into what both internal and external organizations perceive as the scope of software assurance, the value they expect to obtain from it, and the shortcomings they experience in the current practice. 14. Continue interactions with external software engineering environment through collaborations, knowledge sharing, and benchmarking.
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2012 CFR
2012-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: Notice of Cost and Software Data Reporting System (NOV 2010) (a) This solicitation includes— (1) The...
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2014 CFR
2014-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: Notice of Cost and Software Data Reporting System (NOV 2010) (a) This solicitation includes— (1) The...
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2011 CFR
2011-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: NOTICE OF COST AND SOFTWARE DATA REPORTING SYSTEM (NOV 2010) (a) This solicitation includes— (1) The...
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2013 CFR
2013-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: Notice of Cost and Software Data Reporting System (NOV 2010) (a) This solicitation includes— (1) The...
Software Cuts Homebuilding Costs, Increases Energy Efficiency
NASA Technical Reports Server (NTRS)
2015-01-01
To sort out the best combinations of technologies for a crewed mission to Mars, NASA Headquarters awarded grants to MIT's Department of Aeronautics and Astronautics to develop an algorithm-based software tool that highlights the most reliable and cost-effective options. Utilizing the software, Professor Edward Crawley founded Cambridge, Massachussetts-based Ekotrope, which helps homebuilders choose cost- and energy-efficient floor plans and materials.
Cost Estimation of Post Production Software Support in Ground Combat Systems
2007-09-01
request, year of the request, EINOMEN (a word description of the system), and a PRON (a unique identifier containing the year and weapons system...variance. The fiscal year of request, descriptive name (coded as EINOMEN), unique program identifier ( PRON ), amount funded, and total amount requested...entire data set loses this sophistication. Most of the unique PRONs in the database map to a specific ground combat system, as described in the
Incorporating cost-benefit analyses into software assurance planning
NASA Technical Reports Server (NTRS)
Feather, M. S.; Sigal, B.; Cornford, S. L.; Hutchinson, P.
2001-01-01
The objective is to use cost-benefit analyses to identify, for a given project, optimal sets of software assurance activities. Towards this end we have incorporated cost-benefit calculations into a risk management framework.
Estimating Economic Burden of Cancer Deaths Attributable to Smoking in Iran.
Rezaei, Satar; Akbari Sari, Ali; Arab, Mohammad; Majdzadeh, Reza; Mohammadpoorasl, Asghar
2015-01-01
There is a broad consensus among health policy-makers that smoking has a significant impact on both heath system and society. The purpose of this study was to estimate the economic burden of major cancer deaths caused by smoking in Iran in 2012. Number of major cancer deaths due to smoking by sex and age groups in 2012 was obtained from GLOBCAN database. The life expectancy and retirement age were used to estimate years of potential life lost (YPLL) and cost of productive lost attributable to smoking, respectively. Data on prevalence of smoking, relative risk of smoking, life expectancy table, annual wage and employment rate were extracted from the various resources such as previous studies, WHO database and Iranian statistic centers. The data analysis was conducted by Excel software. Smoking was responsible for 4,623 cancer deaths, 80808 YPLL and $US 83,019,583 cost of productivity lost. Lung cancer accounts for largest proportion of total cancer deaths, YPLL and cost of productivity lost attributable to smoking. Males account for 86.6% of cancer deaths, 82.6% of YPLL and 85.3% of cost of productivity lost caused by smoking. Smoking places a high economic burden on health system and society as a whole. In addition, if no one had been smokers in Iran, approximately two out of ten cancer deaths could be prevented.
Stuart, Robyn M; Kerr, Cliff C; Haghparast-Bidgoli, Hassan; Estill, Janne; Grobicki, Laura; Baranczuk, Zofia; Prieto, Lorena; Montañez, Vilma; Reporter, Iyanoosh; Gray, Richard T; Skordis-Worrall, Jolene; Keiser, Olivia; Cheikh, Nejma; Boonto, Krittayawan; Osornprasop, Sutayut; Lavadenz, Fernando; Benedikt, Clemens J; Martin-Hughes, Rowan; Hussain, S Azfar; Kelly, Sherrie L; Kedziora, David J; Wilson, David P
2017-01-01
Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed "investment staircases", a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. We find that when budgets are very limited, the optimal HIV response consists of a smaller number of 'core' interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future.
Code of Federal Regulations, 2014 CFR
2014-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
Code of Federal Regulations, 2012 CFR
2012-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
Code of Federal Regulations, 2011 CFR
2011-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
Code of Federal Regulations, 2013 CFR
2013-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
1986-09-01
point here Is that the capital cost of design and development (including the cost of software tools and/or CAD/CAM programs which aided in the development...and capitalization , software Is in many ways more Ike a hardware component than it is Ike the tech- nical documentation which supports the hardware...Invoked, the owner of intelectual property rights in software may attach appropriate copyright notices to software delivered under this contract. 2.2.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Sandra Brown; Ellen Hawes
2002-09-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research projects is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
Software Measurement Guidebook
NASA Technical Reports Server (NTRS)
1995-01-01
This Software Measurement Guidebook is based on the extensive experience of several organizations that have each developed and applied significant measurement programs over a period of at least 10 years. The lessons derived from those experiences reflect not only successes but also failures. By applying those lessons, an organization can minimize, or at least reduce, the time, effort, and frustration of introducing a software measurement program. The Software Measurement Guidebook is aimed at helping organizations to begin or improve a measurement program. It does not provide guidance for the extensive application of specific measures (such as how to estimate software cost or analyze software complexity) other than by providing examples to clarify points. It does contain advice for establishing and using an effective software measurement program and for understanding some of the key lessons that other organizations have learned. Some of that advice will appear counterintuitive, but it is all based on actual experience. Although all of the information presented in this guidebook is derived from specific experiences of mature measurement programs, the reader must keep in mind that the characteristics of every organization are unique. Some degree of measurement is critical for all software development and maintenance organizations, and most of the key rules captured in this report will be generally applicable. Nevertheless, each organization must strive to understand its own environment so that the measurement program can be tailored to suit its characteristics and needs.
Integrated testing and verification system for research flight software
NASA Technical Reports Server (NTRS)
Taylor, R. N.
1979-01-01
The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.
Livingood, Wiliiam C; Coughlin, Susan; Bowman, Walter; Bryant, Thomas; Goldhagen, Jeffrey
2007-01-01
Public health systems are stressed by increasing demands and inadequate resources. This study was designed to demonstrate how economic impact analysis can estimate the economic value of a local public health system's infrastructure as well as the economic assets of an "Academic Health Department" model. This study involved the secondary analysis of publicly available data on health department finances and employment using proprietary software specifically designed to assess economic impacts. The health department's impact on the local community was estimated at over 100 million dollars, exceeding the economic impact of other recently studied local industries with no additional costs to local taxpayers.
Practical Issues in Implementing Software Reliability Measurement
NASA Technical Reports Server (NTRS)
Nikora, Allen P.; Schneidewind, Norman F.; Everett, William W.; Munson, John C.; Vouk, Mladen A.; Musa, John D.
1999-01-01
Many ways of estimating software systems' reliability, or reliability-related quantities, have been developed over the past several years. Of particular interest are methods that can be used to estimate a software system's fault content prior to test, or to discriminate between components that are fault-prone and those that are not. The results of these methods can be used to: 1) More accurately focus scarce fault identification resources on those portions of a software system most in need of it. 2) Estimate and forecast the risk of exposure to residual faults in a software system during operation, and develop risk and safety criteria to guide the release of a software system to fielded use. 3) Estimate the efficiency of test suites in detecting residual faults. 4) Estimate the stability of the software maintenance process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Karaghouli, Ali; Kazmerski, L.L.
2010-04-15
This paper addresses the need for electricity of rural areas in southern Iraq and proposes a photovoltaic (PV) solar system to power a health clinic in that region. The total daily health clinic load is 31.6 kW h and detailed loads are listed. The National Renewable Energy Laboratory (NREL) optimization computer model for distributed power, ''HOMER,'' is used to estimate the system size and its life-cycle cost. The analysis shows that the optimal system's initial cost, net present cost, and electricity cost is US$ 50,700, US$ 60,375, and US$ 0.238/kW h, respectively. These values for the PV system are comparedmore » with those of a generator alone used to supply the load. We found that the initial cost, net present cost of the generator system, and electricity cost are US$ 4500, US$ 352,303, and US$ 1.332/kW h, respectively. We conclude that using the PV system is justified on humanitarian, technical, and economic grounds. (author)« less
Integrated Cost and Schedule using Monte Carlo Simulation of a CPM Model - 12419
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hulett, David T.; Nosbisch, Michael R.
This discussion of the recommended practice (RP) 57R-09 of AACE International defines the integrated analysis of schedule and cost risk to estimate the appropriate level of cost and schedule contingency reserve on projects. The main contribution of this RP is to include the impact of schedule risk on cost risk and hence on the need for cost contingency reserves. Additional benefits include the prioritizing of the risks to cost, some of which are risks to schedule, so that risk mitigation may be conducted in a cost-effective way, scatter diagrams of time-cost pairs for developing joint targets of time and cost,more » and probabilistic cash flow which shows cash flow at different levels of certainty. Integrating cost and schedule risk into one analysis based on the project schedule loaded with costed resources from the cost estimate provides both: (1) more accurate cost estimates than if the schedule risk were ignored or incorporated only partially, and (2) illustrates the importance of schedule risk to cost risk when the durations of activities using labor-type (time-dependent) resources are risky. Many activities such as detailed engineering, construction or software development are mainly conducted by people who need to be paid even if their work takes longer than scheduled. Level-of-effort resources, such as the project management team, are extreme examples of time-dependent resources, since if the project duration exceeds its planned duration the cost of these resources will increase over their budgeted amount. The integrated cost-schedule risk analysis is based on: - A high quality CPM schedule with logic tight enough so that it will provide the correct dates and critical paths during simulation automatically without manual intervention. - A contingency-free estimate of project costs that is loaded on the activities of the schedule. - Resolves inconsistencies between cost estimate and schedule that often creep into those documents as project execution proceeds. - Good-quality risk data that are usually collected in risk interviews of the project team, management and others knowledgeable in the risk of the project. The risks from the risk register are used as the basis of the risk data in the risk driver method. The risk driver method is based in the fundamental principle that identifiable risks drive overall cost and schedule risk. - A Monte Carlo simulation software program that can simulate schedule risk, burn WM2012 rate risk and time-independent resource risk. The results include the standard histograms and cumulative distributions of possible cost and time results for the project. However, by simulating both cost and time simultaneously we can collect the cost-time pairs of results and hence show the scatter diagram ('football chart') that indicates the joint probability of finishing on time and on budget. Also, we can derive the probabilistic cash flow for comparison with the time-phased project budget. Finally the risks to schedule completion and to cost can be prioritized, say at the P-80 level of confidence, to help focus the risk mitigation efforts. If the cost and schedule estimates including contingency reserves are not acceptable to the project stakeholders the project team should conduct risk mitigation workshops and studies, deciding which risk mitigation actions to take, and re-run the Monte Carlo simulation to determine the possible improvement to the project's objectives. Finally, it is recommended that the contingency reserves of cost and of time, calculated at a level that represents an acceptable degree of certainty and uncertainty for the project stakeholders, be added as a resource-loaded activity to the project schedule for strategic planning purposes. The risk analysis described in this paper is correct only for the current plan, represented by the schedule. The project contingency reserve of time and cost that are the main results of this analysis apply if that plan is to be followed. Of course project managers have the option of re-planning and re-scheduling in the face of new facts, in part by mitigating risk. This analysis identifies the high-priority risks to cost and to schedule, which assist the project manager in planning further risk mitigation. Some project managers reject the results and argue that they cannot possibly be so late or so overrun. Those project managers may be wasting an opportunity to mitigate risk and get a more favorable outcome. (authors)« less
1979-12-01
team progranming in reducing software dleveloup- ment costs relative to ad hoc approaches and improving software product quality relative to...are interpreted as demonstrating the advantages of disciplined team programming in reducing software development costs relative to ad hoc approaches...is due oartialty to the cost and imoracticality of a valiI experimental setup within a oroauct ion environment. Thus the question remains, are
The Hidden Cost of Buying a Computer.
ERIC Educational Resources Information Center
Johnson, Michael
1983-01-01
In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)
An Evaluation of Software Cost Estimating Models.
1981-06-01
objectively. It may L± best at this time to pursue a policy that favors the maximum use of measur- able predictors but recognizes the possibility of using...E -J r % U,- U,% Ul 0 0 ~ - P 0 0 C, 0000CAr- 000 0-_ 1.0 o to U, U, U, U, U, 0D V. 0 m , mO ’D w , 00 CD mr- cnCD a o cm. to 0 ~ m, m wY . (m r "s
A Comparison of Air Force Data Systems
1993-08-01
a software cost model, SPQR . This model was chosen because it provides a straightforward means of modeling the enhancements as they V i VII-25 I would...estimated by SPQR (23,917) by $69 per hour for a total of $1,650,273. An additional 10 percent was added for generating or modifying the Middleware...equipment3 SLOC source lines of code SPO System Program Office SPQR System Product Quality Reporting SSC Standard Systems Center SSI system-to-system
Software Development Projects: Estimation of Cost and Effort (A Manager’s Digest).
1982-12-01
it later when changes need to be made to it. Various levels of difficulty are experiencel .4 due to the skill level of the programmer, poor Orcaram...impurities that if eliminatel 4 reduce the level of complexity of the program. They are as follows: 16 1. Complementary Operations: unreduced expressions 2...greater quality than that to support standard business applications. Remus defines quality as "...the number of program defects normalized by size over
Technology-driven dietary assessment: a software developer’s perspective
Buday, Richard; Tapia, Ramsey; Maze, Gary R.
2015-01-01
Dietary researchers need new software to improve nutrition data collection and analysis, but creating information technology is difficult. Software development projects may be unsuccessful due to inadequate understanding of needs, management problems, technology barriers or legal hurdles. Cost overruns and schedule delays are common. Barriers facing scientific researchers developing software include workflow, cost, schedule, and team issues. Different methods of software development and the role that intellectual property rights play are discussed. A dietary researcher must carefully consider multiple issues to maximize the likelihood of success when creating new software. PMID:22591224
IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.
ERIC Educational Resources Information Center
Sheehan, Mark C.; Williams, James G.
1987-01-01
Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)
Software for Probabilistic Risk Reduction
NASA Technical Reports Server (NTRS)
Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto
2004-01-01
A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.
Atif, Muhammad; Sulaiman, Syed Azhar Syed; Shafie, Asrul Akmal; Asif, Muhammad; Babar, Zaheer-Ud-Din
2014-08-19
Studies from both developed and developing countries have demonstrated a considerable fluctuation in the average cost of TB treatment. The objective of this study was to analyze the medical resource utilization among new smear positive pulmonary tuberculosis patients. We also estimated the cost of tuberculosis treatment from the provider and patient perspectives, and identified the significant cost driving factors. All new smear positive pulmonary tuberculosis patients who were registered at the chest clinic of the Penang General Hospital, between March 2010 and February 2011, were invited to participate in the study. Provider sector costs were estimated using bottom-up, micro-costing technique. For the calculation of costs from the patients' perspective, all eligible patients who agreed to participate in the study were interviewed after the intensive phase and subsequently at the end of the treatment by a trained nurse. PASW was used to analyze the data (Predictive Analysis SoftWare, version 19.0, Armonk, NY: IBM Corp.). During the study period, 226 patients completed the treatment. However, complete costing data were available for 212 patients. The most highly utilized resources were chest X-ray followed by sputum smear examination. Only a smaller proportion of the patients were hospitalized. The average provider sector cost was MYR 992.34 (i.e., USD 325.35 per patient) whereby the average patient sector cost was MYR 1225.80 (i.e., USD 401.90 per patient). The average patient sector cost of our study population accounted for 5.7% of their annual family income. In multiple linear regression analysis, prolonged treatment duration (i.e., > 6 months) was the only predictor of higher provider sector costs whereby higher patient sector costs were determined by greater household income and persistent cough at the end of the intensive phase of the treatment. In relation to average provider sector cost, our estimates are substantially higher than the budget allocated by the Ministry of Health for the treatment of a tuberculosis case in Malaysia. The expenses borne by the patients and their families on the treatment of the current episode of tuberculosis were not catastrophic for them.
Impact of the Volkswagen emissions control defeat device on US public health
NASA Astrophysics Data System (ADS)
Barrett, Steven R. H.; Speth, Raymond L.; Eastham, Sebastian D.; Dedoussi, Irene C.; Ashok, Akshay; Malina, Robert; Keith, David W.
2015-11-01
The US Environmental Protection Agency (EPA) has alleged that Volkswagen Group of America (VW) violated the Clean Air Act (CAA) by developing and installing emissions control system ‘defeat devices’ (software) in model year 2009-2015 vehicles with 2.0 litre diesel engines. VW has admitted the inclusion of defeat devices. On-road emissions testing suggests that in-use NOx emissions for these vehicles are a factor of 10 to 40 above the EPA standard. In this paper we quantify the human health impacts and associated costs of the excess emissions. We propagate uncertainties throughout the analysis. A distribution function for excess emissions is estimated based on available in-use NOx emissions measurements. We then use vehicle sales data and the STEP vehicle fleet model to estimate vehicle distance traveled per year for the fleet. The excess NOx emissions are allocated on a 50 km grid using an EPA estimate of the light duty diesel vehicle NOx emissions distribution. We apply a GEOS-Chem adjoint-based rapid air pollution exposure model to produce estimates of particulate matter and ozone exposure due to the spatially resolved excess NOx emissions. A set of concentration-response functions is applied to estimate mortality and morbidity outcomes. Integrated over the sales period (2008-2015) we estimate that the excess emissions will cause 59 (95% CI: 10 to 150) early deaths in the US. When monetizing premature mortality using EPA-recommended data, we find a social cost of ˜450m over the sales period. For the current fleet, we estimate that a return to compliance for all affected vehicles by the end of 2016 will avert ˜130 early deaths and avoid ˜840m in social costs compared to a counterfactual case without recall.
Code of Federal Regulations, 2010 CFR
2010-04-01
... increases. (b) At the owner's option, the cost of the computer software may include service contracts to... requirements. (c) The source of funds for the purchase of hardware or software, or contracting for services for... formatted data, including either the purchase and maintenance of computer hardware or software, or both, the...
ExpertEyes: open-source, high-definition eyetracking.
Parada, Francisco J; Wyatte, Dean; Yu, Chen; Akavipat, Ruj; Emerick, Brandi; Busey, Thomas
2015-03-01
ExpertEyes is a low-cost, open-source package of hardware and software that is designed to provide portable high-definition eyetracking. The project involves several technological innovations, including portability, high-definition video recording, and multiplatform software support. It was designed for challenging recording environments, and all processing is done offline to allow for optimization of parameter estimation. The pupil and corneal reflection are estimated using a novel forward eye model that simultaneously fits both the pupil and the corneal reflection with full ellipses, addressing a common situation in which the corneal reflection sits at the edge of the pupil and therefore breaks the contour of the ellipse. The accuracy and precision of the system are comparable to or better than what is available in commercial eyetracking systems, with a typical accuracy of less than 0.4° and best accuracy below 0.3°, and with a typical precision (SD method) around 0.3° and best precision below 0.2°. Part of the success of the system comes from a high-resolution eye image. The high image quality results from uncasing common digital camcorders and recording directly to SD cards, which avoids the limitations of the analog NTSC format. The software is freely downloadable, and complete hardware plans are available, along with sources for custom parts.
Cost considerations in automating the library.
Bolef, D
1987-01-01
The purchase price of a computer and its software is but a part of the cost of any automated system. There are many additional costs, including one-time costs of terminals, printers, multiplexors, microcomputers, consultants, workstations and retrospective conversion, and ongoing costs of maintenance and maintenance contracts for the equipment and software, telecommunications, and supplies. This paper examines those costs in an effort to produce a more realistic picture of an automated system. PMID:3594021
Laboratory cost control and financial management software.
Mayer, M
1998-02-09
Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shonder, J.A.
Geothermal heat pumps (GHPs) have been shown to have a number of benefits over other technologies used to heat and cool buildings and provide hot water, combining high levels of occupant comfort with low operating and maintenance costs. Public facilities represent an increasingly important market for GHPs, and schools are a particularly good application, given the large land area that normally surrounds them. Nevertheless, some barriers remain to the increased use of GHPs in institutional and commercial applications. First, because GHPs are perceived as having higher installation costs than other space conditioning technologies, they are sometimes not considered as anmore » option in feasibility studies. When they are considered, it can be difficult to compile the information required to compare them with other technologies. For example, a life cycle cost analysis requires estimates of installation costs and annually recurring energy and maintenance costs. But most cost estimators are unfamiliar with GHP technology, and no published GHP construction cost estimating guide is available. For this reason, estimates of installed costs tend to be very conservative, furthering the perception that GHPs are more costly than other technologies. Because GHP systems are not widely represented in the various softwares used by engineers to predict building energy use, it is also difficult to estimate the annual energy use of a building having GHP systems. Very little published data is available on expected maintenance costs either. Because of this lack of information, developing an accurate estimate of the life cycle cost of a GHP system requires experience and expertise that are not available in all institutions or in all areas of the country. In 1998, Oak Ridge National Laboratory (ORNL) entered into an agreement with the Lincoln, Nebraska, Public School District and Lincoln Electric Service, the local electric utility in the Lincoln area, to study four new, identical elementary schools built in the district that are served by GHPs. ORNL was provided with complete as-built construction plans for the schools and associated equipment, access to original design calculations and cost estimates, extensive equipment operating data [both from the buildings' energy management systems (EMSs) and from utility meters], and access to the school district's complete maintenance record database, not only for the four GHP schools, but for the other schools in the district using conventional space conditioning equipment. Using this information, we were able to reproduce the process used by the Lincoln school district and the consulting engineering firm to select GHPs over other options to provide space conditioning for the four schools. The objective was to determine whether this decision was the correct one, or whether some other technology would have been more cost-effective. An additional objective was to identify all of the factors that make it difficult for building owners and their engineers to consider GHPs in their projects so that ongoing programs can remove these impediments over time.« less
1977-05-01
C31) programs; (4) simulator/ trainer programs ; and (5) automatic test equipment software. Each of these five types of software represents a problem...coded in the same source language, say JOVIAL, then source—language statements would be a better measure, since that would automatically compensate...whether done at no (visible) cost or by renegotiation of the contract. Fig. 2.3 illustrates these with solid lines. It is conjec- tured that the change
Secure it now or secure it later: the benefits of addressing cyber-security from the outset
NASA Astrophysics Data System (ADS)
Olama, Mohammed M.; Nutaro, James
2013-05-01
The majority of funding for research and development (R&D) in cyber-security is focused on the end of the software lifecycle where systems have been deployed or are nearing deployment. Recruiting of cyber-security personnel is similarly focused on end-of-life expertise. By emphasizing cyber-security at these late stages, security problems are found and corrected when it is most expensive to do so, thus increasing the cost of owning and operating complex software systems. Worse, expenditures on expensive security measures often mean less money for innovative developments. These unwanted increases in cost and potential slowing of innovation are unavoidable consequences of an approach to security that finds and remediate faults after software has been implemented. We argue that software security can be improved and the total cost of a software system can be substantially reduced by an appropriate allocation of resources to the early stages of a software project. By adopting a similar allocation of R&D funds to the early stages of the software lifecycle, we propose that the costs of cyber-security can be better controlled and, consequently, the positive effects of this R&D on industry will be much more pronounced.
A study on leakage radiation dose at ELV-4 electron accelerator bunker
NASA Astrophysics Data System (ADS)
Chulan, Mohd Rizal Md; Yahaya, Redzuwan; Ghazali, Abu BakarMhd
2014-09-01
Shielding is an important aspect in the safety of an accelerator and the most important aspects of a bunker shielding is the door. The bunker's door should be designed properly to minimize the leakage radiation and shall not exceed the permitted limit of 2.5μSv/hr. In determining the leakage radiation dose that passed through the door and gaps between the door and the wall, 2-dimensional manual calculations are often used. This method is hard to perform because visual 2-dimensional is limited and is also very difficult in the real situation. Therefore estimation values are normally performed. In doing so, the construction cost would be higher because of overestimate or underestimate which require costly modification to the bunker. Therefore in this study, two methods are introduced to overcome the problem such as simulation using MCNPX Version 2.6.0 software and manual calculation using 3-dimensional model from Autodesk Inventor 2010 software. The values from the two methods were eventually compared to the real values from direct measurements using Ludlum Model 3 with Model 44-9 probe survey meter.
Autonomous Aerobraking Development Software: Phase 2 Summary
NASA Technical Reports Server (NTRS)
Cianciolo, Alicia D.; Maddock, Robert W.; Prince, Jill L.; Bowes, Angela; Powell, Richard W.; White, Joseph P.; Tolson, Robert; O'Shaughnessy, Daniel; Carrelli, David
2013-01-01
NASA has used aerobraking at Mars and Venus to reduce the fuel required to deliver a spacecraft into a desired orbit compared to an all-propulsive solution. Although aerobraking reduces the propellant, it does so at the expense of mission duration, large staff, and DSN coverage. These factors make aerobraking a significant cost element in the mission design. By moving on-board the current ground-based tasks of ephemeris determination, atmospheric density estimation, and maneuver sizing and execution, a flight project would realize significant cost savings. The NASA Engineering and Safety Center (NESC) sponsored Phase 1 and 2 of the Autonomous Aerobraking Development Software (AADS) study, which demonstrated the initial feasibility of moving these current ground-based functions to the spacecraft. This paper highlights key state-of-the-art advancements made in the Phase 2 effort to verify that the AADS algorithms are accurate, robust and ready to be considered for application on future missions that utilize aerobraking. The advancements discussed herein include both model updates and simulation and benchmark testing. Rigorous testing using observed flight atmospheres, operational environments and statistical analysis characterized the AADS operability in a perturbed environment.
2014-09-01
dollars annually to develop and implement enterprise resource planning ( ERP ) systems, which it considers critical to transforming the department’s...modernization.1 DOD officials have stated that the implementation of the ERPs , such as the Global Combat Support System-Army (GCSS-Army), is a key...1An ERP system is an automated system using commercial off-the-shelf software consisting of multiple, integrated functional modules that perform
Software-Enabled Project Management Techniques and Their Relationship to the Triple Constraints
ERIC Educational Resources Information Center
Elleh, Festus U.
2013-01-01
This study investigated the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). There was the dearth of academic literature that focused on the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). Based on the gap…
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
ERIC Educational Resources Information Center
Lowe, John B.
1988-01-01
If the CD-ROM revolution is likened to gambling, players are information providers and consumers; the stakes are development, production, distribution, hardware, and software costs; and betting is represented by the costs of updating disks and hardware and software maintenance, and by pricing. Strategy should take into account cost savings,…
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan
1993-01-01
Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Analyzing the costs to deliver medication therapy management services.
Rupp, Michael T
2011-01-01
To provide pharmacy managers and consultant pharmacists with a step-by-step approach for analyzing of the costs of delivering medication therapy management (MTM) services and to describe use of a free online software application for determining costs of delivering MTM. The process described is applicable to community pharmacies and consultant pharmacists who provide MTM services from nonpharmacy settings. The PharmAccount Service Cost Calculator is an Internet- based software application that uses a guided online interview to collect information needed to conduct a comprehensive cost analysis of any specialized pharmacy service. In addition to direct variable and fixed costs, the software automatically allocates indirect and overhead costs to the service and generates an itemized report that details the components of service delivery costs. The service cost calculator is sufficiently flexible to support the analysis of virtually any specialized pharmacy service, irrespective of whether the service is being delivered from a physical pharmacy. The software application allows users to perform sensitivity analysis to quickly determine the potential impact that alternate scenarios would have on service delivery cost. It is therefore particularly well suited to assist in the design and planning of a new pharmacy service. Good management requires that the cost implications of service delivery decisions are known and considered. Analyzing the cost of an MTM service is an important step in developing a sustainable business model.
NASA Astrophysics Data System (ADS)
Demuzere, Matthias; Kassomenos, P.; Philipp, A.
2011-08-01
In the framework of the COST733 Action "Harmonisation and Applications of Weather Types Classifications for European Regions" a new circulation type classification software (hereafter, referred to as cost733class software) is developed. The cost733class software contains a variety of (European) classification methods and is flexible towards choice of domain of interest, input variables, time step, number of circulation types, sequencing and (weighted) target variables. This work introduces the capabilities of the cost733class software in which the resulting circulation types (CTs) from various circulation type classifications (CTCs) are applied on observed summer surface ozone concentrations in Central Europe. Firstly, the main characteristics of the CTCs in terms of circulation pattern frequencies are addressed using the baseline COST733 catalogue (cat 2.0), at present the latest product of the new cost733class software. In a second step, the probabilistic Brier skill score is used to quantify the explanatory power of all classifications in terms of the maximum 8 hourly mean ozone concentrations exceeding the 120-μg/m3 threshold; this was based on ozone concentrations from 130 Central European measurement stations. Averaged evaluation results over all stations indicate generally higher performance of CTCs with a higher number of types. Within the subset of methodologies with a similar number of types, the results suggest that the use of CTCs based on optimisation algorithms are performing slightly better than those which are based on other algorithms (predefined thresholds, principal component analysis and leader algorithms). The results are further elaborated by exploring additional capabilities of the cost733class software. Sensitivity experiments are performed using different domain sizes, input variables, seasonally based classifications and multiple-day sequencing. As an illustration, CTCs which are also conditioned towards temperature with various weights are derived and tested similarly. All results exploit a physical interpretation by adapting the environment-to-circulation approach, providing more detailed information on specific synoptic conditions prevailing on days with high surface ozone concentrations. This research does not intend to bring forward a favourite classification methodology or construct a statistical ozone forecasting tool but should be seen as an introduction to the possibilities of the cost733class software. It this respect, the results presented here can provide a basic user support for the cost733class software and the development of a more user- or application-specific CTC approach.
Kerr, Cliff C.; Haghparast-Bidgoli, Hassan; Estill, Janne; Grobicki, Laura; Baranczuk, Zofia; Prieto, Lorena; Montañez, Vilma; Reporter, Iyanoosh; Gray, Richard T.; Skordis-Worrall, Jolene; Keiser, Olivia; Cheikh, Nejma; Boonto, Krittayawan; Osornprasop, Sutayut; Lavadenz, Fernando; Benedikt, Clemens J.; Martin-Hughes, Rowan; Hussain, S. Azfar; Kelly, Sherrie L.; Kedziora, David J.; Wilson, David P.
2017-01-01
Background Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. Methods We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed “investment staircases”, a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. Findings We find that when budgets are very limited, the optimal HIV response consists of a smaller number of ‘core’ interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. Significance It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future. PMID:28972975
Ask Pete, software planning and estimation through project characterization
NASA Technical Reports Server (NTRS)
Kurtz, T.
2001-01-01
Ask Pete, was developed by NASA to provide a tool for integrating the estimation and planning activities for a software development effort. It incorporates COCOMO II estimating with NASA's software development practices and IV&V criteria to characterize a project. This characterization is then used to generate estimates and tailored planning documents.
Jiang, Yu; Guarino, Peter; Ma, Shuangge; Simon, Steve; Mayo, Matthew S; Raghavan, Rama; Gajewski, Byron J
2016-07-22
Subject recruitment for medical research is challenging. Slow patient accrual leads to increased costs and delays in treatment advances. Researchers need reliable tools to manage and predict the accrual rate. The previously developed Bayesian method integrates researchers' experience on former trials and data from an ongoing study, providing a reliable prediction of accrual rate for clinical studies. In this paper, we present a user-friendly graphical user interface program developed in R. A closed-form solution for the total subjects that can be recruited within a fixed time is derived. We also present a built-in Android system using Java for web browsers and mobile devices. Using the accrual software, we re-evaluated the Veteran Affairs Cooperative Studies Program 558- ROBOTICS study. The application of the software in monitoring and management of recruitment is illustrated for different stages of the trial. This developed accrual software provides a more convenient platform for estimation and prediction of the accrual process.
Prediction of Protein Configurational Entropy (Popcoen).
Goethe, Martin; Gleixner, Jan; Fita, Ignacio; Rubi, J Miguel
2018-03-13
A knowledge-based method for configurational entropy prediction of proteins is presented; this methodology is extremely fast, compared to previous approaches, because it does not involve any type of configurational sampling. Instead, the configurational entropy of a query fold is estimated by evaluating an artificial neural network, which was trained on molecular-dynamics simulations of ∼1000 proteins. The predicted entropy can be incorporated into a large class of protein software based on cost-function minimization/evaluation, in which configurational entropy is currently neglected for performance reasons. Software of this type is used for all major protein tasks such as structure predictions, proteins design, NMR and X-ray refinement, docking, and mutation effect predictions. Integrating the predicted entropy can yield a significant accuracy increase as we show exemplarily for native-state identification with the prominent protein software FoldX. The method has been termed Popcoen for Prediction of Protein Configurational Entropy. An implementation is freely available at http://fmc.ub.edu/popcoen/ .
NASA Astrophysics Data System (ADS)
Suresh, K.; Balaji, S.; Saravanan, K.; Navas, J.; David, C.; Panigrahi, B. K.
2018-02-01
We developed a simple, low cost user-friendly automated indirect ion beam fluence measurement system for ion irradiation and analysis experiments requiring indirect beam fluence measurements unperturbed by sample conditions like low temperature, high temperature, sample biasing as well as in regular ion implantation experiments in the ion implanters and electrostatic accelerators with continuous beam. The system, which uses simple, low cost, off-the-shelf components/systems and two distinct layers of in-house built softwarenot only eliminates the need for costly data acquisition systems but also overcomes difficulties in using properietry software. The hardware of the system is centered around a personal computer, a PIC16F887 based embedded system, a Faraday cup drive cum monitor circuit, a pair of Faraday Cups and a beam current integrator and the in-house developed software include C based microcontroller firmware and LABVIEW based virtual instrument automation software. The automatic fluence measurement involves two important phases, a current sampling phase lasting over 20-30 seconds during which the ion beam current is continuously measured by intercepting the ion beam and the averaged beam current value is computed. A subsequent charge computation phase lasting 700-900 seconds is executed making the ion beam to irradiate the samples and the incremental fluence received by the sampleis estimated usingthe latest averaged beam current value from the ion beam current sampling phase. The cycle of current sampling-charge computation is repeated till the required fluence is reached. Besides simplicity and cost-effectiveness, other important advantages of the developed system include easy reconfiguration of the system to suit customisation of experiments, scalability, easy debug and maintenance of the hardware/software, ability to work as a standalone system. The system was tested with different set of samples and ion fluences and the results were verified using Rutherford backscattering technique which showed the satisfactory functioning of the system. The accuracy of the fluence measurements is found to be less than 2% which meets the demands of the irradiation experiments undertaken using the developed set up. The system was incorporated for regular use at the existing ultra high vacuum (UHV) ion irradiation chamber of 1.7 MV Tandem accelerator and several ion implantation experiments on a variety of samples like SS304, D9, ODS alloys have been successfully carried out.
Economic evaluation of childhood epilepsy in a resource-challenged setting: A preliminary survey.
Ibrahim, Aliyu; Umar, Umar Isa; Usman, Umar Musa; Owolabi, Lukman Femi
2017-11-01
Considerable disease variability exists between patients with epilepsy, and the societal costs for epilepsy care are overall high, because of high frequency in the general population especially in children from developing countries. A cross-sectional study where children with established diagnosis of epilepsy were interviewed using a semi-structured questionnaire. Prevalence-based costs were stratified by patients' sociodemographic characteristics and socioeconomic scores (SES). The 'bottom-up' and 'human capital' approaches were used to generate estimates on the direct and indirect (productivity losses) costs of epilepsy, respectively. All estimates of the financial burden of epilepsy were analyzed from the 'societal perspective' using IBM SPSS statistics software, version 20.0. The study had 103 enrollees with most in the age group of 0-5years (45.6%). Majority (61.3%) belong to the low socioeconomic class (Ogunlesi SES class IV and V) and reside (80.6%) in an urban setting. The total direct and indirect costs per month were ₦2,149,965.00 ($8497.88) and ₦363,187.80 ($1435.52), respectively. The cost of care per patient per annum was ₦292,794.50 ($1157.29), and the total cost for all the patients per year was ₦30,157,833.60 ($119,200.92). Investigative procedures are the principal cost drivers (₦15,861.17 or $18.15) comprising approximately 58.7% of the total direct costs per patient. Cost of investigations contributed immensely to the total direct cost of care in our study. With the present economic situation in the country, out-of-pocket payments may contribute significantly to catastrophic expenditures and worsening of secondary treatment gap in children with epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
Attributable cost and length of stay for central line-associated bloodstream infections.
Goudie, Anthony; Dynan, Linda; Brady, Patrick W; Rettiganti, Mallikarjuna
2014-06-01
Central line-associated bloodstream infections (CLABSI) are common types of hospital-acquired infections associated with high morbidity. Little is known about the attributable cost and length of stay (LOS) of CLABSI in pediatric inpatient settings. We determined the cost and LOS attributable to pediatric CLABSI from 2008 through 2011. A propensity score-matched case-control study was performed. Children <18 years with inpatient discharges in the Nationwide Inpatient Sample databases from the Healthcare Cost and Utilization Project from 2008 to 2011 were included. Discharges with CLABSI were matched to those without CLABSI by age, year, and high dimensional propensity score (obtained from a logistic regression of CLABSI status on patient characteristics and the presence or absence of 262 individual clinical classification software diagnoses). Our main outcome measures were estimated costs obtained from cost-to-charge ratios and LOS for pediatric discharges. The mean attributable cost and LOS between matched CLABSI cases (1339) and non-CLABSI controls (2678) was $55 646 (2011 dollars) and 19 days, respectively. Between 2008 and 2011, the rate of pediatric CLABSI declined from 1.08 to 0.60 per 1000 (P < .001). Estimates of mean costs of treating patients with CLABSI declined from $111 852 to $98 621 (11.8%; P < .001) over this period, but cost of treating matched non-CLABSI patients remained constant at ∼$48 000. Despite significant improvement in rates, CLABSI remains a burden on patients, families, and payers. Continued attention to CLABSI-prevention initiatives and lower-cost CLABSI care management strategies to support high-value pediatric care delivery is warranted. Copyright © 2014 by the American Academy of Pediatrics.
APPLICATION AND DEVELOPMENT OF APPROPRIATE TOOLS AND TECHNOLOGIES FOR COST-EFFECTIVE CARBON
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Sandra Brown; Ellen Hawes
2003-09-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Sandra Brown; Patrick Gonzalez
2004-07-10
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: remote sensing for carbon analysis; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
Empirical cost models for estimating power and energy consumption in database servers
NASA Astrophysics Data System (ADS)
Valdivia Garcia, Harold Dwight
The explosive growth in the size of data centers, coupled with the widespread use of virtualization technology has brought power and energy consumption as major concerns for data center administrators. Provisioning decisions must take into consideration not only target application performance but also the power demands and total energy consumption incurred by the hardware and software to be deployed at the data center. Failure to do so will result in damaged equipment, power outages, and inefficient operation. Since database servers comprise one of the most popular and important server applications deployed in such facilities, it becomes necessary to have accurate cost models that can predict the power and energy demands that each database workloads will impose in the system. In this work we present an empirical methodology to estimate the power and energy cost of database operations. Our methodology uses multiple-linear regression to derive accurate cost models that depend only on readily available statistics such as selectivity factors, tuple size, numbers columns and relational cardinality. Moreover, our method does not need measurement of individual hardware components, but rather total power and energy consumption measured at a server. We have implemented our methodology, and ran experiments with several server configurations. Our experiments indicate that we can predict power and energy more accurately than alternative methods found in the literature.
Ayyanat, Jayachandran A; Harbour, Catherine; Kumar, Sanjeev; Singh, Manjula
2018-01-05
Many interventions have attempted to increase vulnerable and remote populations' access to ORS and zinc to reduce child mortality from diarrhoea. However, the impact of these interventions is difficult to measure. From 2010 to 15, Micronutrient Initiative (MI), worked with the public sector in Bihar, India to enable community health workers to treat and report uncomplicated child diarrhoea with ORS and zinc. We describe how we estimated programme's impact on child mortality with Lives Saved Tool (LiST) modelling and data from MI's management information system (MIS). This study demonstrates that using LiST modelling and MIS data are viable options for evaluating programmes to reduce child mortality. We used MI's programme monitoring data to estimate coverage rates and LiST modelling software to estimate programme impact on child mortality. Four scenarios estimated the effects of different rates of programme scale-up and programme coverage on estimated child mortality by measuring children's lives saved. The programme saved an estimated 806-975 children under-5 who had diarrhoea during five-year project phase. Increasing ORS and zinc coverage rates to 19.8% & 18.3% respectively under public sector coverage with effective treatment would have increased the programme's impact on child mortality and could have achieved the project goal of saving 4200 children's lives during the five-year programme. Programme monitoring data can be used with LiST modelling software to estimate coverage rates and programme impact on child mortality. This modelling approach may cost less and yield estimates sooner than directly measuring programme impact with population-based surveys. However, users must be cautious about relying on modelled estimates of impact and ensure that the programme monitoring data used is complete and precise about the programme aspects that are modelled. Otherwise, LiST may mis-estimate impact on child mortality. Further, LiST software may require modifications to its built-in assumptions to capture programmatic inputs. LiST assumes that mortality rates and cause of death structure change only in response to changes in programme coverage. In Bihar, overall child mortality has decreased and diarrhoea seems to be less lethal than previously, but at present LiST does not adjust its estimates for these sorts of changes.
Petroleum Refinery Jobs and Economic Development Impact (JEDI) Model User Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, Marshall
The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are user-friendly tools utilized to estimate the economic impacts at the local level of constructing and operating fuel and power generation projects for a range of conventional and renewable energy technologies. The JEDI Petroleum Refinery Model User Reference Guide was developed to assist users in employing and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on modelmore » add-in features, operation of the model, and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the model estimates job creation, earning and output (total economic activity) for a given petroleum refinery. This includes the direct, indirect and induced economic impacts to the local economy associated with the refinery's construction and operation phases. Project cost and job data used in the model are derived from the most current cost estimations available. Local direct and indirect economic impacts are estimated using economic multipliers derived from IMPLAN software. By determining the regional economic impacts and job creation for a proposed refinery, the JEDI Petroleum Refinery model can be used to field questions about the added value refineries may bring to the local community.« less
Using Pilots to Assess the Value and Approach of CMMI Implementation
NASA Technical Reports Server (NTRS)
Godfrey, Sara; Andary, James; Rosenberg, Linda
2002-01-01
At Goddard Space Flight Center (GSFC), we have chosen to use Capability Maturity Model Integrated (CMMI) to guide our process improvement program. Projects at GSFC consist of complex systems of software and hardware that control satellites, operate ground systems, run instruments, manage databases and data and support scientific research. It is a challenge to launch a process improvement program that encompasses our diverse systems, yet is manageable in terms of cost effectiveness. In order to establish the best approach for improvement, our process improvement effort was divided into three phases: 1) Pilot projects; 2) Staged implementation; and 3) Sustainment and continual improvement. During Phase 1 the focus of the activities was on a baselining process, using pre-appraisals in order to get a baseline for making a better cost and effort estimate for the improvement effort. Pilot pre-appraisals were conducted from different perspectives so different approaches for process implementation could be evaluated. Phase 1 also concentrated on establishing an improvement infrastructure and training of the improvement teams. At the time of this paper, three pilot appraisals have been completed. Our initial appraisal was performed in a flight software area, considering the flight software organization as the organization. The second appraisal was done from a project perspective, focusing on systems engineering and acquisition, and using the organization as GSFC. The final appraisal was in a ground support software area, again using GSFC as the organization. This paper will present our initial approach, lessons learned from all three pilots and the changes in our approach based on the lessons learned.
Expanding Software Productivity and Power while Reducing Costs.
ERIC Educational Resources Information Center
Winer, Ellen N.
1988-01-01
Microcomputer efficiency and software economy can be achieved through file transfer and data sharing. Costs can be reduced by purchasing computer systems that allow for expansion and portability of data. (MLF)
Development of a Low-cost, Comprehensive Recording System for Circadian Rhythm Behavior.
Kwon, Jea; Park, Min Gu; Lee, Seung Eun; Lee, C Justin
2018-02-01
Circadian rhythm is defined as a 24-hour biological oscillation, which persists even without any external cues but also can be re-entrained by various environmental cues. One of the widely accepted circadian rhythm behavioral experiment is measuring the wheel-running activity (WRA) of rodents. However, the price for commercially available WRA recording system is not easily affordable for researchers due to high-cost implementation of sensors for wheel rotation. Here, we developed a cost-effective and comprehensive system for circadian rhythm recording by measuring the house-keeping activities (HKA). We have monitored animal's HKA as electrical signal by simply connecting animal housing cage with a standard analog/digital converter: input to the metal lid and ground to the metal grid floor. We show that acquired electrical signals are combined activities of eating, drinking and natural locomotor behaviors which are well-known indicators of circadian rhythm. Post-processing of measured electrical signals enabled us to draw actogram, which verifies HKA to be reliable circadian rhythm indicator. To provide easy access of HKA recording system for researchers, we have developed user-friendly MATLAB-based software, Circa Analysis. This software provides functions for easy extraction of scalable "touch activity" from raw data files by automating seven steps of post-processing and drawing actograms with highly intuitive user-interface and various options. With our cost-effective HKA circadian rhythm recording system, we have estimated the cost of our system to be less than $150 per channel. We anticipate our system will benefit many researchers who would like to study circadian rhythm.
Parallel computers - Estimate errors caused by imprecise data
NASA Technical Reports Server (NTRS)
Kreinovich, Vladik; Bernat, Andrew; Villa, Elsa; Mariscal, Yvonne
1991-01-01
A new approach to the problem of estimating errors caused by imprecise data is proposed in the context of software engineering. A software device is used to produce an ideal solution to the problem, when the computer is capable of computing errors of arbitrary programs. The software engineering aspect of this problem is to describe a device for computing the error estimates in software terms and then to provide precise numbers with error estimates to the user. The feasibility of the program capable of computing both some quantity and its error estimate in the range of possible measurement errors is demonstrated.
NASA Astrophysics Data System (ADS)
Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika
2011-08-01
Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.
48 CFR 242.503-2 - Post-award conference procedure.
Code of Federal Regulations, 2014 CFR
2014-10-01
... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...
48 CFR 242.503-2 - Post-award conference procedure.
Code of Federal Regulations, 2012 CFR
2012-10-01
... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...
48 CFR 242.503-2 - Post-award conference procedure.
Code of Federal Regulations, 2011 CFR
2011-10-01
... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...
A Novel Rules Based Approach for Estimating Software Birthmark
Binti Alias, Norma; Anwar, Sajid
2015-01-01
Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363
NASA Technical Reports Server (NTRS)
Shell, Elaine M.; Lue, Yvonne; Chu, Martha I.
1999-01-01
Flight software is a mission critical element of spacecraft functionality and performance. When ground operations personnel interface to a spacecraft, they are typically dealing almost entirely with the capabilities of onboard software. This software, even more than critical ground/flight communications systems, is expected to perform perfectly during all phases of spacecraft life. Due to the fact that it can be reprogrammed on-orbit to accommodate degradations or failures in flight hardware, new insights into spacecraft characteristics, new control options which permit enhanced science options, etc., the on- orbit flight software maintenance team is usually significantly responsible for the long term success of a science mission. Failure of flight software to perform as needed can result in very expensive operations work-around costs and lost science opportunities. There are three basic approaches to maintaining spacecraft software--namely using the original developers, using the mission operations personnel, or assembling a center of excellence for multi-spacecraft software maintenance. Not planning properly for flight software maintenance can lead to unnecessarily high on-orbit costs and/or unacceptably long delays, or errors, in patch installations. A common approach for flight software maintenance is to access the original development staff. The argument for utilizing the development staff is that the people who developed the software will be the best people to modify the software on-orbit. However, it can quickly becomes a challenge to obtain the services of these key people. They may no longer be available to the organization. They may have a more urgent job to perform, quite likely on another project under different project management. If they havn't worked on the software for a long time, they may need precious time for refamiliarization to the software, testbeds and tools. Further, a lack of insight into issues related to flight software in its on-orbit environment, may find the developer unprepared for the challenges. The second approach is to train a member of the flight operations team to maintain the spacecraft software. This can prove to be a costly and inflexible solution. The person assigned to this duty may not have enough work to do during a problem free period and may have too much to do when a problem arises. If the person is a talented software engineer, he/she may not enjoy the limited software opportunities available in this position; and may eventually leave for newer technology computer science opportunities. Training replacement flight software personnel can be a difficult and lengthy process. The third approach is to assemble a center of excellence for on-orbit spacecraft software maintenance. Personnel in this specialty center can be managed to support flight software of multiple missions at once. The variety of challenges among a set of on-orbit missions, can result in a dedicated, talented staff which is fully trained and available to support each mission's needs. Such staff are not software developers but are rather spacecraft software systems engineers. The cost to any one mission is extremely low because the software staff works and charges, minimally on missions with no current operations issues; and their professional insight into on-orbit software troubleshooting and maintenance methods ensures low risk, effective and minimal-cost solutions to on-orbit issues.
Analysis of the Production Cost for Various Grades of Biomass Thermal Treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cherry, Robert S.; Wood, Rick A.; Westover, Tyler L.
2013-12-01
Process flow sheets were developed for the thermal treatment of southern pine wood chips at four temperatures (150, 180, 230, and 270 degrees C) and two different scales (20 and 100 ton/hour). The larger capacity processes had as their primary heat source hot gas assumed to be available in quantity from an adjacent biorefinery. Mass and energy balances for these flow sheets were developed using Aspen Plus process simulation software. The hot gas demands in the larger processes, up to 1.9 million lb/hour, were of questionable feasibility because of the volume to be moved. This heat was of low utilitymore » because the torrefaction process, especially at higher temperatures, is a net heat producer if the organic byproduct gases are burned. A thermal treatment flow sheet using wood chips dried in the biorefinery to 10% moisture content (rather than 30% for green chips) with transfer of high temperature steam from the thermal treatment depot to the biorefinery was also examined. The equipment size information from all of these cases was used in several different equipment cost estimating methods to estimate the major equipment costs for each process. From these, factored estimates of other plant costs were determined, leading to estimates (± 30% accuracy) of total plant capital cost. The 20 ton/hour processes were close to 25 million dollars except for the 230 degrees C case using dried wood chips which was only 15 million dollars because of its small furnace. The larger processes ranged from 64-120 million dollars. From these capital costs and projections of several categories of operating costs, the processing cost of thermally treated pine chips was found to be $28-33 per ton depending on the degree of treatment and without any credits for steam generation. If the excess energy output of the two 20 ton/hr depot cases at 270 degrees C can be sold for $10 per million BTU, the net processing cost dropped to $13/ton product starting with green wood chips or only $3 per ton if using dried chips from the biorefinery. Including a 12% return on invested capital raised all of the operating cost results by about $20/ton.« less
Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait analysis.
Pfister, Alexandra; West, Alexandre M; Bronner, Shaw; Noah, Jack Adam
2014-07-01
Biomechanical analysis is a powerful tool in the evaluation of movement dysfunction in orthopaedic and neurologic populations. Three-dimensional (3D) motion capture systems are widely used, accurate systems, but are costly and not available in many clinical settings. The Microsoft Kinect™ has the potential to be used as an alternative low-cost motion analysis tool. The purpose of this study was to assess concurrent validity of the Kinect™ with Brekel Kinect software in comparison to Vicon Nexus during sagittal plane gait kinematics. Twenty healthy adults (nine male, 11 female) were tracked while walking and jogging at three velocities on a treadmill. Concurrent hip and knee peak flexion and extension and stride timing measurements were compared between Vicon and Kinect™. Although Kinect measurements were representative of normal gait, the Kinect™ generally under-estimated joint flexion and over-estimated extension. Kinect™ and Vicon hip angular displacement correlation was very low and error was large. Kinect™ knee measurements were somewhat better than hip, but were not consistent enough for clinical assessment. Correlation between Kinect™ and Vicon stride timing was high and error was fairly small. Variability in Kinect™ measurements was smallest at the slowest velocity. The Kinect™ has basic motion capture capabilities and with some minor adjustments will be an acceptable tool to measure stride timing, but sophisticated advances in software and hardware are necessary to improve Kinect™ sensitivity before it can be implemented for clinical use.
A process model to estimate biodiesel production costs.
Haas, Michael J; McAloon, Andrew J; Yee, Winnie C; Foglia, Thomas A
2006-03-01
'Biodiesel' is the name given to a renewable diesel fuel that is produced from fats and oils. It consists of the simple alkyl esters of fatty acids, most typically the methyl esters. We have developed a computer model to estimate the capital and operating costs of a moderately-sized industrial biodiesel production facility. The major process operations in the plant were continuous-process vegetable oil transesterification, and ester and glycerol recovery. The model was designed using contemporary process simulation software, and current reagent, equipment and supply costs, following current production practices. Crude, degummed soybean oil was specified as the feedstock. Annual production capacity of the plant was set at 37,854,118 l (10 x 10(6)gal). Facility construction costs were calculated to be US dollar 11.3 million. The largest contributors to the equipment cost, accounting for nearly one third of expenditures, were storage tanks to contain a 25 day capacity of feedstock and product. At a value of US dollar 0.52/kg (dollar 0.236/lb) for feedstock soybean oil, a biodiesel production cost of US dollar 0.53/l (dollar 2.00/gal) was predicted. The single greatest contributor to this value was the cost of the oil feedstock, which accounted for 88% of total estimated production costs. An analysis of the dependence of production costs on the cost of the feedstock indicated a direct linear relationship between the two, with a change of US dollar 0.020/l (dollar 0.075/gal) in product cost per US dollar 0.022/kg (dollar 0.01/lb) change in oil cost. Process economics included the recovery of coproduct glycerol generated during biodiesel production, and its sale into the commercial glycerol market as an 80% w/w aqueous solution, which reduced production costs by approximately 6%. The production cost of biodiesel was found to vary inversely and linearly with variations in the market value of glycerol, increasing by US dollar 0.0022/l (dollar 0.0085/gal) for every US dollar 0.022/kg (dollar 0.01/lb) reduction in glycerol value. The model is flexible in that it can be modified to calculate the effects on capital and production costs of changes in feedstock cost, changes in the type of feedstock employed, changes in the value of the glycerol coproduct, and changes in process chemistry and technology.
Advanced software development workstation project: Engineering scripting language. Graphical editor
NASA Technical Reports Server (NTRS)
1992-01-01
Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lukac, Martin; Ramanathan, Nithya; Graham, Eric
2013-09-10
Black carbon (BC) emissions from traditional cooking fires and other sources are significant anthropogenic drivers of radiative forcing. Clean cookstoves present a more energy-efficient and cleaner-burning vehicle for cooking than traditional wood-burning stoves, yet many existing cookstoves reduce emissions by only modest amounts. Further research into cookstove use, fuel types, and verification of emissions is needed as adoption rates for such stoves remain low. Accelerated innovation requires techniques for measuring and verifying such cookstove performance. The overarching goal of the proposed program was to develop a low-cost, wireless instrument to provide a high-resolution profile of the cookstove BC emissions andmore » usage in the field. We proposed transferring the complexity of analysis away from the sampling hardware at the measurement site and to software at a centrally located server to easily analyze data from thousands of sampling instruments. We were able to build a low-cost field-based instrument that produces repeatable, low-cost estimates of cookstove usage, fuel estimates, and emission values with low variability. Emission values from our instrument were consistent with published ranges of emissions for similar stove and fuel types.« less
ERIC Educational Resources Information Center
Tan, Andrea; Ferreira, Aldónio
2012-01-01
This study investigates the influence of the use of accounting software in teaching activity-based costing (ABC) on the learning process. It draws upon the Theory of Planned Behaviour and uses the end-user computer satisfaction (EUCS) framework to examine students' satisfaction with the ABC software. The study examines students' satisfaction with…
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III
1991-01-01
Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.
Agile Development Methods for Space Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay; Webster, Chris
2012-01-01
Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).
A less field-intensive robust design for estimating demographic parameters with Mark-resight data
McClintock, B.T.; White, Gary C.
2009-01-01
The robust design has become popular among animal ecologists as a means for estimating population abundance and related demographic parameters with mark-recapture data. However, two drawbacks of traditional mark-recapture are financial cost and repeated disturbance to animals. Mark-resight methodology may in many circumstances be a less expensive and less invasive alternative to mark-recapture, but the models developed to date for these data have overwhelmingly concentrated only on the estimation of abundance. Here we introduce a mark-resight model analogous to that used in mark-recapture for the simultaneous estimation of abundance, apparent survival, and transition probabilities between observable and unobservable states. The model may be implemented using standard statistical computing software, but it has also been incorporated into the freeware package Program MARK. We illustrate the use of our model with mainland New Zealand Robin (Petroica australis) data collected to ascertain whether this methodology may be a reliable alternative for monitoring endangered populations of a closely related species inhabiting the Chatham Islands. We found this method to be a viable alternative to traditional mark-recapture when cost or disturbance to species is of particular concern in long-term population monitoring programs. ?? 2009 by the Ecological Society of America.
Adapting Wireless Technology to Lighting Control and Environmental Sensing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana Teasdale; Francis Rubinstein; Dave Watson
The high cost of retrofitting buildings with advanced lighting control systems is a barrier to adoption of this energy-saving technology. Wireless technology, however, offers a solution to mounting installation costs since it requires no additional wiring to implement. To demonstrate the feasibility of such a system, a prototype wirelessly-controlled advanced lighting system was designed and built. The system includes the following components: a wirelessly-controllable analog circuit module (ACM), a wirelessly-controllable electronic dimmable ballast, a T8 3-lamp fixture, an environmental multi-sensor, a current transducer, and control software. The ACM, dimmable ballast, multi-sensor, and current transducer were all integrated with SmartMesh{trademark} wirelessmore » mesh networking nodes, called motes, enabling wireless communication, sensor monitoring, and actuator control. Each mote-enabled device has a reliable communication path to the SmartMesh Manager, a single board computer that controls network functions and connects the wireless network to a PC running lighting control software. The ACM is capable of locally driving one or more standard 0-10 Volt electronic dimmable ballasts through relay control and a 0-10 Volt controllable output. The mote-integrated electronic dimmable ballast is designed to drive a standard 3-lamp T8 light fixture. The environmental multi-sensor measures occupancy, light level and temperature. The current transducer is used to measure the power consumed by the fixture. Control software was developed to implement advanced lighting algorithms, including daylight ramping, occupancy control, and demand response. Engineering prototypes of each component were fabricated and tested in a bench-scale system. Based on standard industry practices, a cost analysis was conducted. It is estimated that the installation cost of a wireless advanced lighting control system for a retrofit application is at least 30% lower than a comparable wired system for a typical 16,000 square-foot office building, with a payback period of less than 3 years.« less
Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio
1997-01-01
In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.
Makanza, R; Zaman-Allah, M; Cairns, J E; Eyre, J; Burgueño, J; Pacheco, Ángela; Diepenbrock, C; Magorokosho, C; Tarekegne, A; Olsen, M; Prasanna, B M
2018-01-01
Grain yield, ear and kernel attributes can assist to understand the performance of maize plant under different environmental conditions and can be used in the variety development process to address farmer's preferences. These parameters are however still laborious and expensive to measure. A low-cost ear digital imaging method was developed that provides estimates of ear and kernel attributes i.e., ear number and size, kernel number and size as well as kernel weight from photos of ears harvested from field trial plots. The image processing method uses a script that runs in a batch mode on ImageJ; an open source software. Kernel weight was estimated using the total kernel number derived from the number of kernels visible on the image and the average kernel size. Data showed a good agreement in terms of accuracy and precision between ground truth measurements and data generated through image processing. Broad-sense heritability of the estimated parameters was in the range or higher than that for measured grain weight. Limitation of the method for kernel weight estimation is discussed. The method developed in this work provides an opportunity to significantly reduce the cost of selection in the breeding process, especially for resource constrained crop improvement programs and can be used to learn more about the genetic bases of grain yield determinants.
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
Evaluating foodservice software: a suggested approach.
Fowler, K D
1986-09-01
In an era of cost containment, the computer has become a viable management tool. Its use in health care has demonstrated accelerated growth in recent years, and a literature review supports an increased trend in this direction. Foodservice, which is a major cost center, is no exception to this predicted trend. Because software has proliferated, foodservice managers and dietitians are experiencing growing concern about how to evaluate the numerous software packages from which to choose. A suggested approach to evaluating software is offered to dietitians and managers alike to lessen the confusion in software selection and to improve the system satisfaction level post-purchase. Steps of the software evaluatory approach include: delineation of goals, assessment of needs, assignment of value weight factors, development of a vendor checklist, survey of vendors by means of the vendor checklist and elimination of inappropriate systems, thorough development of the request for proposal (RFP) for submission to the selected vendors, an analysis of the returned RFPs in terms of system features and cost factors, and selection of the system(s) for implementation.
Kobayashi, Masanao; Asada, Yasuki; Matsubara, Kosuke; Suzuki, Shouichi; Matsunaga, Yuta; Haba, Tomonobu; Kawaguchi, Ai; Daioku, Tomihiko; Toyama, Hiroshi; Kato, Ryoichi
2017-05-01
Adequate dose management during computed tomography is important. In the present study, the dosimetric application software ImPACT was added to a functional calculator of the size-specific dose estimate and was part of the scan settings for the auto exposure control (AEC) technique. This study aimed to assess the practicality and accuracy of the modified ImPACT software for dose estimation. We compared the conversion factors identified by the software with the values reported by the American Association of Physicists in Medicine Task Group 204, and we noted similar results. Moreover, doses were calculated with the AEC technique and a fixed-tube current of 200 mA for the chest-pelvis region. The modified ImPACT software could estimate each organ dose, which was based on the modulated tube current. The ability to perform beneficial modifications indicates the flexibility of the ImPACT software. The ImPACT software can be further modified for estimation of other doses. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
RT-Syn: A real-time software system generator
NASA Technical Reports Server (NTRS)
Setliff, Dorothy E.
1992-01-01
This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.
NASA Astrophysics Data System (ADS)
Glier, Justin C.
In an effort to lower future CO2 emissions, a wide range of technologies are being developed to scrub CO2 from the flue gases of fossil fuel-based electric power and industrial plants. This thesis models one of several early-stage post-combustion CO2 capture technologies, solid sorbent-based CO2 capture process, and presents performance and cost estimates of this system on pulverized coal power plants. The spreadsheet-based software package Microsoft Excel was used in conjunction with AspenPlus modelling results and the Integrated Environmental Control Model to develop performance and cost estimates for the solid sorbent-based CO2 capture technology. A reduced order model also was created to facilitate comparisons among multiple design scenarios. Assumptions about plant financing and utilization, as well as uncertainties in heat transfer and material design that affect heat exchanger and reactor design were found to produce a wide range of cost estimates for solid sorbent-based systems. With uncertainties included, costs for a supercritical power plant with solid sorbent-based CO2 capture ranged from 167 to 533 per megawatt hour for a first-of-a-kind installation (with all costs in constant 2011 US dollars) based on a 90% confidence interval. The median cost was 209/MWh. Post-combustion solid sorbent-based CO2 capture technology is then evaluated in terms of the potential cost for a mature system based on historic experience as technologies are improved with sequential iterations of the currently available system. The range costs for a supercritical power plant with solid sorbent-based CO2 capture was found to be 118 to 189 per megawatt hour with a nominal value of 163 per megawatt hour given the expected range of technological improvement in the capital and operating costs and efficiency of the power plant after 100 GW of cumulative worldwide experience. These results suggest that the solid sorbent-based system will not be competitive with currently available liquid amine-systems in the absence of significant new improvements in solid sorbent properties and process system design to reduce the heat exchange surface area in the regenerator and cross-flow heat exchanger. Finally, the importance of these estimates for policy makers is discussed.
[Software for illustrating a cost-quality balance carried out by clinical laboratory practice].
Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi
2010-09-01
We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial
Low Cost Ways to Keep Software Current.
ERIC Educational Resources Information Center
Schultheis, Robert A.
1992-01-01
Discusses strategies for providing students with current computer software technology including acquiring previous versions of software, obtaining demonstration software, using student versions, getting examination software, buying from mail order firms, buying few copies, exploring site licenses, acquiring shareware or freeware, and applying for…
Adapting Wireless Technology to Lighting Control and Environmental Sensing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana Teasdale; Francis Rubinstein; David S. Watson
Although advanced lighting control systems offer significant energy savings, the high cost of retrofitting buildings with advanced lighting control systems is a barrier to adoption of this energy-saving technology. Wireless technology, however, offers a solution to mounting installation costs since it requires no additional wiring to implement. To demonstrate the feasibility of such a system, a prototype wirelessly-controlled advanced lighting system was designed and built. The system includes the following components: a wirelessly-controllable analog circuit module (ACM), a wirelessly-controllable electronic dimmable ballast, a T8 3-lamp fixture, an environmental multi-sensor, a current transducer, and control software. The ACM, dimmable ballast, multi-sensor,more » and current transducer were all integrated with SmartMesh{trademark} wireless mesh networking nodes, called motes, enabling wireless communication, sensor monitoring, and actuator control. Each mote-enabled device has a reliable communication path to the SmartMesh Manager, a single board computer that controls network functions and connects the wireless network to a PC running lighting control software. The ACM is capable of locally driving one or more standard 0-10 Volt electronic dimmable ballasts through relay control and a 0-10 Volt controllable output, in addition to 0-24 Volt and 0-10 Volt inputs. The mote-integrated electronic dimmable ballast is designed to drive a standard 3-lamp T8 light fixture. The environmental multisensor measures occupancy, light level and temperature. The current transducer is used to measure the power consumed by the fixture. Control software was developed to implement advanced lighting algorithms, including open and closed-loop daylight ramping, occupancy control, and demand response. Engineering prototypes of each component were fabricated and tested in a bench-scale system. Based on standard industry practices, a cost analysis was conducted. It is estimated that the installation cost of a wireless advanced lighting control system for a retrofit application is at least 20% lower than a comparable wired system for a typical 16,000 square-foot office building, with a payback period of less than 3 years. At 30% market penetration saturation, a cumulative 695 Billion kWh of energy could be saved through 2025, a cost savings of $52 Billion.« less
Textbook Software versus Professional Software: Which Is Better for Instructional Purposes?
ERIC Educational Resources Information Center
Snell, Meggan; Yatsenko, Olga
2002-01-01
Compares textbook software with professional packages such as Peachtree for teaching accounting, in terms of cost, availability, ease of teaching and learning, and applicability. Makes suggestions for choosing accounting software. (SK)
Austin, Peter C
2010-04-22
Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.
NASA Astrophysics Data System (ADS)
Cork, Chris; Lugg, Robert; Chacko, Manoj; Levi, Shimon
2005-06-01
With the exponential increase in output database size due to the aggressive optical proximity correction (OPC) and resolution enhancement technique (RET) required for deep sub-wavelength process nodes, the CPU time required for mask tape-out continues to increase significantly. For integrated device manufacturers (IDMs), this can impact the time-to-market for their products where even a few days delay could have a huge commercial impact and loss of market window opportunity. For foundries, a shorter turnaround time provides a competitive advantage in their demanding market, too slow could mean customers looking elsewhere for these services; while a fast turnaround may even command a higher price. With FAB turnaround of a mature, plain-vanilla CMOS process of around 20-30 days, a delay of several days in mask tapeout would contribute a significant fraction to the total time to deliver prototypes. Unlike silicon processing, masks tape-out time can be decreased by simply purchasing extra computing resources and software licenses. Mask tape-out groups are taking advantage of the ever-decreasing hardware cost and increasing power of commodity processors. The significant distributability inherent in some commercial Mask Synthesis software can be leveraged to address this critical business issue. Different implementations have different fractions of the code that cannot be parallelized and this affects the efficiency with which it scales, as is described by Amdahl"s law. Very few are efficient enough to allow the effective use of 1000"s of processors, enabling run times to drop from days to only minutes. What follows is a cost aware methodology to quantify the scalability of this class of software, and thus act as a guide to estimating the optimal investment in terms of hardware and software licenses.
Kinematic parameter estimation using close range photogrammetry for sport applications
NASA Astrophysics Data System (ADS)
Magre Colorado, Luz Alejandra; Martínez Santos, Juan Carlos
2015-12-01
In this article, we show the development of a low-cost hardware/software system based on close range photogrammetry to track the movement of a person performing weightlifting. The goal is to reduce the costs to the trainers and athletes dedicated to this sport when it comes to analyze the performance of the sportsman and avoid injuries or accidents. We used a web-cam as the data acquisition hardware and develop the software stack in Processing using the OpenCV library. Our algorithm extracts size, position, velocity, and acceleration measurements of the bar along the course of the exercise. We present detailed characteristics of the system with their results in a controlled setting. The current work improves the detection and tracking capabilities from a previous version of this system by using HSV color model instead of RGB. Preliminary results show that the system is able to profile the movement of the bar as well as determine the size, position, velocity, and acceleration values of a marker/target in scene. The average error finding the size of object at four meters of distance is less than 4%, and the error of the acceleration value is 1.01% in average.
Vulnerability in Determining the Cost of Information System Project to Avoid Loses
NASA Astrophysics Data System (ADS)
Haryono, Kholid; Ikhsani, Zulfa Amalia
2018-03-01
Context: This study discusses the priority of cost required in software development projects. Objectives: To show the costing models, the variables involved, and how practitioners assess and decide the priorities of each variable. To strengthen the information, each variable also confirmed the risk if ignored. Method: The method is done by two approaches. First, systematic literature reviews to find the models and variables used to decide the cost of software development. Second, confirm and take judgments about the level of importance and risk of each variable to the software developer. Result: Obtained about 54 variables that appear on the 10 models discussed. The variables are categorized into 15 groups based on the similarity of meaning. Each group becomes a variable. Confirmation results with practitioners on the level of importance and risk. It shown there are two variables that are considered very important and high risk if ignored. That is duration and effort. Conclusion: The relationship of variable rates between the results of literature studies and confirmation of practitioners contributes to the use of software business actors in considering project cost variables.
Fthenakis, Vasilis; Atia, Adam A.; Morin, Olivier; ...
2015-01-28
Increased water demand and increased drought episodes in the Middle East and other regions necessitate an expansion in desalination projects and create a great market opportunity for photovoltaics (PV). PV-powered desalination has previously been regarded as not being a cost-competitive solution when compared with conventionally powered desalination; however, the decline in PV costs over the last few years has changed this outlook. Here, this article presents up-to-date performance and cost analysis of reverse osmosis (RO) desalination powered with PV connected to the Saudi Arabian grid. Reference cases include relatively small (i.e., producing 6550 m 3 water per day) and largemore » (i.e., 190 000 m 3/day) desalination plants using seawater at a salinity of 40 000 ppm. We used data from a King Abdullah University for Science and Technology presentation and Hybrid Optimization Model for Electric Renewables 2.81 Energy Modeling Software (HOMER Energy LLC) in tandem with Desalination Economic Evaluation Program 4.0 (International Atomic Energy Agency) desalination software to analyze the techno-economic feasibility of these plants. The first phase of our work entailed a comparison between dual-axis high concentration PV (CPV) equipped with triple junction III/V solar cells and CdTe PV-powered RO systems. The estimated levelized cost of electricity from CPV is 0.16/kWh dollars, whereas that from CdTe PV is $0.10/kWh dollars and 0.09/kWh dollars for fixed-tilt and one-axis tracking systems, respectively. These costs are higher than the price of diesel-based grid electricity in the region because diesel fuel is heavily subsidized in Saudi Arabia.« less
Jha, Ashish Kumar
2015-01-01
Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2005-10-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2006-01-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
Pointo - a Low Cost Solution to Point Cloud Processing
NASA Astrophysics Data System (ADS)
Houshiar, H.; Winkler, S.
2017-11-01
With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.
Adane, Kasaw; Abiy, Zenegnaw; Desta, Kassu
2015-01-01
The rapid and continuous growth of health care cost aggravates the frequently low priority and less attention given in financing laboratory services. The poorest countries have the highest out-of-pocket spending as a percentage of income. Higher charges might provide a greater potential for revenue. If fees raise quality sufficiently, it can enhance usage. Therefore, estimating the revenue generated from laboratory services could help in capacity building and improved quality service provision. Panel study design was used to determine revenue generated from clinical chemistry and hematology services at Tikur Anbessa Specialized Teaching Hospital, Addis Ababa, Ethiopia. Activity-Based Costing (ABC) model was used to determine the true cost of tests performed from October 2011 to December 2011 in the hospital. The principle of Activity-based Costing is that activities consume resources and activities consumed by services which incur the costs and hence service takes the cost of resources. All resources with costs are aggregated with the established casual relationships. The process maps designed was restructured in consultation with the senior staffs working and/or supervising the laboratory and pretested checklists were used for observation. Moreover, office documents, receipts and service bills were used while collecting data. The amount of revenue collected from services was compared with the cost of each subsequent test and the profitability or return on investment (ROI) of services was calculated. Data were collected, entered, cleaned, and analyzed using Microsoft Excel 2007 software program and Statistical Software Package for Social Sciences version 19 (SPSS). Paired sample t test was used to compare the price and cost of each test. P-value less than 0.05 were considered as statistically significant. A total of 25,654 specimens were analyzed during 3 months of regular working hours. The total numbers of clinical chemistry and hematology tests performed during the study period were 45,959 (66.1 %) and 23,570 (33.9 %), respectively. Only 274, 386 (25.3 %) Ethiopian Birr (ETB) was recovered from the total cost of 1,086,008.09 ETB incurred on clinical chemistry and hematology laboratory tests. The result showed that, about 133,821 (12.32 %) ETB was revenue not collected from out-of-pocket payments that was paid for the services as a result of under pricing. The result showed that 18 out of 20 laboratory tests were under priced. The cost burden related to free Anti Retro-viral Therapy (ART) services was 285,979.82 (26.3 %) ETB. The cost per test estimated was significantly different to the existing price. About 90 % of the tests were under priced. This information could warn the hospital to reconsider resetting prices of these tests profitability ration less than 1. The revenue collected could help to build capacity, upscale quality, and sustainable service delivery.
2014-01-01
Objective To identify associations between market factors, especially relative reimbursement rates, and the probability of surgery and cost per episode for three medical conditions (cataract, benign prostatic neoplasm, and knee degeneration) with multiple treatment options. Methods We use 2004–2006 Medicare claims data for elderly beneficiaries from sixty nationally representative communities to estimate multivariate models for the probability of surgery and cost per episode of care as a function local market factors, including Medicare physician reimbursement for surgical versus non-surgical treatment and the availability of primary care and specialty physicians. We used Symmetry’s Episode Treatment Groups (ETG) software to group claims into episodes for the three conditions (n = 540,874 episodes). Results Higher Medicare reimbursement for surgical episodes and greater availability of the relevant specialists are significantly associated with more surgery and higher cost per episode for all three conditions, while greater availability of primary care physicians is significantly associated with less frequent surgery and lower cost per episode. Conclusion Relative Medicare reimbursement rates for surgical vs. non-surgical treatments and the availability of both primary care physicians and relevant specialists are associated with the likelihood of surgery and cost per episode. PMID:24949281
Karanth, Siddharth S; Lairson, David R; Savas, Lara S; Vernon, Sally W; Fernández, María E
2017-08-01
Mobile technology is opening new avenues for healthcare providers to create and implement tailored and personalized health education programs. We estimate and compare the cost of developing an i-Pad based tailored interactive multimedia intervention (TIMI) and a print based (Photonovella) intervention to increase human papillomavirus (HPV) immunization. The development costs of the interventions were calculated using a societal perspective. Direct cost included the cost of planning the study, conducting focus groups, and developing the intervention materials by the research staff. Costs also included the amount paid to the vendors who produced the TIMI and Photonovella. Micro cost data on the staff time and materials were recorded in logs for tracking personnel time, meeting time, supplies and software purchases. The costs were adjusted for inflation and reported in 2015 USD. The total cost of developing the Photonovella was $66,468 and the cost of developing the TIMI was $135,978. The amortized annual cost for the interventions calculated at a 3% discount rate and over a 7-year period was $10,669 per year for the Photonovella and $21,825 per year for the TIMI intervention. The results would inform decision makers when planning and investing in the development of interactive multimedia health interventions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Inclusion of LCCA in Alaska flexible pavement design software manual.
DOT National Transportation Integrated Search
2012-10-01
Life cycle cost analysis is a key part for selecting materials and techniques that optimize the service life of a pavement in terms of cost and performance. While the Alaska : Flexible Pavement Design software has been in use since 2004, there is no ...
Integration and Assessment of Component Health Prognostics in Supervisory Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Bonebrake, Christopher A.; Dib, Gerges
Enhanced risk monitors (ERMs) for active components in advanced reactor concepts use predictive estimates of component failure to update, in real time, predictive safety and economic risk metrics. These metrics have been shown to be capable of use in optimizing maintenance scheduling and managing plant maintenance costs. Integrating this information with plant supervisory control systems increases the potential for making control decisions that utilize real-time information on component conditions. Such decision making would limit the possibility of plant operations that increase the likelihood of degrading the functionality of one or more components while maintaining the overall functionality of the plant.more » ERM uses sensor data for providing real-time information about equipment condition for deriving risk monitors. This information is used to estimate the remaining useful life and probability of failure of these components. By combining this information with plant probabilistic risk assessment models, predictive estimates of risk posed by continued plant operation in the presence of detected degradation may be estimated. In this paper, we describe this methodology in greater detail, and discuss its integration with a prototypic software-based plant supervisory control platform. In order to integrate these two technologies and evaluate the integrated system, software to simulate the sensor data was developed, prognostic models for feedwater valves were developed, and several use cases defined. The full paper will describe these use cases, and the results of the initial evaluation.« less
The Ruggedized STD Bus Microcomputer - A low cost computer suitable for Space Shuttle experiments
NASA Technical Reports Server (NTRS)
Budney, T. J.; Stone, R. W.
1982-01-01
Previous space flight computers have been costly in terms of both hardware and software. The Ruggedized STD Bus Microcomputer is based on the commercial Mostek/Pro-Log STD Bus. Ruggedized PC cards can be based on commercial cards from more than 60 manufacturers, reducing hardware cost and design time. Software costs are minimized by using standard 8-bit microprocessors and by debugging code using commercial versions of the ruggedized flight boards while the flight hardware is being fabricated.
Spacelab cost reduction alternatives study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1976-01-01
Alternative approaches to payload operations planning and control and flight crew training are defined for spacelab payloads with the goal of: lowering FY77 and FY 78 costs for new starts; lowering costs to achieve Spacelab operational capability; and minimizing the cost per Spacelab flight. These alternatives attempt to minimize duplication of hardware, software, and personnel, and the investment in supporting facility and equipment. Of particular importance is the possible reduction of equipment, software, and manpower resources such as comtational systems, trainers, and simulators.
CellProfiler and KNIME: open source tools for high content screening.
Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc
2013-01-01
High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.
NASA Technical Reports Server (NTRS)
2003-01-01
Topics covered include: Nulling Infrared Radiometer for Measuring Temperature; The Ames Power Monitoring System; Hot Films on Ceramic Substrates for Measuring Skin Friction; Probe Without Moving Parts Measures Flow Angle; Detecting Conductive Liquid Leaking from Nonconductive Pipe; Adaptive Suppression of Noise in Voice Communications; High-Performance Solid-State W-Band Power Amplifiers; Microbatteries for Combinatorial Studies of Conventional Lithium-Ion Batteries; Correcting for Beam Aberrations in a Beam-Waveguide Antenna; Advanced Rainbow Solar Photovoltaic Arrays; Metal Side Reflectors for Trapping Light in QWIPs; Software for Collaborative Engineering of Launch Rockets; Software Assists in Extensive Environmental Auditing; Software Supports Distributed Operations via the Internet; Software Estimates Costs of Testing Rocket Engines; yourSky: Custom Sky-Image Mosaics via the Internet; Software for Managing Inventory of Flight Hardware; Lower-Conductivity Thermal-Barrier Coatings; Process for Smoothing an Si Substrate after Etching of SiO2; Flexible Composite-Material Pressure Vessel; Treatment to Destroy Chlorohydrocarbon Liquids in the Ground; Noncircular Cross Sections Could Enhance Mixing in Sprays; Small, Untethered, Mobile Roots for Inspecting Gas Pipes; Paint-Overspray Catcher; Preparation of Regular Specimens for Atom Probes; Inverse Tomo-Lithography for Making Microscopic 3D Parts; Predicting and Preventing Incipient Flameout in Combustors; MEMS-Based Piezoelectric/Electrostatic Inchworm Actuator; Metallized Capillaries as Probes for Raman Spectroscopy; Adaptation of Mesoscale Weather Models to Local Forecasting; Aerodynamic Design using Neural Networks; Combining Multiple Gyroscope Outputs for Increased Accuracy; and Improved Collision-Detection Method for Robotic Manipulator.
Software Piracy, Ethics, and the Academician.
ERIC Educational Resources Information Center
Bassler, Richard A.
The numerous software programs available for easy, low-cost copying raise ethical questions. The problem can be examined from the viewpoints of software users, teachers, authors, vendors, and distributors. Software users might hesitate to purchase or use software which prevents the making of back-up copies for program protection. Teachers in…
ERIC Educational Resources Information Center
Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.
2014-01-01
As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…
Software cost/resource modeling: Software quality tradeoff measurement
NASA Technical Reports Server (NTRS)
Lawler, R. W.
1980-01-01
A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, Charlie; Crook, Jerry
1997-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for a generic, advanced engine control system that will result in lower software maintenance (operations) costs. It effectively accommodates software requirements changes that occur due to hardware. technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives and benefits of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishment are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software, architecture, reuse software, and reduced software reverification time related to software changes. Currently, the program is focused on supporting MSFC in accomplishing a Space Shuttle Main Engine (SSME) hot-fire test at Stennis Space Center and the Low Cost Boost Technology (LCBT) Program.
NASA Technical Reports Server (NTRS)
1976-01-01
The work breakdown structure (WBS) for the Payload Specialist Station (PSS) is presented. The WBS is divided into two elements--PSS contractor and mission unique requirements. In accordance with the study ground rules, it is assumed that a single contractor, hereafter referred to as PSS Contractor will perform the following: (1) provide C and D hardware (MFDS and elements of MMSE), except for GFE; (2) identify software requirements; (3) provide GSE and ground test software; and (4) perform systems engineering and integration in support of the Aft Flight Deck (AFD) C and D concept. The PSS Contractor WBS element encompasses a core or standardized PSS concept. Payload peculiar C and D requirements identified by users will originate as a part of the WBS element mission unique requirements; these requirements will be provided to the PSS Contractor for implementation.
NASA Technical Reports Server (NTRS)
Mcbride, John G.
1990-01-01
The mission of the AdaNET research effort is to determine how to increase the availability of reusable Ada components and associated software engineering technology to both private and Federal sectors. The effort is structured to define the requirements for transfer of Federally developed software technology, study feasible approaches to meeting the requirements, and to gain experience in applying various technologies and practices. The overall approach to the development of the AdaNET System Specification is presented. A work breakdown structure is presented with each research activity described in detail. The deliverables for each work area are summarized. The overall organization and responsibilities for each research area are described. The schedule and necessary resources are presented for each research activity. The estimated cost is summarized for each activity. The project plan is fully described in the Super Project Expert data file contained on the floppy disk attached to the back cover of this plan.
Overview of technology modeling in the Remedial Action Assessment System (RAAS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, C.D.; Bagaasen, L.M.; Chan, T.C.
1994-08-01
There are numerous hazardous waste sites under the jurisdiction of the US Department of Energy (DOE). To assist the cleanup of these sites in a more consistent, timely, and cost-effective manner, the Remedial Action Assessment System (RAAS) is being developed by the Pacific Northwest Laboratory (PNL). RAAS is a software tool designed to automate the initial technology selection within the remedial investigation/feasibility study (RI/FS) process. The software does several things for the user: (1) provides information about available remedial technologies, (2) sorts possible technologies to recommend a list of technologies applicable to a given site, (3) points out technical issuesmore » that may prevent the implementation of a technology, and (4) provides an estimate of the effectiveness of a given technology at a particular site. Information from RAAS can be used to compare remediation options and guide selection of technologies for further study.« less
U.S. ENVIRONMENTAL PROTECTION AGENCY'S LANDFILL GAS EMISSION MODEL (LANDGEM)
The paper discusses EPA's available software for estimating landfill gas emissions. This software is based on a first-order decomposition rate equation using empirical data from U.S. landfills. The software provides a relatively simple approach to estimating landfill gas emissi...
Projecting manpower to attain quality
NASA Technical Reports Server (NTRS)
Rone, K. Y.
1983-01-01
The resulting model is useful as a projection tool but must be validated in order to be used as an on-going software cost engineering tool. A procedure is developed to facilitate the tracking of model projections and actual data to allow the model to be tuned. Finally, since the model must be used in an environment of overlapping development activities on a progression of software elements in development and maintenance, a manpower allocation model is developed for use in a steady state development/maintenance environment. In these days of soaring software costs it becomes increasingly important to properly manage a software development project. One element of the management task is the projection and tracking of manpower required to perform the task. In addition, since the total cost of the task is directly related to the initial quality built into the software, it becomes a necessity to project the development manpower in a way to attain that quality. An approach to projecting and tracking manpower with quality in mind is described.
The Toxicity Estimation Software Tool (T.E.S.T.)
The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to estimate toxicological values for aquatic and mammalian species considering acute and chronic endpoints for screening purposes within TSCA and REACH programs.
Fault Tolerant Considerations and Methods for Guidance and Control Systems
1987-07-01
multifunction devices such as microprocessors with software. In striving toward the economic goal, however, a cost is incurred in a different coin, i.e...therefore been developed which reduces the software risk to acceptable proportions. Several of the techniques thus developed incur no significant cost ...complex that their design and implementation need computerized tools in order to be cost -effective (in a broad sense, including the capability of
Adams, Vanessa M.; Pressey, Robert L.; Stoeckl, Natalie
2014-01-01
The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements - conservation covenants and management agreements - based on payment level and proportion of properties required to be managed. We then spatially predicted landholders’ probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation. PMID:24892520
Gastroesophageal reflux disease burden in Iran.
Delavari, Alireza; Moradi, Ghobad; Elahi, Elham; Moradi-Lakeh, Maziar
2015-02-01
Gastroesophageal reflux disease is one of the most common disorders of the gastrointestinal tract. The prevalence of this disease ranges from 5% to 20% in Asia, Europe, and North America. The aim of this study was to estimate the burden of gastroesophageal reflux disease in Iran. Burden of gastroesophageal reflux disease in Iran was estimated for one year from 21 March 2006 to 20 March 2007. The definition was adjusted with ICD-code of K21. Incident-based disability-adjusted life year (DALY) was used as the unit of analysis to quantify disease burden. A simplified disease model and DisMod II software were used for modeling. The annual incidence for total population of males and females in Iran was estimated 17.72 and 28.06 per 1000, respectively. The average duration of gastroesophageal reflux disease as a chronic condition was estimated around 10 years in both sexes. Total DALYs for an average of 59 symptomatic days per year was estimated 153,554.3 (60,330.8 for males and 93,223.5 for females). The results of this study showed that reflux imposes high burden and high financial costs on the Iranian population. The burden of this disease in Iran is more similar to that of European countries rather than Asian countries. It is recommended to consider the disease as a public health problem and make decisions and public health plans to reduce the burden and financial costs of the disease in Iran.
Code of Federal Regulations, 2012 CFR
2012-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2010 CFR
2010-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2013 CFR
2013-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2014 CFR
2014-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Payload software technology: Software technology development plan
NASA Technical Reports Server (NTRS)
1977-01-01
Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.
ToxPredictor: a Toxicity Estimation Software Tool
The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...
Quality Market: Design and Field Study of Prediction Market for Software Quality Control
ERIC Educational Resources Information Center
Krishnamurthy, Janaki
2010-01-01
Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…
Toxicity Estimation Software Tool (TEST)
The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert
2005-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
Aladul, Mohammed I; Fitzpatrick, Raymond W; Chapman, Stephen R
2018-05-15
The approval of new biosimilars of infliximab, etanercept and adalimumab by the European Medicines Agency is expected to produce further cost savings to the healthcare system budget. This study aimed to estimate the budget impact of the introduction of new biosimilars Flixabi ® , Erelzi ® , Solymbic ® , Amgevita ® and Imraldi ® in rheumatology and gastroenterology specialities in the UK. A published budget impact model was adapted to estimate the expected cost savings following the entry of new biosimilars Flixabi ® , Erelzi ® , Solymbic ® , Amgevita ® and Imraldi ® in the UK over three-year time horizon. This model was based on retrospective market shares of biologics used in rheumatology and gastroenterology which were derived from DEFINE Software and healthcare professional perspectives. The model predicted that infliximab and etanercept biosimilars would replace their corresponding reference agents by 2020. Adalimumab biosimilars were predicted to achieve 19% of the rheumatology and gastroenterology market by 2020. Without the introduction of further biosimilars, the model predicted a reduction in expenditure of £44 million on biologics over the next three years. With the entry of Flixabi ® , Erelzi ® , Solymbic ® , Amgevita ® and Imraldi ® the model estimates cumulative savings of £285 million by 2020. The introduction of new infliximab, etanercept and adalimumab biosimilars will be associated with considerable cost savings and have a substantial favourable impact on the UK NHS budget. The number of biosimilars and time of entry of is critical to create competition which will result in maximum cost savings. Copyright © 2018 Elsevier Inc. All rights reserved.
TEST (Toxicity Estimation Software Tool) Ver 4.1
The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...
Luczak, Susan E; Rosen, I Gary
2014-08-01
Transdermal alcohol sensor (TAS) devices have the potential to allow researchers and clinicians to unobtrusively collect naturalistic drinking data for weeks at a time, but the transdermal alcohol concentration (TAC) data these devices produce do not consistently correspond with breath alcohol concentration (BrAC) data. We present and test the BrAC Estimator software, a program designed to produce individualized estimates of BrAC from TAC data by fitting mathematical models to a specific person wearing a specific TAS device. Two TAS devices were worn simultaneously by 1 participant for 18 days. The trial began with a laboratory alcohol session to calibrate the model and was followed by a field trial with 10 drinking episodes. Model parameter estimates and fit indices were compared across drinking episodes to examine the calibration phase of the software. Software-generated estimates of peak BrAC, time of peak BrAC, and area under the BrAC curve were compared with breath analyzer data to examine the estimation phase of the software. In this single-subject design with breath analyzer peak BrAC scores ranging from 0.013 to 0.057, the software created consistent models for the 2 TAS devices, despite differences in raw TAC data, and was able to compensate for the attenuation of peak BrAC and latency of the time of peak BrAC that are typically observed in TAC data. This software program represents an important initial step for making it possible for non mathematician researchers and clinicians to obtain estimates of BrAC from TAC data in naturalistic drinking environments. Future research with more participants and greater variation in alcohol consumption levels and patterns, as well as examination of gain scheduling calibration procedures and nonlinear models of diffusion, will help to determine how precise these software models can become. Copyright © 2014 by the Research Society on Alcoholism.
NOSC Program Managers Handbook. Revision 1
1988-02-01
cost. The effects of application of life-cycle cost analysis through the planning and RIDT&E phases of a program, and the " design to cost" concept on...is the plan for assuring the quality of the design , design documentation, and fabricated/assembled hardware and associated computer software. 13.5.3.2...listings and printouts, which document the n. requirements, design , or details of compute : software; explain the capabilities and limitations of the
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
Virtual Sensor for Kinematic Estimation of Flexible Links in Parallel Robots
Cabanes, Itziar; Mancisidor, Aitziber; Pinto, Charles
2017-01-01
The control of flexible link parallel manipulators is still an open area of research, endpoint trajectory tracking being one of the main challenges in this type of robot. The flexibility and deformations of the limbs make the estimation of the Tool Centre Point (TCP) position a challenging one. Authors have proposed different approaches to estimate this deformation and deduce the location of the TCP. However, most of these approaches require expensive measurement systems or the use of high computational cost integration methods. This work presents a novel approach based on a virtual sensor which can not only precisely estimate the deformation of the flexible links in control applications (less than 2% error), but also its derivatives (less than 6% error in velocity and 13% error in acceleration) according to simulation results. The validity of the proposed Virtual Sensor is tested in a Delta Robot, where the position of the TCP is estimated based on the Virtual Sensor measurements with less than a 0.03% of error in comparison with the flexible approach developed in ADAMS Multibody Software. PMID:28832510
[The planning of resource support of secondary medical care in hospital].
Kungurov, N V; Zil'berberg, N V
2010-01-01
The Ural Institute of dermatovenerology and immunopathology developed and implemented the software concerning the personalized total recording of medical services and pharmaceuticals. The Institute also presents such software as listing of medical services, software module of calculation of financial costs of implementing full standards of secondary medical care in case of chronic dermatopathy, reference book of standards of direct specific costs on laboratory and physiotherapy services, reference book of pharmaceuticals, testing systems and consumables. The unified information system of management recording is a good technique to substantiate the costs of the implementation of standards of medical care, including high-tech care with taking into account the results of total calculation of provided medical services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leonard Angello
2005-09-30
Power generators are concerned with the maintenance costs associated with the advanced turbines that they are purchasing. Since these machines do not have fully established Operation and Maintenance (O&M) track records, power generators face financial risk due to uncertain future maintenance costs. This risk is of particular concern, as the electricity industry transitions to a competitive business environment in which unexpected O&M costs cannot be passed through to consumers. These concerns have accelerated the need for intelligent software-based diagnostic systems that can monitor the health of a combustion turbine in real time and provide valuable information on the machine's performancemore » to its owner/operators. EPRI, Impact Technologies, Boyce Engineering, and Progress Energy have teamed to develop a suite of intelligent software tools integrated with a diagnostic monitoring platform that, in real time, interpret data to assess the 'total health' of combustion turbines. The 'Combustion Turbine Health Management System' (CTHMS) will consist of a series of 'Dynamic Link Library' (DLL) programs residing on a diagnostic monitoring platform that accepts turbine health data from existing monitoring instrumentation. CTHMS interprets sensor and instrument outputs, correlates them to a machine's condition, provide interpretative analyses, project servicing intervals, and estimate remaining component life. In addition, the CTHMS enables real-time anomaly detection and diagnostics of performance and mechanical faults, enabling power producers to more accurately predict critical component remaining useful life and turbine degradation.« less
Picozzi, Matteo; Milkereit, Claus; Parolai, Stefano; Jaeckel, Karl-Heinz; Veit, Ingo; Fischer, Joachim; Zschau, Jochen
2010-01-01
Over the last few years, the analysis of seismic noise recorded by two dimensional arrays has been confirmed to be capable of deriving the subsoil shear-wave velocity structure down to several hundred meters depth. In fact, using just a few minutes of seismic noise recordings and combining this with the well known horizontal-to-vertical method, it has also been shown that it is possible to investigate the average one dimensional velocity structure below an array of stations in urban areas with a sufficient resolution to depths that would be prohibitive with active source array surveys, while in addition reducing the number of boreholes required to be drilled for site-effect analysis. However, the high cost of standard seismological instrumentation limits the number of sensors generally available for two-dimensional array measurements (i.e., of the order of 10), limiting the resolution in the estimated shear-wave velocity profiles. Therefore, new themes in site-effect estimation research by two-dimensional arrays involve the development and application of low-cost instrumentation, which potentially allows the performance of dense-array measurements, and the development of dedicated signal-analysis procedures for rapid and robust estimation of shear-wave velocity profiles. In this work, we present novel low-cost wireless instrumentation for dense two-dimensional ambient seismic noise array measurements that allows the real–time analysis of the surface-wavefield and the rapid estimation of the local shear-wave velocity structure for site response studies. We first introduce the general philosophy of the new system, as well as the hardware and software that forms the novel instrument, which we have tested in laboratory and field studies. PMID:22319298
Flight Validation of On-Demand Operations: The Deep Space One Beacon Monitor Operations Experiment
NASA Technical Reports Server (NTRS)
Wyatt, Jay; Sherwood, Rob; Sue, Miles; Szijjarto, John
2000-01-01
After a brief overview of the operational concept, this paper will provide a detailed description of the _as-flown_ flight software components, the DS1 experiment plan, and experiment results to date. Special emphasis will be given to experiment results and lessons learned since the basic system design has been previously reported. Mission scenarios where beacon operations is highly applicable will be described. Detailed cost savings estimates for a sample science mission will be provided as will cumulative savings that are possible over the next fifteen years of NASA missions.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Pharmacy component of a hospital end-product cost-accounting system.
Smith, J E; Sheaffer, S L; Meyer, G E; Giorgilli, F
1988-04-01
Determination of pharmacy department standard costs for providing drug products to patients at Thomas Jefferson University Hospital in Philadelphia is described. The hospital is implementing a cost-accounting system (CAS) that uses software developed at the New England Medical Center, Boston. The pharmacy identified nine categories of intermediate products on the basis of labor consumption. Standard labor times for each product category are based on measurement or estimation of time for each task in the preparation and distribution of a dose. Variable-labor standard time was determined by adjusting the cumulative time for the tasks to account for nonproductive time and nonroutine activities, and a variable-labor standard cost for each category was calculated. The standard cost per dose included the costs of labor and supplies (variable and fixed) and equipment; this standard cost plus the acquisition cost of a drug line item is the total intermediate product cost. Because the CAS is based on the hospital's patient charges, clinical pharmacy services are excluded. Intermediate products that substantially affect end-product costs (costs per patient case) will be identified for inclusion in CAS reports. The CAS will give a more accurate picture of resource consumption, enabling managers to focus their efforts to improve efficiency and productivity and reduce supply use; it could also improve the accuracy of the budgeting process. The CAS will support hospital administration decisions about marketing end products and department managers' decisions about controlling intermediate-product costs.
Open-source meteor detection software for low-cost single-board computers
NASA Astrophysics Data System (ADS)
Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.
2016-01-01
This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.
2008-12-01
between our current project and the historical projects. Therefore to refine the historical volatility estimate of the previously completed software... historical volatility estimates obtained in the form of beliefs and plausibility based on subjective probabilities that take into consideration unique
Estimation of toxicity using a Java based software tool
A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...
NASA Technical Reports Server (NTRS)
Basili, V. R.
1981-01-01
Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.
Software IV and V Research Priorities and Applied Program Accomplishments Within NASA
NASA Technical Reports Server (NTRS)
Blazy, Louis J.
2000-01-01
The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering
Software For Computing Reliability Of Other Software
NASA Technical Reports Server (NTRS)
Nikora, Allen; Antczak, Thomas M.; Lyu, Michael
1995-01-01
Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.
Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh
2016-09-15
Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.
Computer Software for Life Cycle Cost.
1987-04-01
34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually
The Elusive Cost of Library Software
ERIC Educational Resources Information Center
Breeding, Marshall
2009-01-01
Software pricing is not a straightforward issue, since each procurement involves a special business arrangement between a library and its chosen vendor. The author thinks that it is reasonable to scale the cost of a product to such factors as the size of the library, the complexity of the installation, the number of simultaneous users, or the…
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, C.; Crook, J.
1998-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.
Dimensional Error in Rapid Prototyping with Open Source Software and Low-cost 3D-printer
Andrade-Delgado, Laura; Telich-Tarriba, Jose E.; Fuente-del-Campo, Antonio; Altamirano-Arcos, Carlos A.
2018-01-01
Summary: Rapid prototyping models (RPMs) had been extensively used in craniofacial and maxillofacial surgery, especially in areas such as orthognathic surgery, posttraumatic or oncological reconstructions, and implantology. Economic limitations are higher in developing countries such as Mexico, where resources dedicated to health care are limited, therefore limiting the use of RPM to few selected centers. This article aims to determine the dimensional error of a low-cost fused deposition modeling 3D printer (Tronxy P802MA, Shenzhen, Tronxy Technology Co), with Open source software. An ordinary dry human mandible was scanned with a computed tomography device. The data were processed with open software to build a rapid prototype with a fused deposition machine. Linear measurements were performed to find the mean absolute and relative difference. The mean absolute and relative difference was 0.65 mm and 1.96%, respectively (P = 0.96). Low-cost FDM machines and Open Source Software are excellent options to manufacture RPM, with the benefit of low cost and a similar relative error than other more expensive technologies. PMID:29464171
Dimensional Error in Rapid Prototyping with Open Source Software and Low-cost 3D-printer.
Rendón-Medina, Marco A; Andrade-Delgado, Laura; Telich-Tarriba, Jose E; Fuente-Del-Campo, Antonio; Altamirano-Arcos, Carlos A
2018-01-01
Rapid prototyping models (RPMs) had been extensively used in craniofacial and maxillofacial surgery, especially in areas such as orthognathic surgery, posttraumatic or oncological reconstructions, and implantology. Economic limitations are higher in developing countries such as Mexico, where resources dedicated to health care are limited, therefore limiting the use of RPM to few selected centers. This article aims to determine the dimensional error of a low-cost fused deposition modeling 3D printer (Tronxy P802MA, Shenzhen, Tronxy Technology Co), with Open source software. An ordinary dry human mandible was scanned with a computed tomography device. The data were processed with open software to build a rapid prototype with a fused deposition machine. Linear measurements were performed to find the mean absolute and relative difference. The mean absolute and relative difference was 0.65 mm and 1.96%, respectively ( P = 0.96). Low-cost FDM machines and Open Source Software are excellent options to manufacture RPM, with the benefit of low cost and a similar relative error than other more expensive technologies.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
Software and the future of programming languages.
Aho, Alfred V
2004-02-27
Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.
Managing configuration software of ground software applications with glueware
NASA Technical Reports Server (NTRS)
Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.
2003-01-01
This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2006-06-30
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st and July 30th 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool. Work is being carried out in Brazil, Belize, Chile, Peru and the USA.« less
Towards Test Driven Development for Computational Science with pFUnit
NASA Technical Reports Server (NTRS)
Rilee, Michael L.; Clune, Thomas L.
2014-01-01
Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.
ERIC Educational Resources Information Center
Stone, Antonia
1982-01-01
Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…
Gasparini, Roberto; Landa, Paolo; Amicizia, Daniela; Icardi, Giancarlo; Ricciardi, Walter; de Waure, Chiara; Tanfani, Elena; Bonanni, Paolo; Lucioni, Carlo; Testi, Angela; Panatto, Donatella
2016-08-02
The European Medicines Agency has approved a multicomponent serogroup B meningococcal vaccine (Bexsero®) for use in individuals of 2 months of age and older. A cost-effectiveness analysis (CEA) from the societal and Italian National Health Service perspectives was performed in order to evaluate the impact of vaccinating Italian infants less than 1 y of age with Bexsero®, as opposed to non-vaccination. The analysis was carried out by means of Excel Version 2011 and the TreeAge Pro® software Version 2012. Two basal scenarios that differed in terms of disease incidence (official and estimated data to correct for underreporting) were considered. In the basal scenarios, we considered a primary vaccination cycle with 4 doses (at 2, 4, 6 and 12 months of age) and 1 booster dose at the age of 11 y, the societal perspective and no cost for death. Sensitivity analyses were carried out in which crucial variables were changed over probable ranges. In Italy, on the basis of official data on disease incidence, vaccination with Bexsero® could prevent 82.97 cases and 5.61 deaths in each birth cohort, while these figures proved to be three times higher on considering the estimated incidence. The results of the CEA showed that the Incremental Cost Effectiveness Ratio (ICER) per QALY was €109,762 in the basal scenario if official data on disease incidence are considered and €26,599 if estimated data are considered. The tornado diagram indicated that the most influential factor on ICER was the incidence of disease. The probability of sequelae, the cost of the vaccine and vaccine effectiveness also had an impact. Our results suggest that vaccinating infants in Italy with Bexsero® has the ability to significantly reduce meningococcal disease and, if the probable underestimation of disease incidence is considered, routine vaccination is advisable.
Gasparini, Roberto; Landa, Paolo; Amicizia, Daniela; Icardi, Giancarlo; Ricciardi, Walter; de Waure, Chiara; Tanfani, Elena; Bonanni, Paolo; Lucioni, Carlo; Testi, Angela; Panatto, Donatella
2016-01-01
ABSTRACT The European Medicines Agency has approved a multicomponent serogroup B meningococcal vaccine (Bexsero®) for use in individuals of 2 months of age and older. A cost-effectiveness analysis (CEA) from the societal and Italian National Health Service perspectives was performed in order to evaluate the impact of vaccinating Italian infants less than 1 y of age with Bexsero®, as opposed to non-vaccination. The analysis was carried out by means of Excel Version 2011 and the TreeAge Pro® software Version 2012. Two basal scenarios that differed in terms of disease incidence (official and estimated data to correct for underreporting) were considered. In the basal scenarios, we considered a primary vaccination cycle with 4 doses (at 2, 4, 6 and 12 months of age) and 1 booster dose at the age of 11 y, the societal perspective and no cost for death. Sensitivity analyses were carried out in which crucial variables were changed over probable ranges. In Italy, on the basis of official data on disease incidence, vaccination with Bexsero® could prevent 82.97 cases and 5.61 deaths in each birth cohort, while these figures proved to be three times higher on considering the estimated incidence. The results of the CEA showed that the Incremental Cost Effectiveness Ratio (ICER) per QALY was €109,762 in the basal scenario if official data on disease incidence are considered and €26,599 if estimated data are considered. The tornado diagram indicated that the most influential factor on ICER was the incidence of disease. The probability of sequelae, the cost of the vaccine and vaccine effectiveness also had an impact. Our results suggest that vaccinating infants in Italy with Bexsero® has the ability to significantly reduce meningococcal disease and, if the probable underestimation of disease incidence is considered, routine vaccination is advisable. PMID:27163398
Integrated testing and verification system for research flight software design document
NASA Technical Reports Server (NTRS)
Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.
1979-01-01
The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.
Proactive Security Testing and Fuzzing
NASA Astrophysics Data System (ADS)
Takanen, Ari
Software is bound to have security critical flaws, and no testing or code auditing can ensure that software is flaw-less. But software security testing requirements have improved radically during the past years, largely due to criticism from security conscious consumers and Enterprise customers. Whereas in the past, security flaws were taken for granted (and patches were quietly and humbly installed), they now are probably one of the most common reasons why people switch vendors or software providers. The maintenance costs from security updates often add to become one of the biggest cost items to large Enterprise users. Fortunately test automation techniques have also improved. Techniques like model-based testing (MBT) enable efficient generation of security tests that reach good confidence levels in discovering zero-day mistakes in software. This technique is called fuzzing.
Supporting Indonesia's National Forest Monitoring System with LiDAR Observations
NASA Astrophysics Data System (ADS)
Hagen, S. C.
2015-12-01
Scientists at Applied GeoSolutions, Jet Propulsion Laboratory, Winrock International, and the University of New Hampshire are working with the government of Indonesia to enhance the National Forest Monitoring System in Kalimantan, Indonesia. The establishment of a reliable, transparent, and comprehensive NFMS has been limited by a dearth of relevant data that are accurate, low-cost, and spatially resolved at subnational scales. In this NASA funded project, we are developing, evaluating, and validating several critical components of a NFMS in Kalimantan, Indonesia, focusing on the use of LiDAR and radar imagery for improved carbon stock and forest degradation information. Applied GeoSolutions and the University of New Hampshire have developed an Open Source Software package to process large amounts LiDAR data quickly, easily, and accurately. The Open Source project is called lidar2dems and includes the classification of raw LAS point clouds and the creation of Digital Terrain Models (DTMs), Digital Surface Models (DSMs), and Canopy Height Models (CHMs). Preliminary estimates of forest structure and forest damage from logging from these data sets support the idea that comprehensive, well documented, freely available software for processing LiDAR data can enable countries such as Indonesia to cost effectively monitor their forests with high precision.
Evidence of absence (v2.0) software user guide
Dalthorp, Daniel; Huso, Manuela; Dail, David
2017-07-06
Evidence of Absence software (EoA) is a user-friendly software application for estimating bird and bat fatalities at wind farms and for designing search protocols. The software is particularly useful in addressing whether the number of fatalities is below a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software also includes tools (1) for estimating carcass persistence distributions and searcher efficiency parameters ( and ) from field trials, (2) for projecting future mortality based on past monitoring data, and (3) for exploring the potential consequences of various choices in the design of long-term incidental take permits for protected species. The software was designed specifically for cases where tolerance for mortality is low and carcass counts are small or even 0, but the tools also may be used for mortality estimates when carcass counts are large.
Sign: large-scale gene network estimation environment for high performance computing.
Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru
2011-01-01
Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .
Lyerla, R; Gouws, E; García-Calleja, J M; Zaniewski, E
2006-06-01
This paper describes improvements and updates to an established approach to making epidemiological estimates of HIV prevalence in countries with low level and concentrated epidemics. The structure of the software used to make estimates is briefly described, with particular attention to changes and improvements. The approach focuses on identifying populations which, through their behaviour, are at high risk of infection with HIV or who are exposed through the risk behaviour of their sexual partners. Estimates of size and HIV prevalence of these populations allow the total number of HIV infected people in a country or region to be estimated. Major changes in the software focus on the move away from short term projections and towards developing an epidemiological curve that more accurately represents the change in prevalence of HIV over time. The software continues to provide an output file for use in the Spectrum software so as to estimate the demographic impact of HIV infection at country level.
Artificial intelligence approaches to software engineering
NASA Technical Reports Server (NTRS)
Johannes, James D.; Macdonald, James R.
1988-01-01
Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.
ERIC Educational Resources Information Center
Paquet, Katherine G.
2013-01-01
Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…
Reducing Lifecycle Sustainment Costs
2015-05-01
ahead of government systems – Specific O&S needs in government: depots, software centers, VAMOSC/ ERP interfaces Implications of ERP Systems...funding is not allocated for its implementation . Technology Refresh often requires non-recurring engineering investment, but the Working Capital Funds...VAMOSC Systems – Cost and Software Data Reports (CSDRs) • Contractor Logistics Support Contracts • Includes subcontractor reporting – Effects of
NASA Technical Reports Server (NTRS)
Fordyce, Jess
1996-01-01
Work carried out to re-engineer the mission analysis segment of JPL's mission planning ground system architecture is reported on. The aim is to transform the existing software tools, originally developed for specific missions on different support environments, into an integrated, general purpose, multi-mission tool set. The issues considered are: the development of a partnership between software developers and users; the definition of key mission analysis functions; the development of a consensus based architecture; the move towards evolutionary change instead of revolutionary replacement; software reusability, and the minimization of future maintenance costs. The current status and aims of new developments are discussed and specific examples of cost savings and improved productivity are presented.
Advanced Structural Optimization Under Consideration of Cost Tracking
NASA Astrophysics Data System (ADS)
Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.
2014-06-01
In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.
Cost-effectiveness of external cephalic version for term breech presentation.
Tan, Jonathan M; Macario, Alex; Carvalho, Brendan; Druzin, Maurice L; El-Sayed, Yasser Y
2010-01-21
External cephalic version (ECV) is recommended by the American College of Obstetricians and Gynecologists to convert a breech fetus to vertex position and reduce the need for cesarean delivery. The goal of this study was to determine the incremental cost-effectiveness ratio, from society's perspective, of ECV compared to scheduled cesarean for term breech presentation. A computer-based decision model (TreeAge Pro 2008, Tree Age Software, Inc.) was developed for a hypothetical base case parturient presenting with a term singleton breech fetus with no contraindications for vaginal delivery. The model incorporated actual hospital costs (e.g., $8,023 for cesarean and $5,581 for vaginal delivery), utilities to quantify health-related quality of life, and probabilities based on analysis of published literature of successful ECV trial, spontaneous reversion, mode of delivery, and need for unanticipated emergency cesarean delivery. The primary endpoint was the incremental cost-effectiveness ratio in dollars per quality-adjusted year of life gained. A threshold of $50,000 per quality-adjusted life-years (QALY) was used to determine cost-effectiveness. The incremental cost-effectiveness of ECV, assuming a baseline 58% success rate, equaled $7,900/QALY. If the estimated probability of successful ECV is less than 32%, then ECV costs more to society and has poorer QALYs for the patient. However, as the probability of successful ECV was between 32% and 63%, ECV cost more than cesarean delivery but with greater associated QALY such that the cost-effectiveness ratio was less than $50,000/QALY. If the probability of successful ECV was greater than 63%, the computer modeling indicated that a trial of ECV is less costly and with better QALYs than a scheduled cesarean. The cost-effectiveness of a trial of ECV is most sensitive to its probability of success, and not to the probabilities of a cesarean after ECV, spontaneous reversion to breech, successful second ECV trial, or adverse outcome from emergency cesarean. From society's perspective, ECV trial is cost-effective when compared to a scheduled cesarean for breech presentation provided the probability of successful ECV is > 32%. Improved algorithms are needed to more precisely estimate the likelihood that a patient will have a successful ECV.
Life cycle cost modeling of conceptual space vehicles
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1993-01-01
This paper documents progress to date by the University of Dayton on the development of a life cycle cost model for use during the conceptual design of new launch vehicles and spacecraft. This research is being conducted under NASA Research Grant NAG-1-1327. This research effort changes the focus from that of the first two years in which a reliability and maintainability model was developed to the initial development of a life cycle cost model. Cost categories are initially patterned after NASA's three axis work breakdown structure consisting of a configuration axis (vehicle), a function axis, and a cost axis. The focus will be on operations and maintenance costs and other recurring costs. Secondary tasks performed concurrent with the development of the life cycle costing model include continual support and upgrade of the R&M model. The primary result of the completed research will be a methodology and a computer implementation of the methodology to provide for timely cost analysis in support of the conceptual design activities. The major objectives of this research are: to obtain and to develop improved methods for estimating manpower, spares, software and hardware costs, facilities costs, and other cost categories as identified by NASA personnel; to construct a life cycle cost model of a space transportation system for budget exercises and performance-cost trade-off analysis during the conceptual and development stages; to continue to support modifications and enhancements to the R&M model; and to continue to assist in the development of a simulation model to provide an integrated view of the operations and support of the proposed system.
Evaluation of the Ethiopian Millennium Rural Initiative: Impact on Mortality and Cost-Effectiveness
Curry, Leslie A.; Byam, Patrick; Linnander, Erika; Andersson, Kyeen M.; Abebe, Yigeremu; Zerihun, Abraham; Thompson, Jennifer W.; Bradley, Elizabeth H.
2013-01-01
Main Objective Few studies have examined the long-term, impact of large-scale interventions to strengthen primary care services for women and children in rural, low-income settings. We evaluated the impact of the Ethiopian Millennium Rural Initiative (EMRI), an 18-month systems-based intervention to improve the performance of 30 primary health care units in rural areas of Ethiopia. Methods We assessed the impact of EMRI on maternal and child survival using The Lives Saved Tool (LiST), Demography (DemProj) and AIDS Impact Model (AIM) tools in Spectrum software, inputting monthly data on 6 indicators 1) antenatal coverage (ANC), 2) skilled birth attendance coverage (SBA), 3) post-natal coverage (PNC), 4) HIV testing during ANC, 5) measles vaccination coverage, and 6) pentavalent 3 vaccination coverages. We calculated a cost-benefit ratio of the EMRI program including lives saved during implementation and lives saved during implementation and 5 year follow-up. Results A total of 134 lives (all children) were estimated to have been saved due to the EMRI interventions during the 18-month intervention in 30 health centers and their catchment areas, with an estimated additional 852 lives (820 children and 2 adults) saved during the 5-year post-EMRI period. For the 18-month intervention period, EMRI cost $37,313 per life saved ($42,366 per life if evaluation costs are included). Calculated over the 18-month intervention plus 5 years post-intervention, EMRI cost $5,875 per life saved ($6,671 per life if evaluation costs are included). The cost effectiveness of EMRI improves substantially if the performance achieved during the 18 months of the EMRI intervention is sustained for 5 years. Scaling up EMRI to operate for 5 years across the 4 major regions of Ethiopia could save as many as 34,908 lives. Significance A systems-based approach to improving primary care in low-income settings can have transformational impact on lives saved and be cost-effective. PMID:24260307
Preventable drug waste among anesthesia providers: opportunities for efficiency.
Atcheson, Carrie Leigh Hamby; Spivack, John; Williams, Robert; Bryson, Ethan O
2016-05-01
Health care service bundling experiments at the state and regional levels have showed reduced costs by providing a single lump-sum reimbursement for anesthesia services, surgery, and postoperative care. Potential for cost savings related to the provision of anesthesia care has the potential to significantly impact sustainability. This study defines and quantifies routine and preventable anesthetic drug waste and the patient, procedure, and anesthesia provider characteristics associated with increased waste. Over a 12-month period, the type and quantity of clean drugs prepared by the anesthesia team for the first case of the day were recorded. The amount of each drug administered was obtained from the computerized anesthesia record, and data were analyzed to determine the incidence and cost of routine and preventable drug waste. The monthly and yearly cost of preventable waste, including the cost of pharmacy tech labor and materials where applicable, was estimated based on surgical case volume at the study institution. All analyses were performed using SAS software v9.2. Anesthetic drugs prepared for 543 separate surgical cases were observed. Less than 20% of cases generated routine waste. Preventable waste was generated most frequently for ephedrine (59.5% of cases), succinylcholine (33.7%), and lidocaine (25.1%), and least frequently for ondansetron (1.3%), phenylephrine (2.6%), and dexamethasone (2.8%). The estimated yearly cost of preventable anesthetic drug waste was $185,250. Significant potential savings with little impact on clinically significant availability may be achieved through the use of prefilled syringes for some commonly used anesthetic drugs. An intelligently implemented switch to prefilled syringes for select drugs is a potential cost saving measure, but savings might be diminished by disposal of prefilled syringes when they expire, hidden costs in the hospital pharmacy, and inability to supply some medications in prefilled syringes due to stability or manufacturing issues. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Wulfson, Stephen
1988-01-01
Presents reviews of six computer software programs for teaching science. Provides the publisher, grade level, cost, and descriptions of software, including: (1) "Recycling Logic"; (2) "Introduction to Biochemistry"; (3) "Food for Thought"; (4) "Watts in a Home"; (5) "Geology in Action"; and (6)…
When to Renew Software Licences at HPC Centres? A Mathematical Analysis
NASA Astrophysics Data System (ADS)
Baolai, Ge; MacIsaac, Allan B.
2010-11-01
In this paper we study a common problem faced by many high performance computing (HPC) centres: When and how to renew commercial software licences. Software vendors often sell perpetual licences along with forward update and support contracts at an additional, annual cost. Every year or so, software support personnel and the budget units of HPC centres are required to make the decision of whether or not to renew such support, and usually such decisions are made intuitively. The total cost for a continuing support contract can, however, be costly. One might therefore want a rational answer to the question of whether the option for a renewal should be exercised and when. In an attempt to study this problem within a market framework, we present the mathematical problem derived for the day to day operation of a hypothetical HPC centre that charges for the use of software packages. In the mathematical model, we assume that the uncertainty comes from the demand, number of users using the packages, as well as the price. Further we assume the availability of up to date software versions may also affect the demand. We develop a renewal strategy that aims to maximize the expected profit from the use the software under consideration. The derived problem involves a decision tree, which constitutes a numerical procedure that can be processed in parallel.
Cost Effective Development of Usable Systems: Gaps between HCI and Software Architecture Design
NASA Astrophysics Data System (ADS)
Folmer, Eelke; Bosch, Jan
A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their software. Practice, however, shows that product quality (which includes usability among others) is not that high as it could be. Studies of software projects (Pressman, 2001) reveal that organizations spend a relative large amount of money and effort on fixing usability problems during late stage development. Some of these problems could have been detected and fixed much earlier. This avoidable rework leads to high costs and because during development different tradeoffs have to be made, for example between cost and quality leads to systems with less than optimal usability. This problem has been around for a couple of decades especially after software engineering (SE) and human computer interaction (HCI) became disciplines on their own. While both disciplines developed themselves, several gaps appeared which are now receiving increased attention in research literature. Major gaps of understanding, both between suggested practice and how software is actually developed in industry, but also between the best practices of each of the fields have been identified (Carrol et al, 1994, Bass et al, 2001, Folmer and Bosch, 2002). In addition, there are gaps in the fields of differing terminology, concepts, education, and methods.
Vognild, Lars K; Burkow, Tatjana M; Luque, Luis Fernandez
2009-01-01
In this paper we present an approach to building personal health services, supporting following-up, physical exercising, health education, and psychosocial support for the chronically ill, based on free open source software and low-cost computers, mobile devices, and consumer health and fitness devices. We argue that this will lower the cost of the systems, which is important given the increasing number of people with chronicle diseases and limited healthcare budgets.
Low-cost data analysis systems for processing multispectral scanner data
NASA Technical Reports Server (NTRS)
Whitely, S. L.
1976-01-01
The basic hardware and software requirements are described for four low cost analysis systems for computer generated land use maps. The data analysis systems consist of an image display system, a small digital computer, and an output recording device. Software is described together with some of the display and recording devices, and typical costs are cited. Computer requirements are given, and two approaches are described for converting black-white film and electrostatic printer output to inexpensive color output products. Examples of output products are shown.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 24 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory. Each software review lists source, cost, ability level, subject, topic, medium of transfer, required hardware, required software,…
A Formal Application of Safety and Risk Assessment in Software Systems
2004-09-01
characteristics of Software Engineering, Development, and Safety...against a comparison of planned and actual schedules, costs, and characteristics . Software Safety is focused on the reduction of unsafe incidents...they merely carry out the role for which they were anatomically designed.55 Software is characteristically like an anatomical cell as it merely
Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic
NASA Technical Reports Server (NTRS)
Leucht, Kurt W.; Semmel, Glenn S.
2008-01-01
The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.
Trends in computer hardware and software.
Frankenfeld, F M
1993-04-01
Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.
NASA Astrophysics Data System (ADS)
Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; Werth, Charles J.; Valocchi, Albert J.
2016-07-01
Characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydrogeophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with "big data" processing and numerous large-scale numerical simulations. To tackle such difficulties, the principal component geostatistical approach (PCGA) has been proposed as a "Jacobian-free" inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed in the traditional inversion methods. PCGA can be conveniently linked to any multiphysics simulation software with independent parallel executions. In this paper, we extend PCGA to handle a large number of measurements (e.g., 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data were compressed by the zeroth temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Only about 2000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method.
2012-02-01
parameter estimation method, but rather to carefully describe how to use the ERDC software implementation of MLSL that accommodates the PEST model...model independent LM method based parameter estimation software PEST (Doherty, 2004, 2007a, 2007b), which quantifies model to measure- ment misfit...et al. (2011) focused on one drawback associated with LM-based model independent parameter estimation as implemented in PEST ; viz., that it requires
Paretti, Nicholas V.; Kennedy, Jeffrey R.; Turney, Lovina A.; Veilleux, Andrea G.
2014-01-01
The regional regression equations were integrated into the U.S. Geological Survey’s StreamStats program. The StreamStats program is a national map-based web application that allows the public to easily access published flood frequency and basin characteristic statistics. The interactive web application allows a user to select a point within a watershed (gaged or ungaged) and retrieve flood-frequency estimates derived from the current regional regression equations and geographic information system data within the selected basin. StreamStats provides users with an efficient and accurate means for retrieving the most up to date flood frequency and basin characteristic data. StreamStats is intended to provide consistent statistics, minimize user error, and reduce the need for large datasets and costly geographic information system software.
PREMER: a Tool to Infer Biological Networks.
Villaverde, Alejandro F; Becker, Kolja; Banga, Julio R
2017-10-04
Inferring the structure of unknown cellular networks is a main challenge in computational biology. Data-driven approaches based on information theory can determine the existence of interactions among network nodes automatically. However, the elucidation of certain features - such as distinguishing between direct and indirect interactions or determining the direction of a causal link - requires estimating information-theoretic quantities in a multidimensional space. This can be a computationally demanding task, which acts as a bottleneck for the application of elaborate algorithms to large-scale network inference problems. The computational cost of such calculations can be alleviated by the use of compiled programs and parallelization. To this end we have developed PREMER (Parallel Reverse Engineering with Mutual information & Entropy Reduction), a software toolbox that can run in parallel and sequential environments. It uses information theoretic criteria to recover network topology and determine the strength and causality of interactions, and allows incorporating prior knowledge, imputing missing data, and correcting outliers. PREMER is a free, open source software tool that does not require any commercial software. Its core algorithms are programmed in FORTRAN 90 and implement OpenMP directives. It has user interfaces in Python and MATLAB/Octave, and runs on Windows, Linux and OSX (https://sites.google.com/site/premertoolbox/).
Data systems and computer science: Software Engineering Program
NASA Technical Reports Server (NTRS)
Zygielbaum, Arthur I.
1991-01-01
An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.
Hospital cost structure in the USA: what's behind the costs? A business case.
Chandra, Charu; Kumar, Sameer; Ghildayal, Neha S
2011-01-01
Hospital costs in the USA are a large part of the national GDP. Medical billing and supplies processes are significant and growing contributors to hospital operations costs in the USA. This article aims to identify cost drivers associated with these processes and to suggest improvements to reduce hospital costs. A Monte Carlo simulation model that uses @Risk software facilitates cost analysis and captures variability associated with the medical billing process (administrative) and medical supplies process (variable). The model produces estimated savings for implementing new processes. Significant waste exists across the entire medical supply process that needs to be eliminated. Annual savings, by implementing the improved process, have the potential to save several billion dollars annually in US hospitals. The other analysis in this study is related to hospital billing processes. Increased spending on hospital billing processes is not entirely due to hospital inefficiency. The study lacks concrete data for accurately measuring cost savings, but there is obviously room for improvement in the two US healthcare processes. This article only looks at two specific costs associated with medical supply and medical billing processes, respectively. This study facilitates awareness of escalating US hospital expenditures. Cost categories, namely, fixed, variable and administrative, are presented to identify the greatest areas for improvement. The study will be valuable to US Congress policy makers and US healthcare industry decision makers. Medical billing process, part of a hospital's administrative costs, and hospital supplies management processes are part of variable costs. These are the two major cost drivers of US hospitals' expenditures that were examined and analyzed.
What Librarians Still Don't Know about Free Software
ERIC Educational Resources Information Center
Chudnov, Daniel
2009-01-01
Free software isn't about cost, and it isn't about hype, it isn't about taking business away from vendors. It's about four kinds of freedom--the freedom to use the software for any purpose, the freedom to study how the software works, the freedom to modify the software to adapt it to one's needs, and the freedom to copy and share copies of the…
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2014 CFR
2014-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2013 CFR
2013-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
NHPP-Based Software Reliability Models Using Equilibrium Distribution
NASA Astrophysics Data System (ADS)
Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi
Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.
NASA Astrophysics Data System (ADS)
Diefenbach, A. K.; Crider, J. G.; Schilling, S. P.; Dzurisin, D.
2007-12-01
We describe a low-cost application of digital photogrammetry using commercial grade software, an off-the-shelf digital camera, a laptop computer and oblique photographs to reconstruct volcanic dome morphology during the on-going eruption at Mount St. Helens, Washington. Renewed activity at Mount St. Helens provides a rare opportunity to devise and test new methods for better understanding and predicting volcanic events, because the new method can be validated against other observations on this well-instrumented volcano. Uncalibrated, oblique aerial photographs (snap shots) taken from a helicopter are the raw data. Twelve sets of overlapping digital images of the dome taken during 2004-2007 were used to produce digital elevation models (DEMs) from which dome height, eruption volume and extrusion rate can be derived. Analyses of the digital images were carried out using PhotoModeler software, which produces three dimensional coordinates of points identified in multiple photos. The steps involved include: (1) calibrating the digital camera using this software package, (2) establishing control points derived from existing DEMs, (3) identifying tie points located in each photo of any given model date, and (4) identifying points in pairs of photos to build a three dimensional model of the evolving dome at each photo date. Text files of three-dimensional points encompassing the dome at each date were imported into ArcGIS and three-dimensional models (triangulated irregular network or TINs) were generated. TINs were then converted to 2 m raster DEMs. The evolving morphology of the growing dome was modeled by comparison of successive DEMs. The volume of extruded lava visible in each DEM was calculated using the 1986 pre-eruption crater floor topography as a basal surface. Results were validated by comparing volume measurements derived from traditional aerophotogrammetric surveys run by the USGS Cascades Volcano Observatory. Our new "quick and cheap" technique yields estimates of eruptive volume consistently within 5% of the volumes estimated with traditional surveys. The end result of this project is a new technique that provides an inexpensive, rapid assessment tool for tracking lava dome growth or other topographic changes at restless volcanoes.
Implementing Software Safety in the NASA Environment
NASA Technical Reports Server (NTRS)
Wetherholt, Martha S.; Radley, Charles F.
1994-01-01
Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.
Voice recognition software for clinical use.
Korn, K
1998-11-01
The current generation voice recognition products truly offer the promise of voice recognition systems, that are financially and operationally acceptable for use in a health care facility. Although the initial capital outlay for the purchase of such equipment may be substantial, the long-term benefit is felt to outweigh the expense. The ability to utilize computer equipment for educational purposes and information management alone helps to rationalize the cost. In addition, it is important to remember that the Internet has become a substantial source of information which provides another functional use for this equipment. Although one can readily see the implication for such a program in clinical practice, other uses for the program should not be overlooked. Uses far beyond the writing of clinic notes and correspondence can be easily envisioned. Utilization of voice recognition software offers clinical practices the ability to produce quality printed records in a timely and cost-effective manner. After learning procedures for the selected product and appropriately formatting word processing software and printers, printed progress notes should be able to be produced in less time than traditional dictation and transcription methods. Although certain procedures and practices may need to be altered, or may preclude optimal utilization of this type of system, many advantages are apparent. It is recommended that facilities consider utilization of Voice Recognition products such as Dragon Systems Naturally Speaking Software, or at least consider a trial of this method with one of the limited-feature products, if current dictation practices are unsatisfactory or excessively costly. Free downloadable trial software or single user software can provide a reduced-cost method for trial evaluation of such products if a major commitment is not felt to be desired. A list of voice recognition software manufacturer web sites may be accessed through the following: http://www.dragonsys.com/ http://www.software.ibm/com/is/voicetype/ http://www.lhs.com/
Estimation and enhancement of real-time software reliability through mutation analysis
NASA Technical Reports Server (NTRS)
Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.
1992-01-01
A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.
Optimal design of upstream processes in biotransformation technologies.
Dheskali, Endrit; Michailidi, Katerina; de Castro, Aline Machado; Koutinas, Apostolis A; Kookos, Ioannis K
2017-01-01
In this work a mathematical programming model for the optimal design of the bioreaction section of biotechnological processes is presented. Equations for the estimation of the equipment cost derived from a recent publication by the US National Renewable Energy Laboratory (NREL) are also summarized. The cost-optimal design of process units and the optimal scheduling of their operation can be obtained using the proposed formulation that has been implemented in software available from the journal web page or the corresponding author. The proposed optimization model can be used to quantify the effects of decisions taken at a lab scale on the industrial scale process economics. It is of paramount important to note that this can be achieved at the early stage of the development of a biotechnological project. Two case studies are presented that demonstrate the usefulness and potential of the proposed methodology. Copyright © 2016. Published by Elsevier Ltd.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
Florence, Curtis; Shepherd, Jonathan; Brennan, Iain; Simon, Thomas R
2014-04-01
To assess the costs and benefits of a partnership between health services, police and local government shown to reduce violence-related injury. Benefit-cost analysis. Anonymised information sharing and use led to a reduction in wounding recorded by the police that reduced the economic and social costs of violence by £6.9 million in 2007 compared with the costs the intervention city, Cardiff UK, would have experienced in the absence of the programme. This includes a gross cost reduction of £1.25 million to the health service and £1.62 million to the criminal justice system in 2007. By contrast, the costs associated with the programme were modest: setup costs of software modifications and prevention strategies were £107 769, while the annual operating costs of the system were estimated as £210 433 (2003 UK pound). The cumulative social benefit-cost ratio of the programme from 2003 to 2007 was £82 in benefits for each pound spent on the programme, including a benefit-cost ratio of 14.80 for the health service and 19.1 for the criminal justice system. Each of these benefit-cost ratios is above 1 across a wide range of sensitivity analyses. An effective information-sharing partnership between health services, police and local government in Cardiff, UK, led to substantial cost savings for the health service and the criminal justice system compared with 14 other cities in England and Wales designated as similar by the UK government where this intervention was not implemented.
Front-End/Gateway Software: Availability and Usefulness.
ERIC Educational Resources Information Center
Kesselman, Martin
1985-01-01
Reviews features of front-end software packages (interface between user and online system)--database selection, search strategy development, saving and downloading, hardware and software requirements, training and documentation, online systems and database accession, and costs--and discusses gateway services (user searches through intermediary…
The Economics of Educational Software Portability.
ERIC Educational Resources Information Center
Oliveira, Joao Batista Araujo e
1990-01-01
Discusses economic issues that affect the portability of educational software. Topics discussed include economic reasons for portability, including cost effectiveness; the nature and behavior of educational computer software markets; the role of producers, buyers, and consumers; potential effects of government policies; computer piracy; and…
The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.
ERIC Educational Resources Information Center
Bontis, Nick; Chung, Honsan
2000-01-01
Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…
Future Software Sizing Metrics and Estimation Challenges
2011-07-01
systems 4. Ultrahigh software system assurance 5. Legacy maintenance and Brownfield development 6. Agile and Lean/ Kanban development. This paper...refined as the design of the maintenance modifications or Brownfield re-engineering is determined. VII. 6. AGILE AND LEAN/ KANBAN DEVELOPMENT The...difficulties of software maintenance estimation can often be mitigated by using lean workflow management techniques such as Kanban [25]. In Kanban
[Evaluation of Organ Dose Estimation from Indices of CT Dose Using Dose Index Registry].
Iriuchijima, Akiko; Fukushima, Yasuhiro; Ogura, Akio
Direct measurement of each patient organ dose from computed tomography (CT) is not possible. Most methods to estimate patient organ dose is using Monte Carlo simulation with dedicated software. However, dedicated software is too expensive for small scale hospitals. Not every hospital can estimate organ dose with dedicated software. The purpose of this study was to evaluate the simple method of organ dose estimation using some common indices of CT dose. The Monte Carlo simulation software Radimetrics (Bayer) was used for calculating organ dose and analysis relationship between indices of CT dose and organ dose. Multidetector CT scanners were compared with those from two manufactures (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare). Using stored patient data from Radimetrics, the relationships between indices of CT dose and organ dose were indicated as each formula for estimating organ dose. The accuracy of estimation method of organ dose was compared with the results of Monte Carlo simulation using the Bland-Altman plots. In the results, SSDE was the feasible index for estimation organ dose in almost organs because it reflected each patient size. The differences of organ dose between estimation and simulation were within 23%. In conclusion, our estimation method of organ dose using indices of CT dose is convenient for clinical with accuracy.
Urban Earthquake Shaking and Loss Assessment
NASA Astrophysics Data System (ADS)
Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.
2009-04-01
This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.
Jelicic Kadic, Antonia; Vucic, Katarina; Dosenovic, Svjetlana; Sapunar, Damir; Puljak, Livia
2016-06-01
To compare speed and accuracy of graphical data extraction using manual estimation and open source software. Data points from eligible graphs/figures published in randomized controlled trials (RCTs) from 2009 to 2014 were extracted by two authors independently, both by manual estimation and with the Plot Digitizer, open source software. Corresponding authors of each RCT were contacted up to four times via e-mail to obtain exact numbers that were used to create graphs. Accuracy of each method was compared against the source data from which the original graphs were produced. Software data extraction was significantly faster, reducing time for extraction for 47%. Percent agreement between the two raters was 51% for manual and 53.5% for software data extraction. Percent agreement between the raters and original data was 66% vs. 75% for the first rater and 69% vs. 73% for the second rater, for manual and software extraction, respectively. Data extraction from figures should be conducted using software, whereas manual estimation should be avoided. Using software for data extraction of data presented only in figures is faster and enables higher interrater reliability. Copyright © 2016 Elsevier Inc. All rights reserved.
Minimizing communication cost among distributed controllers in software defined networks
NASA Astrophysics Data System (ADS)
Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed
2016-08-01
Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.
Empirical Estimates of 0Day Vulnerabilities in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miles A. McQueen; Wayne F. Boyer; Sean M. McBride
2009-01-01
We define a 0Day vulnerability to be any vulnerability, in deployed software, which has been discovered by at least one person but has not yet been publicly announced or patched. These 0Day vulnerabilities are of particular interest when assessing the risk to well managed control systems which have already effectively mitigated the publicly known vulnerabilities. In these well managed systems the risk contribution from 0Days will have proportionally increased. To aid understanding of how great a risk 0Days may pose to control systems, an estimate of how many are in existence is needed. Consequently, using the 0Day definition given above,more » we developed and applied a method for estimating how many 0Day vulnerabilities are in existence on any given day. The estimate is made by: empirically characterizing the distribution of the lifespans, measured in days, of 0Day vulnerabilities; determining the number of vulnerabilities publicly announced each day; and applying a novel method for estimating the number of 0Day vulnerabilities in existence on any given day using the number of vulnerabilities publicly announced each day and the previously derived distribution of 0Day lifespans. The method was first applied to a general set of software applications by analyzing the 0Day lifespans of 491 software vulnerabilities and using the daily rate of vulnerability announcements in the National Vulnerability Database. This led to a conservative estimate that in the worst year there were, on average, 2500 0Day software related vulnerabilities in existence on any given day. Using a smaller but intriguing set of 15 0Day software vulnerability lifespans representing the actual time from discovery to public disclosure, we then made a more aggressive estimate. In this case, we estimated that in the worst year there were, on average, 4500 0Day software vulnerabilities in existence on any given day. We then proceeded to identify the subset of software applications likely to be used in some control systems, analyzed the associated subset of vulnerabilities, and characterized their lifespans. Using the previously developed method of analysis, we very conservatively estimated 250 control system related 0Day vulnerabilities in existence on any given day. While reasonable, this first order estimate for control systems is probably far more conservative than those made for general software systems since the estimate did not include vulnerabilities unique to control system specific components. These control system specific vulnerabilities were unable to be included in the estimate for a variety of reasons with the most problematic being that the public announcement of unique control system vulnerabilities is very sparse. Consequently, with the intent to improve the above 0Day estimate for control systems, we first identified the additional, unique to control systems, vulnerability estimation constraints and then investigated new mechanisms which may be useful for estimating the number of unique 0Day software vulnerabilities found in control system components. We proceeded to identify a number of new mechanisms and approaches for estimating and incorporating control system specific vulnerabilities into an improved 0Day estimation method. These new mechanisms and approaches appear promising and will be more rigorously evaluated during the course of the next year.« less
Computational Simulations and the Scientific Method
NASA Technical Reports Server (NTRS)
Kleb, Bil; Wood, Bill
2005-01-01
As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.
NASA Astrophysics Data System (ADS)
Sirirojvisuth, Apinut
In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.
CAI System Costs: Present and Future.
ERIC Educational Resources Information Center
Pressman, Israel; Rosenbloom, Bruce
1984-01-01
Discusses costs related to providing computer assisted instruction (CAI), considering hardware, software, user training, maintenance, and installation. Provides an example of the total cost of CAI broken down into these categories, giving an adjusted yearly cost. Projects future trends and costs of CAI as well as cost savings possibilities. (JM)
Elementary Keyboarding Software Product Reports.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This report provides detailed product descriptions of 45 software programs designed to teach or improve the keyboarding skills of elementary school students that were identified by the MicroSIFT (Microcomputer Information and Software for Teachers) staff. The descriptions include program titles, producer names, costs, grade levels, hardware,…
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
10 CFR 603.550 - Acceptability of intellectual property.
Code of Federal Regulations, 2011 CFR
2011-01-01
... property (e.g., copyrighted material, including software) as cost sharing because: (1) It is difficult to... the contribution. For example, a for-profit firm may offer the use of commercially available software... the software would not be a reasonable basis for valuing its use. ...
10 CFR 603.550 - Acceptability of intellectual property.
Code of Federal Regulations, 2012 CFR
2012-01-01
... property (e.g., copyrighted material, including software) as cost sharing because: (1) It is difficult to... the contribution. For example, a for-profit firm may offer the use of commercially available software... the software would not be a reasonable basis for valuing its use. ...
Software engineering with application-specific languages
NASA Technical Reports Server (NTRS)
Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.
1993-01-01
Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.
ERIC Educational Resources Information Center
Pressman, Israel; Rosenbloom, Bruce
1984-01-01
Describes and evaluates costs of hardware, software, training, and maintenance for computer assisted instruction (CAI) as they relate to total system cost. An example of an educational system provides an illustration of CAI cost analysis. Future developments, cost effectiveness, affordability, and applications in public and private environments…
NASA Technical Reports Server (NTRS)
Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.
1992-01-01
A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
The Large Synoptic Survey Telescope project management control system
NASA Astrophysics Data System (ADS)
Kantor, Jeffrey P.
2012-09-01
The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.
NASA Astrophysics Data System (ADS)
Kumar, Ravi; Singh, Surya Prakash
2017-11-01
The dynamic cellular facility layout problem (DCFLP) is a well-known NP-hard problem. It has been estimated that the efficient design of DCFLP reduces the manufacturing cost of products by maintaining the minimum material flow among all machines in all cells, as the material flow contributes around 10-30% of the total product cost. However, being NP hard, solving the DCFLP optimally is very difficult in reasonable time. Therefore, this article proposes a novel similarity score-based two-phase heuristic approach to solve the DCFLP optimally considering multiple products in multiple times to be manufactured in the manufacturing layout. In the first phase of the proposed heuristic, a machine-cell cluster is created based on similarity scores between machines. This is provided as an input to the second phase to minimize inter/intracell material handling costs and rearrangement costs over the entire planning period. The solution methodology of the proposed approach is demonstrated. To show the efficiency of the two-phase heuristic approach, 21 instances are generated and solved using the optimization software package LINGO. The results show that the proposed approach can optimally solve the DCFLP in reasonable time.
ERIC Educational Resources Information Center
Computer Symbolic, Inc., Washington, DC.
A pseudo assembly language, PAL, was developed and specified for use as the lowest level in a general, multilevel programing system for the realization of cost-effective, hardware-independent Naval software. The language was developed as part of the system called FIRMS (Fast Iterative Recursive Macro System) and is sufficiently general to allow…
Streamlining Software Aspects of Certification: Technical Team Report on the First Industry Workshop
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Holloway, C. Michael; Knight, John C.; Leveson, Nancy G.; Yang, Jeffrey C.; Dorsey, Cheryl A.; McCormick, G. Frank
1998-01-01
To address concerns about time and expense associated with software aspects of certification, the Federal Aviation Administration (FAA) began the Streamlining Software Aspects of Certification (SSAC) program. As part of this program, a Technical Team was established to determine whether the cost and time associated with certifying aircraft can be reduced while maintaining or improving safety, with the intent of impacting the FAA's Flight 2000 program. The Technical Team conducted a workshop to gain a better understanding of the major concerns in industry about software cost and schedule. Over 120 people attended the workshop, including representatives from the FAA,commercial transport and general aviation aircraft manufacturers and suppliers, and procurers and developers of non-airborne systems; and, more than 200 issues about software aspects of certification were recorded. This paper provides an overview of the SSAC program, motivation for the workshop, details of the workshop activities and outcomes, and recommendations for follow-on work.
Software Certification for Temporal Properties With Affordable Tool Qualification
NASA Technical Reports Server (NTRS)
Xia, Songtao; DiVito, Benedetto L.
2005-01-01
It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.
Fully decentralized estimation and control for a modular wheeled mobile robot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mutambara, A.G.O.; Durrant-Whyte, H.F.
2000-06-01
In this paper, the problem of fully decentralized data fusion and control for a modular wheeled mobile robot (WMR) is addressed. This is a vehicle system with nonlinear kinematics, distributed multiple sensors, and nonlinear sensor models. The problem is solved by applying fully decentralized estimation and control algorithms based on the extended information filter. This is achieved by deriving a modular, decentralized kinematic model by using plane motion kinematics to obtain the forward and inverse kinematics for a generalized simple wheeled vehicle. This model is then used in the decentralized estimation and control algorithms. WMR estimation and control is thusmore » obtained locally using reduced order models with reduced communication of information between nodes is carried out after every measurement (full rate communication), the estimates and control signals obtained at each node are equivalent to those obtained by a corresponding centralized system. Transputer architecture is used as the basis for hardware and software design as it supports the extensive communication and concurrency requirements that characterize modular and decentralized systems. The advantages of a modular WMR vehicle include scalability, application flexibility, low prototyping costs, and high reliability.« less
Active Self-Testing Noise Measurement Sensors for Large-Scale Environmental Sensor Networks
Domínguez, Federico; Cuong, Nguyen The; Reinoso, Felipe; Touhafi, Abdellah; Steenhaut, Kris
2013-01-01
Large-scale noise pollution sensor networks consist of hundreds of spatially distributed microphones that measure environmental noise. These networks provide historical and real-time environmental data to citizens and decision makers and are therefore a key technology to steer environmental policy. However, the high cost of certified environmental microphone sensors render large-scale environmental networks prohibitively expensive. Several environmental network projects have started using off-the-shelf low-cost microphone sensors to reduce their costs, but these sensors have higher failure rates and produce lower quality data. To offset this disadvantage, we developed a low-cost noise sensor that actively checks its condition and indirectly the integrity of the data it produces. The main design concept is to embed a 13 mm speaker in the noise sensor casing and, by regularly scheduling a frequency sweep, estimate the evolution of the microphone's frequency response over time. This paper presents our noise sensor's hardware and software design together with the results of a test deployment in a large-scale environmental network in Belgium. Our middle-range-value sensor (around €50) effectively detected all experienced malfunctions, in laboratory tests and outdoor deployments, with a few false positives. Future improvements could further lower the cost of our sensor below €10. PMID:24351634
Executive Guide to Software Maintenance. Reports on Computer Science and Technology.
ERIC Educational Resources Information Center
Osborne, Wilma M.
This guide is designed for federal executives and managers who have a responsibility for the planning and management of software projects and for federal staff members who are affected by, or involved in, making software changes, and who need to be aware of steps that can reduce both the difficulty and cost of software maintenance. Organized in a…
RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.
Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z
2017-04-01
We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.
Software for Photometric and Astrometric Reduction of Video Meteors
NASA Astrophysics Data System (ADS)
Atreya, Prakash; Christou, Apostolos
2007-12-01
SPARVM is a Software for Photometric and Astrometric Reduction of Video Meteors being developed at Armagh Observatory. It is written in Interactive Data Language (IDL) and is designed to run primarily under Linux platform. The basic features of the software will be derivation of light curves, estimation of angular velocity and radiant position for single station data. For double station data, calculation of 3D coordinates of meteors, velocity, brightness, and estimation of meteoroid's orbit including uncertainties. Currently, the software supports extraction of time and date from video frames, estimation of position of cameras (Azimuth, Altitude), finding stellar sources in video frames and transformation of coordinates from video, frames to Horizontal coordinate system (Azimuth, Altitude), and Equatorial coordinate system (RA, Dec).
A model for the cost of doing a cost estimate
NASA Technical Reports Server (NTRS)
Remer, D. S.; Buchanan, H. R.
1992-01-01
A model for estimating the cost required to do a cost estimate for Deep Space Network (DSN) projects that range from $0.1 to $100 million is presented. The cost of the cost estimate in thousands of dollars, C(sub E), is found to be approximately given by C(sub E) = K((C(sub p))(sup 0.35)) where C(sub p) is the cost of the project being estimated in millions of dollars and K is a constant depending on the accuracy of the estimate. For an order-of-magnitude estimate, K = 24; for a budget estimate, K = 60; and for a definitive estimate, K = 115. That is, for a specific project, the cost of doing a budget estimate is about 2.5 times as much as that for an order-of-magnitude estimate, and a definitive estimate costs about twice as much as a budget estimate. Use of this model should help provide the level of resources required for doing cost estimates and, as a result, provide insights towards more accurate estimates with less potential for cost overruns.
Energy performance evaluation of AAC
NASA Astrophysics Data System (ADS)
Aybek, Hulya
The U.S. building industry constitutes the largest consumer of energy (i.e., electricity, natural gas, petroleum) in the world. The building sector uses almost 41 percent of the primary energy and approximately 72 percent of the available electricity in the United States. As global energy-generating resources are being depleted at exponential rates, the amount of energy consumed and wasted cannot be ignored. Professionals concerned about the environment have placed a high priority on finding solutions that reduce energy consumption while maintaining occupant comfort. Sustainable design and the judicious combination of building materials comprise one solution to this problem. A future including sustainable energy may result from using energy simulation software to accurately estimate energy consumption and from applying building materials that achieve the potential results derived through simulation analysis. Energy-modeling tools assist professionals with making informed decisions about energy performance during the early planning phases of a design project, such as determining the most advantageous combination of building materials, choosing mechanical systems, and determining building orientation on the site. By implementing energy simulation software to estimate the effect of these factors on the energy consumption of a building, designers can make adjustments to their designs during the design phase when the effect on cost is minimal. The primary objective of this research consisted of identifying a method with which to properly select energy-efficient building materials and involved evaluating the potential of these materials to earn LEED credits when properly applied to a structure. In addition, this objective included establishing a framework that provides suggestions for improvements to currently available simulation software that enhance the viability of the estimates concerning energy efficiency and the achievements of LEED credits. The primary objective was accomplished by using conducting several simulation models to determine the relative energy efficiency of wood-framed, metal-framed, and Aerated Autoclaved Concrete (AAC) wall structures for both commercial and residential buildings.
Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
Software engineering technology transfer: Understanding the process
NASA Technical Reports Server (NTRS)
Zelkowitz, Marvin V.
1993-01-01
Technology transfer is of crucial concern to both government and industry today. In this report, the mechanisms developed by NASA to transfer technology are explored and the actual mechanisms used to transfer software development technologies are investigated. Time, cost, and effectiveness of software engineering technology transfer is reported.
Agent-based Modeling with MATSim for Hazards Evacuation Planning
NASA Astrophysics Data System (ADS)
Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.
2015-12-01
Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1988
1988-01-01
Provides reviews of three software packages including "MusicShapes,""For Comment," and "Colortrope," which were developed for music, writing, and science, respectively. Includes information on grade levels, publishers, hardware needed, and cost. (TW)
Grasping objects autonomously in simulated KC-135 zero-g
NASA Technical Reports Server (NTRS)
Norsworthy, Robert S.
1994-01-01
The KC-135 aircraft was chosen for simulated zero gravity testing of the Extravehicular Activity Helper/retriever (EVAHR). A software simulation of the EVAHR hardware, KC-135 flight dynamics, collision detection and grasp inpact dynamics has been developed to integrate and test the EVAHR software prior to flight testing on the KC-135. The EVAHR software will perform target pose estimation, tracking, and motion estimation for rigid, freely rotating, polyhedral objects. Manipulator grasp planning and trajectory control software has also been developed to grasp targets while avoiding collisions.
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436.16 and 436.17 and measuring cost effectiveness by the modes of analysis described by § 436.19 through... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
Preliminary design of flight hardware for two-phase fluid research
NASA Technical Reports Server (NTRS)
Hustvedt, D. C.; Oonk, R. L.
1982-01-01
This study defined the preliminary designs of flight software for the Space Shuttle Orbiter for three two-phase fluid research experiments: (1) liquid reorientation - to study the motion of liquid in tanks subjected to small accelerations; (2) pool boiling - to study low-gravity boiling from horizontal cylinders; and (3) flow boiling - to study low-gravity forced flow boiling heat transfer and flow phenomena in a heated horizontal tube. The study consisted of eight major tasks: reassessment of the existing experiment designs, assessment of the Spacelab facility approach, assessment of the individual carry-on approach, selection of the preferred approach, preliminary design of flight hardware, safety analysis, preparation of a development plan, estimates of detailed design, fabrication and ground testing costs. The most cost effective design approach for the experiments is individual carry-ons in the Orbiter middeck. The experiments were designed to fit into one or two middeck lockers. Development schedules for the detailed design, fabrication and ground testing ranged from 15 1/2 to 18 months. Minimum costs (in 1981 dollars) ranged from $463K for the liquid reorientation experiment to $998K for the pool boiling experiment.
Smith, R P; Dias, J J; Ullah, A; Bhowal, B
2009-05-01
Corrective surgery for Dupuytren's disease represents a significant proportion of a hand surgeon's workload. The decision to go ahead with surgery and the success of surgery requires measuring the degree of contracture of the diseased finger(s). This is performed in clinic with a goniometer, pre- and postoperatively. Monitoring the recurrence of the contracture can inform on surgical outcome, research and audit. We compared visual and computer software-aided estimation of Dupuytren's contractures to clinical goniometric measurements in 60 patients with Dupuytren's disease. Patients' hands were digitally photographed. There were 76 contracted finger joints--70 proximal interphalangeal joints and six distal interphalangeal joints. The degrees of contracture of these images were visually assessed by six orthopaedic staff of differing seniority and re-assessed with computer software. Across assessors, the Pearson correlation between the goniometric measurements and the visual estimations was 0.83 and this significantly improved to 0.88 with computer software. Reliability with intra-class correlations achieved 0.78 and 0.92 for the visual and computer-aided estimations, respectively, and with test-retest analysis, 0.92 for visual estimation and 0.95 for computer-aided measurements. Visual estimations of Dupuytren's contractures correlate well with actual clinical goniometric measurements and improve further if measured with computer software. Digital images permit monitoring of contracture after surgery and may facilitate research into disease progression and auditing of surgical technique.
DOT National Transportation Integrated Search
2004-05-01
This manual provides basic instruction for using RealCost, software that was developed by the Federal Highway Administration (FHWA) to support the application of life-cycle cost analysis (LCCA) in the pavement project-level decisionmaking process. Th...
The design and implementation of postprocessing for depth map on real-time extraction system.
Tang, Zhiwei; Li, Bin; Li, Huosheng; Xu, Zheng
2014-01-01
Depth estimation becomes the key technology to resolve the communications of the stereo vision. We can get the real-time depth map based on hardware, which cannot implement complicated algorithm as software, because there are some restrictions in the hardware structure. Eventually, some wrong stereo matching will inevitably exist in the process of depth estimation by hardware, such as FPGA. In order to solve the problem a postprocessing function is designed in this paper. After matching cost unique test, the both left-right and right-left consistency check solutions are implemented, respectively; then, the cavities in depth maps can be filled by right depth values on the basis of right-left consistency check solution. The results in the experiments have shown that the depth map extraction and postprocessing function can be implemented in real time in the same system; what is more, the quality of the depth maps is satisfactory.
An ergonomic approach to design hand tool for agricultural production.
Khidiya, Mahendra Singh; Bhardwaj, Awadhesh
2012-01-01
Hand tool mechanisms designed to reduce the risk factors have rarely been studied. In this paper it is analyze trowel firstly designing in CATIA and then its Finite Element Analysis has been carried out by ABAQUS. The main emphasis is on finding stresses by using this software, then removing them by suitable mechanical working on tool & ergonomic change in the design of handle to make it more comfortable. Body part discomfort score and overall discomfort rating experienced by the subjects had also been estimated. During the muscular activity workers physiological responses i.e. energy expenditure rate, oxygen consumption rate and heart rate increases. This increase in physiological responses is related to the type, intensity and duration of work and thus sets limits to the performance of heavy work. In this paper oxygen consumption rate and heart rate was used for physiological cost estimation. These parameters were measured by Computerized Ambulatory Metabolic Measurement System K4b2.
NASA Technical Reports Server (NTRS)
2015-01-01
Topics covered include: 3D Endoscope to Boost Safety, Cut Cost of Surgery; Audio App Brings a Better Night's Sleep Liquid Cooling Technology Increases Exercise Efficiency; Algae-Derived Dietary Ingredients Nourish Animals; Space Grant Research Launches Rehabilitation Chair; Vision Trainer Teaches Focusing Techniques at Home; Aircraft Geared Architecture Reduces Fuel Cost and Noise; Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs; Flight Controller Software Protects Lightweight Flexible Aircraft; Cabin Pressure Monitors Notify Pilots to Save Lives; Ionospheric Mapping Software Ensures Accuracy of Pilots' GPS; Water Mapping Technology Rebuilds Lives in Arid Regions; Shock Absorbers Save Structures and Lives during Earthquakes; Software Facilitates Sharing of Water Quality Data Worldwide; Underwater Adhesives Retrofit Pipelines with Advanced Sensors; Laser Imaging Video Camera Sees through Fire, Fog, Smoke; 3D Lasers Increase Efficiency, Safety of Moving Machines; Air Revitalization System Enables Excursions to the Stratosphere; Magnetic Fluids Deliver Better Speaker Sound Quality; Bioreactor Yields Extracts for Skin Cream; Private Astronaut Training Prepares Commercial Crews of Tomorrow; Activity Monitors Help Users Get Optimum Sun Exposure; LEDs Illuminate Bulbs for Better Sleep, Wake Cycles; Charged Particles Kill Pathogens and Round Up Dust; Balance Devices Train Golfers for a Consistent Swing; Landsat Imagery Enables Global Studies of Surface Trends; Ruggedized Spectrometers Are Built for Tough Jobs; Gas Conversion Systems Reclaim Fuel for Industry; Remote Sensing Technologies Mitigate Drought; Satellite Data Inform Forecasts of Crop Growth; Probes Measure Gases for Environmental Research; Cloud Computing Technologies Facilitate Earth Research; Software Cuts Homebuilding Costs, Increases Energy Efficiency; Portable Planetariums Teach Science; Schedule Analysis Software Saves Time for Project Planners; Sound Modeling Simplifies Vehicle Noise Management; Custom 3D Printers Revolutionize Space Supply Chain; Improved Calibration Shows Images' True Colors; Micromachined Parts Advance Medicine, Astrophysics, and More; Metalworking Techniques Unlock a Unique Alloy; Low-Cost Sensors Deliver Nanometer-Accurate Measurements; Electrical Monitoring Devices Save on Time and Cost; Dry Lubricant Smooths the Way for Space Travel, Industry; and Compact Vapor Chamber Cools Critical Components.
Scientific Software: How to Find What You Need and Get What You Pay for.
ERIC Educational Resources Information Center
Gabaldon, Diana J.
1984-01-01
Provides examples of software for the sciences, including: packages for pathology/toxicology laboratories (costing over $15,000), DNA sequencing, and data acquisition/analysis; general-purpose software for scientific uses; and "custom" packages, including a program to maintain a listing of "Escherichia coli" strains and a…
NASA Tech Briefs, November/December 1986, Special Edition
NASA Technical Reports Server (NTRS)
1986-01-01
Topics: Computing: The View from NASA Headquarters; Earth Resources Laboratory Applications Software: Versatile Tool for Data Analysis; The Hypercube: Cost-Effective Supercomputing; Artificial Intelligence: Rendezvous with NASA; NASA's Ada Connection; COSMIC: NASA's Software Treasurehouse; Golden Oldies: Tried and True NASA Software; Computer Technical Briefs; NASA TU Services; Digital Fly-by-Wire.
Future of Software Engineering Standards
NASA Technical Reports Server (NTRS)
Poon, Peter T.
1997-01-01
In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.
Cartographic applications software
,
1992-01-01
The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.
Campus-Based Practices for Promoting Student Success: Software Solutions. Research Brief
ERIC Educational Resources Information Center
Horn, Aaron S.; Reinert, Leah; Reis, Michael
2015-01-01
Colleges and universities are increasingly adopting various software solutions to raise degree completion rates and lower costs (Ferguson, 2012; Vendituoli, 2014; Yanosky, 2014). Student success software, also known as Integrated Planning and Advising Services (IPAS), appears to be in high demand among both students and faculty (Dahlstrom &…
How Virtual Technology Can Impact Total Ownership Costs on a USN Vessel
2012-03-01
Clients (After Lam, 2010) Alternative Solutions Labor $M Hardware $M Software $M Transport $M Power & Cooling $M Virtualization $M...and will hold contractors accountable to ensure energy efficiency targets of new equipment are as advertised . 2. Total Cost of Ownership...automatically placed into Standby by the VMware software and reduced energy consumption by 230 watts. Even though there were 12 virtual desktops online and in
Defense Facility Condition: Revised Guidance Needed to Improve Oversight of Assessments and Ratings
2016-06-01
are to implement the standardized process in part by assessing the condition of buildings, pavement , and rail using the same set of software tools...facility to current standards; costs for labor, equipment, materials, and currency exchange rates overseas; costs for project planning and design ...example, the services are to assess the condition of buildings, pavement , and rail using Sustainment Management System software tools developed by the
A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.
Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher
2017-08-01
The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.
Future GOES-R global ground receivers
NASA Astrophysics Data System (ADS)
Dafesh, P. A.; Grayver, E.
2006-08-01
The Aerospace Corporation has developed an end-to-end testbed to demonstrate a wide range of modern modulation and coding alternatives for future broadcast by the GOES-R Global Rebroadcast (GRB) system. In particular, this paper describes the development of a compact, low cost, flexible GRB digital receiver that was designed, implemented, fabricated, and tested as part of the development. This receiver demonstrates a 10-fold increase in data rate compared to the rate achievable by the current GOES generation, without a major impact on either cost or size. The digital receiver is integrated on a single PCI card with an FPGA device, and analog-to-digital converters. It supports a wide range of modulations (including 8-PSK and 16-QAM) and turbo coding. With appropriate FPGA firmware and software changes, it can also be configured to receive the current (legacy) GOES signals. The receiver has been validated by sending large image files over a high-fidelity satellite channel emulator, including a space-qualified power amplifier and a white noise source. The receiver is a key component of a future GOES-R weather receiver system (also called user terminal) that includes the antenna, low-noise amplifier, downconverter, filters, digital receiver, and receiver system software. This work describes this receiver proof of concept and its application to providing a very credible estimate of the impact of using modern modulation and coding techniques in the future GOES-R system.